TECHNOLOGY NEWS
Setup News Ticker
   TECHNOLOGY NEWS
Searching for 'release'. (Return)

eWeekApr 03, 2026
Mac Studio 2026: Apple's Biggest Desktop Leap Yet Is Coming
New leaks reveal Apple's M5 Mac Studio with major performance upgrades, a shifting release timeline, and rising prices. Here's what we know so far.

The post Mac Studio 2026: Apple's Biggest Desktop Leap Yet Is Coming appeared first on eWEEK.



RELATED ARTICLES
Apple Releases First iOS 26.5, iPadOS 26.5 and macOS Tahoe 26.5 Public Betas (Mac Rumors)

Mac RumorsApr 03, 2026
iOS 26.5 and iOS 27 Will Add These New Features to Your iPhone
Earlier this week, Apple seeded the first beta of iOS 26.5 to developers. The software update is relatively minor so far, which is not too surprising given that Apple is likely shifting its focus towards iOS 27. Apple is expected to unveil iOS 27 during its WWDC 2026 keynote on June 8, and the update should be released in September.


Mac RumorsApr 03, 2026
Happy Birthday, iPad: Apple's Tablet Turns 16
Today marks the 16th anniversary of when Apple released the first-generation iPad. After Steve Jobs announced the ?iPad? on January 27, 2010, it launched a few months later on April 3, 2010.


RELATED ARTICLES
Apple Now Sells Refurbished M4 iPad Pro Models Starting at $759 (Mac Rumors)

EngadgetApr 02, 2026
Google releases Gemma 4, a family of open models built off of Gemini 3
When Google released Gemini 3 Pro at the end of last year, it was a significant step forward for the company's proprietary large language models. Now, the company is bringing some of the same technology and research that made those models possible to the open source community with the release of its new family of Gemma 4 open-weight models.

Google is offering four different versions of Gemma 4, differentiated by the number of parameters on offer. For edge devices, including smartphones, the company has the 2-billion and 4-billion "Effective" models. For more powerful machines, there's the 26-billion "Mixture of Experts" and 31-billion "Dense" systems. For the unfamiliar, parameters are the settings a large language model can tweak to generate an output. Typically, models with more parameters will deliver better answers than ones with less, but running them also requires more powerful hardware. 

With Gemma 4, Google claims it's managed to engineer systems with "an unprecedented level of intelligence-per-parameter." To back up this claim, the company points to the performance of Gemma 4's 31-billion and 26-billion variants, which claimed the third and sixth spots respectively on Arena AI's text leaderboard, beating out models 20 times their size.     

All of the models can process video and images, making them ideal for tasks like optical character recognition. The two smaller models are also capable of processing audio inputs and understanding speech. Separately, Google says the Gemma 4 family is capable of generating offline code, meaning you could use them to do vibe coding without an internet connection. Google has also trained the models in more than 140 languages.    

  • CEOExpress
  • c/o CommunityScape | 200 Anderson Avenue
    Rochester, NY 14607
  • Contact
  • As an Amazon Associate
    CEOExpress earns from
    qualifying purchases.

©1999-2026 CEOExpress Company LLC