|
Google will bring Gemini, the company's new large language model, to Pixel 8 smartphones after all. The phone will incorporate Gemini Nano, a version of the model built to run locally on personal devices. This follows a successful rollout to the Pixel 8 Pro late last year and the Samsung Galaxy S24 in January.
The Pixel 8 features the same proprietary Tensor G3 chip as the Pro, which was designed to speed up AI performance. So the overall experience should be similar with both gadgets. It'll be coming in the next Pixel Feature Drop, but only as a developer preview for now. Google wants to collect feedback and make sure everything is running smoothly on the slightly lower-specced phone.
This is a fairly sudden change for Google. The company originally said that the Pixel 8 couldn't handle on-device Gemini because of "hardware limitations", despite having the same chip as the Pro model. The main difference between the two phones is the RAM allotment, which doesn't seem like a deal-breaker when it comes to running an on-device AI. It looks like Google also came around to that line of thinking.
So what can you do with this thing? The company's expanding two features that make use of the LLM, and both of these tools have been available for Pro users. The Reco
|
|
Snapchat has a new AI-powered perk for subscribers: Bitmoji versions of your pet. The feature, which is unfortunately not called "petmoji," allows users to snap a photo of their four-legged friend to create a cartoon-like avatar to accompany their Bitmoji in the Snap Map.
Based on screenshots shared by the company, it seems users will be able to choose from a few different variations of the AI-generated images after sharing a photo of their pet. That's considerably less customization than what you can do with your own human-inspired Bitmoji,though it should allow users to create something that looks similar to their IRL pet. (No word on if Snap could one day introduce branded pet accessories for animal avatars like they do for human Bitmoji.)
The addition is also the latest example of how Snap has embraced AI features in its subscription offering. Since debuting Snapchat in 2022, the company has used the premium service to experiment with generative AI features, including its MyAI assistant as well as camera-powered features like Dreams and AI-generated snaps. Snapchat has more than
|
|
Elon Musk has announced new changes to X (formerly Twitter) that will allow certain accounts to unlock premium features.
|
|
Microsoft's Copilot AI service is set to run locally on PCs, Intel told Tom's Hardware. The company also said that next-gen AI PCs would require built-in neural processing units (NPUs) with over 40 TOPS (trillion operations per second) of power — beyond the capabilities of any consumer processor on the market.
Intel said that the AI PCs would be able to run "more elements of Copilot" locally. Currently, Copilot runs nearly everything in the cloud, even small requests. That creates a fair amount of lag that's fine for larger jobs, but not ideal for smaller jobs. Adding local compute capability would decrease that lag, while potentially improving performance and privacy as well.
Microsoft was previously rumored to require 40 TOPS on next-gen AI PCs (along with a modest 16GB of RAM). Right now, Windows doesn't make much use of NPUs, apart from running video effects like background blurring for Surface Studio webcams. ChromeOS and macOS both use NPU power for more video and audio processing features, though, along with OCR, translation, live transcription and more, Ars Technica noted.
So far, the processor with the fastest NPU speed is Apple M3, which offers 18 TOPS across the lineup (M3, M3 Pro and M3 Ultra). AMD's Ryzen 8040 and 7040 laptop
|
|
Many home security companies are pushy with subscriptions. These cams give you the features we find especially important, without any monthly fees.
|
|
GitHub, the online developer platform that allows users to create, store, manage, and share their code, has been on a generative AI (genA) journey since before ChatGPT or Copilot was widely available to the public.
Through an early partnership with Microsoft, the dev platform adopted Copilot two-and-a-half years ago, tweaking it to create its own version — GitHub Copilot.
The genAI-baed conversational chat interface is now used as a tool for both GitHub users and internal employees to assist in code development, as well as an automated help desk tool.
To read this article in full, please click here
|
|