|
Google will bring Gemini, the company's new large language model, to Pixel 8 smartphones after all. The phone will incorporate Gemini Nano, a version of the model built to run locally on personal devices. This follows a successful rollout to the Pixel 8 Pro late last year and the Samsung Galaxy S24 in January.
The Pixel 8 features the same proprietary Tensor G3 chip as the Pro, which was designed to speed up AI performance. So the overall experience should be similar with both gadgets. It'll be coming in the next Pixel Feature Drop, but only as a developer preview for now. Google wants to collect feedback and make sure everything is running smoothly on the slightly lower-specced phone.
This is a fairly sudden change for Google. The company originally said that the Pixel 8 couldn't handle on-device Gemini because of "hardware limitations", despite having the same chip as the Pro model. The main difference between the two phones is the RAM allotment, which doesn't seem like a deal-breaker when it comes to running an on-device AI. It looks like Google also came around to that line of thinking.
So what can you do with this thing? The company's expanding two features that make use of the LLM, and both of these tools have been available for Pro users. The Reco
|
|
32 Degrees North sunglasses allow you to adjust your reading glasses prescription in an app and tap in and out of reading-distance mode.
|
|
NYC Mayor Eric Adams has a new plan to make the city's subways safe: metal detectors. But these are not just any metal detectors because these detectors come with AI. As for how well they work, it's not looking so good.
Read more...
|
|
Microsoft's Copilot AI could soon run locally on PCs rather than relying on the cloud.
Intel told Tom's Hardware that the chatbot could run on future AI-enabled PCs that would need to incorporate neural processing units (NPUs) capable of exceeding 40 trillion operations per second (TOPS) — a performance level not yet matched by any consumer processor currently available.
Intel mentioned that these AI PCs would be equipped to handle "more elements of Copilot" directly on the machine. Copilot currently relies predominantly on cloud processing for most tasks, leading to noticeable delays, especially for minor requests. Enhancing local computing power is expected to reduce such delays, potentially boosting performance and privacy.
To read this article in full, please click here
|
|
Many home security companies are pushy with subscriptions. These cams give you the features we find especially important, without any monthly fees.
|
|