|
Microsoft's Copilot AI could soon run locally on PCs rather than relying on the cloud.
Intel told Tom's Hardware that the chatbot could run on future AI-enabled PCs that would need to incorporate neural processing units (NPUs) capable of exceeding 40 trillion operations per second (TOPS) — a performance level not yet matched by any consumer processor currently available.
Intel mentioned that these AI PCs would be equipped to handle "more elements of Copilot" directly on the machine. Copilot currently relies predominantly on cloud processing for most tasks, leading to noticeable delays, especially for minor requests. Enhancing local computing power is expected to reduce such delays, potentially boosting performance and privacy.
To read this article in full, please click here
|
|
GitHub, the online developer platform that allows users to create, store, manage, and share their code, has been on a generative AI (genA) journey since before ChatGPT or Copilot was widely available to the public.
Through an early partnership with Microsoft, the dev platform adopted Copilot two-and-a-half years ago, tweaking it to create its own version — GitHub Copilot.
The genAI-baed conversational chat interface is now used as a tool for both GitHub users and internal employees to assist in code development, as well as an automated help desk tool.
To read this article in full, please click here
|
|