|
On Monday, Microsoft introduced "Phi-3-mini," a 3.8-billion parameter language model that the company claims rivals the performance of the slightly older ChatGPT 3.5 and Mixtral 8x7B. The paper is titled "Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone," which is clear evidence that Microsoft now has an LLM that can run directly on your PC.
Microsoft hasn't said that Phi-3-mini will be the next Copilot, running locally on your PC. But there's a case to be made that it is, and that we have an idea of how well it will work.
1.) Local AI matters
It's a familiar argument: if you issue a search request to Bing, Google Gemini, Claude, or Copilot, it lives in the cloud. This could be embarrassing ("is this wart bad"?) or sensitive ("can I get in trouble for stealing mail?") or something that you don't want to leak at all like a list of bank statements.
Corporations would like to ask questions of their data via AIs like Copilot, but there are no "on premises" versions of Copilot to date. A local version of Copilot is almost a necessary option at this point.
|
|
Nearly two dozen dating apps were flagged by Mozilla's Privacy Not Included researchers as failing to meet privacy and security standards, sharing customer data with third parties, and excluding the right of a user to wipe their data from the app. The post Mozilla Waves Red Flag Over Data Hungry Dating Apps appeared first on TechNewsWorld.
|
|