|
OpenAI is pushing the launch of its GPT Store to early 2024, according to an email seen by The Verge. The company introduced its GPT Builder tool in early November at its first developer conference, giving subscribers an easy way to create their own custom AI bots. At the time, OpenAI also said it would soon release the GPT Store for users to list their GPTs and potentially make money from them. It was initially slated for a November launch. But, with the surprise ouster of OpenAI's since-reinstated CEO Sam Altman, the month didn't quite pan out as planned.
"In terms of what's next, we are now planning to launch the GPT Store early next year," OpenAI said in its email to GPT Builder users on Friday. "While we had expected to release it this month, a few things have been keeping us unexpectedly busy!" The email also notes that the company has been making improvements to GPTs based on users' feedback, and says some updates to ChatGPT are on the way.
OpenAI has been in the proc
|
|
Meta is failing to stop vast networks of people using its platform to promote child abuse content, a new report in The Wall Street Journal says, citing numerous disturbing examples of child exploitation it uncovered on Facebook and Instagram. The report, which comes as Meta faces renewed pressure over its handling of children's safety, has prompted fresh scrutiny from European Union regulators.
In the report, The Wall Street Journal detailed tests it conducted with the Canadian Centre for Child Protection showing how Meta's recommendations can suggest Facebook Groups, Instagram hashtags and other accounts that are used to promote and share child exploitation material. According to their tests, Meta was slow to respond to reports about such content, and its own algorithms often made it easier for people to connect with abuse content and others interested in it.
For example, the Canadian Centre for Child Protection told the paper a "network of Instagram accounts with as many as 10 million followers each has continued to livestream videos of child sex abuse months after it was reported to the company." In another disturbing example, Meta initially declined to take action on a user report about a public-facing Facebook Group called "Incest." The group was eventually taken down, along with other similar communities.
In a lengthy update
|
|