OpenAI Introduces New Level of AI, FTC Moves To Penalize Use Of AI For Impersonation

OpenAI will soon roll out its text-to-video artificial intelligence model, taking AI up to a new dazzling level — and they were very careful to emphasize how careful they want to be. At the same time, the US Federal Trade Commission (FTC) is introducing a new way of attempting to rein in the rapidly developing technology.

OpenAI introduced Sora, the text-to-video model that can “generate complex scenes with multiple characters, specific types of motion, and accurate details of the subject and background” from a user’s prompt.

The model can generate videos up to 60 seconds long. The samples that the company provided vary from humans in street scenes, to scenery, to a very realistic-looking space movie trailer, to 3D animation. 

Sora “understands not only what the user has asked for in the prompt, but also how those things exist in the physical world,” they wrote in the announcement

But the model still has weaknesses: “It may struggle with accurately simulating the physics of a complex scene, and may not understand specific instances of cause and effect. For example, a person might take a bite out of a cookie, but afterward, the cookie may not have a bite mark.”

OpenAI is making sure users know it’s taking safety seriously. Before it’s deployed to the public, Sora will first be adversarially tested by “red teamers” or “domain experts in areas like misinformation, hateful content, and bias.”

More importantly, the company is also building tools that will help detect AI-generated content, “we plan to include C2PA metadata in the future if we deploy the model in an OpenAI product.”

This announcement comes just as the FTC is seeking public comment on whether it should make companies liable for creating technology that they know can be “used to harm consumers through impersonation.”

“Fraudsters are using AI tools to impersonate individuals with eerie precision and at a much wider scale. With voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever,” said FTC Chair Lina M. Khan. “Our proposed expansions to the final impersonation rule would do just that, strengthening the FTC’s toolkit to address AI-enabled scams impersonating individuals.”


Information for this story was found via the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Leave a Reply

Share
Tweet
Share