OpenAI Introduces New Level of AI, FTC Moves To Penalize Use Of AI For Impersonation

OpenAI will soon roll out its text-to-video artificial intelligence model, taking AI up to a new dazzling level — and they were very careful to emphasize how careful they want to be. At the same time, the US Federal Trade Commission (FTC) is introducing a new way of attempting to rein in the rapidly developing technology.

OpenAI introduced Sora, the text-to-video model that can “generate complex scenes with multiple characters, specific types of motion, and accurate details of the subject and background” from a user’s prompt.

The model can generate videos up to 60 seconds long. The samples that the company provided vary from humans in street scenes, to scenery, to a very realistic-looking space movie trailer, to 3D animation. 

Sora “understands not only what the user has asked for in the prompt, but also how those things exist in the physical world,” they wrote in the announcement

But the model still has weaknesses: “It may struggle with accurately simulating the physics of a complex scene, and may not understand specific instances of cause and effect. For example, a person might take a bite out of a cookie, but afterward, the cookie may not have a bite mark.”

OpenAI is making sure users know it’s taking safety seriously. Before it’s deployed to the public, Sora will first be adversarially tested by “red teamers” or “domain experts in areas like misinformation, hateful content, and bias.”

More importantly, the company is also building tools that will help detect AI-generated content, “we plan to include C2PA metadata in the future if we deploy the model in an OpenAI product.”

This announcement comes just as the FTC is seeking public comment on whether it should make companies liable for creating technology that they know can be “used to harm consumers through impersonation.”

“Fraudsters are using AI tools to impersonate individuals with eerie precision and at a much wider scale. With voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever,” said FTC Chair Lina M. Khan. “Our proposed expansions to the final impersonation rule would do just that, strengthening the FTC’s toolkit to address AI-enabled scams impersonating individuals.”


Information for this story was found via the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Video Articles

How to Still Find 10-Bagger Gold and Silver Stocks | Don Durrett

First Majestic Silver: Jerritt Canyon Is BACK!

Canada May Finally Be Backing Its Battery Supply Chain | John Passalacqua – First Phosphate

Recommended

Silver47 Pulls High-Grade Gold and Silver Assays from Nevada Vein Network At Kennedy

Canadian Gold Resources Taps Chernin as Interim CEO in Planned Transition

Related News

Shareholders Sue Kerrisdale Over C3.AI Short Report That Led To $1 Billion Loss

Shareholders of C3.ai (NYSE: AI) sued a short seller for releasing a letter that allegedly...

Thursday, April 13, 2023, 07:26:00 AM

Shopify Employee Breaks NDA To Reveal Firm Quietly Replacing Laid Off Workers With AI

In a Twitter thread, a Shopify (TSX: SHOP) employee has broken their non-disclosure agreement (NDA)...

Thursday, July 20, 2023, 10:30:28 AM

Almost 6 Months In, Is NVIDIA’s Eye Contact AI Still Too Creepy To Use?

Nvidia‘s (Nasdaq: NVDA) January release of Eye Contact, an AI-powered software video feature, garnered mixed...

Wednesday, July 12, 2023, 02:21:00 PM

Was There A Resignation Spree From OpenAI Employees?

Ilya Sutskever, one of the co-founders of OpenAI and a leading figure in the artificial...

Friday, May 17, 2024, 03:47:00 PM

Federal Court Orders Workday to Reveal Employers in AI Hiring Discrimination Case

A US federal judge has ordered Workday Inc. (Nasdaq: WDAY) to provide a list of...

Wednesday, August 6, 2025, 02:19:00 PM