OpenAI Introduces New Level of AI, FTC Moves To Penalize Use Of AI For Impersonation

OpenAI will soon roll out its text-to-video artificial intelligence model, taking AI up to a new dazzling level — and they were very careful to emphasize how careful they want to be. At the same time, the US Federal Trade Commission (FTC) is introducing a new way of attempting to rein in the rapidly developing technology.

OpenAI introduced Sora, the text-to-video model that can “generate complex scenes with multiple characters, specific types of motion, and accurate details of the subject and background” from a user’s prompt.

The model can generate videos up to 60 seconds long. The samples that the company provided vary from humans in street scenes, to scenery, to a very realistic-looking space movie trailer, to 3D animation. 

Sora “understands not only what the user has asked for in the prompt, but also how those things exist in the physical world,” they wrote in the announcement

But the model still has weaknesses: “It may struggle with accurately simulating the physics of a complex scene, and may not understand specific instances of cause and effect. For example, a person might take a bite out of a cookie, but afterward, the cookie may not have a bite mark.”

OpenAI is making sure users know it’s taking safety seriously. Before it’s deployed to the public, Sora will first be adversarially tested by “red teamers” or “domain experts in areas like misinformation, hateful content, and bias.”

More importantly, the company is also building tools that will help detect AI-generated content, “we plan to include C2PA metadata in the future if we deploy the model in an OpenAI product.”

This announcement comes just as the FTC is seeking public comment on whether it should make companies liable for creating technology that they know can be “used to harm consumers through impersonation.”

“Fraudsters are using AI tools to impersonate individuals with eerie precision and at a much wider scale. With voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever,” said FTC Chair Lina M. Khan. “Our proposed expansions to the final impersonation rule would do just that, strengthening the FTC’s toolkit to address AI-enabled scams impersonating individuals.”


Information for this story was found via the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Video Articles

How to Still Find 10-Bagger Gold and Silver Stocks | Don Durrett

First Majestic Silver: Jerritt Canyon Is BACK!

Canada May Finally Be Backing Its Battery Supply Chain | John Passalacqua – First Phosphate

Recommended

Questcorp’s La Union Surface Program Delivers 20 g/t Gold Over 2.9 Metres In Channel Sample

Kirkland Lake Discoveries Drills 39.35 g/t Gold Over 16.4 Metres As Mirado Continues To Grow

Related News

OpenAI Launches ChatGPT Agent, AI System That Can Control Computers

OpenAI on Thursday launched ChatGPT Agent, an AI system that can autonomously control computers, browse...

Saturday, July 19, 2025, 11:17:00 AM

Rite Aid Banned from Using AI Facial Recognition by the FTC

Rite Aid has been banned from using facial recognition technology for surveillance purposes for five...

Wednesday, December 20, 2023, 02:33:00 PM

OpenAI Power Struggle: Employees Vs. Board Vs. Sam Altman, Explained

While the real reason behind Sam Altman’s sudden dismissal as CEO of OpenAI seems to...

Tuesday, November 21, 2023, 09:54:00 AM

C3.ai Stock Gets Hit With A Short Report

Generative AI is going to be a really hot sector. The way the stock market...

Monday, April 10, 2023, 01:30:00 PM

The Internet Thinks It Wasn’t the Real Putin Who Gave Russia A New Year Message

Russian President Vladimir Putin’s New Year’s Eve speech is fueling more speculation about his well-being....

Tuesday, January 2, 2024, 12:55:04 PM