Sunday, February 8, 2026

Latest

OpenAI Introduces New Level of AI, FTC Moves To Penalize Use Of AI For Impersonation

OpenAI will soon roll out its text-to-video artificial intelligence model, taking AI up to a new dazzling level — and they were very careful to emphasize how careful they want to be. At the same time, the US Federal Trade Commission (FTC) is introducing a new way of attempting to rein in the rapidly developing technology.

OpenAI introduced Sora, the text-to-video model that can “generate complex scenes with multiple characters, specific types of motion, and accurate details of the subject and background” from a user’s prompt.

The model can generate videos up to 60 seconds long. The samples that the company provided vary from humans in street scenes, to scenery, to a very realistic-looking space movie trailer, to 3D animation. 

Sora “understands not only what the user has asked for in the prompt, but also how those things exist in the physical world,” they wrote in the announcement

But the model still has weaknesses: “It may struggle with accurately simulating the physics of a complex scene, and may not understand specific instances of cause and effect. For example, a person might take a bite out of a cookie, but afterward, the cookie may not have a bite mark.”

OpenAI is making sure users know it’s taking safety seriously. Before it’s deployed to the public, Sora will first be adversarially tested by “red teamers” or “domain experts in areas like misinformation, hateful content, and bias.”

More importantly, the company is also building tools that will help detect AI-generated content, “we plan to include C2PA metadata in the future if we deploy the model in an OpenAI product.”

This announcement comes just as the FTC is seeking public comment on whether it should make companies liable for creating technology that they know can be “used to harm consumers through impersonation.”

“Fraudsters are using AI tools to impersonate individuals with eerie precision and at a much wider scale. With voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever,” said FTC Chair Lina M. Khan. “Our proposed expansions to the final impersonation rule would do just that, strengthening the FTC’s toolkit to address AI-enabled scams impersonating individuals.”


Information for this story was found via the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Video Articles

Gold Prices Are High, Experience Matters | Rob McLeod

Silver Is a Wild Animal, Gold Heads for $6,000 in 2026 | Craig Hemke

Is This the End of the Gold and Silver Rally? | Peter Grandich

Recommended

TomaGold Confirms Presence Of Berrigan Deep Zone Following Geophysics

Antimony Resources Reports Massive Stibnite Mineralization Over 25 Metres At Marcus (West) Zone

Related News

Amazon Employees Share Concerns On Q Chatbot: “Severe Hallucinations, Leaking Confidential Data”

Concerns have emerged among Amazon (NASDAQ: AMZN) employees regarding accuracy and privacy issues surrounding the...

Monday, December 4, 2023, 03:55:00 PM

Are Humans Doomed to Destroy ChatGPT?

Humanity has always been afraid of artificial intelligence. We recognize its ability to fundamentally transform...

Monday, February 6, 2023, 02:17:00 PM

Telus, Nvidia To Partner For “Sovereign AI Factory” In Quebec

Telus and Nvidia have announced a landmark partnership to establish a so-called “sovereign AI factory”...

Wednesday, March 19, 2025, 10:40:00 AM

Air Canada Tries To Get Out of A Refund By Blaming Its Chatbot

Air Canada (TSX: AC) may be the first company to try to wiggle out of...

Sunday, February 25, 2024, 07:05:00 AM

The Sam Altman Story | Founder of OpenAI ChatGPT

Good afternoon, good evening or good morning everyone, today we’re gonna spin a yarn about...

Monday, April 24, 2023, 01:30:00 PM