OpenAI Introduces New Level of AI, FTC Moves To Penalize Use Of AI For Impersonation

OpenAI will soon roll out its text-to-video artificial intelligence model, taking AI up to a new dazzling level — and they were very careful to emphasize how careful they want to be. At the same time, the US Federal Trade Commission (FTC) is introducing a new way of attempting to rein in the rapidly developing technology.

OpenAI introduced Sora, the text-to-video model that can “generate complex scenes with multiple characters, specific types of motion, and accurate details of the subject and background” from a user’s prompt.

The model can generate videos up to 60 seconds long. The samples that the company provided vary from humans in street scenes, to scenery, to a very realistic-looking space movie trailer, to 3D animation. 

Sora “understands not only what the user has asked for in the prompt, but also how those things exist in the physical world,” they wrote in the announcement

But the model still has weaknesses: “It may struggle with accurately simulating the physics of a complex scene, and may not understand specific instances of cause and effect. For example, a person might take a bite out of a cookie, but afterward, the cookie may not have a bite mark.”

OpenAI is making sure users know it’s taking safety seriously. Before it’s deployed to the public, Sora will first be adversarially tested by “red teamers” or “domain experts in areas like misinformation, hateful content, and bias.”

More importantly, the company is also building tools that will help detect AI-generated content, “we plan to include C2PA metadata in the future if we deploy the model in an OpenAI product.”

This announcement comes just as the FTC is seeking public comment on whether it should make companies liable for creating technology that they know can be “used to harm consumers through impersonation.”

“Fraudsters are using AI tools to impersonate individuals with eerie precision and at a much wider scale. With voice cloning and other AI-driven scams on the rise, protecting Americans from impersonator fraud is more critical than ever,” said FTC Chair Lina M. Khan. “Our proposed expansions to the final impersonation rule would do just that, strengthening the FTC’s toolkit to address AI-enabled scams impersonating individuals.”


Information for this story was found via the sources and companies mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Leave a Reply

Video Articles

Endeavour Mining Q1 Earnings: Cash Flow Is King

G Mining Oko West Feasibility: Move Fast, Break.. Nothing?

New Gold Q1 Earnings: What’s Behind The Market’s Surprising Reaction?

Recommended

Giant Mining Encounters Native Copper As Hole MHB-34 Hits 563 Metres Depth

Verses Hits Commercialization Stage With Genius AI Platform

Related News

From RSS to Robo-Pinkerton: Feedly AI Invites You to Track Protests “Posing Risks to Your Company’s Assets”

Researcher Rick Claypool couldn’t help but draw on the Pinkerton analogy when he discovered that...

Friday, March 31, 2023, 03:02:00 PM

Verses AI: Genius vs. OpenAI

In this interview, Gabriel René, CEO of Verses AI (CBOE: VERS), discusses the company's innovative...
Saturday, December 28, 2024, 01:19:00 PM

Saudi Arabia Weighs $100 Billion AI Investment Plan, Sources Say

Saudi Arabia is looking to launch an ambitious artificial intelligence venture dubbed “Project Transcendence,” with...

Saturday, November 9, 2024, 11:14:00 AM

New York Times v. Microsoft, OpenAI: The Biggest Argument to Copyright Issues of Generative AI

The New York Times has filed a federal lawsuit against OpenAI and Microsoft, accusing them...

Thursday, December 28, 2023, 11:19:00 AM

Spencer Schiff, Former Bitcoin Bull, Now Believes AI Is the Answer

Spencer Schiff, known for advocating Bitcoin as the top cryptocurrency, has taken an unexpected turn,...

Tuesday, August 1, 2023, 02:56:00 PM