The latest text-to-video software from OpenAI, called Sora, has left one AI specialist feeling “frightened”.
A new and innovative artificial intelligence tool has been revealed by the creators of ChatGPT, which is predicted to hasten the spread of manipulated videos and impact a wide range of sectors.
Sora is a highly advanced AI program that can transform written prompts into unique videos, causing fear in one expert in the field of AI.
According to Oren Etzioni, the creator of TruMedia.org, the progress of Generative AI tools is advancing quickly. This, combined with the presence of social networks, poses a threat to our democracy, and it couldn’t have come at a worse time. TruMedia.org is a nonprofit organization that works towards combating AI-generated misinformation in political campaigns by detecting manipulated media.so-called deepfake videos.
The company shared a sneak peek of their text-to-video technology on X. They stated that their system has the ability to quickly generate high-quality videos that are 60 seconds long. These videos will include intricate scenes, intricate camera movements, and multiple characters with vivid emotions.
Currently, the tool is not accessible to the public. OpenAI has limited its usage to “red teamers” and a select group of visual artists, designers, and filmmakers to beta test the product and provide feedback before it is made available to a wider audience.
According to OpenAI, safety specialists will assess the tool in order to determine its potential for generating false information and harmful content.
Landing soon
According to Etzioni, the rapid development of technology has surpassed the oversight and regulations for these tools. He advocates for responsible and ethical use of AI, with necessary precautions in place.
The speaker stated that they are constructing the airplane while it is already in flight and it should be completed by November. However, they do not have the necessary support from the Federal Aviation Administration, past experience, or resources to successfully complete this project.
Etzioni said that the only obstacle to widespread availability of the tool is the company itself. He expressed his belief that Sora, or a similar technology from a competing company like OpenAI, will be made accessible to the public in the next few months.
“Deepfake scams can impact not only celebrities, but also everyday individuals.”
“According to Dr. Andrew Newell, the chief scientific officer at iProov, the inclusion of [Sora] will simplify the process of creating convincing deepfake videos for individuals with malicious intent. This will also allow them more freedom to produce offensive videos,” stated a CBS MoneyWatch report based on Dr. Newell’s comments.
Organizations, such as banks, are responsible for creating their own AI-powered safeguards to safeguard consumers from possible dangers.
He stated that financial institutions that depend on video authentication for security are at the highest risk.
and
Actors, creators, and others are at risk.
The tool’s features are primarily focused on the abilities of individuals in producing various forms of content such as film, media, and other creative endeavors.
He stated that individuals who provide voice acting services or create short videos for video games, educational use, or advertisements will experience the most immediate impact.
According to Reece Hayden, a senior analyst at ABI Research, a company that specializes in technology intelligence, the use of multimodal models in fields like marketing and creative industries could bring about major cost reductions for film and TV producers. This could also lead to a rise in AI-generated content instead of traditional human actors.
Since Sora allows for the creation of visual content, even for those without artistic skills, it could enable users to develop media in a choose-your-own-adventure style.
Hayden suggested that even a major corporation like Netflix could give consumers the opportunity to create their own content using provided prompts.
Megan Cerullo
Source: cbsnews.com