A new AI tool creates hyperrealistic photos. Can you tell the difference?

A new AI tool creates hyperrealistic photos. Can you tell the difference?

Several new artificial intelligence tools have been released this summer that allow for the creation of hyperrealistic photos, making it easier than ever to alter, or entirely fabricate, an image. Experts say it’s becoming increasingly difficult for consumers to discern what is real and what is fake.

Among the most powerful new tools is FLUX.1, or Flux, a free AI image generator released in August, which allows for the creation of hyperrealistic images without a subscription.

What is Flux?

CBS News tested the new tool and found it was able to create convincing images of people in recognizable locations within seconds. Unlike similar tools, the results lacked many of the telltale signs of AI-generated images, such as skin that appears uncannily smooth.

images of celebrities and copyrighted material, as well as offensive messages, CBS News testing found. 

The AI model that powers Grok 2’s image generation was made by Black Forest Labs, the same startup that created Flux. Unlike other online tools that are often accessed through a browser, experts say Flux can be freely accessed and modified for personal use offline, making it not only remarkably realistic but open to abuse.

Flux’s terms of service prohibits users from creating content that interferes with existing copyrights, creating unlawful content or deceiving others by presenting AI-generated images as human-made.

Many companies developing AI tools, including Adobe, Google and OpenAI, have imposed restrictions in their software aimed at preventing misuse. Experts say that this isn’t something that should be taken for granted in a highly competitive market.

“The thing with Flux is not so much that it’s highly realistic, because that was going to come one way or another. That was inevitable. It’s the guardrails,” said Hany Farid, a professor at the University of California, Berkeley, specializing in digital forensics and authenticating digital media.

Experts warn having an open-source, client-side tool such as Flux opens up the door to modifications from a wider community of users, which can lead to uses that directly violate the terms of service. “You can literally look at the code. And you can download the code and you can modify the code,” said Farid.

“What is going to happen to those other services? Are they going to start saying, ‘Oh, I see guardrails are bad for business? Maybe we should lower those guardrails to be competitive.’ In this business we’re only as good as the lowest common denominator. And in this case it’s Elon Musk,” said Farid.

What’s next in generative AI?

Video generative tools are also becoming widely available. Black Forest Labs, the startup that owns Flux, has said it plans to release tools capable of generating videos in the future. Other tools are already available to consumers; the AI-generated video below, created using an AI-generated image as a starting point, was made using a paid tool called Runway.

generated-video-7.gif
AI-generated video produced with an image from Flux.

Alex Clark, CBS News Confirmed


This new array of tools can create images without many of the typical indications of AI, meaning people need to be more cautious when viewing images and videos online. Experts advise applying basic media literacy skills to vetting visuals, including paying close attention to background elements and other details — and most importantly, consider the source when determining whether something is authentic.

At the time of publication, CBS News had not received responses from X or Black Forest Labs when approached for comment.

Kara Fellows and

contributed to this report.

More

More

Source: cbsnews.com