There’s no denying it at this point: AI is going to replicate media (videos, articles, pictures, etc.) with such authenticity that unaided people will not be able to tell the difference between what’s real and what’s not. We’ve already seen multiple examples of insanely realistic deepfake videos, and now it appears as if we have the text equivalent of those fakes with this AI-powered text generator, dubbed Talk to Transformer.
Before reading on, open up that link to Talk to Transformer in a new tab and pop in your own prompt. The site, which was created by machine learning engineer Adam King, will take somewhere between five to ten seconds to load, and then, from just the half-sentence you’ve written, will spit out a couple hundred words (rough average number of words for our test prompts) that are astoundingly coherent. And we don’t use the word “astoundingly” lightly. Check out this bit of sample text:
Talk to Transformer is able to generate such humanlike text thanks to—you probably guessed it—neural networks coupled with big data. The endlessly entertaining prompt-completing site uses source code from the nonprofit organization, OpenAI (founded by Elon Musk, among others), which “trained” neural networks to essentially write content based on what it learned after processing eight million web pages worth of human-written text. That’s roughly equivalent to all of Shakespeare’s works combined… multiplied by 8,000.
OpenAI’s language processing code, referred to as GPT-2, is explained in more detail in the Two Minute Papers video above. The most important detail discussed in the video comes at around 5:12. It seems that OpenAI thinks that the full version of GPT-2 is too powerful to release to the public right now, which means there’s a significantly better version of the AI powering Talk to Transformer already out there. OpenAI notes that GPT-2 could be used beneficially, for something like an AI writing assistant, or in a nefarious way, like spreading huge amounts of real-sounding, yet completely made up, news. By the way, GPT-2 is the second iteration of GPT. Imagine what GPT-3 will be able to do.
Feature image: OpenAI