A lot of impactful trends have been overshadowed by news of the election cycle and the pandemic.
One “tech”-tonic shift you may have missed is the recent debut of GPT-3
, which stands for Generative Pre-Trained Transformer. In a nutshell, it is a computer model that is trained to be able to write like a human being after being given some introductory text. It has progressed to a shockingly good—and daresay creepy—level.
But GPT-3 goes well beyond the parlor trick of a computer being able to write coherent articles (see the example below from The Guardian). People are getting it to do some impressive stuff, like writing computer code, designing web pages, writing poetry and more.
The biggest deal is that GPT-3 is 10 times more powerful than its predecessor. OpenAI, the company behind GPT-3, “taught” the system with over 175 billion learning parameters—vs. the previous watermark of 17 million.
I submit that you need to pay close attention to the impact this will have—especially on jobs. Virtually every industry populated by knowledge workers will be affected as this technology improves. Despite the latest advancements, it still has a long way to go.
I’ve included at least one Cool Tool to let you try GPT-3 for yourself.
BTW, I did write this entire newsletter—but someday I might just ask a computer to do it for me. And you won’t know the difference.