“The article tells the story of a man who, after his father’s death, turned to GPT-4, an advanced language model, to analyze his father’s thousands of emails and try to generate new ones in his father’s voice.” So starts the ChatGPT summary of an article by Oliver Bateman, a digital pal of mine, who did in fact input thousands of his fathers emails into ChatGPT. His article documenting the experiment highlights the limitations of AI, and perhaps more the complexity of humans.
GPT-4 was able to generate text that resembled Bateman’s dad’s writing to some extent. However the idiosyncratic, likely dyslexic writings are poorly reflected by the AI. The limitations of artificial intelligence and deeply personal, often nonsensical aspects of human language, cannot now, and never can be properly captured by a machine.
The WGA is on strike right now, in part because of a fear that creatively bankrupt media executives will try to replace writers with generative AI. The concern is valid only because of a reduction of media, from semi magical story telling, into content units. Without quirks and imperfections communication is meaningless.
In my day job, as a senior content manager for a research consultancy, I often use GPT-4. It is competent at various brute-force operations — turning lengthy transcripts into notes, proofreading content — even if the inputs require constant fine-tuning and the outputs require careful attention to ensure accuracy. But could it write content that would bring back the dead? Could GPT-4, if sufficiently trained, analyse my father’s emails — and perhaps even write new ones?
Photo: “3d render of the minecraft character at a computer typing” by chris-hayes is marked with Public Domain Mark 1.0.