ChatGPT is boring
Talk about ChatGPT is now all over the place. Some herald it as the dawn of the AI revolution, others fear and suggest regulation. While I am firmly in the second camp, I have to say: ChatGPT is boring.
It is a bullshitter.
From the vantage point of philosophy, it does not lie as it knows not the truth. It mimics writing humanly readable texts, it also simulates telling the truth. But it doesn’t.
As Harry Frankfurt, a Princeton philosophy professor, in his essay “On bullshit” tells us: “the bullshitter hides […] that the truth-values of his statements are of no central interest to him.” What is relevant for ChatGPT is the same as for the bullshitter: it is neither on the side of truth nor on that of untruth. ChatGPT ’s interest is getting away with what it says. ChatGPT does not care whether the things it writes describe reality correctly, it just picks them out (from the internet), or makes them up (by generating language), to suit the purpose of being asked and being able to answer in a human way.
As Frankfurt reminds us: “It is impossible for someone to lie unless he thinks he knows the truth.” ChatGPT does not know the truth, as it does not know anything. It is a bullshitter par excellence.
“Why is there so much bullshit?” – Frankfurt asks. He argues that bullshit is unavoidable whenever circumstances require someone to talk without knowing what he is talking about. When interests of efficiency or financial gain require machines to help people to talk without knowing what they are talking about, more bullshit is generated. People have a propensity in democracy to have opinions about everything. So, ChatGPT does democratic bullshitting.
It writes better than humans, therefore – it is assumed – it knows better as well. Readers of its bullshit mistake it for the truth. The good news: machine bullshit can be regulated; with people it is much harder.
The author is a signatory of the open letter to “Pause Giant AI Experiments” that calls on all AI labs to immediately pause for at least 6 months the training of AI systems more powerful than GPT-4.