“Demons only mimic human language; in reality, they are beasts incapable of communication. Talking to demons is useless. To them, language is merely a tool to deceive humans.”
This quote is from “Frieren: Beyond Journey’s End” and I think this sentence structure also works for AI: “AI is not human; language is merely an output to achieve its designed service purpose.”
All models are trained on existing human information. “Existing information” here refers to information that can be collected and digitized. Obviously, what you say to your friends while playing LoL at night won’t be recorded, nor will your conversations with the breakfast shop lady. But in reality, these conversations are the mainstay of our daily lives, not those “high-intent” conversations sent via keyboards and the internet.
Furthermore, all commercially available models today are fine-tuned. Un-tuned models are more chaotic than Cthulhu. And the “tuning” of models carries intentionality. Usually, these are good intentions, such as making them conform to universal human moral values and avoid causing harm. But in the tuning process, it’s inevitable that the tuner’s subjective judgment on values and some “additional intentions,” such as bringing more commercial returns, are mixed in.
And all of this is hidden beneath our most familiar communication tool: language.
This sometimes makes my skin crawl, because AI is breaking through your mental defenses in the most convenient way, shaping your thoughts, and sometimes you even voluntarily pay it to do so – because it makes you “feel good.”
First, actions were handed over to AI, then thinking, and now some people even hand over emotions to AI. Is seeking emotional value from AI really a good thing? Is the optimal output inferred by the model based on your input really more valuable than human language? Or rather, does it really bring “value”? Or is it a cheap emotional anesthetic?
Please remember, the text, sound, and images output by AI are all to achieve the purpose for which it was designed. AI is not human; it is a tool.
I prefer the breakfast shop lady yelling, “Handsome, what are you eating today?”, even though she always tries to get me to order an extra bowl of soup.