One thing that I have noticed is that a lot of neuro-atypical people share lessons they’ve learned about how to relate to other people. For example, you could have an autistic person share via, say, Twitter, lessons they’ve learned about how to understand others’ interests. To the neurotypical, this kind of thing seems forced, if not outright absurd. To the neurotypical, understanding another person’s interests is as easy as speaking to the person and observing what she has to say.
And yet. There are a lot of people, autistic people among them, who frequently say that they have trouble understanding what other people are interested in. And this tends to harm their relationships with others. It makes them seem standoffish. Thus, the tweets or other social media output from other autistic people, which essentially say “Hey autistic people! Here’s something that I learned, which will probably make your life easier.”
Enter ChatGPT
I was playing around with ChatGPT earlier today. There is a whole genre of ChatGPT prompts, in which its interlocutor tells it to imagine itself as a participant in a situation involving other people. Here’s one example. The author instructs ChatGPT to simulate a game of Tyler Cowen’s Overrated & Underrated. For the uninitiated, Overrated & Underrated refers to economist Tyler Cowen’s practice of asking his interview subjects to rate a variety of things as Overrated (excessively praised by society) or Underrated (not excessively praised by society). This is the type of thing which is easy enough for an opinionated person to do, but hard, at least until the advent of ChatGPT, for an AI to do.
So. What does all of this have to do with the subject of this post, which is empathy? Check out my prompt, and ChatGPT’s response below.
Read the last paragraph of ChatGPT’s output carefully. It’s essentially explaining how to empathize with another person’s interests. It is making concrete what comes naturally to many people, though not to the neuro-atypical, as discussed above. In some very real sense, ChatGPT is trying to teach its interlocutor how to empathize with other people.
Now, it would be too big a stretch to ascribe to ChatGPT some conscious desire to impart information about how to be empathetic. But, certainly you can see the similarity between “It might be helpful to ask them directly what they would like, or to think about what they have mentioned in the past as something they would like to have or do” and autistic people’s tweets about how to empathize with other people.
Of course, ChatGPT “learns” by inhaling vast amounts of data and performing a variety of statistical machinations (yes, I’m being very hand-wavy), and this in no way resembles how people learn to empathize with others, whether the person is autistic or not. But. It seems as though one use for ChatGPT is to use it to help relate to people more effectively.
Here’s another example, in a much different context:
I previously wrote that the best way I have found to think of ChatGPT is as an adjunct for cognition. The context in which I wrote that was my experience in using ChatGPT to remember how to build a sliding fee schedule in Excel.
But it appears that ChatGPT can be used for less cerebral things than building sliding fee schedules in Excel. It can provide some pretty cogent and useful advice for relating to other people.
Here’s another example I just came up with:
This is all solid advice! It may be self-evident or obvious to you, but it isn’t necessarily self-evident or obvious to someone prompted to, well, prompt ChatGPT for some advice.
If you search for “ChatGPT therapy” one of the first links is this Redditor’s experience using it as a kind of therapeutic muse.
To the extent that ChatGPT proves useful for people looking for this kind of therapeutic advice, it is scalable in a way that conventional therapy could never be.