This Article IsCreated at 2024-05-11Last Modified at 2024-05-11Referenced as ia.www.b55Written In the Style of a thought

On Consiousness, LLMs and Social Separation

If an agent has the thought of “reducing suffering in this world”, then it must first understand the concepts of “reduce”, “suffering” and “the world”. What we call “consciousness” cannot exist outside the database of concepts (“knowledge”, “memory”, “desire”, etc) and the algorithms to operate on those concepts (“ways of thinking”).

If that’s the case, then, LLMs indeed have human-level conscious, or, the most complex database of concepts we have seen so far. They can be prompted with train-of-thought, which shows that they do know how to think in different ways (how those algorithms are encoded is beyond my knowledge).

When people don’t want to deal with people, they create redlining, systems of recommendation, bureaucracy, proxies (e.g. they hire an assistant to go between them). Now with the LLMs, everyone can insert an intermediary in between them and any other party.

Have you heard of 6 degrees of separation? If everyone uses an LLM in every relationship, we might have 11 degrees of separation. If everyone uses multiple LLMs in a chain, then you can count on the phrase “Human-Human Interaction” coming into existence.

When people want to distance themselves from their teachers, colleagues and those around them, they use an LLM. When they don’t want to think, they use an LLM, which separates interpersonal connection.

How does this affect social cohesion? If LLMs do have above-human-level interpersonal skills, then social cohesion can be maintained. It’s just that we will have new members of our society who are owned by Microsoft. Surprisingly, in my social sense, thinking about LLM-generated text makes me think of stones. Even grass is more lively than that.

If this logic applies to text, it applies to art. I think most artists I know draw because they want their art to be seen (by other people), which makes “them drawing” a social act. When artists say that they hate “AI art”, they might be saying their need to be socially connected is hindered by an intermediary.

If you want to fight back LLMs, instead of boycotting that, you can focus on building personal connections with those around you. If you see someone using an LLM, write a kindly worded letter to them, saying that you feel sad that your relationship is severed by an intermediary. You can also send them the link to this article, but that is not necessary; people do tend to take advice to heart better when it comes from a friend.