We love community
AI hyperpersonalization, homogenity of crowds, and the future of community.
I am back to writing.
Recently, I watched this interesting interview at the All-In Summit 2024, where Michael Ovitz talks about Hollywood and how he introduced technology to the film industry.1 They had an interesting discussion about AI and how it will change the film industry.
The discussion point on hyperpersonalized media struck me - how AI will be able to generate individually personalized media. Within that, it was so well articulated that watching Netflix is more a community experience, rather than an individual one. Multitude of people watch the same thing(something probably hyped up by marketing) around the same time, and it becomes an asynchronous shared experience.
I think this is a great argument against AI hyperpersonalization. Are we really looking for movies that are personalized to our tastes? Or are we looking for movies that are representative of our tastes?
AI today is a top-tier generalist. It creates an averaged understanding of a task using data it is exposed to. If we are to believe Sam Altman, AGI is going to be a median-human in any field - be it in medicine, software engineering, or any other field.2
When the Besties interviewed Peter Thiel at All-In Summit, he talked about how AI is great at the “woke stuff”, which is the current consensus on what is socially acceptable way of thinking.3 An unconventional wisdom he dropped in the interview was that if you’re looking to be a an actor, you should probably target being slightly racist or sexist, because that’s what will make you stand out. Hilarious, but also a perfectly viable strategy. One could argue that comedy group AIB’s success is built on this strategy; involving expletives in their act made them stand out.4
We just love our own personal choices in music and movies. We want to be unique, we want to be different. And yet, we crave community like nothing else. A subtle thing I personally noticed was after loving a particular movie or tv show, I would always try to find out others who liked it - redditors, tv show forums, etc. The discussions that followed were so much more fun than the content itself. And invariably, the dopamine hit of being in a club of like-minded people who loved the same things I do is comparable if not greater than the dopamine hit I get from the content itself. And don’t we all push all our friends to watch the same things we do? I think that’s a testament to how deeply entrenched our need for community is.
Dopamine effects of groups are as old as time. Massively multiplayer online games are built on this premise, and the most successful games now are all massively multiplayer. Their popularity is oft attributed to the social aspect of the game, which acts as an amplifier to the dopamine hit we get from the game.
One could invert the question - are there scenarios where community becomes irrelevant?(or partially so) I think there are. When we are exposed to a hyperpersonalized world, where everything is tailored to our tastes, we might not need community as much. The dopamine hit we derive from the nuanced differences in the content might be pronounced, and the social aspect of the content might not be as important.
Or maybe the character of community itself evolves. We’ve seen this happen with mobile phones. We are all so excessively connected that the meaning of that connection is diluted - however the checkmark of being connected remains ticked. AI hyperpersonalization might follow a similar path. In the future, we might all watch Batman XXXII, but it would be a slightly different version of it depending on who’s watching it. And we’d be enjoying the nuances of the difference, while still having the checkmark of being in the club of Batman fans. The community aspect would still exist, however its’ meaning would be diluted by the hyper-division of tastes and it would be reduced to our automatic addition to “DC Die Hards Ultra” on “FriendNet”.
I think motivation for everything we do is derived from the community. The people we don’t want to disappoint, the people we want to impress, the people we want to be like. I think at some primal level, we are hardwired to be permanently attuned to the love and hate of the people that surround us and matter to us. It’s been of interest to me on how to engineer this to be a force for personal gain. Somehow, I think the hate part is so much tougher to engineer - simply because degrees of hate required to be felt are so much higher to make a change.
If you’ve read this far, you’re probably part of our little community ;)