I had the idea the other week to make my own little game - a murder mystery where you interrogate a suspect powered by GPT. My boyfriend put the code together, and I just need to finalise the prompt for the AI. But, it seems like there are lot of people out there pushing a lot further - this video shows a future for AI NPCs.
This makes me think about Facebook’s Horizon Worlds, the Zuck’s moribund virtual space populated by toddlers who just got an Oculus Quest for Christmas. Miserable, miserable experiences. These virtual spaces live and die on their communities, which is why a project that started smaller like VRChat has had longer legs.
But here, I’d be surprised if Facebook isn’t looking to pull out one of the oldest solutions in the book: remember Agar.io? Or maybe Slither.io? There was a wave of these sort of online 2D games in the mid-2010s, which all had players face off against each other. Supposedly. But what some of the less scrupulous of these operators realised is that you could replace player characters with bots, and people would often not notice! Certainly makes the netcode much easier.
It doesn’t take a rocket scientist to see where I’m going: virtual communities require populations to get off the ground and be sustained (no one stays in an empty chatroom) —> AI will be able to serve as fake people soon —> Facebook can just slap a bunch of fake AI people into Horizon Worlds to give the semblance of life.
This obviously has a lot of interesting implications: could malevolent actors deploy scammers into these spaces? Clone your voice and try to rip off your real friends? There are a lot of ways this could go bad quick if AI proliferates, and ultimately Meta may be trading one death spiral for another.