ChatGPT Has Brought Us Closer To Tech That Allows You To Talk To Dead Loved Ones In The Metaverse
Artur Sychov, the founder of top metaverse company Somnium Space, recently told Vice that advancements in AI technology, specifically through OpenAI’s ChatGPT, have pushed the timeline for the development of “Live Forever” mode in the metaverse by a few years.
Live Forever mode allows people to store the way they talk and move and how they sound in Somnium Space on the metaverse and keep it there even after they leave the mortal world, so they can “return” as an online avatar to speak and interact with their loved ones. (Remember Black Mirror’s San Junipero episode?)
The big push is thanks to Artific, one of Somnium Space’s users, who started integrating ChatGPT into the metaverse by creating a virtual assistant — essentially an AI-powered robot that can in the very near future be used to store a person’s physical characteristics and manner of speaking, moving, and the sound of their voice.
The user, who was initially skeptical about artificial intelligence, apparently asked ChatGPT how it could be integrated into Somnium Space, and, to his astonishment, the chatbot responded with a rough plan. Artific then picked up from there, cleaned up and fine-tuned the plan, and began working it into the metaverse platform.
What’s next for the metaverse company to do is to find a way to store all the data recordings that will power the Live Forever avatars in a way that allows it to “speak” in a way that feels natural while at the same time going back and forth on the data for the resource.
“It’s a perfect condition for AI because it can learn from every digital object instantly,” said Sychov, comparing how it’s more challenging for AI to interpret objects and experiences in the real world versus already stored data sets in the metaverse.
The potential integration of AI like ChatGPT and OpenAI’s visual generator DALL-E into the metaverse extends beyond talking to dead loved ones. It can also change how avatars and entire worlds are built in virtual reality.
Sychov believes the technology will allow people to simply verbally ask for the AI to change an environment or add a personality trait without needing to code. “This is not decades away,” he said. Maybe in the next two years, or “maybe it’s even faster.”
Information for this story was found via Vice Motherboard, Time Magazine, and the sources and companies mentioned. The author has no securities or affiliations related to this organization. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.