Last week, Amazon it was integrating AI into a number of products—including smart glasses, smart home systems, and its voice assistant, Alexa—that help users navigate the world. This week, Meta its latest AI and extended reality (XR) features, and next week Google its next line of Pixel phones equipped with Google AI. If you thought AI was already “ ,” just wait until it’s part of the increasingly immersive responsive, personal devices that power our lives.
AI is already hastening technology’s trend toward greater immersion, blurring the boundaries between the physical and digital worlds and allowing users to easilytheir own . When combined with technologies like augmented or virtual reality, it will open up a world of creative possibilities, but also raise new issues related to privacy, , and safety. In immersive spaces, our bodies often forget that the content we’re interacting with is virtual, not physical. This is great for and training . However, it also means that VR harassment and assault , and that are more effective.
Generative AI could worsen manipulation in immersive environments, creating endless streams of interactive media personalized to be as persuasive, or deceptive, as possible. To prevent this, regulators must avoid theand act now to ensure that there are appropriate rules of the road for its development and use. Without adequate privacy protections, integrating AI into immersive environments could amplify the threats posed by these emerging technologies.
Take misinformation. With all the intimate data generated in immersive environments, actors motivated to manipulate people could hypercharge their use of AI to create. One by pioneering VR researcher Jeremy Bailenson shows that by subtly editing photos of political candidates’ faces to appear more like a given voter, it’s possible to make that person more likely to vote for the candidate. The threat of manipulation is exacerbated in immersive environments, which often collect body-based data such as . That information can details like a user’s demographics, habits, and health, which lead to detailed profiles being made of users’ interests, personality, and characteristics. Imagine a in VR that analyzes data about your online habits and the content your eyes linger on to determine the most convincing way to sell you on a product, , or idea, .
AI-driven manipulation in immersive environments will empower nefarious actors to conduct influence campaigns at scale, personalized to each user. We’re already familiar with deepfakes thatand fuel , and microtargeting that drives users toward and . The additional element of immersion makes it even easier to manipulate people.
To mitigate the risks associated with AI in immersive technologies and provide individuals with a safe environment to adopt them, clear and meaningful privacy and ethical safeguards are necessary. Policymakers should pass strong privacy laws that safeguard users’ data, prevent unanticipated uses of this data, and give users more control over what is collected and why. In the meantime, with no comprehensive federal privacy law in place, regulatory agencies like the US Federal Trade Commission (FTC) should use their consumer protection authority to guide companies on what kinds of practices are “” in immersive spaces, particularly when . Until more formal regulations are introduced, companies should collaborate with experts to develop best practices for handling user data, govern advertising on their platforms, and design AI-generated immersive experiences to minimize the threat of manipulation.
As we wait for policymakers to catch up, it is critical for people to become educated on, the they collect, how that data is used, and what they may cause individuals and society. AI-enabled immersive technologies are increasingly becoming part of our everyday lives, and are changing how we interact with others and the world around us. People need to be empowered to make these tools work best for them—and not the other way around.
WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions. Submit an op-ed at .