The future of gaming? NVIDIA's new AI technologies are revolutionising the gaming world
At CES 2025, NVIDIA took the opportunity to emphasise the importance of artificial intelligence for games. More AI-supported interaction with NPCs could soon usher in drastic changes.
Thanks to new AI technologies from NVIDIA, the gaming experience will soon become even more immersive. The following innovations will make games more realistic, immersive, interactive and varied. From individual adjustments controlled by voice commands, realistic-looking, communicative NPCs to impressive and automated lip-syncing. The tools presented can fundamentally change the way we immerse ourselves in game worlds. Three videos show what is already possible to a limited extent - and they are a foreshadowing of what we can expect in the future.
In-game customisation and AI-controlled NPC communication
In the game ZooPunk, NVIDIA demonstrates how AI voice commands offer players completely new possibilities in dealing with non-player characters and how, for example, the personal individualisation of objects or items in the game could look in the future. NPCs not only react to voice commands, but also understand complex instructions and act accordingly.This makes for a dynamic world that is customised to the players and evolves automatically. The system ensures a deeper immersion and gives players more control and shows ways in which gamers themselves can exert more influence on a game world in the future.
AI People: NPCs with a life of their own
AI People shows how players can create their own NPCs that behave more or less freely and independently - based on AI data/techniques that are stored either locally or in a cloud. These characters evolve, make their own decisions and interact with other NPCs. Players can create characters that have a real life of their own in the world, rather than simply following predefined paths and constantly replaying the same speech snippets.This creates a dynamic, living world that adapts to individual play styles and responds to player preferences. This technology not only ensures deeper immersion, but also provides more interaction and flexibility in gameplay.
Realistic lip synchronisation with Audio2Face
Audio2Face technology makes the faces of NPCs even more realistic. Using the example of Alien Rogue Incursion, NVIDIA shows how characters can adapt their lip movements almost perfectly and automatically based on spoken text. This makes dialogues seem more believable and sometimes draws players more emotionally into stories.This process for improved realism not only makes games better and more authentic - it also saves developers time.Audio2Face could make a big difference, especially in story-driven games. The video explains how it works.
How do you see the technical developments? Can you imagine having a proper conversation with NPCs via a microphone and holding longer conversations? For even more immersion in VR games, this is certainly a great thing. But do the technologies presented also appeal to you when it comes to conventional flat-screen games? Let us know in the comments!