Gaming Giant Unity Wants to Digitally Clone the World

The company is leveraging its technology to help clients make “digital twins”—virtual copies of real-life objects, environments, and even people.

In video games, non-playable characters can be somewhat clueless. An NPC might wander across a city block and face-plant into a streetlamp, and then maybe vanish the next block over. NPCs leap into player-characters’ punches or commit to kicking a wall 400 times, never learning that the wall won’t kick back.

Unity Technologies is in the business of NPCs. Founded in 2004, Unity makes an eponymous game engine that provides the architecture for hundreds of video games using its real-time 3D computer graphics technology. Unity also provides countless tools integrated with that game engine, including AI tools. In the Unity game engine, developers design their 3D city blocks and streetlamps; model their NPCs; animate their punches; and maybe—through Unity’s AI technology—teach them when to stop kicking.

Five years ago, Unity’s executives had a realization: In the real world, there are a lot of situations that would enormously benefit from NPCs. Think about designing a roller coaster. Engineers can’t ask humans to stand up on a roller coaster ahead of a hairpin turn to test whether they’d fly off. And they definitely can’t ask them to do it 100 or 1,000 times, just to make sure. But if an NPC had all the pertinent qualities of a human being—weight, movement, even a bit of impulsiveness—the engineer could whip them around that bend 100,000 times, like a crazed kid playing RollerCoaster Tycoon, to discern under which circumstances they’d be ejected. The roller coaster, of course, would be digital too, with its metal bending over time and the speed of its cars sinking and rising depending on the number of passengers.

Unity spun that idea into an arm of its business and is now leveraging its game engine technology to help clients make “digital twins” of real-life objects, environments, and, recently, people.

Read More at Wired

Read the rest at Wired