View Original Article

TECHCRUNCH

It seems like a simple concept: when rendering something in augmented reality (like, say, a Pikachu in Pokémon Go) and something in actual reality (like, say, a human, or a car, or a planter) passes in front of it, make the rendered object appear to be “behind” the real one.

In practice, it’s pretty damn hard. A device would need the ability to tell which pixels are close, or far, or somewhere in between. While adding more cameras (or lasers!) to the mix can help, achieving that sense of depth from the single rear RGB camera found on most smartphones — and doing it fast — is a helluva task. That’s why objects in most augmented reality apps tend to just sort of float in front of anything that gets too close.

Niantic (the co. behind the aforementioned Pokémon Go) has just acquired Matrix Mill, a company that has been working to tackle that challenge.

Founded in 2017 and spun out of a lab at University College London, Matrix Mill has been working on a product it calls “Monodepth” — a tool that takes data from a single RGB camera, passes it through a neural network, and spits out depth data fast enough to be integrated into a real-time game.

The company demonstrated its tech to a handful of press yesterday, showing an example of a Pikachu running around in Niantic’s existing AR engine vs one integrating Matrix Mill’s tool:

Note how, in the second version (starting at 0:33), Pikachu is occluded behind those passing by, or how it can seemingly run behind a planter rather than through it. While the visuals aren’t perfect (there’s definitely a bit of stray clipping), the integrations are early — and even at this stage, it’s a pretty big step forward in realism. Alas, no word yet on when this work might actually make its way into Pokémon Go.

Terms of the deal weren’t disclosed.

Niantic also demonstrated some of the tech it’s been working on behind the scenes, but has yet to integrate into its games – specifically, low latency, shared AR experiences.

“Codename: Neon”, for example, is an experimental game that has players run around a big open space, picking up orb “ammo” and firing it at those around them. Check it out:

Note how the game recognizes different players, labeling them on screen with a marker that follows them around. Projectiles track around the room in real time, with all players (and any observers) seeing the rockets whiz around them. By using multiple devices to build a shared map of the real world room, the game is able to understand where players are in relation to each other. It’s easy to imagine this tech being adapted for wizard duels for Niantic/Portkey’s upcoming Harry Potter game.

The same shared AR concept can be adapted for other genres. Here’s “Tonehenge”, a proof of concept co-op puzzler Niantic built in a few days:

Anyone else getting Wizard’s Chess vibes?

Finally, Niantic also disclosed early details on its “Real World Platform” – a set of cross platform (iOS and Android) tools and APIs it’s been working on for third party developers to build on top of. Want to build your own game in the same vein as GO? You bring the concept and the mechanics, they’ll bring the maps, the anti-spoofing tools, their massive database of real world points of interest, etc.

Niantic says it’ll open the platform up to a “select handful” of third parties later this year.