Dear folks at Magic Leap…
… thanks for stopping by! We’d like to give you a visual overview of the game we would love to release on ML1.
The game at a glance
Our game is called Pets Are People Too! - and it’s a virtual pet game unlike any you’ve seen before. It allows you to adopt a little animal, take care of it, play with it, and become it’s best friend. We’ve already created an adorable baby bird, and we’re currently adding cats. The relationship you have with your little creature depends entirely on your actions - just like with a real pet. The game uses our proprietary AB engine (AB = artificial behavior), which allows us to create highly interactive and believable NPCs. Here are some GIFs from our vertical slice, which currently runs on Gear VR and Daydream.
A virtual pet with feelings and attitude
We’ve developed a unique system for procedural behavior synthesis that allows our pets to display believable emotions and attitudes in real-time. If something unexpected happens in the world - and it always does - the pet will react. This sort of responsiveness is crucial for AR because a dynamic world (see below) tends to be full of surprises!
Here you can see three situations where the pet initially has the same attitude, and then adapts rapidly and smoothly to an unexpected event. The scene is unscripted - in fact, all three cases us the same high-level code.
Why “Pets Are People Too!” is a great fit for the ML1
XR and especially AR are all about interacting with a changing spatial context. The tech behind our game fully supports dynamic spatial interactions. We believe that this feature isn’t optional for virtual pet games in XR - it’s necessary. Most real pets are highly spatial creatures. They will explore their environment, have favorite spots, hide behind furniture, and play with their owner in a spatial manner.
Our technology includes a custom navigation, pathfinding and spatial reasoning system that is completely dynamic yet numerically robust, allowing more than 5000 changes per second on a single core of a standard laptop. It outperforms the navigation solution inside Unity or Unreal by more than 2 orders of magnitude. It’s written in plain-vanilla C# and should thus run on ML1 without problems. Think we’re exagerating? Then we’ll be happy to send you an interactive demo (apk). 8-)
We are aware that ML1 does continuous environment monitoring and obstacle detection. Our system can chew up anything thrown at it in the form of a collider. It processes box, sphere, capsule and mesh colliders. The following GIFs illustrate some of its abilities, and how they allow the virtual pet to rapidly adapt to environmental changes.
In the above GIF, note how our real-time pathfinding factors in the width of path (it corresponds to the diameter of the respective ‘blob’). The scene (minus shadows) runs at 60FPS on a Google Pixel 1 without overheating.
The above GIF shows an extreme (and contrived) scenario where an NPC traverses a highly dynamic world. The top and bottom part show the same scene in different perspectives.
* * *
Thanks for checking out our pitch! Please ping us if you want to know more about scope, progress, technology or anything else.