Amazon Sumerian - now available

"Amazon Sumerian is a set of tools for creating high-quality virtual reality (VR) experiences on the web. With Sumerian, you can construct an interactive 3D scene without any programming experience, test it in the browser, and publish it as a website that is immediately available to users."

- Amazon Sumerian User Guide

Amazon Sumerian tutorials

Amazon Sumerian on boarding & interface

Description of the entity component system with behaviors and speech show how AR & VR play a part in Amazon's larger ecosystem. Speech can be powered by Alexa's intelligence and evoked through avatars as an interface.

Asset > Add New Pack > Entity 

Screen Shot 2018-04-19 at 7.11.58 PM.png

Entity > 
Add New Component

Screen Shot 2018-04-19 at 7.12.12 PM.png

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.

4D toys

We often represent three dimensions by rendering images on a 2D screen.
This application represents four physical dimensions on a 2D screen, or in real 3D space using VR.

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.

Leap Motion's North Star. HMDS are the new black

No-one cares about input only.

People care about output. They care about what they can see.

Keiichi Matsuda (creator of Hyper Reality) displayed these controls by "Introducing Virtual Wearables*" on his twitter @keiichiban last month. They looked cool. Or beautiful. Both visually beautiful and functionally beautiful. Intuitive and natural.
But we know this is Leap Motion's strength.

You can see the hand without AR in between the nose pad of the galsses

Screen Shot 2018-04-09 at 10.14.54 AM.png
1_north-star-rotation.gif

The question was "What is HMD it running on?" What digital eyewear were they using? What was the output running on? 

Apparently it is Leap Motion's own head mounted display, the North Star.

 

 image by Leap Motion. Source:  RoadtoVR

image by Leap Motion. Source: RoadtoVR

Now people will care about Leap Motion and what gesture recognition means as the future of input and HCI.
No one cared about gesture recognition before FPV AR output. No one really noticed the value of the mouse until it was attached to a beautiful GUI. At Xerox PARC, they might as well have thrown a rat on the table as described in Walter Isaacson's Steve Jobs.

 

The past decade of computing has largely been defined by mobile phones. There were new capabilities when users could take a connected device with them. The information on a mobile site or mobile app was contextually relevant. But the input pattern was multi-touch. And multi-touch defined
There were some fun experience that required users to shake the phone as a specific, and often secondary method, of input. But we were largely designing for an interface that only simulated two scenarios in real life: 1. Drawing a note on a dusty car window. 2. finger painting.

These experiences all lacked finger grain input beyond zooming in. The methods of hand movement, grips, and other finger articulations that we use every day were not applicable. With Augmented Reality offering an interface that is contextually relevant, the input methods are beyond multi-touch and offer the chance for a much more natural or supernatural user interface.

Read more: 
http://blog.leapmotion.com/northstar/
https://www.roadtovr.com/leap-motion-reveals-project-north-star-an-open-source-wide-fov-ar-headset-dev-kit/?platform=hootsuite

Digital eyewear is the new black.
Again. And this time it will begin to stick.

 

* Magic Leap patent documents display virtual wearables as "Charms."

 

[ But now with two HMD/Digital Eyewear companies, this will continue to confuse people between Leap Motion and Magic Leap :P ]

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.

Google Daydream VR Cookbook

This is what has been cooking since the holidays.
I did technical editing for my friend and former colleague Sam Keene. His new book Google Daydream VR Cookbook: Building Games and Apps with Google Daydream and Unity launches this summer. Sam Keene is an engineer at Google and used to work at R/GA..

It's been a privilege to review this content early. The most comprehensive source I have read. And the exercises are designed as simple recipes.

There is a lot of chatter about VR and AR. This book is a great technical source, whether you are a developer, a designer, or any creative who wants to understand how this new space works.

Pre-order the book on Amazonhere https://www.amazon.com/Google-Daydream-VR-Cookbook-Building/dp/013484551X.

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.

RoboRaid

A basic experience where holograms or AR entities appear to break through the physical walls.
This simple experience only has one method of gesture input - the single Air Tap.
 

When wearing the HMD, the hands occlude the visualizations, so your hands appear on top of the robot holograms.

 

The second method of input is audible, saying "X-ray" shows you where the opponent are "inside the wall."

 See more about RoboRaid and some of Microsoft's concept work here:  https://www.microsoft.com/en-us/hololens/apps/roboraid

See more about RoboRaid and some of Microsoft's concept work here: https://www.microsoft.com/en-us/hololens/apps/roboraid

Outside of this simple application, you can give audible commands to Cortana, the Hololens system AI agent (like Siri but smarter). And you can use the only other Hololens recognized hand gesture - Bloom - to call up the main menu.

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.

Dematerialization to create applications for 3D experiences

Notes from Jody Medich's "Real-world VR applications beyond entertainment - a look at our rapidly dematerializing world."

Visual Cortex
"Visual sense is 70% of your sensory intake."
Visual cortex is even used by blind people
everything from spatial memory to spatial location - all happens in your visual cortex.

Your visual cortex is about the size of your hand. It works for both hemispheres of your brian. It is integral to both sides.

A synthetic visual cortex.

More on
Dematerialization
and application to
Kinesthetic Learning, for creating muscle memory and new nueral pathways.

 

Follow Jody Medich @nothelga

View the original source https://www.oreilly.com/ideas/real-world-vr-applications-beyond-entertainment

 

Full video on Safari Books https://www.safaribooksonline.com/library/view/oreilly-design-conference/9781491976180/video302847.html

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.

IA for AR

Rony's example

Three different levels of zoom of volumetric content you can walk around and view from any angle.

LOF
Consists of proximate, angle and contextual information 

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.

AR Prototyping Tools

The main prototyping tools I have used:

  • Torch 3D

  • Ottifox

  • Halo Labs

There are not many prototyping tools for Augmented Reality. There are very few.


These are select tools that I have discovered and vetted in my own time.
They are an advancement on WebVR that are converging with the adoption of mobile AR.

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.

Designing and Prototyping for AR

Designing and Prototyping Augmented Reality

How to understand how to design for AR? Prototyping multi-scene augmented reality.
Understanding, designing, and prototyping, augmented reality.

There was the age of personal computing, the dot com era, internet when we realized we could connect people to each other, and more recently interpersonal computing that defined more than a decade of digital experience with the notion of “mobile first” and VC firms that only invested in mobile, or designers who were “mobile designers.” We are entering the age of mixed reality or spatial computing.

With innovations like autonomous vehicles that you do not need to drive or that can drive you,  blockchain and cryptocurrency, I believe augmented reality will be the interface to all of this.

AR will be the interface to everything that you cannot see. Anything that lacks a dedicated physical interface such as autonomous vehicles, deliveries, service process, blockchain and crypto-currency. AR will also replace many digital interfaces.

As we enter an age of spatial computing, or designing across space agencies, brands and designers are asking “How to we design for virtual and augmented reality?” When informed that AR experiences are made in a game development engines - this often misunderstood term confuses people, leading them to believe that gaming is the only application. Or that the end goal is only for entertainment’s sake and that Augmented Reality cannot provide real business value. These game development engines with extra heavy interfaces can intimidate many designers, or the simple notion of an integrated development environment where you design and code scares away designer and new creative adopters.

A few select companies are bridging the skill, knowledge and tool gap that currently separates designing and developing 3D experiences. By creating prototyping tools for VR & AR, these companies are bringing these frontier technologies closer to fruition through their creator tools. Currently there is a battle between 2D design and prototyping tools as the phases between design and prototyping merge and the iterations become quicker for more agile processes.  If you are a digital product designer, you are well aware of this battle and keeping track of which tools are better bets to master for your process, teams, and assets. If you are a Venture Capitalist with bets on frontier technologies, you have looked at the tool layer of augmented and virtual realities, because you understand it is the basis of creating experiences and content for reality. And you also understand that this frontier is rapidly approaching. These AR prototyping tools not only bring this frontier closer, they are empowering designers to make things beyond the screen.

What are these companies? Where are they founded, how did they start? How do they bring us as creators closer to this future where frontier technologies are no longer experimental but are instrumental in solving problems and visualizing what is currently invisible? I will cover that in a series of posts covering the main prototyping tools, a process of design for development, informations architecture for virtual and augmented experiences.

Stay tuned.

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.

Face Blur. Computer Vision

While blurring something out in After Effects recently, I realized the relationship between the blur, with tracking (including Z axis tracking) and computer vision.

 

“Hold Infinity in the palm of your hand...A fibre from the Brain does tear...The Game Cock clipd & armd for fight...Has left the Brain that wont Believe”
— William Blake - Auguries of Innocence

Notes taken on a mobile device. Pardon any auto-corrections or incorrection.