Magic Leap – Spatial Computing and the Future of Enterprise | SIGGRAPH 2019 | Unreal Engine

Magic Leap – Spatial Computing and the Future of Enterprise | SIGGRAPH 2019 | Unreal Engine


>>Tricia: Alright! Hello, everybody.
My name is Tricia Katz. I lead our developer evangelism
team at Magic Leap, so I work with creators
across the globe introducing spatial computing
concepts and development on a variety of platforms,
including Unreal Engine. If you have any questions
after my presentation today on development,
on enterprise, or just general spatial
computing or Magic Leap, feel free to reach out to me. You can email me
at [email protected] You can also find me
on Twitter at @triciaakatz. So, I want to start by just
introducing who Magic Leap is. Some of you may be familiar
with Magic Leap, others may not. Magic Leap, we are building
the future of computing. And so, what that looks like is a personal spatial
computing platform that utilizes our digital light
field technology to seamlessly blend the digital and the physical worlds
together. So, we believe in moving
beyond the screens and bringing what
exists digitally out into the world
around us as one. We believe that people
should be first, that technology
should serve us, that computing should
match human experience, and also respect
human physiology. We are building a world
where entertainment and productivity
and socialization become something
to experience that is integrated
within our real world. So, I want to just
level set a little bit. Many of you may have heard
the terms VR, or AR, and you may or may not
have heard spatial computing. I want to talk
about the differences, and the term that we use,
which is spatial computing. So, with virtual reality, you have got
a digital environment that shuts out
the real world. You have a scene or scenes
that were created just for you. Typically, it is
a solo interaction, and you have no interface
with the real world around you. Augmented reality might
feel like a closer term, but it is not quite right. So, with augmented reality,
you have got digital content that sits on top
of your real world, but the two do not
really understand each other. They do not actually
interact together. Whereas with spatial computing, digital content interacts with
the real world and with you. So, they blend together,
where your digital world can understand
your physical world and the objects in it. We like to say that spatial
computing is empathetic. I will talk about the feature
set here in a minute. What that means is
that you have got access to a lot of device data. You have got access
to what the eyes are doing, how the user is interacting, how they are moving
about the 3D space. That data allows you to craft
their unique and personalized experiences that are
tailored to a specific user. We believe
that spatial computing is the next human-computer
interaction paradigm, and we believe in
bringing more contextual and immersive interactive
power to our users. So, this was a quote
that was taken from Forbes, where they talked
about spatial computing and its impact on businesses
and the world around us. And I really like this quote
for a few reasons. So, they said, “Spatial computing
is going to change our lives, not by taking something
that already exists and putting it in our pocket, but by completely
revolutionizing how we view and interact
with the world.” So, the ultimate goal
of spatial computing is to replace screens
and flat displays with an entirely new paradigm for communication
and collaboration. So, truly reimagining the world,
and engagement and experiences that are centered
around gestures and glances instead of screens. And we have seen
enterprise companies that are super eager
to adopt this new technology, because it allows you
to supercharge skill sets. So, you have got AEC
as an early adopter, architecture, engineering,
construction, home improvement, manufacturing, obviously, automotive industry,
heavy machinery, and consumer goods,
and media entertainment. You also have a wide
variety of use cases, and these are continuing
to grow every single day as we see more individuals
kind of come into this space, bringing their skillsets
with them. So, design modeling,
being able to bring a 3D model out into the real world, viewing it,
demonstration purposes, training and employee education, remote assistance
and operations. So, spatial computing offers
an incredibly realistic but safe exposure to various
scenarios and environments. Real-time information overlay, so you can use this alongside
location-based experiences to deliver information that is
unique and specific to a site. Industrial sales, and more. If you have not seen
the hardware yet, this is a look
at our creator edition, our Magic Leap One.
It consists of three components. You have got the Lightpack,
the Lightwear, and the Control. I have this with me today,
so if you have not seen it, please come find me afterwards.
I would love to give you a demo. So, the Lightwear,
it uses our digital light field technology
with environment mapping, with precision tracking,
and soundfield audio, so you can really produce
some great, rich experiences. The control, it offers touch
and haptic feedback for a fluid sensory experience and spatial tracking
with six DoF. So, that means it allows
for more natural interaction for the user that mimic
their own hand movements. So, up and down, left and right,
rotational movements as well. And then the Lightpack
is where the bulk of the processing power happens,
so that is the engine. This is an untethered
experience. And you can do a lot
with spatial computing. So, you have got access
to the headpose, which is the position and
orientation of the Magic Leap One in the real world. This is similar to camera
position and orientation. It is an emerging
input method. It allows you to gauge
the user’s intent, what they are focused on. You also have gestures
and hand tracking. This is one of my favorites. So, you have got access
to eight gestures currently. These are things
like closed fist, open hand, closed point, relaxed point,
pinch, thumbs up, and more. You also have access to hand
tracking with key points. So, in one of our latest
OS and SDK updates, you got updates
to 15 key points per hand. So, that is an entire skeletal
representation of each hand. In our most recent
update last week, with skeletal tracking, we also introduced
the Hand Mesh API, so now you have got access
to an entire Hand Mesh. With image tracking,
you can detect and track 2D image targets
in the environment as you or they move across
the real world space, and with eye tracking,
you have got access to the 3D point in space
where a user is looking. You can also detect the centers
of the eyes in the 3D space, as well as blinks. Audio, so audio input and output
through our built-in speakers. You can use input
as an input method, capturing input and then
using something to translate or drive experiences
based on that. Multimedia,
the playback of audio and video files or streams,
and then world reconstruction. So, building a representation
of the real world environment using the on-device sensors, and you can use that
for physics-based interactions, for placing virtual objects, or character navigation
path planning. And I want to highlight
the input methods just a little bit more
from that list. You have got head pose,
eye gaze, gesture and voice. These are incredibly important
in an enterprise context, where you want
a hands-free interaction. Used together,
we can interact with the world to create a natural
human-computer interface. And developer tools.
So, I will not deep dive here, but I want to introduce you,
if you are a creator, where you can kind of go next
and what you can do. So, we have got three components
with Unreal development. You have got the Lumin SDK. This is going to give you access
to our APIs and features. You can download
that via our Package Manager, the Magic Leap
Package Manager. You have got the Magic
Leap UE4 Editor build. That is on
the Epic Games Store. You also have access
to Unreal Soundfield Audio, API documentation, and examples
in our package manager. You can get this
at creator.magicleap.com. This is a quick look at that. It will allow you to access
our creator portal. So, if you are interested
in getting started building spatial computing experiences
with Unreal Engine, check out creator.magicleap.com.
You have got access to our APIs, our docs, our guides,
and our tutorials. You also have access
to our build tools and publishing tools as well. This is one of my favorite
development tools, and I always like
to talk about it, because it is very unique
to Magic Leap. We call it Magic Leap Remote.
It offers you two affordances. So, it offers you
zero iteration. What that means is that you can
do continuous development and continuous iteration without having to rebuild
every time you make a change. So, personally, for me, I love
to use this with visual effects. If I am making a tweak trying
to get something just right, I use zero iteration. It allows me to develop
much faster and fine-tune
my visual effects. You also have access
to simulator, so if you do not
have a device, and you want to get
started developing, or let us just say
you do not want to build and run on a device
every time, you can do that
with our simulator, which also allows you access
to virtual rooms to test your projects in. I want to talk
a little bit about, we talked about Magic Leap. I want to talk about the power
of Unreal Engine and Magic Leap. So, first and foremost,
Blueprints. Blueprints offer flexibility
and fast iteration. They also open up development
to non-engineers, which is critical
in spatial computing, because as we continue to bring
more people into this space, we are seeing individuals come
with a variety of skill sets that are not necessarily
development skill sets. Our Magic Leap API
is fully enabled in Blueprints as well as C++,
so you have options. You have got
Blueprint nativization, so that allows you to convert
Blueprints to C++ and really allow you to have
massive gains in performance. You have access
to an extensive toolset, so there are built-in tools
for performance analysis, where you can really see
where performance is spent, and then you can focus
on optimization in your project. And then visual capabilities.
So, we just launched Undersea. You can see it here
at SIGGRAPH. And Undersea was built
in Unreal Engine. In fact, we actually have
one of our studios people right back there who
worked on the experience. And with Undersea, it was
the highest graphic fidelity in spatial computing that
we have achieved to date. It is a beautiful experience, and we could have only done
that with Unreal Engine. We have also partnered
with Epic to offer 500 Magic Leaps in their Epic
Mega Grants program. So, if you are
a developer building Unreal Engine spatial computing
experiences, you can apply. There is no deadline, and projects are accepted
on a rolling basis. Now, Unreal Engine experiences are more than
visually compelling. They have really offered
us unique learnings into UE4 development workflows,
as well as just conceptually, how we think about
future experiences, and what we can do
with this technology. Unreal has really allowed us to push the boundaries
of spatial computing, and you are going to see
that very clearly in two of the experiences
I am going to highlight today. So, the first one is Undersea.
Undersea is a room scale spatial computing experience
for Magic Leap One. It was created
with Unreal Engine, and it allows you
to observe underwater life in a dynamically generated
coral reef biome. It provides an opportunity for a sense of presence
and connection between the creatures
and the environment. And this quote from our studio
team sums up so well why we are excited
about Unreal Engine and all that it offers
for spatial computing. So, they said,
“One of our goals was to create a visually compelling
and immersive experience that pushes the graphical
boundaries of spatial computing using
Unreal and Vulkan 3.1 mobile on Magic Leap One. It gave us a glimpse
into the possibilities of building an experience
that truly pushed the limits of our technology
and our graphics hardware.” The Undersea team
really leaned into the strengths
of the Engine. So, they were working
with a small Dev team of two, but a really strong
technical art team. And the Blueprint system
served as a connector between the two worlds. Everything visual was also done
with Blueprints, from the UI to
the onboarding system as well as
the visual effects. And they were able
to continue optimizing and fine-tuning the experience
until it was at peak performance utilizing UE4’s profiling system
and Blueprint nativization. Most importantly, with timelines
for project launch, with the visual effects, and with the team dynamic
and makeup, they would not have been able
to do this project in any other engine. Now, I am sure most of you
have heard of Mica before. If you have not seen Mica, we are also demoing Mica
here at SIGGRAPH. And Mica is our first digital
human at Magic Leap. It is important that you note,
digital human, not digital assistant. Because in the context
of enterprise, as we think about
the future of work, and bringing in a digital human
to co-create with us and collaborate,
versus working for us. Mica is a great example of how
we relate to digital humans. We are learning a lot
in that space, and we will continue
to share those learnings, as well as the impacts
on how we treat and we interact
with digital humans. Mica demonstrates
that collaboration, working with you, not for you.
She is incredibly lifelike. She is also just an example
of a fully-realized AI, and the possibilities
for digital humans as we move into the future. So, throughout 2019,
and for years ahead, we will continue to develop
and release new creator tools, and pathways to enable spatial
computing as well as our vision, which is the Magicverse. So, while I cannot
predict the future, I can tell you the direction
that we are headed. And this is based on the ideas
and our technology goals that will take us there. So, the first one,
we have got four North Stars that drive our product
development. We have got
sensory field computing. This is devices
that touch the senses, so that sight and sound, but conceivably,
it could be any of the senses. You have got Lifestream. So these are the sensors
on the device. They see you as you move
about spaces in the 3D world. You have got human-centered AI, so we do not believe
in a generic AI. We believe in an AI
that is very unique and personalized
to an individual. And then layers, these are digital overlays
on the real world within a city-scale space
or world-scale space. They might offer
different experiences atop a real-world location. The collection of these layers
are what we call the Magicverse. So, this is a look
at the Magicverse and our spatial
application layers. The Magicverse is
an emergent system of systems bridging the physical
and the digital in a large scale
and persistent manner. So, it scales from room level, to building, to city,
to country and world scale. We see data, information,
and experiences being unlocked within
these environments from screens and servers,
and persisting at scale in contextually relevant
physical environments. The Magicverse
can provide massive economic amplifiers to
communities around the world, because it will
enable communication and work to occur
in completely new ways at fractions of the cost
of current physical systems. We just released our latest
Lumin SDK and LuminOS update last week.
It was a great update. With each update comes further
development of the Magicverse, as well as
international deployment and enterprise enablers. So, in our latest
update last week, we now support
internationalization and localization
for the UK, Germany, and France. This is us just getting started
with additional communities. We will continue to expand that
with our localization framework. The updates also
marked another stage of cross-platform
compatibility and mapping. Those are two key platform enablers in our vision
of the Magicverse. We now allow support
for multiplayer, multi-user scenarios
through our out of the box, persistent coordinate
frames API. And lastly, we achieved full
skeleton tracking of the hands, so you now have access
to that entire Hand Mesh through the API. As we wrap up,
I just want to say, I hope we continue to just
connect and create together. That is what we are here
for at Magic Leap. We want to help enable you to build spatial
computing experiences and move into this
next phase of computing. If you are on social media, you might be following
Magic Leap. You should also
follow @magicleapdevs. That is where we talk
about our latest tutorials for Unreal Engine,
our latest updates with our SDK, and we also share experiences that other creators
are building. You are here at SIGGRAPH, so we always say showing
is better than telling. I can sit up here and talk
all day about experiences, but seeing the
experiences for yourself is really the best way
to understand this technology. So, we have Mica showing in
the Immersive Pavilion Village, and Undersea is in the Immersive
Pavilion Museum South K. Go check them out. Last but not least, I want to leave you
with a quote from our CEO. I modified it
just a little bit. He said, “Spatial computing
is here now. Write some code,
or use Blueprints. Create, build, don’t be passive.
Participate.” Thank you.

4 thoughts to “Magic Leap – Spatial Computing and the Future of Enterprise | SIGGRAPH 2019 | Unreal Engine”

  1. Magic Leap is like the most overrated and over funded company i've seen since Theranos (and we all know what happened to them).

  2. There is no need to create a new name for AR. AR is already everything that spatial computing is claiming to be. This is just marketing bs.

  3. It's unfortunate they have nothing to show in this presentation so you have to take it all on face value. The one good thing about halo lens presentations is that traditional camera that lets spectators see the AR content the user is viewing. Until they can show their real product to the masses they will never sell these things

Leave a Reply

Your email address will not be published. Required fields are marked *