Design Prototype and Inside-out Tracking – LPVIZ (Part 2)

LPVIZ Prototype Industrial Design

This post is a follow-up to the introduction of our augmented reality (AR) headset LPVIZ. See our previous post here.

For the past two months the LPVIZ team has been working hard to improve our initial prototype. We have enhanced the device’s appearance and optimized it ergonomically. My colleague Seeon Mitchel has made draft 3D prints of the design that he has been planning for the initial release of LPVIZ. The results are looking excellent (Figure 1 & 2).

The ring design for fixing the unit to the user’s head feels comfortable. Even for longer usage duration the unit does not cause fatigue to the neck. See below two photos of the current functional prototype with the newly printed shell.

Figure 1, 2 – The fully functional LPVIZ design prototype

Inside-out Tracking and Gesture Recognition

The latest LPVIZ prototype features a built-in stereo camera. We are using the excellent Rigel module by the company UltraLeap that allows us to, at the same time, run a SLAM (simultaneous localization and mapping) algorithm and UltraLeap’s hand tracking.

Using the Rigel’s stereo camera, my colleague Thomas Hauth has developed a state-of-the-art inside-out tracking algorithm that allows the headset to be used inside a vehicle, even if no special cameras are installed. The video (Figure 3) below shows the fundamental functionality of the algorithm.

Figure 3 – The video shows the fundamental functionality of the LPSLAM inside-out tracking algorithm

It is important to note that this will not be a full replacement for ART outside-in tracking inside the vehicle. ART’s tracking engine is more accurate and robust under difficult lighting conditions. Still, our purpose is to also serve customers that have a smaller budget or no possibility to install additional equipment inside their vehicle.

OpenZen 1.0 Release

Going Full Circle for Sensor Data Streaming with OpenZen

Since the foundation of LP-Research, it is not only important for us to provide excellent hardware to our customers but we also want to provide software components which ease the adoption and usage of our products. Over the years, we have provided various libraries to support customers using our sensor hardware on a diverse set of platforms.

As our range of sensor offerings is growing, we realized that we need to consolidate our software library stack while still supporting multiple platforms. We wanted to use this opportunity to create a more modular system to work with sensors with various measurement components.

Figure 1 – OpenZen Unity plugin connected to a LPMS-CU2 sensor and live visualization of sensor orientation.

Based on theses requirements, we developed OpenZen. It is our take on a high performance sensor data streaming and processing library. It combines our experience gained during mopre thant five years of sensor data processing with modern software techniques. The core of OpenZen is developed with the modern C++14 language. We are hosting the source code in an open source repository for seamless public domain access to learn and contribute to the code base.

Core Concept

One basic principle of OpenZen is to abstract the sensor components provided by a sensor from the transport layer of the communication. In this way, once the user is familiar with the OpenZen API, a wide range of sensor types via various connection layers can be used. To reach the lowest latency and the highest sensor data throughput, we designed OpenZen to be fully event-based and without any polling loops which could introduce delays.

Sensor Types and Connectivity

With release 1.0, OpenZen provides a sensor interface for the measurements of inertial-measurement units (IMU) and the output of global navigation satellite systems (GNSS). For example, our new LPMS-IG1P sensor is a combined IMU and GNSS-unit. Both units can be read-out via OpenZen.

A list of supported sensors is here.

OpenZen supports sensor connections via various interfaces like USB, serial port, CAN-Bus and Bluetooth. Furthermore, measurement data from sensors can also be streamed via a network and received on a second system by an OpenZen instance.

A list of supported transport layers is here.

Operating Systems and Programming Languages

Currently, OpenZen can be compiled and used on Windows, Linux and MacOS systems. We are working on ports to more platforms, for example Android. Due to its modular design, the OpenZen API can be accessed from many programming languages. At this time, we support the C, C++ and C# programming languages and we provide a ready-to-go Unity plugin.

Thomas wearing LPVIZ

AR HMD for In-Car Applications – LPVIZ (Part 1)

What is In-Vehicle AR

This article describes our first steps in the development of an AR HMD for in-car, aerospace and naval applications.

Over several years we have developed our LPVR middleware. In the first version the purpose of this middleware was to enable location-based VR with a combination of optical and IMU-based headset tracking. Building on this foundation we extended the system to work as a tracking solution for transportation platforms such as cars, ships or airplanes (Figure 1).

In contrast to stationary applications where an IMU is sufficient to track the rotations of an HMD, in the in-vehicle use-case, an additional IMU needs to be fixed to the vehicle and the information from this sensor needs to become part of the sensor fusion. We realized this with our LPVR-DUO tracking system.

Applying this middleware to existing augmented reality headsets on the market turned out to be challenging. Most AR HMDs use their own proprietary tracking technology that is only suitable for stationary use-cases, but doesn’t work in moving vehicles. Accessing such a tracking pipeline in order to extend it with our sensor fusion is usually not possible.

Illustration of In-car VR Installation

Figure 1 – Principle of in-car AR/VR as implemented with LPVR-DUO

Applications

There are a large number of applications for in-car augmented reality ranging from B2B use-cases for design and development to consumer-facing scenarios. A few are listed in the illustration below (Figure 2).

AR applications in a car

Figure 2 – In-car AR use cases range from a simple virtual dashboard to interactive e-commerce applications. The “camera pass-through” enables the driver to virtually look through the car to see objects otherwise occluded by the car chassis.

HMD Specifications

For this reason, we decided to start the development of LPVIZ, an AR HMD dedicated to in-vehicle applications. This AR HMD for in-car, aerospace and naval applications is to represent the requirements of our customers as closely as possible:

  • Strong optical engine with good FOV (LUMUS waveguides), unobstructed lateral vision (safety), low persistence and high refresh rate
  • System satisfies all requirements for immersive AR head tracking (pose prediction, head motion model, late latching, asynchronous timewarp etc.)
  • HMD is thethered to computing unit in vehicle by a thin VirtualLink cable
  • Computing unit is compact, but powerful enough to run SteamVR and thus supports a large range of software applications
  • Options to use either outside-in or inside-out optical tracking inside the vehicle, as well as LeapMotion hand tracking

In-Car HMD Hardware Prototype Development

We have recently created the first prototype of LPVIZ, with hardware development still in a very early stage, but enough to demonstrate our core functionality and use-case well.

Thomas wearing LPVIZ

Figure 3 – Tracking of LPVIZ works based on our LPVR-DUO technology making use of ART outside-in tracking and our LPMS-CURS2 IMU module. This image shows Dr. Thomas Hauth performing an optical-see-through (OST) calibration.

Figure 4 – The LPVIZ prototype is powered by a LUMUS optical engine. This waveguide-based technology has excellent optical characteristics, perfectly suitable for our use-case.

Work in Progress

As you can see from the prototype images, our hardware system is still very much in an alpha stadium. Nevertheless we think it shows the capabilities of our technology very well and points in the right direction. In the next hardware version that will already be close to a release model, we will reduce the size of the device, applying the points below:

  • Use active marker LEDs instead of large passive marker balls OR inside-out tracking
  • Collect all electronics components on one compact electronics board, with only one VirtualLink connector
  • Create a compact housing, with a glasses-like fixture instead of a VR-style ring mount (Figure 5)

Figure 5 – First draft of a CAD design for the housing of the LPVIZ release version

Collaboration with Varjo

Varjo VR-2

Varjo High-Resolution HMDs

Headsets by the Finish start-up Varjo recently have had a profound impact on the market of business-to-business virtual reality devices. Varjo’s advanced display technology allows viewing immersive environments with uniquely high resolution. It makes the company’s HMDs a great choice for professional design and industrial applications.

We have been working with Varjo for a few months in order to adapt our LPVR driver to work with their headsets. As a result we have recently released a first version of the driver and are ready to deploy it to customers.

LPVR Tracking Technology

Valve Lighthouse is the default tracking technology built into Varjo headsets. This system, in spite of being very suitable for games and single user applications, is limited in its tracking volume and accuracy (mainly reproducibility). In order to allow multi user, large space applications (location-based VR) an alternate tracking system is needed.

With LPVR-CAD for Varjo we allow the combination of Varjo headsets with our tracking technology, based on marker-based inside-out tracking, feature-based inside-out tracking or outside-in tracking such as Advanced Realitime Tracking (ART).

Besides static tracking solutions we also offer support for our LPVR-DUO in-car tracking system.

Varjo marker holder top view

Figure 1 – For LPVR outside-in-based tracking, we offer a customized marker holder for Varjo HMDs.

Varjo marker holder detail view

Figure 2 – The marker holder fits all currently available HMDs: VR-1, VR-2 (Pro) and XR-1

Big in Korea

Location-based Virtual Reality for Automotive Design

Figure 1 – Using LPVR-CAD large room-scale tracking, 3D design content is visualized on VIVE Pro HMDs

In cooperation with Korean automotive design solutions provider AP-Solutions, we created a large location based virtual reality installation at the Hyundai research and development center close to Seoul, Korea. The system is used to showcase, amend and modify prototype and production-ready automobile designs (Figure 1).

LPVR Large Room Scale Tracking Engine

Figure 2 – Each VIVE Pro HMD is equipped with optical tracking markers and an LPMS-CU2 IMU. The IMUs are covered with black tape to avoid reflections of infrared light.

The system uses optical tracking together with LP-Research’s LPVR solution to track up to 20 users wearing Vive Pro Head-mounted Displays (HMD). Each user carries a VIVE hand controller for a total of 40 tracked objects in a space close to 400sqm.

Responsiveness is achieved by using LPVR (Figure 2) to combine LPMS IMU data and a software package to achieve optimum performance. The optical system uses 36 infrared cameras to track the 160 markers attached to the HMDs and hand controllers. The position and orientation data of each user’s HMD is combined using LP-Research’s algorithm.

The content of the virtual space is rendered using a CAD software package running on backpack PCs worn by each of the 20 users. The PCs communicate and coordinate via a central server.

Korean News Coverage

Images courtesy of Hyundai Motor Group Newsroom.

1 3 4 5 6 7 15