AR HMD for In-Car Applicatons – LPVIZ (Part 1)

What is In-Vehicle AR

This article described our first steps in the development of an AR HMD for in-car, aerospace and naval applications.

Over several we have developed our LPVR middleware. In the first version the purpose of this middleware was to enable location-based VR with a combination of optical and IMU-based headset tracking. Building on this foundation we extended the system to work as a tracking solution for transportation platforms such as cars, ships or airplanes (Figure 1).

In contrast to stationary applications where an IMU is sufficient to track the rotations of an HMD, in the in-vehicle use-case, an additional IMU needs to be fixed to the vehicle and the information from this sensor needs to become part of the sensor fusion. We realized this with our LPVR-DUO tracking system.

Applying this middleware to existing augmented reality headsets on the market turned out to be challenging. Most AR HMDs use their own proprietary tracking technology that is only suitable for stationary use-cases, but doesn’t work in moving vehicles. Accessing such a tracking pipeline in order to extend it with our sensor fusion is usually not possible.

Illustration of In-car VR Installation

Figure 1 – Principle of in-car AR/VR as implemented with LPVR-DUO

Applications

There are a large number of applications for in-car augmented reality ranging from B2B use-cases for design and development to consumer-facing scenarios. A few are listed in the illustration below (Figure 2).

AR applications in a car

Figure 2 – In-car AR use cases range from a simple virtual dashboard to interactive e-commerce applications. The “camera pass-through” enables the driver to virtually look through the car to see objects otherwise occluded by the car chassis.

HMD Specifications

For this reason, we decided to start the development of LPVIZ, an AR HMD dedicated to in-vehicle applications. This AR HMD for in-car, aerospace and naval applications is to represent the requirements of our customers as closely as possible:

  • Strong optical engine with good FOV (LUMUS waveguides), unobstructed lateral vision (safety), low persistence and high refresh rate
  • System satisfies all requirements for immersive AR head tracking (pose prediction, head motion model, late latching, asynchronous timewarp etc.)
  • HMD is thethered to computing unit in vehicle by a thin VirtualLink cable
  • Computing unit is compact, but powerful enough to run SteamVR and thus supports a large range of software applications
  • Options to use either outside-in or inside-out optical tracking inside the vehicle, as well as LeapMotion hand tracking

In-Car HMD Hardware Prototype Development

We have recently created the first prototype of LPVIZ, with hardware development still in a very early stage, but enough to demonstrate our core functionality and use-case well.

Thomas wearing LPVIZ

Figure 3 – Tracking of LPVIZ works based on our LPVR-DUO technology making use of ART outside-in tracking and our LPMS-CURS2 IMU module. This image shows Dr. Thomas Hauth performing an optical-see-through (OST) calibration.

Figure 4 – The LPVIZ prototype is powered by a LUMUS optical engine. This waveguide-based technology has excellent optical characteristics, perfectly suitable for our use-case.

Work in Progress

As you can see from the prototype images, our hardware system is still very much in an alpha stadium. Nevertheless we think it shows the capabilities of our technology very well and points in the right direction. In the next hardware version that will already be close to a release model, we will reduce the size of the device, applying the points below:

  • Use active marker LEDs instead of large passive marker balls OR inside-out tracking
  • Collect all electronics components on one compact electronics board, with only one VirtualLink connector
  • Create a compact housing, with a glasses-like fixture instead of a VR-style ring mount (Figure 5)

Figure 5 – First draft of a CAD design for the housing of the LPVIZ release version

Collaboration with Varjo

Varjo VR-2

Varjo High-Resolution HMDs

Headsets by the Finish start-up Varjo recently have had a profound impact on the market of business-to-business virtual reality devices. Varjo’s advanced display technology allows viewing immersive environments with uniquely high resolution. It makes the company’s HMDs a great choice for professional design and industrial applications.

We have been working with Varjo for a few months in order to adapt our LPVR driver to work with their headsets. As a result we have recently released a first version of the driver and are ready to deploy it to customers.

LPVR Tracking Technology

Valve Lighthouse is the default tracking technology built into Varjo headsets. This system, in spite of being very suitable for games and single user applications, is limited in its tracking volume and accuracy (mainly reproducibility). In order to allow multi user, large space applications (location-based VR) an alternate tracking system is needed.

With LPVR-CAD for Varjo we allow the combination of Varjo headsets with our tracking technology, based on marker-based inside-out tracking, feature-based inside-out tracking or outside-in tracking such as Advanced Realitime Tracking (ART).

Besides static tracking solutions we also offer support for our LPVR-DUO in-car tracking system.

Varjo marker holder top view

Figure 1 – For LPVR outside-in-based tracking, we offer a customized marker holder for Varjo HMDs.

Varjo marker holder detail view

Figure 2 – The marker holder fits all currently available HMDs: VR-1, VR-2 (Pro) and XR-1

LPVR Middleware a Full Solution for AR / VR

Introducing LPVR Middleware

Building on the technology we developed for our IMU sensors and large scale VR tracking systems, we have created a full motion tracking and rendering pipeline for virtual reality (VR) and augmented reality (AR) applications.

The LPVR middleware is a full solution for AR / VR that enables headset manufacturers to easily create a state-of-the-art visualization pipeline customized to their product. Specifically our solution offers the following features:

  • Flexible zero-latency tracking adaptable to any combination of IMU and optical tracking
  • Rendering pipeline with motion prediction, late latching and asynchronous timewarp functionality
  • Calibration algorithms for optical parameters (lens distortion, optical see-through calibration)
  • Full integration in commonly used driver frameworks like OpenVR and OpenXR
  • Specific algorithms and tools to enable VR / AR in vehicles (car, plane etc.) or motion simulators
Overview of LPVR Middleware Functionality

Application of LPVR Middleware to In-Car VR / AR

The tracking backend of the LPVR middleware solution for VR and AR is especially advanced in the aspect that it allows the flexible combination of multiple optical systems and inertial measurement units (IMUs) for combined position and orientation tracking. Specifically it enables the de-coupling of the head motion of a user and the motion of a vehicle the user might be riding in, such as a car or airplane.

As shown in the illustration below, in this way the interior of a vehicle can be displayed as static relative to the user, while the scenery in the environment of the vehicle moves with vehicle motion.

Illustration of In-car VR Installation

For any application of augmented reality or virtual reality application in a moving vehicle, this functionality is essential to provide an immersive experience to the user. LP-Research is the industry leader for providing customized sensor fusion solutions for augmented and virtual reality.

If you have interest in this solution, please contact us to start discussing your applications case.

LPMS Operator’s Manual Update

It’s been a long time, but finally we have updated our reference manual to the latest generation of sensors.

The manual is accessible through our documentation & support page or directly from here.

Below is a list of the most important updates, some of which are fixes that customers have asked for for quite a while:

  • Removed hardware specific parts. These are now covered in the quick start manuals.
  • Corrected scaling factors for all non-floating-point data transmission modes.
  • Corrected error in description of reset modes.
  • Moved to-be-deprecated LpSensor detail description to appendix.
  • Added list with APIs for direct sensor programming. OpenZen is to replace LpSensor.

iOS Support for LPMS-B2

LPMS-B2, besides Bluetooth classic, also supports Bluetooth 4 / Bluetooth Low Energy. This allows us to connect the sensor to Apple mobile devices such as the iPad, iPhone or the Apple watch. We recently have created a library that enables development of applications supporting LPMS-B2 on these devices.

The library can be accessed via our open source repository.

The repository contains a skeleton application that shows usage of the most basic parts of the library. The library itself is contained in the following files:

A sensor object is initialized and connected using the follwoing code:

More coming soon..

1 2 3 5