Accurate Mixed Reality with LPVR-CAD and Varjo XR-3

Anchoring Virtual Objects with Varjo XR-3

A key aspect of making a mixed reality experience compelling and useful is to correctly anchor virtual content to the real world. An object that’s fixed in its position and orientation (pose) relative to the real world should not change its pose as seen by the user when they’re moving around in the scene wearing the headset. Imagine a simple virtual cube positioned to be sitting on a real table. As the user walks around, this cube should remain in place, regardless from which perspective the user looks at it.

In more detail, correct anchoring of virtual objects to reality depends on the following points:

  1. In order to create correctly aligned content in a mixed reality experience, accurate knowledge of the user’s head pose is essential. We calculate the head pose using LPVR-CAD.
  2. The field of view provided by the cameras and the natural field of view of the human eyes are very different. Appropriate calibration is needed to compensate for this effect. The illustration below shows this inherent problem of video pass-through very clearly. Varjo’s HMDs are factory calibrated to minimize the impact of this effect on the user experience.

To get a better idea how a correct optical see-through (OST) calibration influences mixed reality performance we recommend playing with the camera configuration options in Varjo Lab Tools.

– Image credit: Varjo mixed reality documentation

Functional Testing of MR Performance

Our LPVR solution must at the very least achieve a precision that is satisfactory for our users’ typical applications. Therefore we decided to do a series of experiments to evaluate the precision of our system for mixed reality experiences and how it compares with SteamVR Lighthouse tracking.

We looked at the following configurations to make our evaluation:

# HMD Engine Tracking system Varjo markers
1 Varjo XR-3 Unreal 5.2 LPVR-CAD No
2 Varjo XR-3 Unreal 5.2 Lighthouse No
3 Varjo XR-3 Unreal 5.2 LPVR-CAD Yes
4 Varjo XR-3 Unreal 5.2 Lighthouse Yes

1 – LPVR Tracking without Varjo Markers

In this scenario we fixed a simple virtual cube on a table top in a defined position and orientation. The cube is the only virtual object in this scenario, everything else is the passed-through live video feed from the HMD cameras. We are tracking the HMD using LPVR-CAD in connection with an ART Smarttrack 3. No markers are used to stabilize the pose of the cube.

Important note: The tracking performance for the mixed reality use case can significantly change if the marker target that is attached to the HMD is not correctly adjusted. Please refer to the LPVR-CAD documentation or contact us for further support.

2 – Lighthouse Tracking without Varjo Markers

This scenario is in its basic setup identical to scenario #1, except that we’re using Lighthouse tracking to find the pose of the HMD.

3 – LPVR Tracking with Varjo Markers

Regardless how well the tracking of the HMD itself is working, as long as distortions of the environment as seen via the video pass-through feed aren’t perfectly compensated, there will be a discrepancy of where objects are displayed in reality and in virtual view space. As there are limits to the precision of such an optical see-through calibration (OST), another way to compensate for its effect is to get additional information about the environment directly from the video feed and align objects to it.

Such a tool are Varjo markers, ie. QR codes that are placed in the image. Using image analysis, virtual objects can be fixed to such QR codes and therefore automatically realigned to teh video feed as the user moves around. The video below shows the result of this scenario.

4 – Lighthouse Tracking with Varjo Markers

In our final test scenario we did the same experiment as in scenario 3, just with Lighthouse tracking instead of LPVR tracking.

Conclusion

See a table with our preliminary findings below:

Test Scenario LPVR Tracking Lighthouse Tracking
Without Varjo Markers 2 2.5
With Varjo Markers 1.5 1.5

– Approximate displacement error on horizontal plane (in cm)

Using the same room setup and test scene, the mixed reality accuracy of LPVR-CAD and Lighthouse tracking is similar. With both tracking systems slight shifts of 1-2cm depending on the head movement can be observed. A way to further reduce this residual drift is to use Varjo markers that further align virtual objects with the video feed from the pass-through cameras. Good results with LPVR tracking require precise adjustment of the optical target attached to the headset.

Note that our method of estimating the displacement error is rather qualitative than quantitative. With this post we made a general comparison of LPVR and Lighthouse tracking, with and without Varjo markers. A more quantitatively accurate evaluation will follow.

For customers wanting to reduce as much drift as possible, we recommenced the use of markers and optical tracking. There may be different results using Varjo XR-4 and other variations on the tracking environment or displayed content, which could warrant further testing in the future.

*We provide complete solutions with cutting-edge tracking systems and content for a variety of headsets. For detailed information on supported HMD models, please reach out to us.

Immersive Driving Assistance with LPVIZ

How LPVIZ Augments Driving Reality

Going beyond a simple screen replacement, LPVIZ is an augmented reality driving assistance solution for the car. It allows displaying related content to a driver or passenger in 3D, superimposed to reality. Content can be placed anywhere inside the car, such as a virtual speedometer over the dashboard, and anywhere outside of the car, such as point-of-interest markers or navigation guidance.

The video on top of this post shows what a drive around the block in Azabujuban, Tokyo with LPVIZ looks like. A virtual dashboard is projected onto the center console of the vehicle. Arrows on the ground show lane guidance to the driver. Red Google Maps-style markers show points of interest. The virtual dashboard stays fixed to the same location in the car, even when the vehicle turns. The navigation arrows move smoothly and the point-of-interest markers are globally anchored.

Perfectly Tuned Components

LPVIZ consists of several components that all have to interact perfectly to create a compelling and safe augmentation experience. The below illustration shows a block diagram of how the hardware components are connected.

Accurate tracking is required to display useful content to the driver: the HMD pose in the local car coordinate system and the vehicle pose in a globally anchored frame. Precise calibration of all components of the solution is essential to provide the highest visual fidelity and driver safety. Our LPVIZ product makes all parts of the system available in a compact form factor, ready to be integrated with any vehicle.

The Past, Present and the Future

In the current development stage we’re focusing on the most essential aspects of the solution: displaying a virtual dashboard, navigation information and points-of-interest. While this is our proprietary content, we’re opening our software to work with 3rd party developers to create their own content building on our platform.

Currently we’re offering LPVIZ as a B2B solution for prototyping, design and research. However, we’re working on reducing system complexity to make it work as a consumer facing automotive after-market solution to be released later this year.

Towards a Consumer Product

We are very proud of the progress our team has made in the past months. We’re moving closer to making our vision of an augmented reality driving assistance system a reality for everyone. One very important take-away from our recent developments is that it’s indeed possible to provide real utility to the driver using technology that is readily available. It might still be early days, but we’re edging towards a product that could appeal to a wider consumer market. This is just the beginning.

LPVR-DUO in an Airborne Helicopter

In-Flight VR

Imagine soaring through the skies as a pilot, testing the limits of a helicopter’s capabilities while feeling the rush of wind and turbulence. Now imagine that you don’t see the real world outside and the safe landing pad that your helicopter is approaching but a virtual reality (VR) scene where you are homing in on a ship in high seas. The National Research Council Canada (NRC) and Defence Research and Development Canada (DRDC) have brought this experience to life with their groundbreaking Integrated Reality In-Flight Simulation (IRIS).

IRIS is not your ordinary simulator; for one, it’s not sitting on a hexapod, it’s airborne. It’s a variable-stability helicopter based on the Bell 412 that can behave like other aircraft and can simulate varying weather conditions; combine that with a VR environment and you have a tool that allows safely training operations in the most adverse conditions. In particular it is used for Ship Helicopter Operating Limitations (SHOL) testing.

Mission-Critical Application with LPVR-DUO

The LPVR-DUO system is what makes VR possible on this constantly moving platform. This cutting-edge AR/VR tracking system seamlessly merges the inertial measurements taken by the headset with the helicopter’s motion data and a camera system mounted inside the cabin to provide the correct visuals to the pilot. The challenges of using cameras to track the VR headset inside the tight environment of the helicopter while lighting conditions are ever-changing are overcome by using an ART SmartTrack 3 system. This system follows an arrangement of reflective markers attached to the pilot’s helmet. The VR headset is attached to the helmet in such a way that the pilot can wear it as if it were a pair of night vision goggles. Put together, this allows displaying a virtual world to the pilot, even in the most extreme maneuvers.

To ensure an authentic experience, the IRIS system incorporates real-time turbulence models, meticulously crafted from wind tunnel trials. These turbulence effects are seamlessly integrated into the aircraft’s motion and into the VR scene, providing pilots with precise proprioceptive and vestibular cues. It’s a symphony of technology and innovation in the world of aviation testing.

In-Cockpit Implementation

The optical tracking system relies on highly reflective marker targets on the helmet to track movement in three dimensions. Initially, only five markers were installed, strategically placed for optimal tracking. But the pursuit of perfection led NRC to create custom 3D-printed low-reflectivity helmet molds, allowing them to mount a dozen small passive markers. This significantly improved tracking reliability in various lighting conditions and allowed for a wider range of head movement.

Recently, NRC put this remarkable concept to the test with actual flight trials. The response from pilots was nothing short of exhilarating. They found the system required minimal adaptation, exhibited no noticeable lag, and, perhaps most impressively, didn’t induce any motion sickness. Even the turbulence effects felt incredibly realistic. Surprisingly, the typical VR drawbacks, such as resolution and field of view limitations, had minimal impact, especially during close-in shipboard operations. It’s safe to say that IRIS has set a new standard for effective and immersive aviation testing.

Publication of Results

The NRC team presented their results at the Vertical Flight Society’s 79th Annual Form in two papers [1] and [2] and they also have a blog post on their site.

NOTE: Image contents courtesy of Aerospace Research Centre, National Research Council of Canada (NRC) – Ottawa, ON, Canada

High-performance Use Cases of LPVR & Varjo Headsets

Components of a VR/AR Operating System

Augmented and virtual reality technology helps boost worker productivity in various fields such as automotive, aerospace, industrial production, and more. Whereas the context of these applications is usually fairly specific, some aspects are common to many of these use cases. In this article, we will specifically explore the topic of pose tracking of Varjo head mounted displays (HMDs) based on LP-RESEARCH’s LPVR operating system. We will further on show two customer use cases that utilize LPVR in different ways.

In a typical VR/AR setup, you find three main subsystems as shown in the illustration below:

With our LPVR operating system, we connect these three building blocks of an VR/AR system and make them communicate seamlessly with each other while providing a simple, unified interface to the user. Depending on the specific use case, users might select different types of hardware to build their VR/AR setup. Therefore LPVR offers a wide range of interface options to adapt to systems from various manufacturers.

LPVR Flavors

LPVR operates in different flavors, we can group end applications into two categories:

  • LPVR-CAD – Static AR/VR setups, where multiple users operate and collaborate in one or more joint tracking volumes. These tracking volumes can be situated in different locations.
  • LPVR-DUO – AR/VR systems that are located in a vehicle or on a motion platform: such systems have special requirements, especially on the tracking side. If, for example, you would want to track a headset inside a car, displaying a virtual cockpit anchored to the car frame, and a virtual outside world fixed to a global coordinate system, means of locating the car in the world and referencing the HMD locally in the car frame are required.

 

In the following paragraphs, we will introduce two customer use cases that cover these two basic scenarios.

Large-scale Industrial Design at Hyundai

– Varjo XR-3 at Hyundai Design Center with optical markers attached. Image credit: Hyundai

For the Korean automotive company Hyundai Motor Company, we created a large, location-based virtual reality installation at their research and development center in Namyang, Korea. The system is used to showcase, amend and modify prototype and production-ready automobile designs.

This application uses optical outside-in tracking and LP-RESEARCH’s LPVR-CAD solution to track up to 20 users wearing head-mounted displays. While LPVR allows a mix of different headset types to operate in the same tracking volume, the Varjo XR-3 gives the most outstanding performance to inspect objects in high resolution and great detail. Additionally to an HMD, users carry hand controllers for a total of more than 40 tracked objects in a space close to 400 sqm.

– Hyundai’s collaborative virtual reality design experience. Image credit: Hyundai

Responsiveness is achieved by using LPVR-CAD to combine data from the inertial measurement unit built into the headsets and information from the optical tracking system. The optical system uses 36 infrared cameras to track the 160 markers attached to the HMDs and hand controllers. Perfectly smooth and uninterrupted position and orientation data of each user’s HMD is achieved by using LP-RESEARCH’s sensor fusion algorithms.

Depending on the type of headset, users either wear a backpack PC, connect to a host wirelessly or use an extra-long cable to connect directly to a rendering PC outside the tracking volume.

“Currently, we are actively utilizing VR from the initial development stage to the point of development. In the future, we plan to increase accessibility and usability by simplifying equipment using wireless HMDs. For this, improving the speed and stability of wireless internet is essential, which we plan to address by introducing 5G. In addition, LP RESEARCH’s technology is essential for multi-user location sharing within a virtual space.” – SungMook Kang, Visualization Specialist, Hyundai Motor Company

Next-level Automotive Entertainment with CUPRA

Imagine.. playing Mario cart, your hands are gripping the wheel, and you are in Neo Tokyo, on a race track. Futuristic buildings keep flying by while you race ahead, drifting into long turns and leaving your competitors behind you.

Now imagine you are no longer in your living room, you are sitting in an actual race car, buzzing around in an empty parking lot. Instead of looking through the windshield with your own eyes, you are wearing a Varjo XR-3 HMD. What you see outside the car is a virtual world, it’s Neo Tokyo.

– The view through the Varjo XR-3 headset. Image credit: CUPRA

As the car moves on the parking lot, you move inside the virtual world. When you move your head inside the car’s cockpit, the motions of your head are accurately tracked.

– Varjo XR-3 inside the cabin of the Urban Rebel. Image credit: Fonk Magazine

– Cupra’s Urban Rebel drifting on the test course

Together with the Norwegian company Breach VR, we have implemented this experience for the automotive company CUPRA. CUPRA is relentlessly pushing the technology of their vehicles into the future, striving to provide a novel driving experience to their customers.

Tracking of the vehicle and the Varjo XR-3 inside the vehicle is achieved with LP-RESEARCH’s automotive tracking systems LPVR-DUO. As the headset’s gyroscope sensors record the superimposed motion data of the car and the user inside the car, a specialized sensing system, and the algorithm are required to separate the two.

The result of this cascade of exceptional technology is a compellingly immersive driving experience of the future. The combination of an outstanding visualization device like the Varjo XR-3, LPVR state-of-the-art tracking, BreachVR’s 3D software and design and, last but not least, the incredible CUPRA race cars make for an exciting ride that you’ll greatly enjoy and never forget. Come and join the ride!

Check this blog blog post in the Varjo Insider Blog.

Check out our Instagram for further use cases with Varjo’s HMDs: @lpresearchinc

Design of an Efficient CAN-Bus Network with LPMS-IG1

Introduction to Designing an Efficient CAN-Bus Network

This article describes how to design an efficient high speed CAN-bus network with LPMS-IG1. We offer several sensor types with a CAN bus connection. The CAN bus is a popular network standard for applications like automotive, aerospace and industrial automation where connecting a large number of sensor and actuation units with a limited amount of cabling is required.

While creating a CAN bus network is not difficult by itself, there are a few key aspects that an engineer should follow in order to achieve optimum performance.

Efficient CAN-Bus Network Topology

A common mistake when designing a CAN bus network is to use a star topology to connect devices to each other. In this topology the signal from each device is routed to a center piece by connections of similar length. The center piece is connected to the host to acquire and distribute data to the devices of the network.

For reaching the full performance of a CAN bus network, we strongly discourage using this topology. Most CAN bus setups designed in this way will fail to work reliably and at high speed.

The CAN bus standard’s fundamental concept is to work best in a daisy chain configuration, with one sensor unit or the data acquisition host being the first device in the chain and one device being the last in the network.

Maximum CAN-Bus Speed and Cable Length

A key aspect for the design of an efficient high speed CAN-bus network is to correctly adjust bus cable lengths. The bus line running past each device is to be the longest connection in the network. Each sensor needs to be connected to the bus by a short stub connection. A typical length for such a stub connection is 10-30cm, whereas the main bus line can have a length of hundreds of meters, depending on the desired transmission speed.

Speed in bit/s Maximum Cable Length
1 Mbit/s 20 m
800 kbit/s 40 m
500 kbit/s 100 m
250 kbit/s 250 m
125 kbit/s 500 m

Note that a CAN bus network needs to be terminated using a 120 Ohm resistor at each end. This is especially important for bus length of more than 1-2m and should be considered as general good practice.

LPMS-IG1 CAN-Bus Configuration

One of our products with a CAN bus interface option is our LPMS-IG1 high performance inertial measurement unit. LPMS-IG1 can be flexibly configured to satisfy user requirements. It has the ability to output data using the CANopen standard, freely configurable sequential streaming or our proprietary binary format LP-BUS. These and further parameters can be set via our IG1-Control data acquisition application.

Some CAN bus data loggers that rely on the CANopen standard require users to provide an EDS file to automatically configure each device on the network. While we don’t support the automatic generation of EDS files from our data acquisition applications, depending on the settings in IG1-Control or LPMS-Control, it is possible to manually create an EDS file as described in this tutorial.

In this article we give a few essential insights into how to design an efficient high speed CAN-bus network with LPMS-IG1. If you would like to know more about this topic or have any questions, let us know!

1 2 3 5