LPMS-NAV Navigation Sensors for Mobile Robots and Automated Guided Vehicles (AGV)

We are proud to announce the release of a new series of high-precision sensors for applications in autonomous vehicle navigation. The sensors are based on quartz-vibration gyroscopes with low-noise, low-drift characteristics. They have excellent capabilities for measurement of slow to medium speed rotations.

We offer the new sensors in various versions with different communication interfaces and housing options: LPMS-NAV2, LPMS-NAV2-RS232 and LPMS-NAV2-RS422. Please check more detailed information on our products page

The following video shows a use-case of one of our customers in China. The company is using LPMS-NAV2-RS232 sensor for mobile robot navigation. Automatic navigation of automated guided vehicles (AGV) or cleaning robots are two of the principal application areas of the LPMS-NAV series.

If you have any interest, please contact us for further information.

IMU-based Dead Reckoning (Displacement Tracking) Revisited

In a blog post a few years ago, we published results of our experiments with direct integration of linear acceleration from our LPMS-B IMU. At that time, although we were able to process data in close to real-time, displacement tracking only worked on one axis and for very regular up-and-down motions.

In the meantime the measurement quality of our IMUs has improved and we have put further work into researching dead reckoning applications. Fact is that still, low-cost MEMS as they are used in our LPMS-B2 devices are not suitable to perform displacement measurement for extended periods of time or with great accuracy. But, for some applications such as sports motion measurement or as one component in a larger sensor fusion setup, the results are very promising.

A further experiment shows this algorithm applied to the evaluation of boxing motions. This system might work as a base component for IoT boxing gloves that allow automatic evaluation of an athletes technique and strength, or it might ne integrated into an advanced controller for virtual reality sports.

As usual, please contact us for further information.

IMUcore Sensor Fusion

Introducing IMUcore

IMUcore is the central algorithm working inside all LP-RESEARCH IMUs. It collects orientation data from several sources and combines them into fast and drift-free tilt and direction information. To work with any type of MEMS sensor, various online and offline calibration methods have been implemented to guarantee high quality data output. The algorithm is very versatile and performance-saving. It can be implemented on embedded MCUs with minimum power consumption.

IMUcore is now available as a solution from LP-RESEARCH. Please contact us here for more information or a price quotation.

Sensor Fusion Filter Overview

IMUcore uses gyroscope data as basis information to calculate orientation. Errors introduced through measurement noise are corrected by accelerometer and compass data. Optionally the sensor fusion can be extended with an optical (or other) tracking system to additionally provide position information.

If this topic sounds familiar to you and you are looking for a solution to a related problem, contact us for further discussion.

Meet Xikaku

We are proud to present our new partner company Xikaku. Xikaku is a US company located in Los Angeles, focusing on the development of technology related to the field of augmented reality (AR). Visit their website here.

LPVR Location-based VR Tracking

LPVR Location-based VR Tracking Introduction

UPDATE 1: Full SteamVR platform support

UPDATE 2: Customer use case with AUDI and Lightshape

UPDATE 3: Customer use case with Bandai Namco

Consumer virtual reality head mounted display (HMD) systems such as the HTC VIVE support so-called room scale tracking. These systems are able to track head and controller motion of a user not only in a sitting or other stationary position, but support free, room-wide motions. The volume of this room scale tracking is limited to the capabilities of the specific system, usually covering around 5m x 5m x 3m. Whereas for single user games or applications this space may be sufficient, especially multi-user applications such as arcade-style game setups or enterprise applications require larger tracking volumes.

Optical tracking systems such as Optitrack offer tracking volumes of up to 15m x 15m x 3m. Although the positioning accuracy of optical tracking systems are in the sub-millimeter range, especially orientation measurement is often not sufficient to provide an immersive experience to the user. Image processing and signal routing may introduce further latencies.

Our LPVR-VIVE large room-scale tracking solution solves this problem by combining optical tracking information with inertial measurement data using a special predictive algorithm based on a head motion model.

Compatible HMDs: HTC VIVE, HTC VIVE Pro
Compatible optical tracking systems: Optitrack, VICON, ART, Qualisys, all VRPN-compatible tracking systems
Compatible software: Unity, Unreal, Autodesk VRED, all SteamVR-compatible applications

LPVR-VIVE is now available as a solution from LP-RESEARCH. Please contact us here for more information or a price quotation.

IMU and Optical Tracker Attachment

The system follows each HMD using a combination of optical and IMU tracking. A special 3D-printed holder is used to attach a high-quality IMU (LPMS-CU2) and optical markers to an HTC VIVE headset.

To create a perfectly immersive experience for the user, optical information is augmented with data from an inertial measurement unit at a rate of 800Hz. Additionally to the high frequency / low-latency updates, a head motion model is used to predict future movements of the player’s head. This creates an impression of zero latency gameplay.

Tracking Camera, HMD and Player Setup

In its current configuration the system can track up to 6 actors simultaneously, each holding two VIVE controllers to interact with the environment. Players wear backpack PCs to provide visualization and audio.

Playground and Camera Arrangement

LPVR-VIVE covers an area of 15m x 15m. There is an outer border of 1.5m that is out of the detection range of the cameras. This results in an actual, usable playground area of 13.5m x 13.5m. The overall size of the playground is 182.25m². The cameras are grouped around the playground to provide optimum coverage of the complete area.


Contact us for further information.

1 2 3 4 10