AVGVST Guest Post: Refining Human Motion

This is a guest post by AVGVST creative agency. AVGVST are our good neighbours here in Nishizabu Tokyo, so we thought it is a good idea to ask them to create a few good-looking blog posts for us.

Human Motion Capture

Human motion capture is a term commonly known from the world of movie production: Gollum in The Lord of the Rings lurking and smiling at his shiny ring in a weirdly human-like manner or the beautifully alien creatures of Avatar floating through a fantastic landscape.

Although transferring human body movements to a movie character is an established method, it might be surprising to some that human motion capture has a range of applications in a areas beyond the world of film production.

Motion Capture can improve human life by boosting a person’s work efficiency, support injury recovery and help preventing excessive strain on the human body under rough working conditions. The medical and manufacturing industries are just two of many fields where motion capture helps to optimizing human movements.

IMU-Based Technology for Refining Human Motion

One of the main topics of LP-RESEARCH for the application of its advanced sensor technology is to provide means for quantitatively refining human motion and make them faster, safer and more efficient.

LP-RESEARCH’s chief scientist Tobias Schlüter is writing software that uses motion sensor data to measure the movements of a person. Small sensors attached to the subject’s limbs track body motion and based on the acquired information, adjustments can be made to the subject’s movements.

This can result in improved speed safety and efficiency for a specific activity.

Worker Safety & Well-Being First in Industrial Production

Using this technology a patient trying to recover from a severe injury might find a faster way back to normal life. A runner working to improve his running style might gather useful information to optimize his training strategy.

A central topic in industrial manufacturing is the improvement of production efficiency: human workers performing repetitive tasks face problems of fatigue and physical conditions like back pain. Human motion capture and the corresponding analysis methods help to correct sub-optimal movements to help the work fatigue less, stay healthy and at the same time become more efficient.

With applications in sports, medical treatment, industrial production and more: Sensor technology from Tokyo – welcome to LP-RESEARCH.

To find out more details of how this technology works, please contact us

LPVR Middleware a Full Solution for AR / VR

Introducing LPVR Middleware

Building on the technology we developed for our IMU sensors and large scale VR tracking systems, we have created a full motion tracking and rendering pipeline for virtual reality (VR) and augmented reality (AR) applications.

The LPVR middleware is a full solution for AR / VR that enables headset manufacturers to easily create a state-of-the-art visualization pipeline customized to their product. Specifically our solution offers the following features:

  • Flexible zero-latency tracking adaptable to any combination of IMU and optical tracking
  • Rendering pipeline with motion prediction, late latching and asynchronous timewarp functionality
  • Calibration algorithms for optical parameters (lens distortion, optical see-through calibration)
  • Full integration in commonly used driver frameworks like OpenVR and OpenXR
  • Specific algorithms and tools to enable VR / AR in vehicles (car, plane etc.) or motion simulators
Overview of LPVR Middleware Functionality

Application of LPVR Middleware to In-Car VR / AR

The tracking backend of the LPVR middleware solution for VR and AR is especially advanced in the aspect that it allows the flexible combination of multiple optical systems and inertial measurement units (IMUs) for combined position and orientation tracking. Specifically it enables the de-coupling of the head motion of a user and the motion of a vehicle the user might be riding in, such as a car or airplane.

As shown in the illustration below, in this way the interior of a vehicle can be displayed as static relative to the user, while the scenery in the environment of the vehicle moves with vehicle motion.

Illustration of In-car VR Installation

For any application of augmented reality or virtual reality application in a moving vehicle, this functionality is essential to provide an immersive experience to the user. LP-Research is the industry leader for providing customized sensor fusion solutions for augmented and virtual reality.

If you have interest in this solution, please contact us to start discussing your applications case.

Contact us for further information.

LPMS Operator’s Manual Update

It’s been a long time, but finally we have updated our reference manual to the latest generation of sensors.

The manual is accessible through our documentation & support page or directly from here.

Below is a list of the most important updates, some of which are fixes that customers have asked for for quite a while:

  • Removed hardware specific parts. These are now covered in the quick start manuals.
  • Corrected scaling factors for all non-floating-point data transmission modes.
  • Corrected error in description of reset modes.
  • Moved to-be-deprecated LpSensor detail description to appendix.
  • Added list with APIs for direct sensor programming. OpenZen is to replace LpSensor.

AVGVST Guest Post: Technology from Neo Tokyo

This is a guest post by AVGVST creative agency. AVGVST are our good neighbours here in Nishizabu Tokyo, so we thought it is a good idea to ask them to create a few good-looking blog posts for us.

Teamwork

LP-RESEARCH provides answers to some of today’s hard engineering problems and questions. It is in the company’s essence to find those solutions and provide them to the people who need them most.

Find out more about the company and the people behind it in this interview with the LP-RESEARCH team.

Global Scale

We are in Germany, on one of the fastest highways on this planet – Die Autobahn. A silver limousine is cruising down the road, gentle and elegantly it cuts through traffic. Its driver wears augmented reality glasses projecting his distance to other cars, navigation information and nearby construction zones directly onto the road into the driver’s field of view.

Beijing China. A forklift is navigating autonomously through a warehouse to organize the stock of an international trading company. Following its pre-programmed routine, the forklift silently moves from A to B, B to C, repeat, effortlessly picking up and delivering palettes loaded with goods to be shipped.

Thousands of miles away, an employee of a Swiss chocolate factory is packing pralines into heart-shaped boxes. His movements are efficient, following a dynamically calculated pattern optimized to maximize productivity, while at the same time helping the employee to keep a healthy posture and minimize the strain on his joints.

From the warehouse in China over chocolate made in Switzerland to cruising on the German Autobahn, there is one thing that connects all these locations: A technology company from Tokyo – Life Performance Research Inc.

Solutions to Hard Problems

The core of LP-RESEARCH’s business and the starting point on many of their journeys to finding a solution to a hard problem relies on their advanced sensing and measurement technology. LP-RESEARCH’s product, the Life Performance Motion Sensor (LPMS) algorithmically fuses together information from a gyroscope, accelerometer and other sensors.

Mathematically “glued” together, these data streams produce a result that is in its accuracy and responsiveness superior to the information gathered from a singular source. This type of sensor fusion represents the focus and core knowledge of LP-RESEARCH’s developments and is applied in all variations of the company’s products and services.

iOS Support for LPMS-B2

LPMS-B2, besides Bluetooth classic, also supports Bluetooth 4 / Bluetooth Low Energy. This allows us to connect the sensor to Apple mobile devices such as the iPad, iPhone or the Apple watch. We recently have created a library that enables development of applications supporting LPMS-B2 on these devices.

The library can be accessed via our open source repository.

The repository contains a skeleton application that shows usage of the most basic parts of the library. The library itself is contained in the following files:

A sensor object is initialized and connected using the follwoing code:

More coming soon..

1 2 3 10