"Motion Capture” is the technique of digitizing the movement of people, animals or objects. In the case of the ART Motion Capture system, markers are attached to the subject’s limbs (no need for lycra body suits!) and the data is captured by performing a range of movements in front of an array of pre-installed cameras.
Before Motion Capture can take place, it is critical that the body markers are precisely calibrated in order to produce an exact replica of the movements made by the subject. To do this, ART Motion Capture uses a single step calibration process which is fast, accurate and convenient. The subject simply makes a series of arbitrary movements which automatically calculates the bone length and the position of the targets. The results are transferred automatically to a digital manikin where they can be used directly in real time, without the need for post processing.
The ART Motion Capture system has been specifically designed for applications such as immersive Virtual Reality and ergonomics. It is a result of ARTs research effort in close cooperation with Volkswagen AG as part of the AVILUS research project. The system supports collaborative working, allowing the Motion Capture of up to three subjects (only upper body for the third) simultaneously.
ART-Human is a tool to assist motion-capture projects. The software uses the information of our highly accurate 6 degree-of-freedom (6DOF) optical targets to track a human subject. We use up to 17 ergonomically designed targets which can be worn directly on normal clothing.
For normal usage the ART-Human model calibrates the length of each limb segment automatically. After the targets have been attached, the fast and straightforward calibration process is started. The user simply moves his arms and legs for around 20 seconds to the full mobility of all joints. The accuracy of the calibration is then evaluated by the software, and the results – the length of each limb and the offset data for the 6DOF targets – are applied to the model.
ART-Human is now ready to send human model data in real-time to your application. We provide our own 6dj format, similar to the Fingertracking hand data format. It’s also possible to use Fingertracking with ART-Human at the same time. The movement of the human subject is now linked directly in realtime to the manikin in the 3D model.
- Full kinematics of human skeleton model
- Simple calibration procedure
- Use of 6DOF optical tracking targets
- Fast automatic bone-length-calibration (< 1 min)
- Real-time interface to Autodesk MotionBuilder®, Siemens Jack and Dassault Systèmes Live Motion Standard v1
- Output via VRPN, 6dj
- Recording to C3D, BVH and to RAMSIS CSV files
- Fully compatible with the ART Fingertracking system
Head mounted display (HMD)
A tracked HMD provides an excellent immersive viewing experience. And when the subjects are also wearing a Motion Capture system, they can collaborate and interact realistically with other users within a 3D environment. It has been well established that this provides major benefits for multi-person training for tasks that are either very expensive or potentially dangerous in the real world.
When planning an assembly task, it is important to know if the part itself will fit into the designed space, and also that the worker has enough space to carry out the work safely and efficiently. Motion Capture has been proven to be much more cost effective than using a physical prototype.
When designing automobile interiors, a “virtual seating buck” is used as an alternative to physical prototypes. This means that the subject uses a HMD to view a detailed 3D model of the vehicle’s interior. Often, a target set for the upper body is used, as only the position of the upper body and arms needs to be measured. Customers also use the ART Fingertracking system in conjunction with Motion Capture to give an accurate representation of finger movement.
Training staff in the use of equipment or procedures that are potentially dangerous, expensive and possibly unavailable is a constant problem (e.g. a fire in a critical area of the company or maintenance of heavy machines). Using VR based training means that staff can be trained in a safe but realistic computer generated environment where repetition of complex tasks can provide a high level of learning and understanding.
In the production planning phase it is important to know how to assemble the product as quickly as possible, and that the worker can carry out the task safely over a long period of time. To do this, ergonomic studies are performed before the actual work place is set up. Rather than a simulation involving a virtual manikin operated by the mouse and keyboard, it is more realistic to have a worker perform the assembly task in the real world using Motion Capture. This accurately digitizes the movement involved in carrying out the work and transfers it to a virtual manikin, which in turn allows detailed analysis so that an objective comparison can be made between different tasks.
One potential drawback of carrying out operations in a real world setting is that optical tracking can sometimes be hindered by objects between the cameras and the Motion Capture targets. To deal with this, ART has developed a new hybrid Motion Capture system which uses traditional optical targets combined with robust inertial sensors. This ensures that the flow of accurate tracking data remains uninterrupted even if the subject is out of view of the cameras.
The ART Motion Capture system has a direct link to the Alaska/Dynamicus ergonomic software from IfM Chemnitz. This will calculate the ergonomic stress of the person to the EAWS (Ergonomic Assessment Work Sheet) standard.