top of page

Building a Motion Capture Pipeline for British Sign Language in XR with Manus and The Open University

As extended reality moves from niche to mainstream, researchers are beginning to explore how immersive infrastructure can serve accessibility goals, not just entertainment.


One of the more technically ambitious efforts in this space is SIGNATURE-BSL, a project led by The Open University and funded through the Higher Education Innovation Fund from UK Research and Innovation. Its aim is to determine the feasibility of an automated machine translation system for British Sign Language (BSL), delivered via naturalistic 3D virtual humans in XR environments.


Three hands gesture in sign language against a black background. Left shows 'L', middle 'V', and right a fist. Each hand is a different skin tone.


The Challenge


BSL is not a gestural overlay on spoken English. It is a fully embodied language in which meaning is distributed across hand shape, finger articulation, motion, body posture and facial expression simultaneously. That creates an unusually demanding capture problem.


As well as this, vocabulary alone poses a challenge. There are an estimated 20,000 and 100,000+ individual signs, and unlike spoken languages, the order and structure of a sentence depends heavily on context - where a speaker’s hands are in space and when movements happen relative to each other.


Subtle differences in hand position or facial expression can alter meaning entirely, and transitions between signs also carry linguistic information. Any translation system that fails to capture this nuance will fail fluent users immediately.


This is the core reason why existing natural language processing approaches do not transfer cleanly to BSL. The signal is 3D, time dependent and anatomically specific in ways that 2D video or text based training data cannot adequately represent.



The Solution


An avatar of a man in a checkered sweater gestures with both hands. Behind him is a black grid background. He appears surprised and focused.

The SIGNATURE-BSL project addresses this by combining two complementary capture approaches.


The first is volumetric video. Using a 32-camera studio setup, the team captures BSL signers as 360 degree recordings, producing animated 3D meshes that preserve both spatial geometry and temporal dynamics. 


The second approach is optical motion capture combined with hand and facial tracking - a more data-efficient method that focuses on bone movement, joint position and facial expression rather than full surface geometry. This is where Manus gloves play a central role. 


Two people in a studio: one in a motion capture suit wearing Manus Gloves, the other in a patterned dress with a lanyard. Black curtain background.

Sign language is exceptionally demanding on hand capture. The gloves record finger articulation at high precision that preserves the distinctions between similar signs. 


Because smooth transitions between signs are linguistically meaningful and not just cosmetically desirable, this level of precision is non-negotiable for the data to be linguistically valid.




Arm wearing a Manus Quantum Metaglove with black and gray details, labeled "QUANTUM," on aqua background with pixelated green and blue squares.

Manus Quantum Metagloves are a professional grade hand tracking solution built for applications where precision is non-negotiable. 


At the core of the gloves is Manus's proprietary Quantum tracking technology, which uses millimetre-accurate, drift-free fingertip sensors to capture absolute position and three-axis rotation across all fingers at 120Hz with a signal latency of under 7.5ms. 


Quantum Metagloves maintain consistent fidelity regardless of hand position or motion speed, connect wirelessly via Bluetooth with a range of up to 15 metres, and feature swappable batteries for uninterrupted capture sessions. 


Data streams natively into Manus Core and integrates in real time with Unreal Engine, Unity, MotionBuilder, OptiTrack and other major pipelines, with FBX and CSV export also supported. 


For research teams who need anatomically trustworthy hand data, the Quantum Metagloves set the standard.



The Result


Together, these two pipelines generate a multi-modal dataset of BSL performed by expert signers, forming the anatomically meaningful training foundation that future machine learning systems will depend on.


The team has built a web-based proof-of-concept that demonstrates motion replay and control within XR environments. Aligning Manus glove data with anatomical constraints is key to reproducing natural hand motion.


For fluent BSL users, the quality bar is high: unnatural or ambiguous hand motion is immediately identifiable and breaks comprehension. Accurate hand articulation is a threshold requirement for the technology to be usable.


Speaking about the long-term potential of the work, Principal Investigator Prof Dr Fridolin Wild notes that AI-enabled BSL translation could deliver meaningful workforce inclusion benefits, allowing deaf and hard-of-hearing individuals to participate more easily in a broader job market, while also lowering access barriers in lower-stakes contexts and strengthening deaf community independence.




Manus gloves are used across industries, from robotics to animation, surgical simulation and space research. Get in touch with us to schedule a demo or to discuss how we can find a mocap glove solution for you. Email sales@target3d.co.uk or call us on (+44) 0203 488 2575. 



Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
Get in Touch
bottom of page