Students at a London school are testing software that lets people control computer games using just their body movements, making gaming more accessible.
Key Points:
- The “motion input” software uses AI to track body parts, eye gaze, gestures, and speech
- It runs on regular computers without the extra equipment needed
- Students are testing it out by playing popular games like Minecraft and Rocket League
Is it cool if you could play your favourite games just by moving your body or looking around? Keep reading to learn more!
How It Works
The motion input programs use a computer’s webcam and artificial intelligence to detect how different parts of the user’s body are moving. By tracking things like:
- Eye movements
- Arm motions
- Facial expressions
- Speech commands
The software converts those into inputs to control games and applications. For example, you might navigate a character by moving your arms, jump by raising your eyebrows, or select options through voice commands.
Making Gaming Accessible
Traditional game controllers can be difficult or impossible for some people with disabilities. With motion input, almost anyone can play regardless of mobility limitations. As one student tester said:
“It is easy to use and helps me experience every action of the game. This new controller helps me play better than other ways I’ve tried.”
The team at University College London developed the software precisely to remove barriers and make computer access more inclusive.
Potential Beyond Gaming
While the focus so far has been on video games, the researchers believe the motion input technology could have many other valuable applications, such as:
- Allowing factory workers to control machinery hands-free
- Letting doctors view medical images by using gestures
- Helping disabled people navigate computer interfaces more easily
The accessibility features are already freely available, and the team hopes to eventually bring the other capabilities to industries like manufacturing and healthcare.
The Future of Natural Computer Interaction
Motion input lets people use intuitive movements instead of traditional inputs, opening up new possibilities for human-computer interaction. As the software becomes more advanced, we may interact with devices as naturally as we communicate with other people.
Imagine effortlessly controlling phones, laptops, and other gadgets just by waving your hand, looking around, or speaking. Technologies like this could revolutionize how we engage with the digital world. What other unique innovations might be on the horizon?