2023 Author: Bryan Walter | [email protected]. Last modified: 2023-05-21 22:24
CTRL-labs has presented the first version of its neural interface for application developers. The device reads the activity of the motor neurons in the hand and converts it into commands for other devices. The controller will be shipped to developers in early 2019, VentureBeat reports.
Despite the fact that electronic computers appeared more than half a century ago, until now the dominant way of human-computer interaction are buttons on the keyboard or other devices that perform certain functions. Over the past decade, there has been a shift towards touchscreens, but they tend to still display the virtual incarnation of buttons. But a fairly large number of researchers are engaged in the study of human-computer interaction and the development of fundamentally new ways to control technology.
These experimental designs involve a variety of operating principles. For example, some of these devices use wearable joysticks or touchpads, some of them turn the skin into a touchscreen, while others track hand gestures or movements of all parts of the body and turn them into commands. In addition, some scientists adhere to a completely different direction, and create devices that directly read the activity of neurons in the brain, allowing you to control the computer with the power of thought.
Although such technologies are considered promising, brain-to-computer neural interfaces are still in their infancy and do not possess sufficient accuracy and ease of implementation. For several years, engineers from CTRL Labs have been developing a different type of neurocontroller, which determines the user's intentions not using electroencephalography, but using electromyography, that is, reading muscle activity.
Developer version of the controller
The device is a bracelet, on the inner side of which there are 16 blocks, on some of which there are two electrodes, and on the other - three. In addition, the control unit is located on the upper side of the bracelet. It is not yet powerful enough for full control. Instead, in the developer version, it will act as a transmitter of data from electrodes and motion sensors for processing by neural network algorithms on a computer.
With this number of sensors, algorithms can detect a wide range of standard movements, even if they are of extremely low intensity. As an example, the developers showed several demo videos in which the controller reconstructed the structure of the hand in real time and even allowed a regular table to be used as a keyboard.
In addition to the controller itself, the developer kit includes a development kit (SDK) that allows you to create ready-made applications for the controller. The company believes that the controller can be useful for developers of applications for virtual and augmented reality, and has adapted the SDK for this. For example, it supports integration with user tracking systems used by some VR environments. The controller will be available to developers in Q1 2019. The company does not tell about the timing of release and the cost of the future production version.
It is worth noting that the concept of using electromyographic sensors on the hand is not new and has already been implemented by the startup Thalmic Labs in the form of a Myo bracelet, but the company subsequently abandoned its production. In addition, this bracelet had a much lower accuracy and could only read explicit hand gestures, rather than small impulses that almost did not lead to noticeable hand movement.