Researchers are developing ‘brain-computer interfaces’ (BCIs) following recent advances in electronics and computing.
Scientists have developed a new Avatar-style technology that allows a human controller to remotely manoeuvre a turtle’s path just using their thoughts. In the 2009 blockbuster “Avatar,” a human remotely controls the body of an alien. It does so by injecting human intelligence into a remotely located, biological body.
Researchers are developing ‘brain-computer interfaces’ (BCIs) following recent advances in electronics and computing. These technologies can ‘read’ and use human thought to control machines, for example, humanoid robots.
New research has demonstrated the possibility of combining a BCI with a device that transmits information from a computer to a brain, or a ‘computer-to-brain interface’ (CBI). The combination of these devices may be used to establish a functional link between the brains of different species.
Researchers from the Korea Advanced Institute of Science and Technology (KAIST) in South Korea have developed a human-turtle interaction system in which a signal originating from a human brain can affect where a turtle moves.
Unlike previous research that has tried to control animal movement by applying invasive methods, KAIST researchers proposed a conceptual system that can guide an animal’s moving path by controlling its instinctive escape behaviour.
The entire human-turtle setup includes a head-mounted display (HMD) combined with a BCI to immerse the human user in the turtle’s environment. The human operator wears the BCI-HMD system, while the turtle has a ‘cyborg system’ – consisting of a camera, a Wi-Fi transceiver, a computer control module and a battery – all mounted on the turtle’s upper shell.
Also included on the turtle’s shell is a black semi-cylinder with a slit, which forms the ‘stimulation device’. This can be turned 36 degrees via the BCI. The human operator receives images from the camera mounted on the turtle. These real-time video images allow the human operator to decide where the turtle should move.
The human provides thought commands that are recognised by the wearable BCI system as electroencephalography (EEG) signals. The BCI can distinguish between three mental states: left, right and idle.
The left and right commands activate the turtle’s stimulation device via Wi-Fi, turning it so that it obstructs the turtle’s view. This invokes its natural instinct to move toward light and change its direction. Finally, the human acquires updated visual feedback from the camera mounted on the shell and in this way continues to remotely navigate the turtle’s route.