Robotic Revolution! Australian Soldiers Can ‘Instruct & Control’ Robotic Dogs Merely By Thinking & Fight Next-Gen Wars

Australian researchers working with the country’s Defense Department have realized a breakthrough that allows a human to instruct a robot merely by thinking and could help armed forces fighting in future wars to communicate with a wide array of sensors, vehicles, and robots amid the adversary’s attempts to intercept radio communications.

The Australian researchers from the University of Technology Sydney in Sydney, New South Wales, published a paper this month in Applied Nano Materials, which details how a test subject could direct a ground robot to waypoints merely by visualizing them through a Microsoft HoloLens.

Guiding a robot or computer simply by thinking happens through ‘Brain-Machine interfaces (BMIs)’ or even neural interfaces. The history of BMIs begins with the discovery of the brain’s electrical activity by the German psychiatrist Hans Berger, who is well known for having invented electroencephalography (EEG) in 1924, a method used to record the electrical activity of the brain.

The Australian researchers note in their paper that BMIs hold vast potential for the future of robotics, bionics, prosthetics, neurogaming, electronics, and autonomous vehicles.

Neurogaming or brain-controlled video gaming goes back to 2006 when scientists from Washington University in St. Louis built an interface that enabled a teenager with epilepsy – a brain disorder characterized by repeated seizures – to control ‘Space Invaders.’

The idea behind BMIs is that rather than using purpose-specific neurons to control a purpose-specific device such as a cochlear implant, or some prostheses, it is better to build an interface between the brain and any computer, and the computer could perform various tasks.

BMIs generally consist of three modules: the external sensory stimulus, a sensing interface, and a neural signal processing unit. Of these, the ‘sensing interface’ plays a significant part by detecting the cortical electrical activity in the cerebrum’s outer layer, which encodes ‘human intent,’ or technically speaking, brain waves at a frequency of ∼1−150 Hz.

The sensing interface detects this cortical electrical activity through either implanted or wearable neural sensors like the EEG electrodes.

The latter is preferred when no severe disabilities are involved. However, wearable sensors rely on gels for electrical conduction, which has to be applied on the scalp or hair, and this could not work well for soldiers in helmets.

“The use of the gel contributes to skin irritation, risk of infection, hair fouling, allergic reaction, instability upon motion of the individual, and unsuitability for long-term operation due to the gradual drying of the gel,” researchers note in the paper.

Robotic Dogs
File Image: Robotic Dogs

Robotic Dogs Controlled Via Visualization

Australian researchers have developed a graphene-based dry sensor to detect the EEG signals from the occipital region of the head, which corresponds to the visual cortex and is thus crucial for BMIs that rely on visual stimuli. Reports suggest this sensor could work well inside a helmet.

The Australian military is also working with the researchers, and the system was tested before the paper was published.

The researchers coupled their new Graphene-based sensor with a Microsoft HoloLens, which the test subject used to look around while his brain sent out a signal via the occipital lobe.

These signals were collected through the sensor and processed by a small Raspberry Pi 4B computer, which translated the signals into clear computer signals that encoded instructions about a particular waypoint corresponding to a certain position.

These instructions were then fed to a Q-UGV (quadrupedal unmanned ground vehicle), a four-legged robot resembling a dog produced by Ghost Robotics. Upon receiving the instructions from the computer, the Q-UGV then proceeded to that way point.

Image
A Ghost Robotics’ Vision 60 Q-UGV supports the soldiers of the Australian Army during an autonomous systems demonstration at the Majura Training Area, Canberra. (Twitter)

The video of the test was uploaded to YouTube by the Australian Army a month ago, in which Lt. Col. Kate Tollenaar and Sgt. Damien Robinson describes the successful experiment.

The Australian Army also conducted a second demonstration involving a commander instructing the robot and fire-team members to perform a simulated patrol clearance of several buildings at the Majura range Urban Operations Center. During the demonstration, the soldiers monitored the robot’s video feed via the HoloLens headset.

“This technology enables me to not only control the ghost robot as well as monitor its video feed, but it allows me to be situationally aware of my surroundings as well as my team to be able to control all movements of that battlefield clearance,” said Sgt Chandan Rana, who led the simulated patrol clearance.

“This is very much an idea of what might be possible in the future,” Lt. Col. Kate Tollenaar says in the video. “We’re really excited to see where the technology might go and to work with our stakeholders.”

Chin-Teng Lin, a professor at the University of Technology Sydney and one of the paper’s authors, told Defense One that until now, the brain-computer interfaces (BCIs) have only functioned effectively in laboratory settings, requiring the user to wear invasive or cumbersome wet sensors and remaining stationary to minimize signal noise.

In comparison, “our dry sensors are easy to wear with the BCI. They work in real-world environments, and users can move around while using the system,” Lin explained.

The US military has also been working on brain-machine interfaces and has accomplished some remarkable feats.

For example, in 2015, Jan Sheuermann, a paralyzed woman with a brain chip developed by scientists from the US Defense Advanced Research Projects Agency (DARPA) and the University of Pittsburgh’s Human Engineering Research Laboratories, was able to pilot a virtual F-35 Joint Strike Fighter (JSF) using only brain signals.

The scientists had initially approached Sheuermann about plugging her brain into a robotic arm. However, instead of connecting her brain interface to a robotic arm, they connected it to a flight simulator so she could use the same neural connections to fly an F-35 JSF.