Lomas
Defining brain–machine interface applications by matching
interface performance with device requirements
Oliver Tonet a,∗ , Martina Marinelli a , Luca Citi a,b , Paolo Maria Rossini c ,
Luca Rossini c,d , Giuseppe Megali a , Paolo Dario a
a
CRIM Lab, Scuola Superiore Sant’Anna, Pisa, Italy
b IMT School of Advanced Studies, Lucca, Italyc Universit` Campus Biomedico, Rome, Italy
a
d ESA Advanced Concepts Team, Noordwijk, The Netherlands
Received 6 December 2006; received in revised form 6 March 2007; accepted 22 March 2007
Abstract
Interaction with machines is mediated by human–machine interfaces (HMIs). Brain–machine interfaces (BMIs) are a particular class of HMIs
and have so far been studied as a communication meansfor people who have little or no voluntary control of muscle activity. In this context,
low-performing interfaces can be considered as prosthetic applications. On the other hand, for able-bodied users, a BMI would only be practical
if conceived as an augmenting interface. In this paper, a method is introduced for pointing out effective combinations of interfaces and devices for
creatingreal-world applications. First, devices for domotics, rehabilitation and assistive robotics, and their requirements, in terms of throughput
and latency, are described. Second, HMIs are classified and their performance described, still in terms of throughput and latency. Then device
requirements are matched with performance of available interfaces. Simple rehabilitation and domotics devices can be easilycontrolled by means of
BMI technology. Prosthetic hands and wheelchairs are suitable applications but do not attain optimal interactivity. Regarding humanoid robotics,
the head and the trunk can be controlled by means of BMIs, while other parts require too much throughput. Robotic arms, which have been
controlled by means of cortical invasive interfaces in animal studies, could be the nextfrontier for non-invasive BMIs. Combining smart controllers
with BMIs could improve interactivity and boost BMI applications.
© 2007 Elsevier B.V. All rights reserved.
Keywords: Brain–computer interface; Brain–machine interface; Human–machine interface; Hybrid bionic system; Throughput; Information transfer rate
1. Introduction
1.1. Hybrid bionic systems
In everyday life, we increasinglyinteract with machines,
such as computer, appliances, even robots. This interaction is
mediated by a human–machine interface (HMI). The ensemble
user-interface-device, comprising both artificial and biological
components, is defined as hybrid bionic system (HBS).
From a control system viewpoint, Fig. 1a shows the information flow that happens as we interact with a HMI. Our intention
to interact withthe interface for a utilization task, e.g. grasp a
knob, resides in dedicated neural networks within the brain and
is translated into complex motor commands and then dispatched
∗
Corresponding author. Tel.: +39 050883405.
E-mail address: oliver.tonet@sssup.it (O. Tonet).
0165-0270/$ – see front matter © 2007 Elsevier B.V. All rights reserved.
doi:10.1016/j.jneumeth.2007.03.015
fromthe areas for motor planning and execution toward the target muscles through the cortico-spinal and peripheral nervous
fibres. The results of our action are then gathered by our sensing
system (eyes, touch and proprioceptive receptors, etc.), translated into sensory signals and fed back to the central nervous
system (CNS) through the afferent pathways.
This scenario is over-simplified, butnonetheless it allows to
clarify the potentials of direct brain–machine communication.
A brain–machine interface (BMI), or brain–computer interface
(BCI), can be defined as any system able to monitor brain activity and translate a person’s intentions into commands to a device.
In an ideal BMI, the motor commands, instead of being sent to
the physiological musculo-skeletal effectors, will reach...
Regístrate para leer el documento completo.