Brain-Computer Interfacing: Prospects and Technical Aspects of Functional Cranial Implants


Background

We often perceive human thoughts as powerful forces. However, one can question whether we, as humans, can manipulate the physical world around us just by our thoughts or will.

Myths, legends, and science fiction are all full of stories where thought or will is used as a power, thereby capable of moving objects. Throughout the 20th century, science fiction is replete with references to brain-controlled devices and machines with artificial intelligence. Most reports describe some form of physical connection allowing thoughts or will to directly influence an external object for intervention. As such, it is possible that brain-computer interfaces (BCIs) originate from this type of thinking, and that there is a logical extension of an electrical human organ (the brain) controlling an external electrical device.

As we move forward into the next millennium, neuroscience, that until recently failed to create a method to replace one’s aged or damaged brain, as opposed to the successes seen with transplantation of heart pumps or organs, is now participating in new gene therapy trials to eliminate certain neurodegenerative diseases. Consequently, in due time, the field of neurosurgery will successfully create a method to alter and/or augment one’s aging/injured brain using external devices—and this is where the field of BCIs and implantable neurotechnology takes shape.

Introduction

Generally speaking, a BCI or brain-machine interface is considered a neural activity interface using thought-translation devices for direct communication between the human brain and an external machine. These machines are designed to harness the brain’s intrinsic electrical activity and couple it to the information processing power of a computer for the purpose of performing various tasks. Over the years numerous terms—such as neuroprosthetic devices, neural interfaces, and mind-machine interfaces—suggest a “man-made nervous system interface” working in collaboration with an external/internal device.

In its most simplistic form, the BCI replaces or enhances the input and/or output of the brain to overcome neuronal dysfunction. Signal input via electroencephalography (EEG), electrocorticography (ECoG), or intracortical methods is digitized and processed via feature extraction and algorithmic translation by the computer, thereby providing a long-awaited solution for various neurological disorders ( Fig. 111.1 ). The fundamental application and overarching objective is to bypass a permanent deficit caused by neurological disease and/or to augment existing malfunction (or impaired transient function) for improved performance—to restore sight, hearing, movement, communication, and/or cognitive function. However, brain restoration accompanies more recent controversy over applications of human enhancement, as opposed to the more commonly accepted applications to handicaps such as hearing, vision, mutism, paralysis, and replacement of diseased nonbrain organs (i.e., heart, kidney, liver). In many ways, the function of BCIs is to decode and alter the activity of neural circuitry, best described as “neuromodulation”; they are thus capable of augmenting, inhibiting, modifying, or regulating the electrical and/or chemical neural interface to achieve therapeutic effect(s).

FIGURE 111.1, Summary diagram of a brain-machine interface and its core components. A typical brain-computer interface (BCI) system. The patient performs motor imagery that generates a specific pattern of brain activity to serve as a signal for the BCI. Signal acquisition occurs via electroencephalography (EEG) , electrocorticography (ECoG) , or single-/multi-unit recordings. Signal input from the different recording methods is digitized and processed via feature extraction and algorithmic translation by the computer. The algorithm then outputs device commands that can be used to operate a robotic arm, environmental controls, communication devices, wheelchairs, and other assistive devices for neurorehabilitation.

Common types of BCIs and neuromodulations serving to treat central nervous system diseases include time-tested devices like deep brain stimulators for Parkinson’s disease, responsive neurostimulation for drug-resistant epilepsy (DRE), transcranial magnetic stimulation for depression, facilitated neurorehabilitation for stroke victims, magnetic resonance (MR)-guided focused ultrasound for multiple entities, and neuroprosthetics using EEG or ECoG to bypass spinal cord injury—in an effort to control extraanatomical objects (i.e., prosthetic limbs, cursors on a computer screen, network-connected devices such as lights or television). More recently, neurosurgeons and neuroscientists have learned firsthand that the brain’s electrical activities are capable of conscious pattern changes that can be actively conditioned and reproduced with great benefit. One example is the production and activity of alpha/theta waves during mindfulness meditation and deep relaxation techniques.

Neuroprosthetics, for the purpose of neuromodulation, is a rapidly expanding field. The emerging technologies in current development will forever affect the practice of neurosurgery as we know it today. In fact, this field spans all inherent body functions controlled by the brain and therefore, by default, embraces many other clinical specialties such as the burgeoning field of neuroplastic surgery—which combines the reconstructive principles of neurosurgery and the aesthetic/restorative principles of craniofacial plastic surgery. Thus, for the focus of this chapter, we will concentrate mainly on the art and science of functional neuroplastic surgery, which includes design and implantation of various “smart” cranial implants, several forms of implantable neurotechnology modifications, and numerous diagnostic/therapeutic tools in development for the modern era of neuro-biohacking. Furthermore, this brief introductory chapter is meant to be a high-level overview on BCIs and does not attempt in any way to explain the fine details of the complex biophysical properties managing communication between individual neurons and/or the transduction of environmental/sensory signals into electrical activity.

Electricity was first demonstrated as the basis of neural activity via an unusual series of animal experiments conducted by Luigi Galvani (late 18th century) ( Fig. 111.2 ). As a frog nerve was being dissected, Galvani discovered that electrical neural stimulation caused the intact muscles to contract violently in a reproducible manner ( Fig. 111.3 ). However, more than 100 years would pass before the first electroencephalogram recordings of the brain (i.e., EEG) were published by Hans Berger. , , The utility of the EEG was quickly realized, and from there, epileptiform/interictal abnormalities were soon described. , , , Although the term BCI had not yet been formalized, one of its earliest descriptions was within a musical piece entitled Music for Solo Performer (1965), by the American composer Alvin Lucier. This visionary music piece, named “Illuminated by the Moon,” makes use of an EEG machine and analog signal processing device to stimulate acoustic percussion instruments ( Fig. 111.4 ). From here, the rapid development of various tools/techniques allowed researchers unprecedented insight into the workings of the human brain/nervous system, with the newfound ability to control/assess neural activity with subcellular resolution—as well as to explore healthy and diseased specimens on the intact whole brain with unprecedented, high-end molecular resolution. ,

FIGURE 111.2, Portrait of Luigi Galvani (1737–1798), the Italian physicist credited with pioneering bioelectricity.

FIGURE 111.3, A diagram of Galvani’s experiment on frog legs in the late 1780s.

FIGURE 111.4, Image of the American composer Alvin Lucier recording the music piece “Illuminated by the Moon.” Of note, there is great resemblance to a preliminary brain-machine interface.

Of note, the classical understanding of a neuroprosthetic device is one that records brain signals from the user, computationally analyzes those signals, and then transforms those intentions into an external effector output. These devices may be complementary to an output device, such as a prosthetic robotic arm for movement assistance, or may be independent in application, like a cochlear implant for hearing augmentation. , , , , Regardless of function, this proliferation of neuroscience has generated extraordinary interests across many disciplines, including clinical, governmental, and business organizations—such as the Wyss Center (Geneva, Switzerland), Longeviti Neuro Solutions (Hunt Valley, Maryland), and the Department of Defense (DoD)/DARPA. Without these specific entities leading the way, there would exist a major gap within industry between cranial implant companies (i.e., supplies for the surgical carpenter) and implantable neurotechnology companies (i.e., the supplies for the surgical electrician). This was eloquently termed “engineering arrogance” by the Wall Street Journal, describing the unfortunate ignoring of optimal delivery vehicles for BCI implantation even while developing state-of-the-art neurotechnology. , These pioneering organizations/companies now offer resources and expertise to accelerate translation from bench to patient, ranging from neuroscience research to novel clinical solutions—all in an effort to assist those unfortunate patients suffering from various neurological disorders.

Overall, BCIs consist of three individual components, the first being an environmental stimulation or sensing component, the second being a signal-conducting component (such as a wire), and the third being a brain interface component or stimulation electrode, which delivers and/or receives signal to and from neural tissue. , , , , , , More specifically, the process of engineering such BCI systems as an assistive device for functionally impaired patients relies on the fact that—despite their loss of motor function following spinal cord injury or brain stem infarct—paralyzed/“locked-in” individuals often retain the ability to generate characteristic neuronal activity patterns via their imagination of different movements. That is, their ability to perform motor imagery is amazingly preserved even in the setting of paraplegia or tetraplegia. , , , , ,

Overall, EEG is the most extensively studied BCI noninvasive interface. In the 1980s, Farwell and Donchin developed a scalp EEG-based BCI to allow normal subjects to communicate words to a computer and thereby “speak” through a computer-driven speech synthesizer ( Fig. 111.5 ). , , , Soon thereafter, Niels Birbaumer and team used EEG recordings of slow cortical potential to give paralyzed patients (locked-in syndrome patients) limited control over a computer cursor to write responses ( Fig. 111.6 ). From there, Peckham and colleagues used a 64-electrode EEG skullcap to return limited hand movements to a tetraplegic patient ( Fig. 111.7 ). In turn, this allows a computer (an integral component of the BCI) to capture the unique cortical signatures associated with different imagined motions and then immediately transform them into commands performed by an assistive device, such as a robotic arm or mechanical exoskeleton.

FIGURE 111.5, Images of Dr. Farwell’s brain fingerprinting test being conducted in the 1980s.

FIGURE 111.6, Published photograph of model wearing a brain-computer interface device developed by Dr. Birbaumer’s team.

FIGURE 111.7, Photograph of brain-computer interface and skull cap device being worn to assist a patient with tetraplegia, originally developed by Peckman and colleagues.

To date, BCI systems have been successfully built using scalp EEG, subdural ECoG, and intracortical microelectrodes. Although ECoG and intracortical microelectrodes provide the best signal quality, their invasive nature renders them less ideal for BCI operation. Therefore, scientists are challenged by the risk-to-benefit ratio—with one end being the interference of the scalp and skull (less invasive, extracranial design above the skull) and the other end being the invasive nature of intracortical lead placement via craniotomy (more invasive, intracranial design inside the skull). This is why it is imperative to refine and develop functional cranial implants moving forward—as a novel solution employing the skull space itself (neither above nor below the skull). , These newer implants, sometimes referred to as “smart” cranial implants, are being developed and implanted to capitalize on the valuable location of one’s own skull (following limited craniectomy). This helps in turn to circumvent the impedance presented by the scalp or skull (i.e., cranial implants encapsulating BCIs positioned directly over the targeted brain). Fig. 111.8 demonstrates images published in the “first-in-human experience” article reported by our team (2017).

FIGURE 111.8, Images from a “first-in-human experience” article describing complete integration of a neuromodulation (brain-computer interface) device within a customized cranial implant—involving an epilepsy patient suffering from a large, acquired skull defect and coexisting need for implantable neurotechnology

One major shortfall of employing EEG, however, is the physical separation of the cortical source signal and the scalp-based recording electrodes. This distance is occupied by the meninges, bone, and scalp, which results in limited spectral and spatial resolution. For a scalp-based electrode to record a measurable signal from the cortex, electrical potentials must be summated across an area of cortex. , These limitations may be mitigated, in part, by intracranial placement of recording electrodes via standard craniotomy, and hence, this is where the flourishing world of neuroplastic surgery and functional cranial implants lives—which involves the act of utilizing one’s inherent skull space (via craniectomy) and replacing it with a low-profile skull implant containing a BCI—which, in turn, helps to bypass all concerns of the scalp impingement. At the same time, it retains essential protection of the brain ( alloplastic biomaterials are as hard as bone) and prevents suboptimal encroachment onto the brain (the absent intracranial extension prevents undesirable mass effect on the brain). , , , , , , ,

Although computational abilities were initially restricting BCI success, recent advances in microprocessor design and digital signal analysis now outperform neuroprosthetic requirements such that computational speed is no longer a rate-limiting factor. Technologic advances have provided the necessary tools for BCI device development, thereby allowing innovative applications to introduce motor, sensory, visual, auditory, speech, and other modalities to the field—which sequentially pushed neural engineering to the forefront of neuroprosthetic development. , , , , ,

You're Reading a Preview

Become a Clinical Tree membership for Full access and enjoy Unlimited articles

Become membership

If you are a member. Log in here