In the 21st century, the brain is the final frontier

In the 21st Century, conflict has again driven scientific advance, but instead of gazing upwards to the sky and unlocking the mysteries of the firmament

Geoff Ling

Published: Updated:

As a U.S. military physician, I have always understood the strange interplay of conflict and science. In the 20th Century, our trips outside of Earth, and our first steps on the Moon would not have been possible without the design of the V-2 Rocket. Our understanding of the universe comes from our ability to escape the atmosphere, which comes from our history of war.

In the 21st Century, conflict has again driven scientific advance, but instead of gazing upwards to the sky and unlocking the mysteries of the firmament, we have instead looked inwards, at the brain.

My time developing neurotechnology has been driven by the early conflicts of the 21st Century. We needed methods to restore and repair our injured warfighters, who had sacrificed on the field of battle. Early results from the culminating years of the 20th Century had indicated that non-human primates were able to move cursors across a screen through simple thoughts, so we looked to expand on those methods and determine whether we could control an advanced prosthetic arm in the same way. In under a decade, we had pushed the science far beyond what was thought possible in 2004. Today, we have not only demonstrated the ability to control an external robotic limb through signals from a human brain, but we have restored sensation of touch to an individual with severe neurodegenerative injury.

We are at the precipice of a truly transformative time for neuroscience, and as I depart DARPA I know that my colleagues there will continue to carry the banner of discovery forwards. Reflecting on the past, let me make some predictions on the future.

All modern economies are derived from astronomy. This may sound strange, at first, but think back. Without astronomy, we would not have the ability to determine our location on the Earth. Without astronomy, we would not have cartography, or navigation, or trade. We used the stars to make maps, and we used maps to make the modern world.

The modern economy also owes a great deal to our limited understanding of the human mind. Advertising, marketing, focus groups, all rely on psychology, on a mechanistic understanding of how it is that we think and come to decisions. So do negotiation, and ethics, and law.

But what we know about the mind is very different from what we know about the stars. The telescope was a tool that allowed for the continual observation of millions of stars. As technology has improved, we have improved our field of view. To look inwards, at the mind, we’ve had to rely on techniques that are much more limited – more similar to trying to count stars by listening for them, rather than observing them directly, and so our maps have been limited.

While working at DARPA, I had the fortune of working with Karl Deisseroth, a professor from Stanford. Karl developed a method, called CLARITY, that allows us to create amazing maps, neuron-by-neuron, at levels of detail that we have never seen before.

We have gone, in the last decade, from indirect to direct observation. We can now count thousands of neurons in real-time as they fire, and then generate maps of millions more. These will be our star charts, and we can borrow techniques from astronomy to navigate the properties of the billions of neurons that make up the human brain.

As we make maps of the mind, and chart ourselves, I believe that we will fundamentally uncover better ways to understand how people decide and think, and how to communicate with each other. CLARITY, and similar technologies, will generate a revolution in how we know ourselves.

A second major shift in our understanding of the brain will come through the dissemination of technology. In 2004, when I began at DARPA, deep brains stimulation had been in use in the United States for less than a decade. Today, over 100,000 people around the world are living with an implanted device that improves their control over severely impairing diseases. Over 300,000 people, around the world, have a cochlear implant, which allows them to experience a world of sound, of speech, and of music.

There is no reason to constrain ourselves to our first forays to improve ourselves. Deep brain stimulators are relatively crude devices that require hand tuning by clinicians in order to properly function, and which have limited use in research. Cochlear implants interface with a part of the nervous system that responds especially well to analog waves.

Since 2004, deep brain stimulators have minimally improved. In 2004, one of the most advanced cellular phones you could buy was the Blackberry 7210, the first model to include color display. In 2004, Steve Jobs gathered a small team inside Apple to start “Project Purple.” Project Purple was the iPhone.


Diagram Source: Our Mobile Planet

In neurotechnology, we are waiting for our iPhones. Once we have them, I believe the world will change.

Last, it isn’t just about our brains. Going directly to an understanding of ourselves through our brains is an obvious starting point. Ever since Phineas Gage’s unfortunate encounter with a railroad spike we have known that changes in our brains result in changes to our minds and to ourselves. What we’ve learned, and emphasized more recently is that changes to ourselves can result in changes to our brains. The link between malnutrition and depression is well established, but the link between species of bacteria in the microbiome in our stomachs and our brains is revelatory research. We are going from the conduits of information in our central nervous system and looking out to the periphery at a much more granular scale. New efforts are underway as massive collaborations between industry, researchers, and federal stakeholders that seek to understand how modulation of the peripheral nervous system can effect organ function, not just through zapping a nerve with an electrical charge, but by utilizing chemical signaling pathways and even ultrasound to generate guided therapies.


Diagram Source: DARPA.Mil

These efforts to characterize the information highways of the body will lead to new therapies not simply through devices, but through new drug targets. Ideally, they will lead to a better understanding of the integration between mind and body and let us have a better understanding of how traditional and alternative medical techniques like yoga, or meditation can result in profound outcomes in cellular diseases like cancer.

The 20th Century was the century of the sky and the satellite, and the revolutions in microelectronics and computer technology, in control theories and “big data.” We will borrow, and steal, the lessons learned as we continue to the 21st Century, the century of biology.

Geoff Ling, Director, Biological Technologies Office, Defense Advanced Research Projects Agency (DARPA), USA. He is participating at the Summit on the Global Agenda 2015

Disclaimer: Views expressed by writers in this section are their own and do not reflect Al Arabiya English's point-of-view.