NEUROSCIENCE

Mind Readings

Researchers can now predict what a monkey will draw- before it even moves

In a laboratory outside San Diego, a rhesus monkey sits and peers through a periscopelike arrangement of mirrors into a computer-generated virtual reality. If monkeys have a sense of wonder, this one must have been in awe at first seeing an elliptical tube float before its eyes. But now the animal has mastered this game. The monkey reaches for the object. When it does, infrared diodes strapped on its arm convey its movement to the computer, which then moves a spherical cursor in the virtual world. As one segment of the tube lights up and spins around the ellipse, the rhesus must follow it, tracing ovals in the air. After months of practicing four hours a day, five days a week, the monkey has this down. Every time it completes five orbits, the creature wins a drink of water.

As the rhesus plays, Andrew B. Schwartz, a senior fellow at the Neurosciences Institute, sits in front of a floor-to-ceiling rack of equipment recording the animal's thoughts-or rather electrical traces of them. His instruments are connected by a wire far thinner than a human hair to a single brain cell lying just below the surface of the animal's primary motor cortex. No electricity goes into the subject's skull. But the moment the monkey decides to move its arm, this neuron starts firing, sending pulses out to the computer, which registers how rapidly they arrive. From the pattern of signals produced by fewer than 100 brain cells sampled as the rhesus repeats its task, Schwartz has all the data he needs to predict where the monkey's arm is going a good tenth of a second before the animal moves a muscle.

Neuroscientists discovered a decade ago that the rate at which a neuron fires in the motor cortex determines the direction the associated muscle will tend to move. Averaging the directions sent by a bunch of brain cells within the region, researchers found that they could predict with uncanny accuracy which way a monkey was going to move its arm-so long as the movement was a straight line. Working with colleagues at Arizona State University, Schwartz has improved the technique to reproduce the spirals and other complex curves the subject draws in three-dimensional space.

The advance is important, explains Gary T. Yamaguchi of Arizona State, "because our long-term goal is to try to figure out how to use these neural signals to move a prosthetic limb in a natural way." Schwartz suggests that within a decade or so it should be possible to fit amputees with thought-controlled robotic arms that move naturally.

"One of the big problems with building devices that replace human function is that they often fail or make the wearer feel conspicuous and then get thrown in the closet," Yamaguchi says. But building a bionic arm involves more than just decoding the path a person intends his or her arm to follow.

"The problem is that there are an essentially infinite number of joint positions and movements you can use to move your hand from point A to point B," Yamaguchi explains. "Fortunately, humans and lower primates almost always make common movements in just one way." At a conference in February, Yamaguchi reported that his group has developed a mathematical model that, given a trajectory, can accurately predict just how a human would move six of the seven major joints in the arm.

Schwartz found that the key to translating motor cortex signals is to associate neuron firing rates with velocity as well as direction. He improved the accuracy of his decoder further by accounting for what he calls "a time-warping phenomenon" in the brain. When humans and other primates draw straight lines, the lag between brain signal and muscle movement is tiny, just a few hundredths of a second. Curves are harder, however. So the tighter the curve, the slower we draw it, and the further our brains have to race ahead of our hands.

That process complicates movement prediction, however. "If at one moment the lag from neuron firing to movement is 200 milliseconds and then a moment later the lags drops near zero, the two signals might cross, making the movements occur out of order," Schwartz says. "In reality, the change from moment to moment is never so radical that signals actually cross, but you can see how predictions get really messed up unless we take time warping into account."

Although Schwartz's current technique works well on tasks that monkeys have been trained to perform, he has yet to test it when the animals are drawing patterns they have never seen before. His colleagues have constructed an artificial neural net that reorganizes itself to extract as much information as possible from the cortex signals. It should predict new patterns with more accuracy.

The technology still offers more questions than answers. Deciphering the brain signals that produce finer motions like grasping may be much trickier than predicting arm movement. And no one knows whether it will work in humans as well as it does in lower primates.

Schwartz intends to find out. This summer, he says, his group will start testing two new devices. A larger probe will sense the firing rates of many cortex cells simultaneously, producing real-time predictions such as those generated by muscle and brain-wave monitors. And an early prototype of a wireless probe will radio its host's intentions to an external processor. If the device is successful, the researchers will try to shrink and implant it within an animal's skull. -W. Wayt Gibbs in San Francisco