Researchers release largest dataset of Neuropixels recordings ever collected

Allen Institute researchers release largest dataset of Neuropixels recordings ever collected

From 300,000 mouse neurons, scientists hope to glean how the brain drives behavior. A newly released publicly available dataset is the largest of its kind and represents billions of split-second electrical pulses that comprise the brain’s language of information. From this massive collection of cellular activity, scientists hope to decode the neural computations that underlie behavior.

The dataset was collected using Neuropixels, ultra-thin silicon probes capable of measuring the activity of hundreds of neurons at once. Before the advent of such technologies, scientists could only eavesdrop on handfuls of single neurons at a time. “It’s like trying to deduce the rules of an unknown sports game by just watching one player,” said Allen Institute neuroscientist Corbett Bennett, Ph.D., who was part of the team that led the creation of the dataset. “People have discovered really valuable, interesting things with that method. But now that we can record from 1,000 neurons in every experiment, we’re seeing so much more of the field, and we can determine a few more pieces of the rules of the game,” Bennett said.

Mice were shown a series of images and trained to lick a spout when that series changed—the “oddball” image. As the animal performed its task, six Neuropixels probes recorded electrical chatter from more than a thousand neurons in action across several areas of the mouse’s brain.

“What’s amazing is people don’t really think of mice as very smart, but they can do really complex tasks,” said Séverine Durand, Ph.D., a neuroscientist in the Allen Institute’s MindScope Program.

While a mouse’s ability to tell one image from another might seem esoteric, the scientists are hoping to extrapolate broader information from the dataset. The suite of electrical activity represents what they call a “perception-action cycle”—how what is perceived leads to an action. The data encapsulates how the brain processes visual information that comes from the eyes, how the animal makes sense of what the mouse sees, how the mouse decides to take an action (to lick or not to lick) and how it translates that decision into movement.

“We’re hoping to capture the footprint of perception in this neural activity,” said Shawn Olsen, Ph.D., an investigator in the Allen Institute’s MindScope Program who helped lead the data collection and analysis.

Source: Read Full Article