The human brain is a mysterious hunk of meat. For example, as you read this, you’re “hearing” the words in your brain, and sorting out their meaning, but where exactly are these gray matter word lockers? Wonder no longer. This interactive brain map from scientists at the University of California, Berkeley allows you to see exactly where the words are stored in the furrows of your cerebellum.
To create the map, the team scanned the brains of seven different volunteers, who were asked to listen to a two-hour chunk of The Moth Radio Hour, a popular storytelling podcast. As the volunteers listened, the team was able to see exactly where and when the oxygen levels in their brains changed, which let them map certain words onto certain parts of the brain.
What the team discovered was that they could actually map 12 different categories of words to different areas of the brain. In general, there are more parts of the brain dedicated to processing words related to visual stimulus, violence, time, place, and the human body, which from both an evolutionary and grammatical perspective makes sense. There are fewer parts of the brain that light up when you talk about numbers, the indoors, or other people, however. (It’s interesting to wonder if the brains of mathematicians have different semantic brain maps than mountain climbers, but sadly, Berkley’s research doesn’t cover that.)
The map lets you explore which words map to which brain folds, just by clicking on them. What’s interesting is that on a furrow-by-furrow basis, these areas don’t necessarily group together—but even so, there are larger continents of the brain that tend to be, say, more visual than violent, or more social than tactile. Depending on how you view the interactive map, the brain either looks like an irregular, candy-colored globe of rigorously defined semantic meaning, or a chaotic, technicolor yarn ball of feelings and impressions.
Although the sample size was small, the researchers of this brain map think that this technique could help other scientists understand what’s going on within the minds of patients with Alzheimer’s. In their published report, they also suggest it could be used to communicate with people who can’t speak or sign, effectively “reading” their minds.