Most of what we call “being smart” happens above the neck. The person who processes data fastest, articulates arguments most cleanly, solves abstract problems in the fewest steps. The conventional wisdom is clear: intelligence is cognitive, verbal, measurable. IQ tests, SAT scores, performance reviews. But some of the sharpest processing happening in any room is occurring in the bodies of people who would never score well on any of those instruments.

The standard counter-argument, of course, is that we already account for this. Howard Gardner’s theory of multiple intelligences, first proposed in the 1980s, gave us the language of “bodily-kinesthetic intelligence” and made space for dancers and surgeons alongside mathematicians. That framework has been widely discussed and expanded. But Gardner’s model still frames body intelligence as a talent category, like being good at sports or having skilled hands. What I’m describing is different. The body as a sensory processing system. Not intelligence expressed through the body, but intelligence conducted by it.

The person who walks into a meeting and can feel something is off before a single agenda item has been raised. The grandmother in rural Maharashtra who knows rain is coming two days before the forecast does. The experienced nurse who looks at a patient and says “something’s wrong” without being able to name what. These people aren’t guessing. They’re reading data streams the rest of us have been trained to ignore.

The body as processor

There’s a term in neuroscience that deserves a much bigger audience: interoception. It refers to the sense by which we perceive internal bodily signals, everything from heartbeat and breath rate to gut feelings, temperature shifts, and muscular tension.

For decades, interoception was treated as plumbing. Background noise the brain handled automatically. But recent work has reframed it as something closer to a sixth sense, one that varies dramatically in sensitivity from person to person. Research covered in Scientific American describes interoception as foundational to mental health, noting that disturbances in mind-body connection may underlie conditions from anxiety to depression to PTSD.

The implication runs deeper than clinical treatment. If internal body signals are a legitimate information channel, then people with heightened interoceptive awareness are essentially running a secondary data processing system. One that operates without language, without conscious deliberation, and often without the person themselves understanding what they know.

That’s worth sitting with.

person sensing environmentPhoto by Eren Li on Pexels
Why we dismiss what can’t be spoken

Western intellectual culture has a bias so deep it feels like common sense: if you can’t articulate something, you don’t really know it. This shows up everywhere. In workplaces, the person who “has a feeling” about a hire gets overruled by the person with a spreadsheet. In medicine, the patient who says “something feels wrong” gets told their bloodwork is fine. In schools, the child who learns through movement and sensation gets labeled as distracted.

The underlying assumption is that real knowledge must pass through language to count. If you can explain your reasoning, you’re intelligent. If you can’t, you’re just lucky, or emotional, or making things up.

But this confuses articulation with processing. A person can process enormous amounts of environmental information, subtle shifts in someone’s vocal pitch, micro-expressions, barometric pressure changes, the way a group’s energy shifts when someone enters a room, without ever being able to translate that processing into words. The knowledge is real. The mechanism is real. The output, that gut sense that something is wrong, is often more accurate than deliberate analysis.

I’ve come to think that being good at understanding other people’s behavior doesn’t necessarily mean understanding your own. Certainly not the mechanisms. I can watch someone’s posture shift in a conversation and know exactly what it means. Ask me how I know, and I’ll give you a post-hoc rationalization that sounds convincing but probably isn’t the real process. The real process happened somewhere below the floor of consciousness.

The data streams we’re filtering out

Consider what the body actually registers in any given moment. Temperature gradients across the skin. Micro-vibrations through the floor. The acoustic signature of a room, how sound bounces differently when a space is tense versus relaxed. Olfactory signals so subtle they never reach conscious smell but still trigger limbic responses. The pace and rhythm of other people’s breathing.

Individually, none of these are remarkable. Collectively, they constitute an astonishing amount of information. And the body is processing all of it, all the time, without you asking it to.

The question is what happens to that processing. In most modern environments, we’ve learned to suppress it. We sit in climate-controlled offices under fluorescent lights, staring at screens, receiving nearly all our information through two channels: text and speech. We’ve essentially built a civilization that communicates exclusively through the body’s narrowest bandwidth.

My sister, who’s a nurse, once described something that stuck with me. She said experienced nurses develop a sense for when a patient is about to deteriorate. Not based on monitors or charts, but on something they can feel in the room. She called it “the look.” When I pushed her on what “the look” actually consisted of, she couldn’t break it down. Skin pallor, breathing pattern, something about the eyes. But the point was that her body was synthesizing dozens of micro-signals into a single, confident assessment faster than any checklist could.

That’s not mysticism. That’s pattern recognition running on biological hardware.

Embodied cognition and the AI problem

The robotics and AI community has stumbled onto this from the opposite direction. There’s a growing recognition that artificial intelligence may require a body to achieve anything resembling human-level understanding. As New Atlas has explored, researchers are grappling with whether disembodied AI can ever develop the kind of contextual, situational intelligence that humans display effortlessly.

There’s a concept called embodied intelligence that refers to the idea that cognition isn’t just something that happens in the brain (or the processor). It emerges from the interaction between a physical body and an environment. You don’t just think about a problem. You feel the weight of it, literally, in your posture. You sense resistance. You navigate space.

Rapid investment in humanoid robots and open-source embodied intelligence platforms reflects this understanding. The race isn’t just to build smarter algorithms. It’s to build systems that learn the way bodies learn: through contact, feedback, and physical consequence.

Which raises an uncomfortable question. If the AI community is finally recognizing that intelligence requires a body, why do we still treat body-based intelligence in humans as inferior to the verbal, abstract kind?

human body awarenessPhoto by Polina Tankilevitch on Pexels
Who has this intelligence, and why

Not everyone’s body processes at the same resolution. Some of this is innate variation. But much of it is shaped by experience, and particularly by environments where paying attention to non-verbal signals was a survival skill.

Children who grew up in unpredictable households often develop extraordinary sensitivity to atmosphere. They learned to read a parent’s mood from the sound of the car door closing. The weight of footsteps on the stairs. The specific quality of silence that meant anger rather than peace. This hypervigilance, usually discussed as a trauma response, is also a form of data processing that the body learned to perform at an elite level under pressure.

In my recent piece on people who always arrive early, I explored how childhood environments wire behavioral patterns that persist long after the original context has disappeared. The same logic applies here. The body learned to process environmental threat cues at high speed and high fidelity. The fact that the original environment was harmful doesn’t make the processing capability any less real or sophisticated.

Similarly, people who’ve spent decades in physical trades, farming, fishing, building, develop a perceptual acuity that office workers simply don’t possess. A farmer who can smell the difference between soil that’s ready and soil that needs another week isn’t performing magic. She’s running a chemical analysis through biological sensors that are extraordinarily well-calibrated through years of feedback loops.

In my earlier writing on the resourcefulness of people who grew up lower-middle class in the ’60s and ’70s, one of the threads that emerged was how physical interaction with the material world, fixing things, repurposing things, working with your hands, created a kind of intelligence that never gets captured on any résumé. Body intelligence and material intelligence are close cousins.

The cost of ignoring it

There’s a real price to the cultural hierarchy that puts verbal-analytical intelligence at the top and body intelligence at the bottom.

In professional settings, decisions are made by the people who can argue most fluently, not by the people who can sense most accurately. The person in the room who feels that something is off about a deal, a candidate, a strategy, but can’t build a PowerPoint deck to prove it, gets dismissed. I watched this happen for over a decade. The articulators won the meetings. The sensors were right more often.

In healthcare, the gap between what the body knows and what diagnostics confirm can be months or years. The emerging research on interoception suggests that many mental health conditions may stem from a disrupted connection between body signals and conscious awareness. When people can’t feel their own internal states accurately, the downstream effects ripple through mood, decision-making, and relationships.

In education, entire populations of kinesthetic learners get funneled through verbal-analytical systems and emerge convinced they’re not intelligent. The kid who can feel the physics of a skateboard ramp with perfect accuracy but can’t pass a physics exam isn’t failing because he lacks intelligence. He’s failing because we’ve defined intelligence so narrowly that his processing system doesn’t count.

How to recover what you’ve learned to suppress

The good news, if there is any, is that body intelligence isn’t lost. It’s suppressed. And suppression can be reversed.

The starting point is embarrassingly simple. Stop filling every moment with input. I found, after years of trying to optimize every waking minute, that walking without listening to anything, no podcasts, no music, no phone calls, was more valuable than any of the information I’d been cramming in. Not because the walking generated insights. Because it let the body’s processing system run without interference.

The body needs silence the way the mind needs sleep. Not absence of sound, necessarily, but absence of directed attention. Time when you’re not trying to process anything through language. Time when the body’s sensors can report without being overridden.

Physical exercise helps, and not for the reasons most people think. The standard line is that exercise reduces stress and improves mood. True, but incomplete. Exercise recalibrates the interoceptive system. It forces you to feel your heartbeat, your breath, your temperature, your fatigue. It re-establishes the communication channel between body and brain that sedentary, screen-dominated life gradually shuts down.

I find my brain works better when I’m tired from exercise rather than tired from sitting and overthinking. That’s not a fitness cliché. That’s the body’s processing system having been activated, calibrated, and allowed to contribute.

There are more formal approaches. Mindfulness-based interoceptive exposure, practiced in some therapeutic settings, deliberately trains people to notice and tolerate internal body signals. Flotation therapy, where you lie in a sensory-deprivation tank, strips away external stimuli and amplifies awareness of internal states. Both are based on the principle that interoceptive sensitivity can be trained like any other skill.

What this actually means

We’ve built an entire civilization around the assumption that intelligence lives in the head. Our schools test it there. Our workplaces reward it there. Our technologies are designed to augment it there.

But the body is running a parallel intelligence system that is older, faster, and in many contexts more accurate than conscious thought. It processes environmental data, social data, and threat data at speeds that language-based cognition simply cannot match.

The grandmother who knows rain is coming. The nurse who knows a patient is deteriorating. The person who walks into a room and feels wrongness before anyone speaks. These aren’t mystics. They’re people whose bodies are doing what bodies have done for hundreds of thousands of years: processing the world at full bandwidth.

The rest of us narrowed our bandwidth voluntarily. We can widen it again.

Understanding the world and living well in it are different skills, and they run on different systems. The first lives in language. The second, more often than we’d like to admit, lives in the body. And the body has been trying to tell us things this whole time. We just kept asking it to put that in an email.

Feature image by KEHN HERMANO on Pexels