Credits

Philip Maughan is a writer and researcher based in London.

Over the last few years, breakthroughs in AI have been almost too numerous to track. Chatbots can now pass the same exams required of doctors and lawyers. A cancer drug designed by AI has entered clinical trials. AI agents are serving as autonomous personal assistants.

There have even been reports that AI can smell.

“Computers Are Learning to Smell,” declared The Atlantic. “AI is digitizing our sense of smell,” according to the World Economic Forum. “AI tastebuds are better at identifying what’s in food than you,” claimed TechRadar, while a spellbound BBC Future reported that “An AI started ‘tasting’ colours and shapes.” 

The truth, however, is that these headlines grossly embellish AI’s abilities. If you read the BBC Future article closely, for example, you’ll learn that a large language model (LLM) repeated the associations humans make between tastes, colors and shapes — sweet things are pink and round; sour things are yellow — observations that were captured in its training data. The reality is that very little progress has been made toward giving AI a sense of smell because pretty much nobody working in AI cares. 

Between 2015 and 2025, the number of research papers on artificial olfaction remained stagnant, while papers on machine vision, language processing and computer audio rose by orders of magnitude. Big AI conferences like NeurIPS, ICLR and ICML have shown little interest in integrating olfaction into the next generation of models. Most leaders in the field appear convinced that achieving human-level AI is a question of improving skills like abstract reasoning, planning, language use and problem-solving — things that typically do not depend on sensory capabilities like smell. Even in the design of humanoid and canine robots, as well as other embodied AI systems, it’s rarely considered.

But olfaction is not an optional add-on for developing human-level artificial intelligence, according to a growing body of research. In fact, it could very well be fundamental and irreplaceable. 

Humans with a keen sense of smell can detect a single odor molecule at a concentration of about 0.01 parts per billion — or one in 100 trillion air molecules. Geneticists at Columbia University’s Zuckerman Institute believe we can discern up to a trillion scents thanks to how the 300 to 400 different receptor types in our noses combine to create an impression of chemical reality sent directly to the deepest, most ancient regions of the brain. Our sense of smell can help us decide what to eat, alert us to dangerous places and people, recognize family and choose our romantic partners, respond to both bodily and emotional sickness and so much more. 

Smell is a vital though poorly understood component of human intelligence. So why is it that in our quest to advance artificial intelligence, it isn’t more of a priority?

Smell And Cognition

There’s a foundational challenge with achieving artificial olfaction: Smell remains shrouded in mystery. For one thing, we don’t know for sure how smell receptors work. One theory is that odorants fit into the receptors in our noses like keys into locks. Another is that our receptors detect the vibrational frequencies made by odor molecules. This is just one enigma that persists in the world of smell because, throughout much of intellectual history, nobody saw much point in trying to figure it out. 

The sense of smell, Charles Darwin concluded in 1874, was “of extremely slight service.” His remarks echoed the chemist and perfumer G. W. Septimus Piesse, who wrote with disappointment in 1855 that of the five senses, “smelling is the least valued.” Earlier still, in 1798, the philosopher Immanuel Kant suggested that smell was “the most dispensable” of our senses. “It does not pay to cultivate it or to refine it,” he argued, “for there are more disgusting objects than pleasant ones.” 

I have some sympathy with Kant here. I often have similar thoughts while pacing the streets of my neighborhood in London, which are commonly scented by jettisoned fast food and festering drains. Today, though, I’m sitting by a tall window in Bloomsbury, and I can’t smell a thing. Looking down through the wilting canopy, I see crowds of tourists queueing for the British Museum on this radiant summer afternoon. Thankfully, they are not looking up. If they were, they’d see me gazing down at them with a blue medical peg on my nose while leaning over a coffee table covered in jellybeans. 

“Humans with a keen sense of smell can detect a single odor molecule at a concentration of about 0.01 parts per billion — or one in 100 trillion air molecules.”

I’m at the Institute of Philosophy at the University of London’s School of Advanced Study to meet Barry Smith, a professor of philosophy and the director of the institute. Smith is also the founding director of the Centre for the Study of the Senses, which unites philosophers, psychologists, neuroscientists and, recently, some of the more curious individuals at Google DeepMind, to rethink the senses from first principles — examining how they influence our emotions, perception of the environment and self-awareness. 

“The accountants are always attacking me,” the upbeat Glaswegian Smith jokes as I chew a yellow jellybean. It tastes mildly sugary but has no flavor I can pick up. “Another £20 for sweets on your expenses?” he says they often ask him. “I have to tell them, ‘I’m sorry, but this is equipment. Really crucial equipment.’”

The most striking thing about Smith’s office — other than the piles of empty candy wrappers on his desk — is the conspicuous mountain of wine boxes stacked up against the wall. Aside from being one of the few philosophers interested in how AI and machine learning might advance our understanding of perception, experience and mind, Smith is also a keen oenophile who moonlights as a critic. 

“Here I was, a philosopher of language writing about wine as a sideline, when I suddenly thought, ‘How does taste actually work?’” So, he says, he talked to his colleagues in biology and neurobiology, who explained that taste, smell and touch do not exist independently of one another.

When we describe how things taste, we are mostly describing how they smell. Each time we eat or drink, organic molecules are pulsed into the nasopharynx, the uppermost part of our throat, as we swallow. From there, signals received into the olfactory bulb — the brain’s first processing center for smell — are sent directly to the hippocampus and amygdala, closely related structures in the brain responsible for memory and navigation, processing emotions and alertness.

Whether you are smelling fresh cinnamon buns or a dead rat under the floorboards, parts of the physical world have detached from their source and connected with nerve endings in your nose that are really extensions of your brain. 

The receptors on our tongue pick up only salts, sugars, acids, bitter compounds, glutamates and iron; our ability to perceive anything more complex is thanks to our sense of smell. Learning how much of what we call “taste” or “flavor” is truly olfactory is the rationale behind the jellybean test. 

When I remove the nose clip, the yellow jellybean in my mouth suddenly tastes like pineapple, and it’s as though a certain color and depth has returned to the scene. This fullness, a sensory layer that is present whenever we are conscious, is both objective and subjective. As I continue chewing, I’m reminded of a childhood memory of hiding similar sweets from an armada of perfidious cousins. 

Unlike the other senses, which fade gradually and inexorably over time, we can maintain our sense of smell by training it. Smith tells me about a 2017 study by the German doctor and scientist Thomas Hummel, who split a cohort of people in their 70s into two groups; one trained their smell twice daily by sniffing rose, lemon, clove and eucalyptus essential oils for 10 seconds, while the other played sudoku. After three months, the smell group showed improved cognition, word recognition and scored better at memory tasks while the other group had gotten better only at playing sudoku.

Another study, conducted by a team at University of California, Irvine in 2023, recorded a staggering 226% improvement in the memory performance of older adults after just two hours of exposure to different scents each night as they slept. Other research has found that humans unconsciously time cognitive tasks with nasal inhalation. By matching nasal airflow to brain activity measuring, scientists observed how inhaling through the nose improved visual and spatial problem-solving even when there was no smell information to take in.

Smell has also been found to impact how we relate to and interact with one another. Noam Sobel and his lab at the Weizmann Institute have determined that social groups tend to be made up of people who smell similarly. They also found that, much like other animals, we evaluate each other’s smells when we meet, sniffing chemical traces on our hands following a handshake — something we do without being aware of it. 

“Whether you are smelling fresh cinnamon buns or a dead rat under the floorboards, parts of the physical world have detached from their source and connected with nerve endings in your nose.”

Without a sense of smell, humans find it difficult to interpret, navigate and reason about the world. Their relationships suffer and elevated anxiety levels are common. As I leave Smith’s office, with a souvenir nose peg stuffed in my pocket, I imagine navigating the street outside without knowing what trees, restaurants, cars or people smell like — much as a self-driving car is engineered to do. Kant was right that smells have the power to attract or repel, but he did not realize how much they can teach us about our environments and one another.

Artificial Olfaction And World Models

Spend enough time listening to AI researchers debate progress in their field and someone will likely bring up Moravec’s Paradox. It’s the idea, articulated by Canadian futurist Hans Moravec in the 1980s, that skills such as speech and high-level reasoning require less computation than simpler-seeming capabilities like movement and sensory perception. 

“We are all prodigious olympians in perceptual and motor areas, so good that we make the difficult look easy. Abstract thought, though, is a new trick,” Moravec wrote. “We have not yet mastered it. It is not all that intrinsically difficult; it just seems so when we do it.”

Meta’s former chief AI scientist Yann LeCun often cites Moravec when arguing that truly intelligent, multimodal, adaptive AI will not emerge from the LLMs “hypnotizing” Silicon Valley today. “We have these language systems that can pass the bar exam, can solve equations, compute integrals, but where is our domestic robot?” he asked in an interview with Newsweek in 2025. “Where is a robot that’s as good as a cat in the physical world? We don’t think the tasks that a cat can accomplish are smart, but in fact, they are.”

A growing number of AI researchers and entrepreneurs believe that LLMs have already plateaued. LeCun is a prominent advocate for world models — an emerging paradigm in AI inspired by the idea that human minds create internal representations of the world made out of a variety of sensory data, and use them to predict, plan and execute actions. 

Whereas LLMs are trained primarily on exabytes of text, proponents of world models point out that a human just existing in the world will pick up vastly more. This includes not just language, sound and vision data, but also touch, temperature, balance, movement and other sensory modalities — including smell. This information may not help us solve a mathematical proof, but it does help us learn and make connections in ways that AI today cannot. 

Even though, in terms of input through the senses, olfactory data is our third largest by volume, very few researchers developing world models prioritize it in their work. A recent literature review on the subject did not list any modalities except vision, text, audio and lidar (which uses lasers to create a 3D map of the environment). 

Kordel France, a roboticist and machine olfaction researcher, is determined to change that. He argues that science needs a unifying data standard for smell, comparable to a JPG for images. From there, he believes, researchers worldwide should set benchmarks to compete with one another and share datasets for experimentation. 

France recently launched Sigma, a portable device that connects to a phone and records smells the way you might record audio or video. An app sends the data to ScentNet — the first open, multimodal dataset integrating smell with vision, depth, audio, language and inertial data. The name is a reference to ImageNet, a visual database of more than 14 million annotated images created in 2009 by Stanford professor Fei-Fei Li, who is also building world models with her company World Labs

In December I met France in London for coffee. As we sat down, he held up a palm-sized metal ball — a “homemade sensor,” he explained — that looked suspiciously like a grenade. Made of materials such as metal oxides and quartz crystals, olfactory sensors respond to specific odor molecules in the air. They are used widely in the food and fragrance industries, as well as in environmental monitoring, agriculture and defense.

France was on his way back from EurIPS, the European leg of the annual NeurIPS conference, where he and his co-authors delivered a position paper titled, “Olfaction Standardization is Essential for the Advancement of Embodied Artificial Intelligence.” They were the only team to submit a paper on artificial smell. 

“We could see innovations like wearables that monitor cortisol and antibody levels in sweat, or travel gear that can tell you how clean a hotel room really is.”

France, who works on robotics for Toyota North America, and who recently completed a PhD on adaptive learning in machine olfaction at the University of Texas, is married to a botanist who owns a flower shop. He sometimes hides his homemade sensors among the flowers there, where they register chemical reactions taking place in the air. “You can tell when she gets a new shipment,” he said. “Nitrogen levels change. Ammonia levels change. And you can sort of predict the level of spoilage.” 

His sensors and similar tools that are sometimes referred to as “e-noses” represent a radical departure from traditional systems used to detect odorants. The best instrument for identifying odor molecules has long been a gas chromatograph-mass spectrometer (GC-MS), a benchtop device used widely in forensics, environmental testing, food safety and clinical toxicology. Unfortunately, as France pointed out to me, this machine is “the size of a fridge, costs about half a million dollars and takes six hours to run a sample.” 

Newer sensors focused on detecting a small number of chemical compounds can operate in real-time, and have reached low thresholds for detection and distinguishing against background interference. But there is still a long way to go before we have anything that can recognize smells like an animal nose can.

Although there have been several exciting breakthroughs in sensor technology — like pulsing air to replicate sniffing and integrating biological receptors from locusts into robots — these types of systems tend to be limited in use. Most function well in controlled conditions, but do not last long in the wild. Unlike olfactory neurons in mammals, which continuously turn over and regenerate throughout life, artificial sensors are less durable. What’s more, the maximum number of smells they can detect is limited, even in a laboratory setting, though their accuracy can far outperform biological noses so long as they keep their focus narrow. 

Despite the current shortcomings of e-noses and similar sensors, they could still be used to collect data for AI systems. “Even if we put current olfaction sensors on robots, we could start to pair chemistry with image data,” France explained. “Then we can start to say, ‘Oh, there’s ethanol, methane and heptane in the scene. There’s a vehicle that’s running, or a fire,’ and we can start to coordinate scent to objects, which would be extremely powerful.” 

Almost none of the companies racing to build market-friendly humanoid robots have included olfactory sensors in their designs. One exception is the unfortunately named California startup Ainos — a fusion of “AI” and “nose” — which plans to integrate smell sensors into environmental safety robots in Japan. 

Developing reliable, long-lasting e-noses will surely be a lucrative venture. Because certain diseases can be smelled on patients, healthcare services are likely to snap them up. And if sensors can be built into consumer tech, we could see innovations like wearables that monitor cortisol and antibody levels in sweat, or travel gear that can tell you how clean a hotel room really is. Your phone could perform a “sniff test” on food that might be past its prime or alert you to allergens before symptoms develop. 

Getting these types of devices out in the world could also produce training data for the next round of world models, helping us better understand the connection between intelligence and olfaction in the process. But the question remains as to whether odors are fundamentally unlike images, sounds or words. What if the thing olfaction researchers want to measure is destined to forever escape their grasp?

There Is No Map

While commercial robotics has largely failed to integrate AI and smell, the $100 billion flavor and fragrance industry is pushing ahead. Last March, the New-York based startup Osmo, founded by neuroscientist Alex Wiltschko in 2022, launched Generation, a fragrance house that uses a proprietary “olfactory intelligence” model to assist perfumers in creating scents based on customer prompts. 

Osmo is perhaps the best-known company working in artificial olfaction, but it is not alone. In 2019, Swiss flavor and fragrance giant Givaudan launched Carto, an AI-powered touchscreen sampling tool that suggests combinations from an “odour value map” of niche ingredients to kickstart scent creation. A similar tool, Philyra, was developed by IBM for German chemicals company Symrise. In 2024, Prada released Paradoxe Virtual Flower, remixing its beloved Paradoxe line by using AI to refine the jasmine-ness integral to the perfume. 

“When I say ‘the smell of the ocean’ or ‘a whiff of freshly cut grass,’ these are only verbal caricatures — like giving a name to a ghost.”

Osmo has received investment from Google Ventures, Lux Capital and Two Sigma, and has AI pioneer and Nobel Prize-winner Geoffrey Hinton on its advisory board. As part of the service, Generation will not only design a fragrance, but bottle it, palletize it and drop ship it to brands seeking a smell experience to represent them. But Osmo’s long-term mission “is and always will be to digitize our sense of smell,” Wiltschko told me.

When we spoke on a video call, he acknowledged that this idea in particular was far easier to talk about than it was to build. “People often do not appreciate how poorly understood this sense is,” he said. “There’s this fallacy, that I’m guilty of as well, where we think we live in a modern world where things are figured out. We do not. At Osmo, we only care about smell, but there is a universe of mystery and possibility in that one chemical sense.”

In the 1990s, scientists discovered that certain olfactory receptors could be expressed outside the nose. More recently, a startup called Patina has been creating compounds that activate olfactory receptors in the skin, including one that can renew skin and accelerate wound healing. 

Patina co-founder and CEO Sean Raspet is a flavor scientist with a background making olfactory-themed art. In fact, it was Raspet’s early desire to be an artist that got him interested in olfaction to begin with. “You can’t make a new color because they’ve all been systematized,” he told me. “But you can make new scents.”

Most of the AI systems being applied in the flavor and fragrance industry rely on public datasets that use linguistic tags — terms like “floral,” “resinous” or “citrusy” — applied by perfumers to known chemical compounds. The idea is that an AI trained on this information will be able to predict new molecule combinations that people might like, or suggest replacements for pricey or hard-to-find ingredients. But it will still be human noses sniffing the end result. 

Our sense of smell is modulated by context — including the labels we apply to scents. A 2008 study challenged the claim in Shakespeare’s “Romeo and Juliet” that “a rose by any other name would smell as sweet.” It showed that odors are perceived as more or less pleasant when described using positive, rather than neutral or negative, names.

It has been estimated that there are 10 novemdecillion (10 followed by 60 zeros) possible configurations for potentially smelly molecules. For scale, this is 1,000 times more than the number of atoms in our solar system. Not only is there no map to guide us through the world of smells, such a map may be impossible to produce. 

Almost all smells are accords, mixtures of odor molecules we tend to describe by way of musical analogy (complete with top and bottom “notes”). Even strawberry, which we think of as a discrete flavor, is built out of hundreds of molecules and is interpreted with more or less accuracy depending on the smeller. Odorous molecules enhance or suppress one another, often in unpredictable ways, and are constantly shifting as their proportions change and chemical reactions take place. When I say “the smell of the ocean” or “a whiff of freshly cut grass,” these are only verbal caricatures — like giving a name to a ghost.

Human odor perception may be too subtle and complex to ever recreate fully in machines. But including even a simple system for chemical sensing and olfactory understanding could profoundly expand AI’s model of reality.

That’s Aroma

One evening while doomscrolling, I came across a TikTok featuring a group of young people sniffing test tubes in a bar. “Scent based dating experience where you get matched with people if you like their smell!” the caption read. On further investigation I found that the event, “Scent of Connection,” involved submitting samples of your sweat to be served up inside test tubes on a sort of plinth and sniffed by potential dating matches. 

I tracked down the creator of the post, a Spanish creative technologist and designer named Nicole Alonso. She told me that the plinth contained a computer that made certain test tubes glow blue as you input your preferences, and was an ironic nod to the gamification of dating encouraged by apps like Tinder, Grindr or Hinge. The idea was that introducing smell to the equation would bring back the human component that she and her co-founders felt was missing from the modern dating scene. There would be another event in a few weeks’ time, she told me.

“Truly embodied, contextually aware artificial intelligence will not be possible without linking inhalation and intellect.”

By the time it rolled around, I had enough olfactory factoids memorized to bore even the most hardened enthusiast. Ahead of the big night, I received a formal invitation in the mail along with two small pieces of fabric in a Ziploc bag. “Get sweaty!” read the instructions. “Remove your deodorant and do whatever makes you sweat: do a workout, go on the underground with lots of layers on, sit in a sauna…” 

With no sauna on hand, I dug out my most insulated winter sportswear, raised my hood and went for a jog in the September sun. When I returned home, I rubbed the fabric on my lower back, added a spritz of my favorite perfume (as per the instructions, I wasn’t trying to cheat) and put the little plastic baggie in the fridge while I got ready.

The event took place in a gallery called Filet located at the end of a row of shops near London’s “digital roundabout” at Old Street. The selection board and a rack filled with test tubes had been set up in the center of the room. There was an ice box filled with beers, tinned cocktails and BuzzBallz at the back.  

A long queue formed for the part of the plinth where participants were asked to put in their preferences to receive suggestions. So I decided to just sniff all the scents, one by one. Each test tube was numbered to correspond with the individual behind the scent. While I was sniffing, a guy next to me asked what I thought of the tube in my hand. I said it wasn’t unpleasant but I wasn’t drawn to it. It turned out to be his. 

Some of the tubes had the loud and acrid tang of body odor. Some didn’t smell of much. Two smelled pleasant — not too strongly of perfume or sweat. The word that came to mind, as weird as it sounds, was friendly. By the end of the night my nose was fine, but my social battery was drained. I ran around the corner to a burger restaurant, which smelled primarily of grease.

I may not have found love, but in my time researching olfaction — from the jellybean test to smell-based dating — I’ve discovered that not only is our sense of smell undervalued, its mechanics are profoundly complex. In my view, olfaction represents a critical blind spot in the quest to develop human-level artificial intelligence, although I remain open to the idea that advanced AI need not be human-like at all. Instead, powerful future AI systems might do well being freed from the messy intricacies of brains and bodies, pushing abstract reasoning to places that we lumbering primates cannot ever hope to comprehend.

But if the deleterious effects of losing one’s sense of smell teach us anything, and if we heed the research emerging from both controlled trials and bold small-scale experiments, then we will recognize that truly embodied, contextually aware artificial intelligence will not be possible without linking inhalation and intellect. After all, it’s thought that the Earth contains more than 40 billion odorous molecules, each of which can be mixed in different quantities to produce new ones and embedded in our lives, laddering up to a near-infinity of planetary smells. We make use of this information constantly. AI should do the same.