The recent narrative surrounding artificial intelligence often suggests a machine that is rapidly closing the gap between silicon and biological life. We are told that chatbots can pass the bar exam, AI-designed drugs are entering clinical trials, and, according to a flurry of headlines, computers are finally learning to smell. These reports describe AI "tasting" colors or digitizing scents, painting a picture of a technology that is becoming increasingly multisensory.

However, a closer look at the research reveals that these claims are largely an exercise in creative data interpretation. When a large language model associates the color pink with sweetness or yellow with sourness, it is not experiencing a sensation; it is merely echoing human linguistic patterns found in its training data. This is mimicry, not olfaction. In reality, while machine vision and natural language processing have seen exponential growth, the field of artificial olfaction has remained remarkably stagnant.

Between 2015 and 2025, the volume of research papers dedicated to machine smell failed to keep pace with the rest of the industry. While flagship conferences like NeurIPS and ICML focus heavily on generative media and agentic behavior, the integration of chemical sensing into AI models remains a neglected niche. Most leaders in the field have prioritized the digital senses—sight and sound—leaving the complex, chemical world of smell largely unexplored. This sensory gap suggests that for all its progress, AI remains fundamentally detached from the physical reality of the biological world.

With reporting from Noema Magazine.

Source · Noema Magazine