A quiet milestone for sensory science
- Gavriel Wayenberg
- Aug 13
- 2 min read
Today, one of the world’s most experienced food companies informed us that it plans to brief multiple internal teams using Ajinomatrix’s technology as an illustration of how AI will change day-to-day work in food R&D and quality.

In the long chronicle of food technology, genuine shifts rarely arrive with fanfare. They appear first as careful internal briefings, small demonstrations, and patient comparisons with what came before. This is one of those moments.
Why this matters (and why we’re saying it softly)
For more than two decades, the food industry has asked a deceptively simple question: can the sensations of taste and aroma be measured, predicted, and engineered as reliably as texture or color? The honest answer has been “not yet, not fully.”
What is now being prepared inside some major’s walls is a different kind of conversation—one that treats sensory intelligence not as a curiosity, but as a practical layer that helps scientists and operators make quicker, more consistent decisions: fewer panel loops, clearer diagnostics across plants, faster iteration from idea to product.
We share this update not to boast, but to record a historical waypoint: when a global leader chooses to explain future work through your tools, it signals that an emerging method has crossed into the mainstream of practice.
A respectful nod to long horizons
Innovation in food is a long game. It requires steady hands, institutional memory, and a willingness to keep testing while the rest of the world moves on. We want to acknowledge the discipline of a company that has kept the sensory question open for more than 20 years—and the many scientists, engineers, and product teams whose work made today’s briefings possible. Patience and stewardship matter.
What this is—and what it is not
It is an internal step: Ajinomatrix will be shown as an example of how AI can support R&D, QC, and product creation in the years ahead.
It is not a public launch or a commercial announcement. We will maintain the same discretion we ask of our partners.
It is a signal that predictive sensory tools—once considered out of reach—are now practical enough to be discussed across departments.
It is not the end of panels, chefs, or craft. It’s a way to aim them better and waste less of their time.
The work underneath the headline
Behind this moment are three building blocks that quietly mature:
Sensory OS — a way to capture, standardize, and compare sensory data from people, sensors, and processes.
FEELLM — models that translate chemistry and text into panel-equivalent scores with explanations.
TasteTuner-o — an optimization layer that turns targets into actionable levers under real-world constraints.
Put together, these tools do something simple: they help teams ask better questions sooner.
What comes next
We will continue doing what brought us here—listening, measuring, and improving—while respecting confidentiality and letting results speak. If this work becomes publicly presentable, you will hear about it on the right day, in the right way.
For now, we mark the page: a major has chosen to teach the future using our tools. That is enough. The industry learns by example; we are grateful that our example will be part of that lesson.
— The Ajinomatrix team
Comments