Another Crazy Day in AI: The Science of Digitizing Scent
- Wowza Team

- Nov 15, 2025
- 4 min read
Updated: Dec 1, 2025

Hello, AI Enthusiasts.
Week done? Great. Here’s something to leave you thinking.
A recent podcast episode took a surprisingly cool turn into the world of digital scent—how it’s captured, modeled, and maybe one day streamed like data. If you’ve never thought about what “smell tech” looks like, this one’s worth the listen.
Penn GSE is also shaking things up with a new hands-on program that lets students audit real AI systems to spot bias.
And Google’s newest shopping updates make holiday browsing much easier. You can search conversationally and get real-time product info without the usual headache.
Now go enjoy your weekend. You’ve earned a little mental wandering.
Here's another crazy day in AI:
How one company "teleported" a plum's smell
Penn GSE launches classroom-ready AI auditing program
Google rolls out AI updates for seasonal shopping
Some AI tools to try out
TODAY'S FEATURED ITEM: Building the World's First Digital Nose

Image Credit: Wowza (created with Ideogram)
What if your next text message didn't just show a photo but delivered the actual smell behind it?
In a recent episode of The Neuron Podcast, host Pete Huang leads a conversation on emerging sensory technology with Osmo Founder and CEO Alex Wiltschko. Together, they discuss the methods Osmo is using to digitize smell, the scientific reasoning behind the work, and the potential applications that may develop as the technology continues to advance. The episode walks through how scent is captured, modeled, and recreated, offering a closer look at a field that rarely receives attention but touches many aspects of everyday life.
What the conversation explores:
Osmo uses a read→map→write approach to digitize scent, similar to processes used for audio or visual signals.
Humans rely on over 300 olfactory receptors, making smell far more complex than color perception.
The team successfully recreated the scent of a fresh-cut plum in another room using sensors and a molecular printer.
Osmo Studio reduces fragrance development timelines from more than a year to roughly one week.
Three newly created fragrance molecules—previously nonexistent in nature—have been developed.
Smell is closely linked to areas of the brain that govern memory and emotion.
Potential applications include early disease detection using scent-based chemical markers.
Long-term goals include creating smaller, portable sensors that could be integrated into everyday devices.
One project generated a museum’s signature scent from a single photograph.
The episode discusses how progress in emerging technologies often accelerates as data and tools improve over time.
Alex has spent about 20 years working on this problem, including time at Google Brain before founding Osmo three years ago. Right now, the technology finds its main use in the fragrance industry, where it genuinely speeds things up. Traditional perfume development takes well over a year, but being able to describe what you want and get samples within days changes how that works. The conversation also gets into other potential uses like creating scents for museums, spotting counterfeit products through molecular analysis, and the bigger ambition of using smell to detect diseases early.
The episode is worth checking out if you're curious about how technology tackles problems that seemed nearly impossible until recently. Smell is one of those things we experience constantly but rarely think about in technical terms. Whether digital scent becomes something we all encounter or stays within specialized fields probably depends on cost, accuracy, and whether enough practical reasons emerge for people to use it. Either way, it's interesting to hear how the science actually works without all the hype that usually surrounds emerging tech.
Watch on YouTube here.
Listen on Spotify here.
OTHER INTERESTING AI HIGHLIGHTS:
Penn GSE Launches Classroom-Ready AI Auditing Program
/Nora Garg, on The Daily Pennsylvanian
Penn’s Graduate School of Education has introduced a high school curriculum focused on helping students recognize and question bias in AI systems. The program, “AI Auditing for High School,” teaches algorithmic bias through hands-on audits and real-world examples, even for students with no coding background. Developed by professors Yasmin Kafai and Danaë Metaxa alongside U.S. educators, it aims to build critical thinking around how AI works, who it benefits, and where it falls short. The launch comes as Penn GSE expands its AI training initiatives, supported by major grants and growing partnerships with school districts.
Read more here.
Google Rolls Out AI Updates for Seasonal Shopping
/Vidhya Srinivasan, VP/GM Ads and Commerce, on Google Blogs — The Keyword
Google is rolling out a major AI shopping update across Search and Gemini to simplify holiday shopping. Users can now describe what they want conversationally in AI Mode and get organized results with visuals, reviews, prices, and inventory data. Gemini also supports shopping directly in-app, helping people compare products, explore ideas, and find real-time listings powered by the Shopping Graph. New agentic features, like Google calling local stores on your behalf and automated budget-friendly purchasing, aim to save time while making shopping more efficient.
Read more here.
SOME AI TOOLS TO TRY OUT:
That’s a wrap on today’s Almost Daily craziness.
Catch us almost every day—almost! 😉
EXCITING NEWS:
The Another Crazy Day in AI newsletter is on LinkedIn!!!

Leveraging AI for Enhanced Content: As part of our commitment to exploring new technologies, we used AI to help curate and refine our newsletters. This enriches our content and keeps us at the forefront of digital innovation, ensuring you stay informed with the latest trends and developments.





Comments