top of page

Another Crazy Day in AI: How Google is Testing Visual AI in Medicine

Another Crazy Day in AI: An Almost Daily Newsletter

Hello, AI Enthusiasts.



Google’s DeepMind and Research teams just gave their diagnostic chatbot a big brain boost. The new version isn’t just conversational, it’s multimodal. It reads your medical scans and documents while it chats with you—adding visual intelligence to its already strong language processing.


College kids are getting smarter (and not just from caffeine). Texas A&M is rolling with Perplexity to teach AI from the ground up.


And if you’d rather not rely on cloud tools, GPT4ALL lets you run LLMs from your own desktop. That’s enough thinking for one night. See you in your inbox again soon.


Here's another crazy day in AI:

  • A diagnostic chatbot with vision

  • Mays Business School launches AI partnership with Perplexity

  • Use your favorite AI models offline with GPT4ALL

  • Some AI tools to try out


TODAY'S FEATURED ITEM: What AMIE’s Update Means for Healthcare


A robotic scientist in a classic white coat with 'AI Scientist' on its back stands beside a human scientist with 'Human Scientist' on their coat, looking towards the AI Scientist.

Image Credit: Wowza (created with Ideogram)


How might visual AI change the future of remote healthcare access?


Google DeepMind and Google Research have taken a major leap forward with their multimodal AMIE (Articulate Medical Intelligence Explorer), a research diagnostic assistant. This breakthrough, shared by Khaled Saab and Jan Freyberg on the Google Research Blog, introduces a new multimodal version of AMIE—capable not just of having diagnostic conversations but also of interpreting medical images and documents shared by patients. It highlights how conversational AI systems can now incorporate visual medical information during diagnostic discussions, potentially transforming the future of remote healthcare.


This new version of AMIE shows just how far AI has come in the field of healthcare. Not only can AMIE hold conversations and answer questions based on medical information, but it can also interpret and integrate visual data like medical images. This is a huge step forward, especially for remote consultations, where patients can now share images like X-rays or CT scans directly with the system. AMIE is able to assess this information and provide diagnostic assistance alongside the patient's description of symptoms or medical history.


Source: Google
Source: Google

Here are some of the key points discussed:

  • AMIE’s performance in medical conversations was rated higher than that of primary care physicians across 28 of 32 objective and subjective criteria in a blinded study with licensed professionals.

  • The new model was trained on a combination of publicly available data and simulated patient-doctor interactions created by clinicians.

  • AMIE’s reasoning is visible to users—it shows which parts of an image or document it’s referring to in its responses, making its diagnostic process more transparent.

  • It can reference prior parts of the conversation or shared images and documents to support its medical reasoning.

  • While AMIE is not a product and hasn’t been tested in real-world clinical settings, this research helps explore what might be possible for the future of AI-assisted healthcare.


Source: Google
Source: Google

As the boundaries between conversational AI and medical diagnostics continue to blur, this research offers a glimpse into how integrated tools might one day support physicians or even extend basic access in areas where healthcare professionals are scarce. Being able to process both dialogue and visual data means these systems can engage in more context-aware discussions—something that's especially useful when a patient can’t explain everything in words alone.


Of course, this work remains in the research stage. There are essential questions to resolve—around safety, equity, consent, and how these systems would operate alongside human clinicians in the real world. But it’s a meaningful development. It encourages a broader conversation about how multimodal AI might reshape the tools available in healthcare, not by replacing human judgment, but by enhancing how care can be delivered and understood—especially at a distance.




Read the full blog here. Read the paper here.

OTHER INTERESTING AI HIGHLIGHTS:


Mays Business School Launches AI Partnership with Perplexity

/David Swope, News Reporter, on The Battalion


Texas A&M’s Mays Business School has become the first academic institution to partner with Perplexity, granting students access to its advanced AI-powered platform, Enterprise Pro. The goal is to prepare students for a workforce increasingly shaped by artificial intelligence by giving them hands-on experience with cutting-edge tools. The initiative is part of a broader effort, including a new AI and Business minor and student competitions focused on AI applications. While some express concerns about over-reliance on AI, school leaders emphasize ethical use and AI literacy as essential skills for the future.



Read more here.


Use Your Favorite AI Models Offline with GPT4ALL

/Jack Wallen, Contributing Writer, on ZDNET


Want to run your favorite local AI models directly from your desktop? GPT4ALL is a user-friendly app that makes it easy to run open-source models like Llama, Mistral, and Orca on Linux, MacOS, and Windows. With support for model switching, GPU selection, and API serving, GPT4ALL offers a powerful way to use AI privately without relying on the cloud. Installation is quick for Ubuntu-based systems, and users can immediately begin experimenting with various models and workflows — from personal research to coding help.



Read more here.


SOME AI TOOLS TO TRY OUT:


  • Sans Writer – A clean, private space to write without distractions or metrics.

  • HiBird – AI meetings, translations, and messaging to help startups grow faster.

  • Suna by Kortix – A generalist AI agent that takes action on your behalf.


That’s a wrap on today’s Almost Daily craziness.


Catch us almost every day—almost! 😉

EXCITING NEWS:

The Another Crazy Day in AI newsletter is on LinkedIn!!!



Wowza, Inc.

Leveraging AI for Enhanced Content: As part of our commitment to exploring new technologies, we used AI to help curate and refine our newsletters. This enriches our content and keeps us at the forefront of digital innovation, ensuring you stay informed with the latest trends and developments.





Comments


Subscribe to Another Crazy Day in AI​

Catch us almost every day—almost! 😉

Thanks for signing up!

Copyright Wowza, inc 2025
bottom of page