Prototyping a personalized care plan for children with autism: Using biosensors, voice AI, and affinity therapy to restore calm and self-control.
A child is on the verge of sensory overload in a busy grocery store. Their heart rate quickens. Their breathing becomes shallow. There’s too much happening, too many noises, and the lights are fluorescent and harsh. People are trying to pass with their cart, but where is there to go? Panic starts to wash over them. 

Then, a small, familiar voice starts to speak to them through a wearable on their wrist. It’s their character, someone they chose and helped create. The voice is calm and gentle. A rock of familiarity in an environment that has suddenly become too much. “It’s okay to feel overwhelmed. Let’s take a deep breath together.”

Ron Suskind, a Pulitzer Prize-winning journalist and former Wall Street Journal editor, wrote a book in 2014 called Life Animated: A Story of Sidekicks, Heroes, and Autism. Suskind’s book chronicles his relationship with his son, Owen, who was diagnosed with regressive autism at age three. Owen had lost his ability to speak, but Suskind observed he had an intense fascination with animated Disney movies. He would watch them constantly, and Suskind noticed that, through these films, Owen began to make sense of his inner world. The world of Disney served as a translational and regulatory mechanism for Owen: it helped him understand and begin to communicate through its characters and story arcs. 

Working with his therapists, Suskind and his family helped Owen regain his speech by rehearsing dialogue from Disney movies. As Owen became more comfortable using his preferred language and building a shared understanding with his caregivers through it, he continued to grow his speaking capabilities and autonomy. By the end of the book, Owen is an adult living by himself, still intensely passionate about Disney—its quotes, themes, and characters remaining a guidepost for how he navigates his life—but with a worldview that has expanded as well. 

Suskind would later coin the term “affinity therapy” to describe how Owen’s ability to communicate was established by using Owen’s love, or affinity, for Disney movies, to build a pathway for communication and learning. These affinities comforted him when the world felt overwhelming, while also providing a unique shared language between Owen and his caregivers on which to build trust. Affinity therapy leverages a child with autism’s deeply held interests to create a zone of comfort when struggling with sensory overload, while also enabling them to develop a personalized and emotionally resonant form of learning therapy.

Framing the design challenge

Taking lessons from affinity therapy, we wanted to design a tool to support nonverbal autistic children with emotional connection and in-the-moment regulation.

Current AAC devices.

Augmentative and Alternative Communication (AAC) devices already provide nonverbal children with ways to communicate; there are also iPads and smartwatches that children, or anyone who struggles with speech, can use wherever they go. However, while they provide a valuable method of communication, they cannot detect when a child is at risk of becoming overwhelmed or take proactive measures to calm them. Our prototype aims to bridge that gap—moving from a communication device to a more proactive, personalized, and communicative partner. 

We reframed the device as a co-regulator. It monitors for early cues of physiological stress, knows where the child is located (aisle five of the grocery store, in a school pickup line, or a waiting room, for example), and immediately offers help in the comforting voice of the child’s current passion, whether that’s Winnie the Pooh, Spiderman, or Bluey.

Sensors on the watch detect an increase in heart rate and body temperature.

Why go to that trouble? Because, as Ron Suskind demonstrated with his Disney script-based affinity therapy, autistic children watch and trust their favorite characters. They know every intonation, every pause, and every scene. 

In the prototype, the biosensor senses a rise in heart rate and invites the child to choose which of their favorite characters will speak to them to calm them down. When their favorite character guides a breathing exercise or repeats affirmations, their instructions resonate with the child, whereas a generic calm-down prompt would likely be ineffective amid the overstimulation.

The watch then asks whether the child feels better, logs the outcome, and, if the child remains upset, quietly returns control to the caregiver.

This provides the child with an opportunity to develop emotional literacy by enabling them to recognize their emotional state and whether it improved. It is presented in a simple “yes” or “no” format for easy interpretation.

Prototype: emotionally adaptive AAC device

With this vision in mind, we began building a prototype to replicate this user experience. We wanted the prototype to feel alive, responsive, comforting, and tailored to each child’s emotional world. So we turned to AI.

Crafting a calming script with ChatGPT, which aims to offer support in a moment of sensory overload. Bringing the character’s voice to life using ElevenLabs’ text-to-speech model.
Using the Veo2 frame-to-video model in Flow to animate the character image into a video.
Stitching it all together in Premiere—a manual process for now, but hinting at what’s possible.

Every aspect of our video prototype—the character design, script, voice, and animation—was enabled by emerging AI tools. We started with ChatGPT, generating a calming, kid-friendly character and a script for a specific moment: reassuring a child in a grocery store who’s starting to feel overwhelmed. We then brought that script into ElevenLabs, selecting a pre-set voice model that feels warm and natural—more like a friend than a device. Finally, we experimented with Google Flow to convert the character image into a looped video, adding subtle movement to match the tone and rhythm of the character’s voice.

This rapid prototype offers a glimpse of the interactive experiences AI can help bring to life. In future iterations, we could integrate these tools into a unified system—connecting their APIs to automate the creation of video content that adapts in real time to a child’s emotional state and environment. And by leveraging emerging AAC technologies, we can envision a personalized wearable that not only monitors a child’s emotions but also actively supports them in recognizing, self-regulating, and communicating these complex feelings.

Diving deeper: Talking to an expert

“Personalization and proactive support are key. Anticipating and reinforcing positive behaviors early can profoundly impact emotional well-being and skill development for neurodivergent children.” – Emma Ruby, M.S. Applied Behavioral Analysis

We recognized early in this process that we had our own limitations in fully grasping the magnitude of the needs and challenges faced by neurodivergent children, so we sought the advice of a trained expert. We shared our prototype with Emma Ruby, a therapist who works with children with developmental disabilities, to test our assumptions and establish a deeper layer of context for how our prototype can fit into the wider ecosystem of care. 

Ruby provided valuable insights, emphasizing the importance of personalization and of reinforcing positive behaviors through proactive support rather than just reactive interventions. Her feedback helped us identify where the prototype could meaningfully enhance emotional regulation and communication, and where additional human support would still be necessary.

For example, although Ruby acknowledged that our prototype creatively addressed proactive emotional support by detecting early signs of stress, she encouraged us to explore alternative care modalities, such as positive reinforcement and the Applied Behavior Analysis (ABA) concept of priming. Integrating positive reinforcement throughout the day, not only during moments of potential dysregulation, could provide meaningful educational opportunities by reinforcing and teaching ongoing positive behaviors. Priming is a therapeutic exercise that proactively prepares a child for difficult situations. These are repeated, proactive modules that Ruby highlighted as important clinical interventions that support both in-the-moment dysregulation and proactive, continual reinforcement throughout the day.

Based on Ruby's input, we revisualized what personalization might look like for future prototype.

In speaking with Ruby, it became apparent that this device would need to be highly personalized to each child’s unique personality and needs. One build on the prototype was to expand the voice options beyond cartoons or characters to include other familiar voices, such as a caregiver’s or even the child’s own voice. Acknowledging that someone like Owen was able to build foundational relationships based on Disney characters just means that every child has their own way of making sense of the world, and that for our device to have more impact, it would need to coincide with how each child understands their world and what makes them feel safest and most comfortable.

Ethical Considerations: Balancing AI and human relationships

Any product designed for children requires a deep level of care, intention, and ethical reflection. When that product engages with their emotions, the responsibility deepens. And when it’s designed for neurodivergent kids, that responsibility has to shape every single decision. How can we ensure that the experience empowers rather than corrects—encouraging personal expression rather than enforcing neurotypical norms? How can we provide meaningful, personalized support while ensuring that sensitive data such as biometrics, location, and mood are protected? And what does consent look like for a child when the system is adapting to them in real time? These aren’t just technical considerations; they’re human ones. Any future development would require close collaboration with caregivers, clinicians, and—most importantly—kids themselves, to ensure the technology respects their autonomy, protects their privacy, and meets them on their own terms.

Future Outlook: Expanding voice AI applications

Following our conversation with Ruby, we realized that our prototype is just the first step in a broader, more nuanced exploration. Ruby’s thoughtful feedback highlighted critical areas for enhancement, particularly emphasizing the importance of personalizing experiences to meet each child’s unique emotional and communicative needs. 

Moving forward, we’re excited to engage with a diverse range of experts, including behavioral therapists, caregivers, and individuals with lived experiences, to deepen our understanding and refine our approach. Our goal is to incorporate their valuable perspectives into future iterations, creating a richer, more holistic solution. By continuously evolving our prototype, we aim to design a voice AI-integrated AAC device that not only proactively supports emotional regulation but also promotes ongoing positive behavior, fosters autonomy, and personally resonates with each child. 

Moreover, speaking with Ruby made us wonder: What other modalities could we explore beyond voice? We learned that some children may prefer sensory stimulation, such as vibration, to help them cope with their emotions. Incorporating additional modalities, such as haptics and a range of visual sensory modalities, can enhance the impact of future AAC devices.

As we continue to navigate this space, our role is to thoughtfully balance these factors to create human-centric, ethical, and effective solutions. Our goal is not to replace human interaction but to enhance it by designing systems that supplement healthcare professionals and caregivers, foster meaningful connections, and ultimately provide individuals with a deeper sense of safety, comfort, and control. We want to build designs that honor both the potential and limitations of technology, ensuring our innovations meaningfully contribute to a more empathetic, accessible, and inclusive healthcare landscape.

Heading 1

Heading 2

Heading 3

Heading 4

Heading 5
Heading 6

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.

Block quote

Ordered list

  1. Item 1
  2. Item 2
  3. Item 3

Unordered list

  • Item A
  • Item B
  • Item C

Text link

Bold text

Emphasis

Superscript

Subscript

sdfadfas
Manage Preferences