Creating a Personalized Care Modality for Children with Autism
A reconceptualization of the AAC using lessons learned from Ron Suskind’s “Affinity Therapy” and technological interventions that create more comfort, autonomy, and empathy for children with autism.

A child is on the verge of sensory overload in a busy grocery store. Their heart rate quickens, their breath gets shallow. There’s too much happening, too many noises, the lights are fluorescent and harsh, people are trying to pass with their cart but, where is there to go?  Panic starts to wash over. 

But then a small, familiar voice starts to speak to them through a wearable on their wrist. It’s their character, someone they chose and helped create. The voice is calm and gentle. A rock of familiarity in an environment that has suddenly become too much. “It’s okay to feel overwhelmed. Let’s take a deep breath together.”

Ron Suskind, a Pulitzer Prize-winning journalist and former Wall Street Journal editor, wrote a book in 2014 called Life Animated: A Story of Sidekicks, Heroes, and Autism. Suskind’s book chronicles his relationship with his son, Owen, who was diagnosed with regressive autism at age three. Owen had lost his ability to speak, but Suskind noticed that Owen had an intense fascination with animated Disney movies. He would watch them constantly, and Suskind began to notice that it was through these movies that Owen began to make sense of his inner world. The world of Disney was like a translation and regulatory mechanism for Owen: they helped him understand and begin to communicate through its characters and story arcs. 

Working with his therapists, Suskind and his family were able to help Owen regain his speech by rehearsing dialogue from Disney movies. As Owen became more comfortable using the language of his choice and building a shared understanding with his caregivers through that language, he continued to grow his speaking capabilities and autonomy. By the end of the book, Owen is an adult living by himself, still intensely passionate about Disney, of course, its quotes, themes, and characters remaining a guidepost for how he navigates his life, but with a worldview that has expanded, as well. 

Suskind would later coin the term “Affinity Therapy” to describe how Owen’s ability to communicate was established by using Owen’s love of Disney movies, his deep affinity for them, to build a pathway for communication and learning. These affinities establish a form of comfort when the world feels overwhelming, while also using a shared, unique language to establish a relationship between Owen and his caregivers. Through these Disney movies, they had a shared bank of knowledge, a foundation to build trust upon. Affinity Therapy takes these deeply-held interests and uses them as a way to create a zone of comfort for individuals who struggle with sensory overload, while also allowing for a jumping-off point to carve out a personalized and emotionally-resonant form of learning therapy. 

Framing the Design Challenge

We want to explore how we might take the lessons from Affinity Therapy and design a tool that helps non-verbally autistic children with emotional connection and in-the-moment regulation. 

Current AAC devices

AAC (augmented and alternative communication) devices already give non-verbal children a way to communicate—there are also iPads and smartwatches that children, or anyone who struggles with speech, can take with them wherever they go. However, while they provide a valuable method of communication, they can’t sense when a child is at risk of becoming overwhelmed, and take proactive action to calm them down. Our prototype looks to bridge that gap —from a communication device, to a more proactive, personalized, and communicative partner. 

We reframed the device as a co-regulator: it listens for early physiological stress cues, knows where the child is (aisle five, school pickup, waiting room) and immediately offers help in the comforting voice of the child’s current passion, whether that’s Winnie the Pooh, Spiderman, or Bluey.

Sensor on watch detects abnormal heart rate and increase in temperature.

Why go to that trouble? Because, as Ron Suskind showed with his Disney-script “affinity therapy,” autistic children watch and trust their favorite characters. They know every intonation, every pause, every scene. 

In the prototype, the biosensor will sense a rise in heart-rate and invite the child to choose which of their favorite characters will speak to them. When that voice guides a breathing exercise or repeats affirmations, the instructions land where a generic calm-down prompt would likely fail amongst the overstimulation.

The watch then asks if the child feels better, logs the outcome, and quietly hands control back to the caregiver if the child is still upset.

This gives the child an opportunity to teach emotional literacy, allowing them to recognize the state of their emotions and if it has improved. It is simplified in a yes or no format for easy interpretation.

Prototype: Emotionally Adaptive AAC Device

With this vision in mind, we began to build out a prototype that could replicate this user experience. We wanted to create a prototype that felt alive—responsive, comforting, and tailored to each child’s emotional world. And so we turned to AI.

Crafting a calming script with ChatGPT, aiming to offer support in a moment of sensory overloadBringing the character’s voice to life using ElevenLabs’ text-to-speech model
Using the Veo2 frame-to-video model in Flow to animate the character image into a video
Stitching it all together in Premiere—a manual process for now, but hinting at what’s possible.

Every aspect of our video prototype—the character design, script, voice, and animation—was made possible with the use of emerging AI tools. We started with ChatGPT, generating a calming, kid-friendly character and a script for a specific moment: reassuring a child in a grocery store who’s starting to feel overwhelmed. We then brought that script into ElevenLabs, selecting a pre-set voice model that feels warm and natural—more like a friend than a device. Finally, we experimented with Google’s Flow to transform the character image into a loopable video, adding subtle movement to match the tone and rhythm of the character’s voice.

This rapid prototype offers a glimpse at the type of interactive experiences that AI can help bring to life. In future iterations, we could integrate these tools into a unified system—connecting their APIs to automate the creation of video content that adapts in real time to a child’s emotional state and environment. And by leveraging emerging AAC technologies, we can envision a personalized wearable that not only monitors a child’s emotions, but actively supports them in recognizing, self-regulating, and communicating these complex feelings.

Diving Deeper: Talking to an Expert

“Personalization and proactive support are key. Anticipating and reinforcing positive behaviors early can profoundly impact emotional well-being and skill development for neurodivergent children.” – Emma Ruby, M.S. Applied Behavioral Analysis

We recognized early on in this process that we have our own limitations in fully grasping the magnitude of the needs and challenges faced by neurodivergent children, so we sought the advice of a trained expert. We shared our prototype with Emma Ruby, a therapist who works with children with developmental disabilities, to test our assumptions, and establish a deeper layer of context for how our prototype can fit into the wider ecosystem of care. 

Emma provided valuable insights, emphasizing the importance of personalization and reinforcing positive behaviors through proactive support, rather than just reactive interventions. Her feedback helped us understand where the prototype could meaningfully enhance emotional regulation and communication, and where additional human support would still be necessary.

For example, although Emma acknowledged that our prototype creatively addresses proactive emotional support by detecting early signs of stress, she encouraged us to explore different modalities of care like positive reinforcement and the “ABA” concept of priming.

She mentioned that integrating positive reinforcement throughout the day, not just during moments of potential dysregulation, could serve as meaningful educational opportunities, reinforcing and teaching ongoing positive behaviors, while “priming” is a therapeutic exercise  that proactively prepares a child for difficult situations. These are repeated, proactive modules that Emma highlighted as important clinical interventions that support both in-the-moment dysregulation and proactive, continual reinforcement throughout the day.

Based on Emma’s input, we revisualized what personalization might look like for future prototype.

In speaking to Emma, too, what became apparent is that this device would need to be highly personalized to the unique personalities and needs of each child. One build on the prototype was to expand the voice options beyond cartoons or characters to include other familiar voices like the voice of a caregiver, or even the child’s own voice. Acknowledging that someone like Owen was able to build foundational relationships based on Disney characters just means that every child has their own way of making sense of the world, and that for our device to be more impactful, it would need to coincide with how each child understands their world and what makes them feel safest and most comfortable. 

Ethical Considerations: Balancing AI and Human Relationships

Any product designed for children requires a deep level of care, intention, and ethical reflection. When that product engages with their emotions, the responsibility deepens. And when it’s designed for neurodivergent kids, that responsibility has to shape every single decision. How do we make sure the experience empowers rather than corrects—encouraging personal expression, not enforcing neurotypical norms? How do we offer meaningful, personalized support while guaranteeing the protection of sensitive data like biometrics, location, and mood? And what does consent look like for a child, when the system is adapting to them in real time? These aren’t just technical considerations; they’re human ones. Any future development would require close collaboration with caregivers, clinicians, and—most importantly—kids themselves, to ensure the technology respects their autonomy, protects their privacy, and meets them on their own terms.

Future Outlook: Expanding Voice AI Applications

After our conversation with Emma, we recognized that our prototype marks just the first step in a broader, more nuanced exploration. Emma’s thoughtful feedback highlighted critical areas for enhancement, particularly emphasizing the importance of personalizing experiences to meet each child's unique emotional and communicative needs. 

Moving forward, we’re excited to engage with a diverse range of experts, including behavioral therapists, caregivers, and individuals with lived experiences, to deepen our understanding and refine our approach. Our goal is to incorporate their valuable perspectives into future iterations, creating a richer, more holistic solution. By continuously evolving our prototype, we aim to design a Voice AI-integrated AAC device that not only proactively supports emotional regulation but also promotes ongoing positive behavior, fosters autonomy, and personally resonates with each child. 

Moreover, speaking with Emma made us wonder: what other modalities could we explore beyond voice? We learned some children may prefer sensory stimulation, such as vibration, to help them cope with their emotions. Incorporating other modalities, such as haptics or a variety of visual sensories, can improve the impact of the future of AAC devices.

While we continue navigating this space, our role is to thoughtfully balance these factors, creating human-centric, ethical, and effective solutions. Our goal is not to replace human interaction, but to enhance it—designing systems that supplement healthcare professionals and caregivers, fostering meaningful connections, and ultimately, a deeper sense of safety, comfort, and control for individuals. We want to build designs that honor both the potential and limitations of technology, ensuring our innovations meaningfully contribute to a more empathetic, accessible, and inclusive healthcare landscape.