AAC and Nonspeaking Communication Reference¶
Historical Context and Medical Evolution¶
Pre-Modern Communication Methods¶
Humans have always found ways to communicate beyond speech. Sign languages have existed for centuries, with documented use dating to classical Rome and Greece. Gesture-based communication, writing, and pictorial representation served nonspeaking individuals throughout history—though access to these methods depended heavily on social status, resources, and whether anyone in the person's community bothered to learn their communication methods.
For most of history, nonspeaking individuals—whether from congenital conditions, acquired disabilities, or illness—were largely excluded from society. The assumption that speech equals intelligence meant that those who couldn't speak were presumed incompetent, often institutionalized, and denied education, employment, and basic human rights. Communication boards and other low-tech methods existed but were not systematically developed or widely available.
Early 20th Century: Communication Boards Emerge¶
The first widely distributed communication aid was developed in the 1920s by and for F. Hall Roe, a man with cerebral palsy in Minneapolis. This letter and word-based communication board—distributed by a local men's group—represented an early recognition that nonspeaking people had things to say and deserved tools to say them. Similar boards were created sporadically throughout the early 20th century, but without systematic development, distribution, or research.
Through the 1930s-1950s, communication boards remained the primary AAC option for those lucky enough to have access. These ranged from simple alphabet boards to more complex picture-based systems. The assumption of incompetence remained dominant—many nonspeaking people were institutionalized without any communication support, their thoughts and needs simply ignored.
1950s-1960s: The Modern AAC Era Begins¶
The modern AAC field emerged in the 1950s in Europe and North America, driven by researchers who recognized that nonspeaking people could communicate if given appropriate tools. Early electronic devices were bulky, expensive, and institutionally based—not portable technology that individuals could carry.
A landmark development was the Patient Operated Selector Mechanism (POSM), designed by Reg Maling in 1960. This sip-and-puff typewriter controller allowed people with severe motor impairments to type, demonstrating that communication access was a matter of technology design, not cognitive capacity. The POSM and similar devices remained primarily institutional tools, available to few.
1970s: Voice Output Emerges¶
The 1970s saw transistorized devices replace mechanical systems, making AAC devices smaller and more reliable. By the late 1970s, devices with voice output—synthesized speech that could speak typed or selected messages—became commercially available. The HandiVoice (1977) was among the first portable voice output communication aids (VOCAs).
These early speech-generating devices sounded robotic and mechanical, but they represented a profound shift: for the first time, nonspeaking individuals could "speak" in real-time conversations, their words heard by anyone in the room without requiring the listener to read a board or understand sign language.
1980s: Field Recognition and ISAAC¶
The late 1970s and 1980s saw explosive growth in AAC research, publication, and training. The International Society for Augmentative and Alternative Communication (ISAAC) was founded in 1983, establishing AAC as a recognized field with international conferences, research journals, and professional standards.
Technology advanced rapidly. The EyeTyper allowed people to type via eye movements. Symbol-based systems (like Blissymbolics and later proprietary symbol sets) expanded options for pre-literate users or those who processed images more readily than text. Dedicated AAC devices became more portable, though remained extremely expensive.
Critically, this era saw growing advocacy around "presume competence"—the principle that nonspeaking individuals should be assumed to understand and have things to say, rather than assumed incompetent until proven otherwise. This represented a fundamental shift from the institutional model that had warehoused nonspeaking people without communication access for generations.
1990s-2000s: Technological Explosion¶
The Dynavox, introduced in the early 1990s, pioneered touch screens with dynamic displays and integrated word prediction—the ancestor of modern tablet-based AAC systems. As computing technology advanced, AAC devices became more powerful, portable, and capable of natural-sounding speech synthesis.
The internet created new possibilities: email and online communication allowed nonspeaking people to participate in conversations without real-time typing pressure, reach global audiences through writing, and connect with other AAC users worldwide. The disability rights movement, gaining momentum through the 1990s, increasingly included nonspeaking advocates using AAC to articulate their own perspectives.
Cost remained a significant barrier. Dedicated AAC devices could cost $8,000-$15,000 or more, often requiring complex insurance approval processes that assumed nonspeaking people didn't "really" need to communicate. Many families went without, cobbled together low-tech solutions, or fought years-long battles with insurance companies.
2010s-Present: Tablet Revolution and Normalization¶
The introduction of iPads and Android tablets transformed AAC access. Suddenly, devices capable of running sophisticated AAC software cost $300-800 rather than $10,000+. Apps like Proloquo2Go, TouchChat, and LAMP Words for Life put powerful communication tools within reach of many more families.
This democratization came with tradeoffs: tablets aren't as durable as dedicated devices, may not have the same access options (eye tracking, switch access), and insurance coverage for AAC apps remained inconsistent. But the visibility of tablets in everyday life also reduced stigma—an AAC user with an iPad looks like anyone else using technology.
Custom voice banking—recording a person's biological voice before illness or injury destroys it, then using those recordings to create a synthesized voice that sounds like them—became increasingly accessible. What was once experimental technology available to few became a standard option for people anticipating voice loss from ALS, cancer, or other conditions.
The "presume competence" principle gained wider acceptance, though implementation remained uneven. Research consistently demonstrated that given appropriate AAC systems and communication partners who presumed competence, nonspeaking individuals showed capabilities far exceeding what institutional models had assumed. The problem had never been the nonspeaking person's capacity—it was society's failure to provide access and to listen.
Ongoing Challenges¶
Despite advances, significant barriers persist. Insurance coverage for AAC remains inconsistent and inadequate. Many speech-language pathologists lack AAC training. Medical professionals routinely talk to companions rather than AAC users. Employment discrimination against nonspeaking people remains widespread. The assumption that speech equals intelligence persists despite decades of evidence to the contrary.
For AAC users in crisis—medical emergencies, police encounters, psychiatric holds—communication access can be literally life-or-death. Devices get taken away, batteries die, time pressure prevents adequate communication, and decisions are made without the nonspeaking person's input. The right to communicate, while increasingly recognized in principle, remains inconsistently protected in practice.
Era-Specific Character Implications¶
Cody Matsuda (Born 1979; Became Nonspeaking Spring 1995): Cody became nonspeaking at 16 in 1995, in the midst of the AAC technological explosion but before the tablet revolution made devices widely affordable. His family's financial resources (Ellen and Greg Matsuda's professional careers) meant access to dedicated AAC devices that many families couldn't afford—the expensive Dynavox-era technology that required significant investment.
His transition to AAC occurred during the period when "presume competence" was gaining traction in professional circles but hadn't yet reached mainstream consciousness. The assumption that his loss of speech meant loss of intelligence—demonstrated by institutions, medical providers, and strangers—reflects the dominant framework of that era and the ongoing failure to fully embrace presumed competence.
Growing up with AAC in the late 1990s and 2000s meant navigating a world where technology improved rapidly but social attitudes changed slowly. His sophisticated written work—publishing books, co-authoring academic papers, presenting at conferences—represents both his individual brilliance and the broader truth that the AAC field had always known: nonspeaking people have as much to say as anyone else, given the tools and opportunity.
Cody's experience with medical gaslighting before his brain injury (CFS dismissed, suicidal ideation minimized) and communication barriers after it reflects the compounded discrimination facing disabled people whose conditions are first disbelieved, then whose subsequent disabilities are used to further marginalize them.
Charlie Rivera (Born 2007; Intermittent → Full-Time AAC): Charlie's AAC journey differs fundamentally from Cody's. He retains the physical ability to speak; his AAC use is driven by energy conservation in the context of CFS/ME and dysautonomia. This positions him in a different relationship to AAC—as a tool for energy management rather than motor compensation.
His use of AAC increases over his lifetime, from intermittent during severe flares to full-time in his final years. This trajectory reflects the reality of many people with progressive conditions or fluctuating disabilities: AAC use isn't binary (speaking vs. nonspeaking) but a continuum, with different tools appropriate at different times.
Charlie's custom voice bank and laugh macros represent the cutting edge of contemporary AAC: technology sophisticated enough to preserve individual personality, humor, and emotional expression. His approach to AAC—programming laugh recordings, treating the device as an extension of self rather than a medical necessity to be minimized—reflects disability pride and the understanding that adaptive technology is about living fully, not just surviving.
As a high-profile musician using AAC, Charlie's visibility challenges stereotypes about who uses communication devices and what AAC users can accomplish. His Grammy acceptance speech delivered via AAC (in his own synthesized voice) reaches audiences who may never have seen AAC used by someone at the top of their profession—normalizing adaptive technology in elite professional spaces.
TYPES OF AAC¶
Unaided AAC (No External Device)¶
Sign Language: - ASL (American Sign Language) - Complex, complete language - NOT "broken English" on hands - Own grammar, syntax, culture - Requires motor planning (may be affected by apraxia)
Gestures: - Natural gestures - Pointing - Facial expressions - Body language
For Cody: - Family learned ASL after his injury - Ellen, Greg, all siblings became fluent - ASL is primary with family/friends who know it - Faster, more natural than device - But not everyone knows ASL
Aided AAC (External Tools)¶
No-Tech/Low-Tech: - Communication boards - Picture cards - Alphabet boards - Writing/typing
Mid-Tech: - Simple voice output devices - Single message devices - Pre-recorded messages
High-Tech (SGDs - Speech Generating Devices): - iPad with communication apps - Dedicated AAC devices - Text-to-speech - Symbol-based or text-based - Customizable, portable
For Cody: - Uses high-tech AAC device - Text-based (can type, literate) - Text-to-speech output - Faster than writing - Can save common phrases - Also uses laptop for longer writing (published author)
CODY'S SPECIFIC COMMUNICATION¶
What Happened¶
Before (Birth to Spring 1995): - Spoke verbally (likely formal, precise like other autistic characters) - CFS undiagnosed, dismissed as depression - Medical gaslighting - Told Dr. Sato "I don't want to wake up tomorrow" - Dismissed as "teenage melodrama"
The Crisis (Spring 1995, age 16): - Overdosed on Fluoxetine (~28 capsules) - ICU: Cardiac arrest, seizure - Anoxic brain injury (brain deprived of oxygen) - Survived but lost ability to speak - Motor apraxia for speech specifically
What Was Lost: - Motor planning for speech (apraxia) - Ability to coordinate oral muscles for speaking - Voice production
What Was NOT Lost: - Language comprehension (understands everything) - Intelligence (brilliant writer, 92nd percentile English equivalent) - Thoughts, ideas, personality - Ability to communicate (just changed modality)
Motor Apraxia (Cody's Specific Condition)¶
What It Is: - Brain knows what to say - Can't execute motor plan for speech - Disconnect between intention and motor control - Specific to speech production
What It's Like: - "I know the words. I know exactly what I want to say." - "My mouth won't cooperate." - "It's like knowing a piano piece but fingers won't play it." - Frustrating, isolating - NOT cognitive problem - motor problem
Why Writing/Typing Works: - Different motor pathway than speech - Hands not affected - Can express fully through text - Published author - language skills intact
Communication Modalities¶
With Family (ASL Fluent): - Primary: ASL - Fast, natural, expressive - Full language, complete thoughts - Emotional nuance - Ellen, Greg, all siblings fluent
With Andy (Boyfriend/Partner): - ASL (Andy learning, becoming fluent) - AAC device for complex conversations - Writing/texting - Developed their own shorthand - Andy understands Cody completely
With Strangers/Public: - AAC device (text-to-speech) - Typing to communicate - Writing when needed - Not everyone knows ASL - Device gives voice (literally)
For Writing/Academic Work: - Laptop/computer - Published author - Academic papers - Co-authors with Ellen, Greg - Language skills sophisticated
PRESUME COMPETENCE¶
What It Means¶
The Principle: - Assume nonspeaking people understand - Assume intelligence unless proven otherwise - Default to competence, not incompetence - Communication difference ≠ cognitive difference
Why It Matters: - Nonspeaking people often treated as intellectually disabled - Assumed to not understand - Talked over, talked about (not to) - Denied agency and autonomy - Cody's intelligence questioned constantly
For Cody Specifically¶
What People Assume (Wrongly): - Can't understand complex ideas - Needs simplified language - Can't have sophisticated thoughts - Lost intelligence with speech
The Reality: - Brilliant writer - Analyzes literature at college professor level - Discusses To Kill a Mockingbird, 1984, The Crucible - Published author - Disability rights advocate - Co-presents at academic conferences - Lost speech, NOT mind
How to Write This¶
DO: - Show people underestimating Cody - Show Cody's frustration - Show him proving them wrong - Show allies presuming competence (Andy, family) - Show sophisticated thoughts in AAC/writing
DON'T: - Simplify Cody's language in internal monologue - Make him "learn" things he already knew - Treat loss of speech as loss of intelligence - Have characters surprised he's "so smart" (ableist)
AAC DEVICE LOGISTICS¶
Physical Device¶
What It Looks Like: - Likely iPad with AAC app OR dedicated device - Protective case - Shoulder strap or bag - Always with him - Backup battery/charger
Features: - Text-to-speech (types, device speaks) - Saved phrases (common responses) - Customizable - Portable - Internet access (for research, communication)
Using the Device¶
Process: 1. Type message on device screen 2. Device speaks message aloud 3. OR shows text to person
Considerations: - Takes time (typing = slower than speech) - People interrupting before he finishes - Assuming he's done when he's not - Looking at person, not device (eye contact with human, not screen) - Device malfunction = communication loss - Battery dying = panic
In Scenes: - Show typing delay - Others waiting (or not waiting) patiently - Cody looking at person while device speaks - Adjusting volume - Saving phrases for later use - Frustration when device glitches
COMMUNICATION ACCESS¶
What It Means¶
Definition: Nonspeaking people have right to communicate and be understood. Society must provide access.
Components: - AAC devices/tools available - Training for communication partners - Time to communicate - Respect for communication mode - Accessible information - Communication support
Barriers to Access¶
Attitudinal: - Not waiting for AAC response - Talking to companion instead of AAC user - Assuming incompetence - Impatience with communication speed - Dismissing AAC as "not real communication"
Physical: - Device malfunction - No power/internet - Loud environments (can't hear device) - No flat surface to set device - Fine motor difficulties (if applicable)
Systemic: - AAC devices expensive (insurance barriers) - Lack of AAC-trained professionals - No communication support in schools/jobs - Inaccessible information (can't ask questions)
For Cody: - Device expensive (Ellen and Greg could afford) - Finding AAC-competent professionals - People talking to Andy instead of him - Assumptions he doesn't understand - Fight for communication access ongoing
CHARLIE'S SPECIFIC AAC USAGE¶
Context and Need¶
Charlie's Conditions: - Chronic Fatigue Syndrome (CFS/ME) - POTS (Postural Orthostatic Tachycardia Syndrome) - Dysautonomia - Gastroparesis - Chronic migraine syndrome
Why AAC: - Vocal energy depletion during severe flares - Speaking requires physical energy Charlie doesn't always have - Not motor apraxia (like Cody) but energy conservation - Intermittent/part-time use when spoons are low - Full-time use in final years (age 70+)
What Was NOT Lost: - Ability to speak (when energy permits) - Language skills - Desire to communicate - Personality, humor, boldness - Musicality and emotional expression
Custom Voice Bank and Laugh Macros¶
Custom Voice Bank: - Recorded Charlie's actual voice before vocal deterioration - Preserves his specific tone, inflection, accent - Device speaks in Charlie's voice, not generic synthesized voice - Allows others to hear "Charlie" even when he can't produce vocal sound - Maintains identity and personality through technology
Custom Laugh Macros (Charlie's Innovation):
Charlie programmed multiple laugh recordings into his AAC device, each capturing a specific type of laughter and emotional expression:
-
"BIG GAY CACKLE" - Loud, unrestrained, head-thrown-back laugh that could fill a room. Used when something is genuinely hilarious and joy cannot be contained.
-
"evil goblin giggle" - Mischievous, plotting laugh. The sound he makes when planning something delightfully chaotic or when he's being deliberately impish.
-
"sitcom laugh track" - Recorded canned laughter. Used when something is so absurd that only artificial laughter can adequately respond, or when social situations demand laughter but Charlie's exhausted.
-
"polite chuckle (dead inside)" - Exactly what it promises. Social obligation wrapped in exhaustion. Used when someone tells a bad joke but social norms require acknowledgment.
-
"wheezing at 3am laugh" - The specific hilarity of being sleep-deprived and everything becoming funny. Breathless, almost painful laughter that comes from fatigue-induced giggles.
Why This Matters: - Proves AAC isn't just functional communication but full self-expression - Demonstrates disabled people using technology don't lose humor, personality, or joy - AAC as tool for emotional authenticity, not just basic needs - Laugh macros preserve Charlie's specific personality when biological voice fails - Challenges assumptions that AAC communication is sterile or impersonal
Communication Modalities¶
When Charlie Uses Each Mode:
Verbal Speech (Biological Voice): - When energy permits - Good health days - Short conversations - Intimate moments with Logan - Performance (when possible) - Becomes less reliable in late life
AAC Tablet with Custom Voice Bank: - During severe CFS/POTS flares - When vocal energy depleted - Longer conversations requiring sustained communication - Professional accessibility consulting work - Public appearances when fatigued - Primary voice in final years (age 70+)
ASL (American Sign Language): - When vocal energy depleted but hands functional - With family/friends who know ASL (Logan, band family) - When AAC device inaccessible or uncharged - During medical appointments
Music, Humming, Touch: - When all other modalities exhausted - Nonverbal emotional expression - Bedbound days - Final days
AAC and Performance Career¶
Challenge: - Grammy-winning jazz musician whose career depends on voice/sound - How does AAC fit into performance identity?
Reality: - AAC for communication, not performance (plays instruments) - Conserves vocal energy for music when possible - Uses AAC for MC segments between sets - AAC during interviews and public appearances - Later career: AAC allows continued accessibility advocacy work - Proves disabled musicians can use adaptive tech and still create at highest levels
Community Response: - Fans understand AAC as access tool - Music community sees it as practical accommodation - Challenges stereotypes about what "real" musicians look/sound like - Normalizes AAC in professional creative settings
Differences from Cody¶
Cody (Motor Apraxia): - Permanent loss of speech motor planning - AAC full-time from age 16 onward - Can't produce speech even with energy - Writing and typing fully intact - AAC necessary for all communication outside ASL
Charlie (Energy Depletion): - Intermittent AAC use (varies by day/health) - Can speak when energy permits - AAC conserves limited energy resources - Full-time AAC only in final years - AAC as energy management, not motor compensation
Both Share: - Presumed competence essential - AAC as real communication (not "less than") - Multiple modalities (AAC + ASL + other methods) - Fighting ableism around communication access - Chosen family learning ASL to support them
Intimate Relationships and AAC¶
Charlie and Logan: - Logan treats AAC as Charlie's voice (because it is) - No differentiation between "AAC Charlie" and "verbal Charlie" - AAC doesn't reduce intimacy or emotional connection - Logan can hear Charlie's personality through device - Laugh macros during private moments (playfulness preserved) - "I love you" through AAC carries same weight as spoken words
Physical Intimacy: - AAC set aside during some intimate moments - Touch and presence don't require words - AAC nearby for communication when needed - Custom voice bank allows "Charlie's voice" in bedroom - No shame or awkwardness around AAC as part of life
Medical Settings¶
Challenges: - Doctors not waiting for AAC response - Talking to Logan instead of Charlie - Assumptions AAC user can't answer complex questions - Time pressure in appointments - Device battery/technical issues during emergencies
What Works: - Logan advocating but redirecting questions to Charlie - Medical team trained in AAC communication access - Pre-typed common medical responses - Yes/no questions when appropriate - Allowing time for Charlie to type complex answers - Treating AAC as valid medical communication
AAC as Activism¶
Charlie's Impact: - High-profile musician using AAC challenges stereotypes - Shows AAC users as full people with humor, sexuality, creativity - Normalizes adaptive technology in mainstream settings - Accessibility consulting work using AAC demonstrates expertise - Laugh macros go viral as example of disability innovation and joy
WRITING NONSPEAKING CHARACTERS¶
Internal Monologue¶
Cody's Thoughts: - Full, complete, sophisticated - Same intelligence as always - Same personality - Same vocabulary - Complex ideas, emotions, analysis - No "simplification" needed
Example:
The irony isn't lost on Cody—he tried to silence himself permanently,
and the universe decided to take his voice anyway. But not his words.
Never his words. Those belong to him, flowing through his fingers
onto screens and pages, reaching people in ways his voice never did.
External Communication (AAC Device)¶
Format Options:
Option 1: Direct Dialogue
Option 2: Description + Dialogue
Option 3: Description + Italics
Recommendation: - Use direct dialogue ("device says") for flow - Mention typing occasionally to remind readers - Don't over-explain mechanism every time - Trust readers to remember
ASL Communication¶
Format Options:
Option 1: Direct Dialogue (Translated)
Option 2: Description
Option 3: Cultural Notes
Cody's signing gets bigger when he's upset, filling the space with
his anger in ways his device never could.
Recommendation: - Translate ASL to English for readability - Occasionally mention physical aspects - Note emotional expression through signing - Don't write ASL grammar (confusing for readers)
Communication Timing¶
Show the Reality: - Typing takes time - Others waiting (or not) - Interruptions - Cody holding up hand ("not finished") - Frustration when rushed - Relief when given time
Example:
Cody types on his device, and Andy waits. He's learned not to fill
the silence, not to guess what Cody's saying, not to jump in with
assumptions. He just waits.
"I hate that doctor," Cody's device finally says.
Andy nods. "Me too."
ABLEISM NONSPEAKING PEOPLE FACE¶
Communication Denied¶
Talking Over: - Starting to speak before device finishes - Interrupting typing - Finishing sentences (wrong) - Not giving time to respond
Talking Around: - "What does he want?" (to companion) - Discussing them like not present - Medical decisions without input - Assumptions without asking
Communication Dismissed: - "That's not what you really mean" - Ignoring AAC communication - Only valuing verbal speech - Treating AAC as "less than"
Competence Questioned¶
Assumptions: - Can't understand - Needs simple language - Intellectually disabled (by default) - Device operates them (not other way around)
For Cody: - Testing center flagging scores (thought someone else took test) - Doctors talking to Ellen instead of him - Strangers talking to Andy instead of him - Shock when he writes sophisticated essays - "I didn't think nonspeaking people could..."
Access Denied¶
Barriers: - No AAC training for professionals - Inaccessible information (video without captions) - Forms requiring verbal response - Phone calls impossible (no relay) - "Just write it down" (not always possible/appropriate)
Medical Settings: - Doctors not trained in AAC - No time given to type responses - Medical questions to family, not patient - Consent assumed (didn't wait for answer) - Procedures explained verbally only
RELATIONSHIPS AND AAC¶
Cody and Andy¶
How They Communicate: - ASL (Andy learning, becoming fluent) - AAC device - Texting - Writing notes - Developed their own shorthand - Don't need words sometimes (presence enough)
What Works: - Andy waits for communication - Never finishes Cody's sentences - Treats AAC as Cody's voice (because it is) - Learning ASL (showing commitment) - Asks Cody directly (never asks others about him) - Presumes competence always
Intimacy: - Communication isn't just words - Touch, presence, eye contact - ASL in darkness (Andy learning to sign by touch) - Texting "I love you" from across room - Device saying "come here" - Andy knows the tone - Silence comfortable, not awkward
Cody and Family¶
Matsuda Family Culture: - All learned ASL (Ellen, Greg, Susie, Pattie, Joey) - Seamless code-switching (ASL ↔ English) - Cody fully included in conversations - No one talks over him - Time always given - ASL at dinner table, family gatherings
Ellen and Cody: - Co-authoring academic papers - Her understanding his signing nuances - Professional partnership (equals) - Mother-son relationship maintained - Communication access never question
Greg and Cody: - Two autistic men (different modalities) - Co-presenting at conferences - Greg using spoken word, Cody using device - Both valid, both valued - Intergenerational autism advocacy
AAC AND IDENTITY¶
Nonspeaking Identity¶
Complex Feelings: - Grief for lost voice - Acceptance of new communication - Pride in AAC skills - Frustration with barriers - Identity as nonspeaking advocate
For Cody: - Lost voice traumatically (not from birth) - Remembers speaking - Grief and acceptance coexist - Found new voice through writing - Identity: nonspeaking autistic advocate - Uses experience to help others
Voice vs. Communication¶
The Debate: - Is AAC "finding voice"? - Or is AAC the voice? - Cody: "Device IS my voice now" - Not "giving me a voice" (always had voice - thoughts, ideas) - Device gives others ACCESS to voice
Language Matters: - Not "trapped" in silence (ableist framing) - Not "locked in" (implies prison) - "Nonspeaking" not "nonverbal" (may vocalize) - AAC user, not "using AAC" (identity vs. tool)
Writing as Voice¶
Cody's Published Work: - Voices Beyond Speech (book title) - Disability rights essays - Academic collaborations - Blog, social media presence - Speaking (writing) for nonspeaking community
The Power: - Words reaching thousands - Published voice - Academic credibility - Advocacy platform - Proof: nonspeaking ≠ nothing to say
MEDICAL/EMERGENCY SITUATIONS¶
Medical Settings¶
Communication Challenges: - Doctors not waiting for AAC - Questions requiring immediate answer - Procedures starting before consent given - Medical jargon not accessible - Pain scales assuming verbal communication
What Cody Needs: - Time to type responses - Questions in writing (process time) - Yes/no questions when appropriate - AAC-trained medical staff - Ava or family present to advocate
What Often Happens: - Doctor talks to Ava: "Does he have pain?" - Doctor doesn't wait for device response - Assumptions made without asking Cody - Frustration, violation, medical trauma compounded
Emergencies¶
The Problem: - AAC takes time (emergencies don't allow) - Device might be inaccessible - Pain/distress affecting typing - First responders untrained - Quick decisions needed
Solutions: - Medical alert bracelet (info about communication) - Emergency card (in wallet) - ICE contacts (phone) - Ava/family knowing medical history - Training first responders (ideally)
Worst Case: - Cody conscious but can't communicate - Device out of reach - No one understands ASL - Medical decisions made without input - Violation of autonomy
WRITING AAC IN SCENES - EXAMPLES¶
Everyday Conversation¶
"How was your day?" Andy asks.
Cody types on his device. "Exhausting. Dr. Morrison didn't wait for
my answers. Talked to Ava the whole time."
Andy's jaw tightens. "Did you tell him to—"
Cody holds up a hand: not finished. Types more. "I tried. He said
'this will be faster' and asked Ava everything. Like I wasn't there."
"That's bullshit."
"I know." Cody's device delivers the words flatly, but his face shows
the anger the synthesized voice can't.
Academic/Professional¶
The conference room falls silent as Cody's device speaks: "The medical
model of disability frames nonspeaking people as broken—as if the
problem is our mouths, not society's refusal to listen."
He pauses, typing. Ellen sits beside him, patient. She knows her son's
thoughts are worth waiting for.
"But I'm not broken," the device continues. "My brain forms words.
Complex, sophisticated words. The only difference is they come through
my fingers instead of my mouth. If that makes you uncomfortable, that's
your problem, not mine."
Intimate/Emotional¶
Andy watches Cody's hands move in the dim light, signing slowly.
*I'm scared.*
"Of what?" Andy signs back, still learning but trying.
Cody switches to his device—some things need more words than he has
signs for. "That you'll get tired of waiting. Of giving me time to say
what I mean. That you'll find someone easier."
Andy catches Cody's hands, stilling them. "Hey. Look at me."
Cody does.
"You're not hard. You're not too much. And I'm not going anywhere."
Cody types with one hand, the other held by Andy. "Promise?"
"Promise."
WHAT NOT TO DO¶
Avoid These Tropes:¶
❌ "Miracle cure" restoring speech - Cody's brain injury is permanent - Speech won't return - AAC IS his voice now
❌ AAC as "less than" real communication - Device IS real communication - Not substitute—actual voice - Valid and complete
❌ "Trapped" or "locked in" language - Cody not trapped - Has communication - Ableist framing
❌ Surprised he's intelligent - Nonspeaking ≠ cognitively impaired - Stop being shocked - Presume competence
❌ Only communicating "important" things - AAC for everything (jokes, small talk, "I love you") - Not just needs/emergencies - Full personhood
❌ Device speaking for itself - "The device says"—Cody says - Device is tool, Cody is person - "Cody's device says" OR "Cody says" (through device)
❌ Perfect AAC access - Barriers exist - Device malfunctions - People don't wait - Show the reality
RESOURCES CONSULTED¶
- AssistiveWare (AAC software)
- ASHA (American Speech-Language-Hearing Association)
- AAC users' first-hand accounts
- Research on motor apraxia
- Nonspeaking advocacy organizations
- Communication access guidelines
WRITING CHECKLIST¶
When writing Cody/AAC scenes: - [ ] Internal monologue sophisticated (presuming competence) - [ ] Communication mode specified (ASL, device, writing) - [ ] Timing realistic (typing takes time) - [ ] Others' responses authentic (waiting or not, respectful or not) - [ ] Device logistics mentioned occasionally - [ ] Communication barriers shown when relevant - [ ] Ableism shown accurately (talked over, presumed incompetent) - [ ] Multiple modalities (ASL with family, device with others) - [ ] Emotional nuance in communication - [ ] AAC treated as real communication (not less-than) - [ ] Lost speech ≠ lost intelligence - [ ] Relationships show accommodation and respect - [ ] Medical settings show access barriers - [ ] Writing/academic work shows sophisticated language - [ ] Avoid miracle cure and ableist tropes
This is a living document. Update as you research further or develop Cody's storyline.
Last Updated: February 5, 2026