Almost a year ago, I drove to a mall, bought an iPad, ordered an iPad case, and made a purchase that felt like a radical gesture: I downloaded Speak For Yourself, an Augmentative and Adaptive Communication (AAC) app that might give Fiona the ability to speak.
Here’s why the app felt kind of radical: The main screen features almost 120 words, which means the icons are small, and there’s no way to make the targets bigger. And Fiona at the time could not isolate her pointer finger (she still can’t), so plenty of people on Fiona’s team doubted whether this app was well suited for Fiona. How would she hit the targets? Maybe she needed an app with fewer words? Maybe this app was too complex for her?
Certainly she could have benefited from an app with bigger targets. But with bigger targets on an app came a serious downfall: it meant the app either offered far fewer words, or it presented words in many more layers of programming, so that any given word required several hits to find it. A user had to scroll through five screens just to say “Mustard.” No fun.
Choosing an app was a yearlong ordeal, and there were plenty of differing opinions. Everyone knew Fiona’s receptive communication far outweighed her expressive, but nobody quite agreed on how best to enable her to “speak.” We consulted with Fiona’s therapists. We met with two AAC experts stationed in Burlington. We met with a Prentke Romich representative. I corresponded with Dana Nieder at Uncommon Sense. We looked at, and in some cases tested out, Proloquo2go, Touch Chat, LAMP: Words for Life, Aacorn, and Go Talk Now. Meanwhile, we used a binder of words and pictures on cards, which Fiona could point to in order to tell us what she wanted. But the binder was a cumbersome and short-term solution. We wanted what AAC folks call a “speech-generating device,” a device that did what her mouth wanted to do but couldn’t. A device that actually talked.
I kept returning to Speak For Yourself. Sure, the icons were tiny, but every word in Speak For Yourself is either on the main screen or on a secondary screen, which means every word only takes one or two hits to say. There were other bonuses:
- With a capacity of almost 14,000 words, the device would grow with Fiona. If she learned it, she would never need to learn another system.
- It was already preprogrammed. I didn’t have to decide which words should go where. The app’s designers already did the research for me.
- It was highly adaptable. I could mask all but four words, or open up thousands. And there was plenty of space to add words from Fiona’s world, like “Auntie Kim” or “Yo Gabba Gabba.”
- It was incredibly user-friendly. Masking or unmasking words took two seconds. Searching for words took as fast as I could spell. And adding another word took under a minute.
- Unlike with other apps, the targets never moved. Imagine learning to type on a keyboard where the letters keep shifting place. Like a keyboard, the app was designed with “motor planning” in mind.
But would it work for Fiona? Accessibility was the biggest concern. She had weak fingers and she couldn’t isolate them. When shown the app, she would splay her fingers and whack at the screen, hitting five words at once, lighting up a train of strange language. Were we setting her up for failure?
I suspect some folks thought Yes. But the advice from our AAC expert made the most sense to me: Presume competence. Give her the chance. Don’t choose a system based on her fine motor skills; they will catch up. Choose a system based on her language needs.
“How many words should we give her,” a therapist once asked the AAC expert from Burlington when she visited our living room.
“Give her all the words,” she said. “She understands them, so give them to her.”
I loved that idea. Give her all the words. It sounded so bountiful, so abundant and promising, like words were apples overflowing from dozens of barrels.
I also liked the AAC expert’s final advice: “Give it a year.” Whatever app we chose, we should try it out for a year before expecting any results. She reasoned that it takes a year for the average child to start producing language. A year of lullabies and Dr. Seuss and ABC’s before a one-year-old finally says mom or ball or no. So we should expect the same with a new AAC app, she said. A year of input before seeing any output.
To show you where we are a year later, here are two conversations I had with Fiona recently:
[I’m sitting in a rocker with both kids. Petra is snuggled to my left. Fiona is cradled in my right arm. Her talker (the iPad) is on a bench across the room.]
Fiona: [Signs eat.]
Me: You wanna eat?
Fiona: Mm-mm. [Nods.]
Me: We just ate.
Fiona: [Signs eat again.]
Me: Okay. [I put her down.] Go to your chair.
[She crawls to her highchair. I’m tired, and Petra’s snuggly so I don’t budge, hoping Fiona is just bluffing.]
Fiona: [After a minute of waiting by her chair, she walks five shaky steps to the bench where her talker sits. She taps on her talker.] Eat.
[In other words, How many ways do I have to tell you, Mom? I want to eat.]
Me: [Laughing.] Okay. What do you want to eat?
[Fiona sits in her highchair. She’s done eating dinner. Her talker (the iPad) is in front of her.]
Justin: What do you want to do next?
Fiona: [Taps on her talker.] Play.
Me: Play what?
Fiona: [Taps on her talker.] Dad.
Me: [Laughing.] You want to a play with Dad?
Fiona: Mm-mm [Nods.]
Justin [mouth full of food]: Dad’s still eating dinner.
[A few minutes pass.]
Justin: What do you want to do now?
Fiona: [Taps on her talker, then smiles at something on the screen.] Snuggle.
Justin: Who do you want to snuggle with? (I think she’ll say Dad.)
Me: Eli? [A kid from school.] Eli’s not here right now. Who else do you want to snuggle with?
Fiona: Erin [Fiona’s Personal Care Assistant.]
I suppose these look like simple conversations, but they are revelations in our house. We are conversing with our kid! Fiona is answering questions! She’s offering spontaneous ideas! She’s not yet combining two words, but her single-word responses are totally appropriate, which means her little thumb, steadied against her bent index finger, is reaching intended squares.
Also, any time she says a person’s name, she has navigated to the correct secondary screen, which means her thumb has made two accurate “hits” in a row. In a few weeks shy of a year, she has gone from randomly batting at words with her whole hand, to intentionally hitting them and even navigating secondary screens in order to communicate.
In part 2, (which I haven’t written yet) I’ll tell you how we got here. I’ll give you some of our steps, side-tracks, and even backtracks on the road to helping Fiona arrive where she is today.