AI? A response.

At Stillman Academy, we serve students—often dyslexic students—whose intelligence is obvious in conversation but doesn’t reliably show up on paper yet. We begin with a careful analysis of literacy skills (decoding, spelling, fluency, comprehension, and writing), then design an individualized instructional plan using structured, systematic methods—including the Slingerland Approach—so foundational skills become stable and automatic. The goal isn’t to shrink a child’s thinking to match today’s reading level; it’s to raise reading and writing skills to meet the child’s thinking—while protecting confidence, curiosity, and dignity along the way.

We were recently asked, “What’s the use in learning to read if you’re just going to use AI to write your social marketing posts?” the deeper question is what reading is for. Literacy isn’t just a performance. It’s agency: the ability to understand what’s being said to you, test it, and decide what you will do next.

This is how we approach AI use at Stillman Academy. We don’t romanticize it, and we don’t outsource the human center of teaching to it. We treat it the way the world has always treated transformative tools: with guardrails, with judgment, and with purpose. We use it to help more students access big ideas—coastal ecology, systems design, pharmaceutical price gouging, poetry and art, civil rights history—at a reading level they can genuinely read today, while we systematically build the foundational skills that will make tomorrow’s reading automatic. We use it to do more of what matters, not less: to see a child’s intelligence clearly, to affirm it, and to teach the skills that let that intelligence be heard—on the page, in the world, and eventually, in the life they choose to build.

This is exactly how Stillman Academy’s CRUSHER project began – a teacher-student collaborative, tutoring program tailored for students incorporating subjects that are important to them at their reading level. One of our students conceptualized and designed an amphibious robot that could restore kelp forests by identifying and killing urchins by submitting his questions to AI. In this way, the student learned to “read” an ecosystem with that same discipline. We studied the chain of causes behind urchin barrens along the Pacific coast—how predator disruption and sunstar wasting syndrome had tipped an entire system—so reading became a method for understanding interdependence, not just decoding words on a page. What matters in CRUSHER isn’t that a child “uses AI.” It is that a child uses literacy for a purpose worthy of their mind: proposing a tool concept that could help restore kelp forests and biodiversity. The project makes the practice feel honest. It answers the quiet question many struggling readers carry—Why am I doing this?—with a real-world reason: because reading and writing are one way you can participate in the world you care about. Stillman Academy’s CRUSHER project also helps us explore domains of knowledge a teacher may not be an expert in, while the teacher guides the human work: careful reading, questioning, summarizing, revising, and choosing what was solid enough to build on. Using AI tools as part of Stillman Academy’s CRUSHER project extends reach, while the student remains responsible for interpreting, questioning, and meaning development.

Stillman Academy uses AI for drafting, planning, and yes, marketing posts—not to outsource thinking, but to protect the hours that should never be outsourced: assessment, direct instruction, building engaging decodable materials tied to student interests and lexical knowledge, training staff, and supporting families. The goal of operational leverage is not distance from students; it’s more time with them.

Take Winston Churchill, who was dyslexic. He is remembered for employing language that could steady a nation when fear was the default. That kind of power isn’t about writing quickly—it’s about understanding, choosing, and persuading in public.

Or Jacqueline Woodson, also dyslexic. She became one of the most important voices in contemporary children’s literature by refusing to mistake speed for depth. Her story resonates with what parents often see at home: a child reads one sentence beautifully and guesses at the next as if the words have turned slippery. That unpredictability isn’t a character flaw. It’s a reading system that hasn’t stabilized yet—and the child often experiences that instability as doubt about themselves. The work is to make print predictable so intelligence can show up consistently, not only on “good reading days.”

Jamie Oliver, also dyslexic, was written off in school—and then went on to publish 30+ books and sell more than 14.55 million, becoming the best-selling British non-fiction author. His life clarifies a point that matters to families: the turning point usually isn’t “try harder.” It’s instruction and conditions that stop confusing decoding difficulty with laziness. Shame doesn’t build literacy. Explicit teaching and well-designed practice do.

Jack Horner, another dyslexic, built a world-class scientific career by learning to read patterns in fragments—bone shards, footprints, partial evidence—and then insisting the fragments could form a coherent story. That is the posture we take with literacy data. We don’t treat “behind” as one category. We ask what’s secure, what’s fragile, and why—then we teach what the brain needs so reading stops feeling like guessing and starts feeling like knowing.

Gavin Newsom, dyslexic, is one example of a person who learned early that language can become a gate—used to judge, sort, and sometimes underestimate. Dyslexic families recognize that pattern immediately: when adults misunderstand a process, they judge a person. Our job is to reverse that. We protect the person, we respect the mind, and we fix the process—skill by skill—until the child has a stable system that supports their ideas instead of trapping them.

George Lucas, and Steven Spielberg, both dyslexic, didn’t change film by letting technology tell the story for them. They changed film by using new tools to expand what stories could be shown—while keeping human judgment and responsibility at the center.  That’s the right frame for AI in education. AI is not the teacher. It is a tool that can widen access to learning materials—but it can’t replace the moral obligation a teacher has to protect a child’s dignity and insist that the child still does their own thinking.

Auguste Rodin, dyslexic, didn’t become Rodin by skipping difficult work. He became Rodin by returning to it—again and again—until form emerged. That is what structured literacy feels like when it’s done well: patient craft, systematic practice, and revision without humiliation. AI only belongs in that world if it supports the craft—never if it becomes a way to avoid it.

Frank Whittle, dyslexic, helps explain why the “tools question” is never just about tools. His jet engine didn’t merely make flight faster; it changed what people could reach, what economies could do, and what the modern world would demand of us. Big tools require big questions.

Ansel Adams, dyslexic, teaches the next step: attention. He trained the world to see what a landscape is saying before it disappears, and he developed technical methods (like the Zone System) to bring precision to that seeing. 

Robert Ballard, dyslexic, is a powerful model for what comes next: tools plus interpretation. His 1985 Titanic discovery depended on instruments he invented that could extend human sight—and on the human discipline to interpret what those instruments revealed. 

He was initially captivated by Elon Musk’s stated drive toward a manned mission to Mars. Rather than treat it as a passing fascination, we treated it as fuel for serious reading and writing. We asked: Are resources best spent reaching Mars, or preserving life’s delicate balance on Earth? The aim was not “AI use.” The aim was to turn curiosity into literacy—reading and writing used to weigh tradeoffs, investigate evidence, and make a case.

Apple’s rise wasn’t just about clever ideas and short-term gains; it was about another dyslexic, Steve Jobs, and his intuitive grasp of design that made technology accessible to millions. Jobs’ special leadership in conjunction with Apple’s research into product ecosystems, markets, manufacturing, and law—structures that turn concepts into realities at scale—made Apple what it is today.  That’s why when Stillman Academy uses AI-assisted reading passage for students curious about Steve Jobs, we deliver the goods: For our students, the literacy skills are foundational; the ideas are adult-sized.

Charles Schwab, dyslexic, shows how systems can widen or restrict access. Schwab built a “discount broker” model that expanded participation when fixed commissions ended—dropping fees by more than 50% in response to deregulation, according to Schwab’s own history of the period.  That structural lens helped students understand the economy and introduced them to financial literacy.

Our patent law reading passage examined medicine pricing—especially life-saving drugs—and the tactics companies use to extend patent protections, raising costs and narrowing access. The moral weight here is not too large for children; it is too large to be left only to the already powerful. A carefully written passage lets a student practice decoding and sentence structure while also learning to hold a real ethical tension: incentivizing innovation versus protecting vulnerable lives. Then we asked the harder question: what happens when protective systems also create harm?

Joshua Wong, dyslexic, illustrates why this kind of literacy matters. He became influential by learning to name systems clearly and demand accountability—co-founding Scholarism as a teenager and pushing back against policies he believed shaped civic identity and freedom.  In the classroom, the “skill” isn’t outrage. It’s precision: reading closely enough to see structure, and writing clearly enough to explain consequences without being swept away by slogans.

Nolan Bushnell, dyslexic, helped pioneer interactive systems that reward iteration—try, adjust, try again—by co-founding Atari and bringing Pong to life as an arcade breakthrough. Britannica notes Atari sold thousands of Pong arcade machines by 1972, marking the beginning of a new industry.  That iterative rhythm is exactly what developing readers need—and exactly what shame interrupts. AI, used carefully, can reduce the shame tax: it can offer low-stakes practice, immediate feedback, and multiple drafts without a child feeling personally judged.

Notch—Markus Persson—created Minecraft, a world built on construction, revision, and experimentation; Microsoft later bought Mojang for $2.5 billion in 2014.  Whether the medium is games, film, or AI, the principle is the same: new tools reshape reality, and they reshape what “being literate” means. In an AI era, literacy must include the ability to verify, to notice manipulation, and to distinguish confident text from true text—because children will be surrounded by confident text.

Sidney Poitier and Harry Belafonte, both dyslexic, offer a different kind of model: courage paired with responsibility. Their excellence wasn’t limited to art; it extended into civil rights. SNCC’s Digital Gateway describes how Belafonte raised $70,000 in two days during Freedom Summer and, with Poitier, personally carried the cash to Greenwood, Mississippi—where they were ambushed by the Klan and escorted to safety by activists.  This is the kind of history many children never meet—not because it’s unimportant, but because it’s inconvenient. Kids can handle moral choice. What they often can’t handle is being quietly excluded from rich history because the reading level wasn’t built for them.

Muhammad Ali, dyslexic, shows why literacy remains essential even if AI can draft sentences. He won an Olympic gold medal at 18 and became heavyweight champion, then paid a steep price for refusing the Vietnam War—losing his title and being barred from boxing for years.  Courage requires language: language to name what you believe, language to argue with pressure, language to refuse manipulation. When a child reads a story of courage at their level, they inherit a moral vocabulary alongside a growing decoding system.

Frank Whittle’s jet engine didn’t make propellers “stupid”; it made travel more possible. No one wants to take a prop plane to Manila from San Francisco when a jet can do the trip—because the point is not nostalgia for the old tool. The point is reaching the destination better. We accepted Whittle’s innovation because it expanded human capability. We accepted structural innovators like Charles Schwab because systems can be redesigned to widen access. We accepted builders like Jobs because tools can become more humane and usable. We accepted creators like Lucas, Spielberg, Rodin, Woodson, Adams, and Ballard because imagination plus craft can change what a generation sees as possible. And we keep learning from people like Joshua Wong, Jamie Oliver, Poitier, Belafonte, and Ali because courage and responsibility are also forms of literacy.

Stillman Academy uses AI for drafting, planning, and yes, marketing posts—not to outsource thinking, but to protect the hours that should never be outsourced: assessment, direct instruction, building engaging decodable materials tied to student interests and lexical knowledge, training staff, and supporting families. The goal of operational leverage is not distance from students; it’s more time with them.