AI? A response.

At Stillman Academy, we serve students—often dyslexic students—whose intelligence is obvious in conversation but doesn’t reliably show up on paper yet. We begin with a careful analysis of literacy skills (decoding, spelling, fluency, comprehension, and writing), then design an individualized instructional plan using structured, systematic methods—including the Slingerland Approach—so foundational skills become stable and automatic. The goal isn’t to shrink a child’s thinking to match today’s reading level; it’s to raise reading and writing skills to meet the child’s thinking—while protecting confidence, curiosity, and dignity along the way.

Winston Churchill, dyslexic, is remembered for language that could steady a nation when fear was the default. That kind of power isn’t about writing quickly—it’s about understanding, choosing, and persuading in public. We were recently asked, “What’s the use in learning to read if you’re just going to use AI to write your social marketing posts?” the deeper question is what reading is for. Literacy isn’t a performance. It’s agency: the ability to understand what’s being said to you, test it, and decide what you will do next.

Jacqueline Woodson, dyslexic, became one of the most important voices in contemporary children’s literature by refusing to mistake speed for depth. Her story resonates with what parents often see at home: a child reads one sentence beautifully and guesses at the next as if the words have turned slippery. That unpredictability isn’t a character flaw. It’s a reading system that hasn’t stabilized yet—and the child often experiences that instability as doubt about themselves. The work is to make print predictable so intelligence can show up consistently, not only on “good reading days.”

Jamie Oliver, dyslexic, was written off in school—and then went on to publish 30+ books and sell more than 14.55 million, becoming the best-selling British non-fiction author since records began (and second best-selling British author behind J.K. Rowling).  His life clarifies a point that matters to families: the turning point usually isn’t “try harder.” It’s instruction and conditions that stop confusing decoding difficulty with laziness. Shame doesn’t build literacy. Explicit teaching and well-designed practice do.

Jack Horner, dyslexic, built a world-class scientific career by learning to read patterns in fragments—bone shards, footprints, partial evidence—and then insisting the fragments could form a coherent story. That is the posture we take with literacy data. We don’t treat “behind” as one category. We ask what’s secure, what’s fragile, and why—then we teach what the brain needs so reading stops feeling like guessing and starts feeling like knowing.

Gavin Newsom, dyslexic, is one example of a person who learned early that language can become a gate—used to judge, sort, and sometimes underestimate. Dyslexic families recognize that pattern immediately: when adults misunderstand a process, they judge a person. Our job is to reverse that. We protect the person, we respect the mind, and we fix the process—skill by skill—until the child has a stable system that supports their ideas instead of trapping them.

George Lucas, dyslexic, didn’t change film by letting technology “tell the story for him.” He changed film by using new tools to expand what stories could be shown—while keeping human judgment and responsibility at the center.  That’s the right frame for AI in education. AI is not the teacher. It is a tool that can widen access to learning materials—but it can’t replace the moral obligation a teacher has to protect a child’s dignity and insist that the child still does their own thinking.

Auguste Rodin, dyslexic, didn’t become Rodin by skipping difficult work. He became Rodin by returning to it—again and again—until form emerged. That is what structured literacy feels like when it’s done well: patient craft, systematic practice, and revision without humiliation. AI only belongs in that world if it supports the craft—never if it becomes a way to avoid it.

Frank Whittle, dyslexic, helps explain why the “tools question” is never just about tools. His jet engine didn’t merely make flight faster; it changed what people could reach, what economies could do, and what the modern world would demand of us. The Smithsonian notes his engine was patented in 1932 and flight tested in Britain’s first jet aircraft platform, the Gloster E.28/39, in May 1941. Breakthrough tools expand possibility—and they expand responsibility. Big tools require big questions.

That is exactly how the CRUSHER project began – a bespoke tutoring program for a single student incorporating subjects that were important to him at his reading level. This student designed an amphibious robot that could restore kelp forests by identifying and killing urchins that eat kelp. He was initially captivated by Elon Musk’s stated drive toward a manned mission to Mars. Rather than treat it as a passing fascination, we treated it as fuel for serious reading and writing. We asked: Are resources best spent reaching Mars, or preserving life’s delicate balance on Earth? The aim was not “AI use.” The aim was to turn curiosity into literacy—reading and writing used to weigh tradeoffs, investigate evidence, and make a case.

Ansel Adams, dyslexic, teaches the next step: attention. He trained the world to see what a landscape is saying before it disappears, and he developed technical methods (like the Zone System) to bring precision to that seeing.  In CRUSHER, the student learned to “read” an ecosystem with that same discipline. We studied the chain of causes behind urchin barrens along the Pacific coast—how predator disruption and sunstar wasting syndrome had tipped an entire system—so reading became a method for understanding interdependence, not just decoding words on a page.

Robert Ballard, dyslexic, is a powerful model for what comes next: tools plus interpretation. His 1985 Titanic discovery depended on instruments that could extend human sight—and on the human discipline to interpret what those instruments revealed.  In CRUSHER, AI helped us explore engineering domains I don’t claim to be expert in—materials, battery systems, sensing constraints—while I guided the human work: careful reading, questioning, summarizing, revising, and choosing what was solid enough to build on. Tools extended reach; the student remained responsible for meaning.

What mattered in CRUSHER wasn’t that a child “used AI.” It was that a child used literacy for a purpose worthy of their mind: proposing a tool concept that could help restore kelp forests and biodiversity. The project made practice feel honest. It answered the quiet question many struggling readers carry—Why am I doing this?—with a real-world reason: because reading and writing are how you participate in the world you care about.

Steve Jobs, dyslexic, belongs in the next example not as a mascot but as a reminder that invention is never only invention—it is also the systems that decide which innovations survive. Apple’s rise wasn’t just about clever ideas; it was about product ecosystems, markets, manufacturing, and law—structures that turn concepts into realities at scale.  That’s why a second example of AI-assisted passage design centered on a judge who spent a career standardizing patent law to protect inventors. The literacy skills stayed foundational; the ideas stayed adult-sized.

Charles Schwab, dyslexic, shows how systems can widen or restrict access. Schwab built a “discount broker” model that expanded participation when fixed commissions ended—dropping fees by more than 50% in response to deregulation, according to Schwab’s own history of the period.  That structural lens helped students understand patent law as architecture—not trivia. Then we asked the harder question: what happens when protective systems also create harm?

That’s where we examined medicine pricing—especially life-saving drugs—and the tactics companies use to extend patent protections, raising costs and narrowing access. The moral weight here is not too large for children; it is too large to be left only to the already powerful. A carefully written passage lets a student practice decoding and sentence structure while also learning to hold a real ethical tension: incentivizing innovation versus protecting vulnerable lives.

Joshua Wong, dyslexic, illustrates why this kind of literacy matters. He became influential by learning to name systems clearly and demand accountability—co-founding Scholarism as a teenager and pushing back against policies he believed shaped civic identity and freedom.  In the classroom, the “skill” isn’t outrage. It’s precision: reading closely enough to see structure, and writing clearly enough to explain consequences without being swept away by slogans.

Nolan Bushnell, dyslexic, helped pioneer interactive systems that reward iteration—try, adjust, try again—by co-founding Atari and bringing Pong to life as an arcade breakthrough. Britannica notes Atari sold thousands of Pong arcade machines by 1972, marking the beginning of a new industry.  That iterative rhythm is exactly what developing readers need—and exactly what shame interrupts. AI, used carefully, can reduce the shame tax: it can offer low-stakes practice, immediate feedback, and multiple drafts without a child feeling personally judged.

Notch—Markus Persson—created Minecraft, a world built on construction, revision, and experimentation; Microsoft later bought Mojang for $2.5 billion in 2014.  Whether the medium is games, film, or AI, the principle is the same: new tools reshape reality, and they reshape what “being literate” means. In an AI era, literacy must include the ability to verify, to notice manipulation, and to distinguish confident text from true text—because children will be surrounded by confident text.

Sidney Poitier and Harry Belafonte, both dyslexic, offer a different kind of model: courage paired with responsibility. Their excellence wasn’t limited to art; it extended into risk. SNCC’s Digital Gateway describes how Belafonte raised $70,000 in two days during Freedom Summer and, with Poitier, personally carried the cash to Greenwood, Mississippi—where they were ambushed by the Klan and escorted to safety by activists.  This is the kind of history many children never meet—not because it’s unimportant, but because it’s inconvenient. Kids can handle moral choice. What they often can’t handle is being quietly excluded from rich history because the reading level wasn’t built for them.

Muhammad Ali, dyslexic, shows why literacy remains essential even if AI can draft sentences. He won an Olympic gold medal at 18 and became heavyweight champion, then paid a steep price for refusing the Vietnam War—losing his title and being barred from boxing for years.  Courage requires language: language to name what you believe, language to argue with pressure, language to refuse manipulation. When a child reads a story of courage at their level, they inherit a moral vocabulary alongside a growing decoding system.

Steven Spielberg, dyslexic, helps answer the business side of your criticism. He didn’t build his career by doing every job alone; he built systems and teams so his human attention could go where it mattered most. That’s how I think about AI for running a school. Individualization at this level is labor-intensive. I use AI for drafting, planning, and yes, marketing posts—not to outsource thinking, but to protect the hours that should never be outsourced: assessment, direct instruction, building engaging decodable materials tied to student interests, training staff, and supporting families. The goal of operational leverage is not distance from students; it’s more time with them.

And now the ending, gathered tightly. Frank Whittle’s jet engine didn’t make propellers “stupid”; it made travel more possible. No one wants to take a prop plane to Manila from San Francisco when a jet can do the trip—because the point is not nostalgia for the old tool. The point is reaching the destination better. We accepted Whittle’s innovation because it expanded human capability. We accepted structural innovators like Charles Schwab because systems can be redesigned to widen access. We accepted builders like Jobs because tools can become more humane and usable. We accepted creators like Lucas, Spielberg, Rodin, Woodson, Adams, and Ballard because imagination plus craft can change what a generation sees as possible. And we keep learning from people like Joshua Wong, Jamie Oliver, Poitier, Belafonte, and Ali because courage and responsibility are also forms of literacy.

That is how we approach AI at Stillman Academy. We don’t romanticize it, and we don’t outsource the human center of teaching to it. We treat it the way the world has always treated transformative tools: with guardrails, with judgment, and with purpose. We use it to help more students access big ideas—coastal ecology, systems design, pharmaceutical price gouging, civil rights history—at a reading level they can genuinely read today, while we systematically build the foundational skills that will make tomorrow’s reading automatic. We use it to do more of what matters, not less: to see a child’s intelligence clearly, to affirm it, and to teach the skills that let that intelligence be heard—on the page, in the world, and eventually, in the life they choose to build.