This is the final article in a 12-part series on why pedagogy fundamentals matter more than ever in the AI age.
We started with Make It Stick. We end with a book that is, in some ways, the most quietly radical of the twelve.
Because every other book in this series argued that AI doesn't change how learning works.
This one argues that AI changes what kind of artifact teaching has to produce.
The artifact has to come apart.
A confession on behalf of the profession
Most of us, at some point in the last twelve months, have asked an AI for "a complete unit on photosynthesis" — and we have stared at the wall of text it produced and felt two things at once.
Relief. That was fast.
And dread. What do I do with this?
The dread is because what came back is one giant slab. One unbroken block. One forty-page document with no joints, no seams, no places to lift a piece out without breaking the whole.
It is a monolith.
Monoliths are terrifying to maintain.
The thesis
In the AI age, the scarce resource is not content. The scarce resource is structure.
AI made content effectively free. Infinite. On tap.
What AI did not do — what AI cannot do, by default, no matter how good the prompt — is make that content learnable.
Learnable means a kid can find a piece of it. A teacher can update a piece of it. A new context can reuse a piece of it. A failure in one part doesn't poison the whole thing.
That property — the property of being made of pieces that snap together, come apart, swap out, and travel — has a name.
Software engineers have been worshipping at this altar for fifty years. Carpenters have known it for centuries. The ALA published a book about it for librarians in 2021.
The name is modularity.
AI generates infinite content. Modularity is what makes infinite content learnable.
The book
Modular Online Learning Design: A Flexible Approach for Diverse Learning Needs by Amanda Nichols Hess (ALA Editions, 2021).
It was written before AI was on anyone's lesson plan.
It turns out, in retrospect, to have been exactly the playbook AI was waiting for.
The lie that needs to die
The story floating around: If something needs to change, just regenerate.
Why bother updating a lesson when you can just ask the AI for a new one? Content is cheap. Generation is free.
I want to argue that this is the most expensive idea in education right now.
Every time you regenerate from scratch, you lose two things.
You lose the judgment you had baked into the previous version — the place where you tweaked the example to match your kids, the place where you swapped out the seductive detail.
And you lose coherence across versions — your Tuesday lesson and your Thursday lesson now drift, because each one was sampled fresh from the model. You can no longer say "this is a unit." You have a pile of one-off generations.
Hess names this directly:
Hess on what modularity buys you
Modularity in learning design makes it easier to update content, run evaluations, add new instructional modules, and adapt these pieces to meet new learning scenarios or needs.
— Amanda Nichols Hess, Modular Online Learning Design
The promise: you update. You don't regenerate.
You swap a piece. You don't rewrite the whole.
The "just regenerate it" mindset is what happens when you have no modules. Your content is one slab, and the only edit you know how to make is to throw the slab away.
What Hess actually says
On the unit of work.
"Modular online learning design is understanding how [educators] can create learning objects... that can be designed and developed with a flexible, adaptable, reusable approach in mind."
A learning object is the unit. Not the lesson. Not the unit plan. The piece. Small enough to lift, name, and reuse.
The AI, by default, does not give you learning objects. It gives you flowing text. Flowing text is what the model was rewarded for during training. But flowing text is the wrong shape for a teacher. A teacher needs a kit of pieces, not a sermon.
On chunking.
"Chunking content is another feature of reordering information for modular online learning projects."
Smaller pieces. Pieces that can be sequenced, reordered, recombined. Not because small is cute, but because small is reusable.
A two-minute "spot the misconception" probe can run in any week of the year. A sixty-minute monolith can run exactly once.
On portability.
"Without considering how content can be flexibly adapted — perhaps by different people, or in different ways — our online learning resources are not, in fact, modular."
A claim that is almost rude in its bluntness. If your lesson cannot survive being lifted by a different teacher in a different week, you do not have a module. You have a fragment.
On scaling.
"Scaling up... involves taking a learning object or objects and finding ways to make broader or more wide-ranging resources. Scaling down involves focusing... for a specific idea, topic, or need."
A unit that supports scaling up, scaling down, and lateral transformation is a unit you will get five years out of.
A unit that doesn't, you will rewrite next August.
On reinvention.
"Redesign, rather than reinvent, the wheel — to revise and reimagine existing content in meaningful ways."
Redesign, not reinvent. Revise, not regenerate. That's a one-line manifesto for working with AI in education in 2026, from a 2021 book.
On adaptation as a property.
"What makes an online learning design task modular is building in adaptability throughout such a project, and especially ensuring that the instructional content is nimble enough to meet a range of diverse needs. This approach ensures that an online learning object can meet more than one need, at more than one point in time, in more than one way."
More than one need. More than one point in time. More than one way. That is the test.
If your lesson only works in one need, at one time, in one way — it is not a module. It is a performance.
AI is very good at producing performances. AI is bad, by default, at producing modules.
That is exactly inverted from what we need.
On rot.
"The most effective learning resources are constantly updated to reflect changing concepts, tools, or processes."
Not a metaphor. An operational claim.
If a learning resource is not on a maintenance schedule, it is on a decay schedule.
There is no third option.
What you do on Monday
Seven skills. The architecture of teaching in the AI age.
1. Object-First Generation.
Before you ask the AI for content, ask it for the inventory of objects. Specify, in writing, the named pieces — each one bounded, each one labeled.
"Do not produce a continuous lesson. Produce a labeled inventory of separate learning objects. For each object give me: a name, a one-line purpose, the content, an estimated time. Each object must be self-contained. I should be able to delete any one of them and the others should still make sense."
That last sentence — I should be able to delete any one of them and the others should still make sense — does all the work. It is the engineering definition of modularity, smuggled into a teaching prompt.
2. Snack-Sized by Default.
When you ask the AI for content, default to micro. Small, named chunks. Combine upward when you need a longer block. Never default to "a lesson."
"Generate a single 5-minute learning chunk on [topic]. It must teach exactly one idea. It must include: a one-line objective, a quick activation, a brief input, and one retrieval prompt. Do not exceed 5 minutes. I will combine multiple chunks myself."
The lesson will feel small. Trust it. By the time you've stacked eight, you have something more flexible than the slab the AI would have given you in one shot.
When one of those chunks needs to change, you change one chunk. Not the whole lesson.
The difference between editing a Lego model and editing a sand sculpture.
3. Interface Contract.
Software engineers solved this decades ago. Every module declares, in writing, what it needs to come in, and what it produces going out.
For every learning object, write a one-card contract:
- Prerequisites in. What must the kid already know, have, or have done?
- Objective. What is the single learnable thing inside?
- Artifacts out. What does the kid leave the chunk holding?
- Tested-in contexts. Where has this run successfully?
"For each learning object you produce, prepend a 4-line contract. If a prerequisite is implicit, name it explicitly. Do not assume the reader has seen prior lessons unless the contract says so."
The first time you bolt this on, you'll be shocked by how many hidden dependencies fall out. The lesson can now travel. The lesson becomes a Lego brick, not a snowflake.
4. Three-Way Portability.
When you finalize any object, ask three questions:
- Scale up. If I had to teach this to twice the audience, what would I keep, and what would I have to grow?
- Scale down. If I had to teach only the most essential five minutes of this, what would survive?
- Lateral. If I had to deliver this to a totally different group — younger learners, multilingual learners, asynchronous online — without rewriting the content, what would I change about the format?
"For this lesson, generate three companion variants: scaled-up (full unit), scaled-down (10-minute mini), and lateral (different audience or format). Show me what changes and what stays the same."
The next textbook adoption will not eat your weekend. It will eat an afternoon.
5. Build a Library, Not a Bonfire.
Maintain a personal library of learning objects. Treat each generation as a candidate for the library, not a disposable artifact.
After every AI-generated chunk that worked, save it with: a name, a one-line description, the topic and skill, the level it ran at, notes on what to change.
Before generating anything new, search your library first. If something is close, ask the AI to adapt it, not to start over.
"I am attaching a previous learning object I created. Do not write a new one. Modify this one for [new context], preserving its structure, its routines, and the cognitive moves it asks of the learner. Only change what you must."
The teacher who has a curated library of two hundred well-named learning objects has, functionally, more leverage than any prompt-jockey alive. The library is the moat.
6. Document the Process, Not Just the Product.
Whenever you finalize an object, also write down the why:
- Design rationale. Why these examples? Why this order? What did you try first and abandon?
- Known failure modes. Where does this lesson predictably break?
- Adaptation notes. What's safe to change? What's load-bearing?
"For each learning object, also produce a design rationale block: why this structure, in plain language; the two or three places this typically breaks; what is safe to change vs. load-bearing."
The very first time you make the AI do this, you'll discover something embarrassing. The AI will guess the design rationale, and the guess will be smarter than what you would have written. Because the rationale is the thinking the AI did to produce the object — and most of us never asked it to expose that thinking.
7. Ship With a Maintenance Stub.
Every learning object you ship gets a tiny attached maintenance stub. Treat it like an expiry date on food.
- Last reviewed. A date. Updated whenever you teach the lesson again.
- Next review due. The next time you commit to checking links, examples, datasets.
- Known stale points. The pieces you already suspect will go bad first.
- Owner. Who is responsible for refresh?
"At the end of this learning object, append a maintenance stub: Last reviewed (date), Next review due (6 months from now), Known stale points (current events, statistics, screenshots, links, tools), Owner (leave blank for me)."
When you ask AI to flag its own stale points, it is shockingly good at it. "The stat about social media usage will be outdated within eighteen months." That is foresight no human teacher can sustain across a hundred lessons.
The AI can. Make it.
The slogan
The Kent Beck refrain, one last time. Invest in the design of the system every day.
For us:
Architect the modules. Fill them with AI.
That's the whole shape of the job now.
The part the machine can do — fill a module, generate an example, draft a worked solution, produce a quiz — will keep getting cheaper, faster, more abundant. Forever.
The part that is irreducibly you — deciding what the modules are, where they snap together, what their interfaces require, how they scale, when they retire, what gets archived, what gets reused, what travels, what stays put — that part is the architecture of your teaching.
That part is what survives generations of models.
That part is the moat.
The series, in one sentence
Twelve articles. Twelve books. One claim:
The fundamentals were never the past.
In the AI age, the fundamentals are the moat.
- Make It Stick told us learning is effortful. AI is the most efficient illusion-of-fluency machine ever built. Hold the line on retrieval, spacing, interleaving.
- Design for How People Learn told us your AI doesn't know your kids. The diagnosis is pedagogy.
- Map It told us information is the equipment, the action is the destination.
- Clark and Mayer told us to edit the cognitive load every day.
- Co-Intelligence told us the partnership is the intelligence in the room.
- AI and Human Agency told us to design for agency every day.
- Pratschke told us to mediate, don't moderate.
- Gagne told us to engineer the conditions for learning every day.
- Kapp told us to design the meaning, delegate the mechanics.
- Brown and Green told us to design the system, let AI ship the artifact.
- Cammy Bean told us to be the designer, every day.
- Hess tells us — today — to architect the modules, fill them with AI.
Twelve different books. Twelve different decades. Twelve different angles.
One unbroken argument.
What's next
If this series landed for you, the EDodo flagship — AI-Powered Learning Design — is the cohort version of all of it. Eight weeks of project-based building, peer review, real artifacts. Educators who care deeply about pedagogy quietly find each other there.
If you've been quietly mastering this in your own classroom — running closed-book starts, five-gap diagnoses, action-mapped units, segmented lessons, cognitive arcs, real motivation, modular libraries — and you can speak fluent AI on top of it, please consider teaching with us.
The world has plenty of educators who know how to ask AI for a lesson.
It does not yet have many educators who know how to build a system of modules that can outlast the model that filled them.
Be one.
The wall of text the AI handed you on Sunday is not a lesson.
It is raw material.
You are the architect. The modules are the work.
The fundamentals are not the past.
In the AI age, the fundamentals are the moat.
That ends the series.
Thank you for reading.
Now go build.
Source: Hess, A. N. (2021). Modular Online Learning Design: A Flexible Approach for Diverse Learning Needs. ALA Editions. All quotes verbatim from the book.
