Edodo

Article 3 of 9 in a series on pedagogy fundamentals in the AI age.

You sit down on Sunday night. You type "Make me a unit on photosynthesis for grade 8" into a chatbot.

You get back a thirty-page document. Three lesson plans. A reading. A vocabulary list. A worksheet. A quiz. A formative assessment. A summative. An exit ticket. A reflection prompt.

You run it.

By Friday, no kid is thinking about photosynthesis differently. No kid is looking at a leaf and asking themselves anything new.

The unit ran. The kids got through it. The artifacts exist.

Nothing in any kid moved.

This is the prompt-to-content tornado. Type a prompt, get an artifact. By Friday afternoon you have produced more content than the teacher down the hall produced in her career — and your students have not changed.

The book that names this and dismantles it was written for the corporate training world in 2008.

It happens to be the most useful book in education at the moment.


The book

Map It: The Hands-On Guide to Strategic Training Design by Cathy Moore.

Cathy built Action Mapping because she watched grown professionals burn millions producing content that did not change a single behavior. Then AI gave the information-dump factory a thousand-times speed boost.

Her opening punch:

Cathy Moore on the cloud

As a profession, we're obsessed with knowledge transfer. We spend our days in a cloud of knowledge that floats so high above the real world that we can't see the intelligence in the faces of the people below or the rocks they have to climb over.

Map It, Cathy Moore

The cloud of knowledge is exactly where the AI lives. It was trained on the entire internet. So when you ask it for a lesson, it gives you back knowledge — beautifully arranged, bulleted, with three learning objectives at the top and a quiz at the bottom.

Our kids are standing at the bottom of that cloud, in the rain of information, with the rocks still in their way.

Information was never the destination.

Information is the equipment they use to do something differently.

The whole game is the something differently.


The expensive lie

The story going around: AI is a productivity multiplier. Use it to crank out lessons. The more material you produce, the better you serve your students.

The most expensive lie we're telling each other.

"The goal of action mapping is to solve business problems by changing job behavior. Our goal isn't to prepare people for a knowledge test."

— Cathy Moore, Map It

Translate to our world: the goal of teaching is to change what students do. Not prepare them for a quiz.

The AI doesn't know that. It assumes any topic has a body of knowledge, that body needs to be transferred, and once transferred, the job is done. That is the school model in a chatbot.

"Our jobs require far more than knowledge. We have to skillfully apply that knowledge to complex situations. We have to use it to make good decisions on the job."

Your students' jobs require far more than knowledge. They have to apply it in the next problem, the next conversation, the next argument with a friend who claims something nutty about climate.

The AI, left alone, will not design for that job. The AI will design for the test.


What Cathy actually says

On our profession's disease.

"As 'instructors,' we spend our days floating high above the real world in a cloud of knowledge. Information is all we have up there, so we've developed an unhealthy obsession with it. We give it magical powers."

We act as if information is a drug. Pour more in, behavior comes out. Cathy is laughing at us, gently. Behavior doesn't work that way.

On the four-word rule.

"Don't include 'nice to know.'"

Print it. Tape it above your monitor. Every minute a kid spends absorbing nice-to-know is a minute they could have been doing something. The AI doesn't know the difference. You do.

On personalization theatre.

"The school model encourages our clients to think that they can solve their problems with a one-time workshop or course. The typical one-hour webinar isn't very likely to change behavior."

Replace "webinar" with "AI-generated personalized learning path." The diagnosis holds. Personalization at the information layer is a fancier injection. It just changes the color of the syringe.

On engagement bling.

"Tina makes an online course engaging by using narration, images, animation, and clicking. Anna makes an online course engaging by challenging people with realistic problems that affect their jobs or lives."

The whole choice in two sentences. Tina is the AI's default. Anna is the educator.

On the unnecessary lesson.

"Often, we need to take the uncommon step of determining whether training will actually solve the problem. Often, it won't. Maybe it's actually caused by an inefficient procedure, or user-hostile software, or impossible deadlines. None of those will be fixed by a webinar."

The AI cannot run this diagnostic. It assumes every request is a teaching request. You have to interrupt.

On the heart of the method.

"Action mapping eliminates unnecessary information. It doesn't organize it."

The whole prompt-to-content tornado is organizing information. Cathy's life's work is the opposite. Eliminate. Cut everything that doesn't serve a real action.


What you do on Monday

1. Action First. Before you open a single AI tool, write this on paper:

"By Friday, my students will [verb] [object] when [realistic situation]."

The verb cannot be "understand," "appreciate," or "be aware of." Those are cloud verbs. Use do verbs: explain, defend, choose, predict, design, critique, revise, ask, refuse. Then list 3 realistic decisions a student must make in that situation. Now, prompt the AI — and only ask for practice activities tied to those three decisions.

2. The Knowledge Audit. For every chunk of information in an AI lesson, ask: "What does the student DO with this, in a realistic situation, this week?" If "they will know it for a quiz" — cut. If "it might come up later" — cut. If "it's interesting context" — cut.

"For every chunk of information in this lesson, name the specific decision a student will make using that information in the next seven days. If you cannot name a real decision, mark the chunk DELETE."

The first time you watch the AI mark sixty percent of its own lesson DELETE, you'll have a small religious experience.

3. The Five-Why Diagnostic. Before you build any lesson, ask Why five times. Why aren't my students doing this? Why is that the case? Keep going. If the bottom is "they don't have the knowledge yet" — design the lesson. If it's "the reading level is wrong" or "they don't see why it matters" or "the platform is broken" — that's the project. Don't put a lesson on top of it.

"You're the last defense. You're the one remaining hero who can save thousands of innocent people from a pointless information dump. Be their hero." — Cathy Moore

Replace "thousands" with "thirty." The math is the same.

4. The Realistic Challenge Test. "Does this decision actually show up in the student's life — at home, in another class, online, with a friend, with a sibling, with a news article?" If no, replace it. Even an imperfect realistic decision beats a perfectly engaging unrealistic one.

5. DCRC Practice Design. Real practice has four ingredients:

  • Decision — the student has to decide what to do, not show what they know.
  • Context — a named person in a specific place doing a specific thing.
  • Reality — the decision actually shows up in the student's real life.
  • Consequence — when the student picks an answer, they see the realistic consequence and get to draw their own conclusion.

"Rewrite this activity to meet four criteria. Decision: students must decide what to do. Context: include a named character in a specific situation. Reality: the situation must be one the student would realistically encounter. Consequence: for each answer choice, write a story-style continuation showing what realistically happens next. Do NOT write 'Correct' or 'Incorrect.' Show the consequence and let the student draw the conclusion."

6. Pull Not Push. The deepest move in the book.

"Anna lets people pull information when they need it to solve realistic problems. Tina pushes information at people by making them read or listen to it."

Redesign every lesson: open with a realistic challenge (no preamble); provide information as optional, pullable supports — a job aid, a "Need help?" link; let students attempt before they consult; feedback shows the consequence, with the matching slice of reference appearing at the moment of need; debrief at the end to surface the underlying concept.

Try this once. The kids who already knew the content fly through and feel respected for the first time in a year. The kids who didn't know it pull what they need and learn it while they're using it.


The slogan

Information is the equipment. The action is the destination.

Vivek (after Cathy Moore), Map It

AI made the information machine essentially free. It will deliver any equipment you want, in any format, in seconds.

What it cannot do is name the destination.

The destination is the action. The action is yours to name. The action is the irreducibly teacherly part of this work — and it's now the most expensive and most valuable thing you do.


If this is landing, the EDodo flagship — AI-Powered Learning Design — is built on these fundamentals. Eight weeks of project-based building, peer review, real artifacts.

While everyone else is racing to produce more content, you'll be quietly producing more change.

That is a different game. It's the only game that ever mattered.


Source: Moore, C. (2017). Map It: The Hands-On Guide to Strategic Training Design. Montesa Press. All quotes verbatim.