All books/Designing AI-Assisted Concept-Based Inquiry Classrooms
Chapter 818 min read

Creating Provocations & Assessments


Provocations launch inquiry by creating intellectual need. Assessments reveal whether that inquiry led to genuine understanding. Together, they bookend the CBI experience—one opens the inquiry, the other demonstrates its impact.

This chapter shows you how to design both: provocations that ignite curiosity and assessments that truly measure conceptual understanding.


6.1 What Makes an Effective Provocation?

A provocation is an experience, question, image, artifact, or situation designed to stimulate curiosity, engagement, and the desire to investigate. It creates the "intellectual itch" that drives inquiry.

Characteristics of Effective Provocations

1. Creates Cognitive Dissonance

Effective provocations present something unexpected, paradoxical, or challenging to existing thinking. They make students say "Wait... that doesn't make sense" or "I didn't know that!"

Example: Showing students that a heavy bowling ball and a light tennis ball dropped from the same height hit the ground at the same time—contradicting intuition about weight and falling.

2. Generates Authentic Questions

Rather than telling students what they'll learn, a provocation makes them want to ask. The questions emerge naturally.

Example: Presenting a sealed box that makes unusual sounds when shaken. Students naturally want to know: "What's inside? Why does it sound like that?"

3. Connects to Concepts

While engaging, a provocation must connect to the conceptual understandings you're targeting. Entertainment without conceptual connection is just entertainment.

Example: For a unit on interdependence, showing a video of wolves being reintroduced to Yellowstone and the cascading effects on the entire ecosystem.

4. Is Accessible Yet Challenging

All students should be able to engage with the provocation, but it should create intellectual challenge for everyone.

5. Allows Multiple Entry Points

Different students might respond to different aspects of the provocation, but all responses should lead toward inquiry.

What Provocations Are NOT

Not a hook that's forgotten: A provocation connects to the entire unit, not just the first day.

Not entertainment for its own sake: There must be conceptual purpose.

Not the answer: Provocations raise questions, not provide answers.

Not a gimmick: Provocations are intellectually substantive.

The Provocation-to-Inquiry Flow

PROVOCATION FLOW

    PROVOCATION
         │
         ▼
   Creates wonder/
     dissonance
         │
         ▼
 Students generate
     questions
         │
         ▼
  Questions focus
     inquiry
         │
         ▼
Investigation seeks
    answers
         │
         ▼
  Understanding
    develops
         │
         ▼
 Generalization
    emerges

6.2 Provocation Types and Design Principles

Provocations come in many forms. Selecting the right type depends on your content, students, and conceptual goals.

Type 1: Visual Provocations

Images, videos, or visual displays that capture attention and raise questions.

Examples:

  • Photographs showing contrast or change over time
  • Infographics with surprising data
  • Artwork depicting abstract concepts
  • Maps that challenge assumptions
  • Time-lapse videos of processes

Design Principles:

  • Image should be immediately engaging
  • Remove explanatory text initially
  • Allow time for observation before discussion
  • Prepare guiding questions if students struggle

Sample Visual Provocation:

Concept: Perspective Provocation: Two photographs of the same event from different angles that tell different stories. Question to prompt: "Which photograph tells the truth?"

Type 2: Artifact Provocations

Physical objects that students can observe, handle, and investigate.

Examples:

  • Primary source documents
  • Scientific specimens
  • Mathematical manipulatives arranged unexpectedly
  • Cultural artifacts
  • Mystery objects

Design Principles:

  • Objects should be safe for students to handle
  • Provide magnifying glasses or other tools as needed
  • Create "notice and wonder" routines
  • Connect artifacts to conceptual questions

Sample Artifact Provocation:

Concept: Adaptation Provocation: Collection of bird beaks (models) and various types of "food" (nuts, worms, berries). Students match beaks to food without being told the purpose. Question to prompt: "What determines which birds survive?"

Type 3: Scenario Provocations

Hypothetical situations or real-world dilemmas that engage students in thinking.

Examples:

  • Moral dilemmas
  • "What would you do if..." situations
  • Historical decision points
  • Design challenges
  • Prediction scenarios

Design Principles:

  • Scenarios should feel relevant and real
  • Include enough detail for engagement
  • Ensure multiple valid responses
  • Connect to conceptual understanding

Sample Scenario Provocation:

Concept: Scarcity and Choice Provocation: "Your community has been given $10,000 to improve life for residents. Here are five proposals, each costing $10,000. How should the money be spent? What happens to the unselected proposals?" Question to prompt: "What criteria should guide resource allocation?"

Type 4: Demonstration Provocations

Live or recorded demonstrations that reveal something surprising.

Examples:

  • Science experiments with unexpected results
  • Mathematical puzzles that seem impossible
  • Reading a controversial text passage
  • Playing music that challenges expectations

Design Principles:

  • The "surprise" should be genuine
  • Allow predictions before revealing results
  • Resist explaining immediately
  • Let students struggle with the discrepancy

Sample Demonstration Provocation:

Concept: Density Provocation: Place objects in water—some that should sink float, some that should float sink. Question to prompt: "What determines whether something floats?"

Type 5: Data Provocations

Statistics, graphs, or data sets that reveal patterns or challenge assumptions.

Examples:

  • Surprising statistics
  • Data that contradicts common beliefs
  • Patterns in data sets
  • Incomplete data that requires interpretation

Design Principles:

  • Data should be authentic and verifiable
  • Present without interpretation initially
  • Ask students to find patterns
  • Connect data to human stories

Sample Data Provocation:

Concept: Correlation and Causation Provocation: Graph showing strong correlation between ice cream sales and drowning deaths. Question to prompt: "Does ice cream cause drowning?"

Type 6: Story/Text Provocations

Narratives, excerpts, or quotes that engage emotionally and intellectually.

Examples:

  • Opening passages of literature
  • Primary source accounts
  • Conflicting eyewitness testimonies
  • Provocative quotations
  • Poems that challenge

Design Principles:

  • Texts should be appropriately complex
  • Read aloud when possible
  • Allow initial response before analysis
  • Connect emotional response to conceptual inquiry

Designing Your Own Provocations

Step 1: Identify your target concept(s) What conceptual understanding do you want students to develop?

Step 2: Find the "hook" What's surprising, paradoxical, or fascinating about this concept?

Step 3: Select provocation type Which type would best convey this hook to your students?

Step 4: Design for engagement How can you maximize student curiosity and question generation?

Step 5: Connect to inquiry How will this provocation lead into the investigation phase?

Step 6: Test and refine Does this provocation generate the questions you intended?


6.3 Assessment in Concept-Based Inquiry

Assessment in CBI differs fundamentally from traditional assessment. The goal is not to measure fact retention but to determine whether students have developed transferable conceptual understanding.

What We're Really Assessing

In concept-based inquiry, we assess:

Traditional AssessmentCBI Assessment
Can students recall facts?Can students use facts as evidence?
Can students define terms?Can students apply concepts?
Can students follow procedures?Can students transfer understanding?
Can students identify the "right" answer?Can students construct and defend generalizations?

The Assessment Challenge

Conceptual understanding is invisible. We can't see inside students' minds. We can only infer understanding from what students do—their performances, explanations, applications, and transfers.

This creates the assessment challenge: designing tasks that make thinking visible and require demonstration of transferable understanding.

Types of Assessment in CBI

1. Formative Assessment (Throughout Inquiry)

Ongoing assessment that informs instruction:

  • Observation of inquiry process
  • Student questions and discussions
  • Draft generalizations
  • Exit tickets focused on concepts
  • Self-assessment of understanding

2. Summative Assessment (End of Unit)

Assessment that measures achievement of understanding:

  • Performance tasks requiring transfer
  • Explanation of generalizations with evidence
  • Application to new contexts
  • Defense of conceptual claims

The Transfer Test

The ultimate test of conceptual understanding is transfer: Can students apply what they've learned to new, unfamiliar situations?

Near Transfer: Applying learning to similar situations Example: After learning about ecosystems using a forest example, students analyze a marine ecosystem.

Far Transfer: Applying learning to significantly different situations Example: After learning about interdependence in ecosystems, students analyze interdependence in economic systems.


6.4 The GRASPS Framework for Performance Tasks

The GRASPS framework, developed by Wiggins and McTighe, provides a structure for designing authentic performance assessments that reveal conceptual understanding.

GRASPS Components

G = GOAL
    What is the goal or challenge?
    What problem needs solving?

R = ROLE
    What role does the student assume?
    Whose perspective do they take?

A = AUDIENCE
    Who is the audience for this work?
    Who will receive or evaluate it?

S = SITUATION
    What is the context or scenario?
    What conditions or constraints exist?

P = PRODUCT/PERFORMANCE
    What will students create or do?
    What format will demonstrate understanding?

S = STANDARDS/CRITERIA
    What criteria define success?
    How will quality be judged?

Why GRASPS Works for CBI

  1. Authentic context makes transfer meaningful
  2. Role and audience create purpose beyond school
  3. Product creation requires synthesis of understanding
  4. Criteria focus on conceptual demonstration

GRASPS Examples

Example 1: Science (Ecosystems)

ComponentDescription
GoalAdvise the city on whether to approve a new development that will affect local wetlands
RoleEnvironmental consultant
AudienceCity Planning Commission
SituationA developer wants to build housing on wetland area; the city must decide
ProductWritten recommendation with oral presentation
StandardsMust explain ecosystem interdependence, predict effects of development, use evidence

Example 2: Social Studies (Revolution)

ComponentDescription
GoalAnalyze whether current situation qualifies as revolutionary
RolePolitical analyst
AudienceNews network viewers
SituationA country is experiencing significant unrest; the network wants analysis
ProductVideo analysis segment (3-5 minutes)
StandardsMust apply generalization about revolution causes, use historical comparisons

Example 3: Language Arts (Author's Craft)

ComponentDescription
GoalCreate a writer's guide on effective perspective use
RoleCraft mentor
AudienceAspiring writers
SituationA writing website needs content about narrative perspective
ProductBlog post with examples from studied and new texts
StandardsMust explain how perspective shapes reader understanding, demonstrate through original examples

Example 4: Mathematics (Data Analysis)

ComponentDescription
GoalIdentify misleading data presentations and create ethical versions
RoleData ethics advisor
AudienceMarketing department of a company
SituationCompany's marketing team has been accused of misleading data use
ProductReport identifying problems and proposing ethical alternatives
StandardsMust apply principles of honest data representation, explain why original is misleading

Designing GRASPS Tasks

Step 1: Start with the generalization What understanding should students demonstrate?

Step 2: Identify transfer context Where in the "real world" does this understanding matter?

Step 3: Design the scenario Create authentic situation requiring this understanding

Step 4: Define role and audience Who would naturally address this situation? For whom?

Step 5: Determine product What would this person create in this situation?

Step 6: Establish criteria What would distinguish sophisticated understanding from limited understanding?


6.5 Assessing Conceptual Understanding

Beyond GRASPS tasks, multiple strategies help assess conceptual understanding throughout and at the end of inquiry units.

Strategy 1: Generalization Defense

Students state a generalization and defend it with evidence from multiple sources.

Structure:

  1. State the generalization clearly
  2. Provide at least three pieces of supporting evidence
  3. Address potential counterarguments
  4. Explain why the generalization matters

Assessment Focus:

  • Is the generalization accurately stated?
  • Does evidence genuinely support the claim?
  • Are connections between evidence and claim explicit?
  • Can student handle complexity and exceptions?

Strategy 2: New Context Application

Present students with an unfamiliar situation and ask them to apply their conceptual understanding.

Example: After studying how geographic features influence settlement patterns: "Here is information about a newly discovered planet. Based on what you understand about geography and settlement, where would humans most likely establish communities? Why?"

Assessment Focus:

  • Does student recognize relevant concepts in new context?
  • Can student apply generalization without prompting?
  • Is reasoning sound even when content is unfamiliar?

Strategy 3: Concept Mapping

Students create visual representations showing relationships among concepts.

Assessment Focus:

  • Are concept connections accurate?
  • Does student explain relationships, not just draw lines?
  • Are connections labeled with relationship types?
  • Does map show hierarchical understanding (macro/micro concepts)?

Strategy 4: Explanation to Naive Learner

Students explain a concept or generalization to someone who knows nothing about it.

Structure: "Explain [concept/generalization] to a younger student (or alien, or time traveler) who has never encountered this idea. Use examples they would understand."

Assessment Focus:

  • Can student explain without jargon?
  • Are examples truly illustrative?
  • Does explanation capture the essential relationship?
  • Can student adapt explanation to audience?

Strategy 5: What Stays the Same?

Students identify what remains constant across varied examples.

Example: "We've studied revolutions in America, France, and industry. What patterns remain the same across all of them? What is always true about revolution?"

Assessment Focus:

  • Can student abstract from specific examples?
  • Does student identify deep patterns, not surface similarities?
  • Is generalization that emerges accurate and significant?

Strategy 6: If/Then Predictions

Students use conceptual understanding to make predictions about unfamiliar situations.

Example: "Based on what you understand about supply and demand, what would happen if a new law limited how many [product] could be produced each month?"

Assessment Focus:

  • Does prediction follow logically from conceptual understanding?
  • Can student explain the reasoning?
  • Does student consider multiple factors?

Rubric for Conceptual Understanding

CONCEPTUAL UNDERSTANDING RUBRIC

LEVEL 4: SOPHISTICATED
├── Accurately states generalizations
├── Provides multiple, varied evidence
├── Addresses complexity and exceptions
├── Transfers to unfamiliar contexts independently
└── Makes connections across concepts

LEVEL 3: PROFICIENT
├── States generalizations correctly
├── Provides adequate supporting evidence
├── Shows understanding of main relationship
├── Transfers to similar contexts with guidance
└── Connects to related concepts

LEVEL 2: DEVELOPING
├── Partially accurate generalization
├── Evidence incomplete or weakly connected
├── Shows surface-level understanding
├── Limited transfer ability
└── Treats concepts in isolation

LEVEL 1: BEGINNING
├── Cannot state generalization accurately
├── Confuses facts with generalizations
├── Understanding tied to specific examples
├── No evidence of transfer
└── Limited concept vocabulary

Templates & Tools

Template 6.1: Provocation Design Template

PROVOCATION DESIGN TEMPLATE

UNIT: ______________________________________
TARGET CONCEPTS: ____________________________
TARGET GENERALIZATION: ______________________
_________________________________________

PROVOCATION TYPE:
□ Visual  □ Artifact  □ Scenario
□ Demonstration  □ Data  □ Story/Text

DESCRIPTION OF PROVOCATION:
_________________________________________
_________________________________________
_________________________________________

COGNITIVE DISSONANCE IT CREATES:
What assumptions does it challenge?
_________________________________________

QUESTIONS IT SHOULD GENERATE:
What should students wonder?
1. _______________________________________
2. _______________________________________
3. _______________________________________

CONNECTION TO INQUIRY:
How does this lead to investigation?
_________________________________________

MATERIALS NEEDED:
_________________________________________

IMPLEMENTATION NOTES:
How will you present it? What will you say/not say?
_________________________________________
_________________________________________

BACKUP QUESTIONS:
If students don't generate questions naturally:
1. _______________________________________
2. _______________________________________

Template 6.2: GRASPS Task Designer

GRASPS PERFORMANCE TASK DESIGN

UNIT: ______________________________________
TARGET GENERALIZATION: ______________________
_________________________________________

G - GOAL
What challenge or problem must be addressed?
_________________________________________
_________________________________________

R - ROLE
What role does the student assume?
_________________________________________

A - AUDIENCE
Who is the audience for this work?
_________________________________________

S - SITUATION
What is the context? What constraints exist?
_________________________________________
_________________________________________
_________________________________________

P - PRODUCT/PERFORMANCE
What will students create or do?
_________________________________________

S - STANDARDS/CRITERIA
What will distinguish excellent from adequate work?

Criteria 1: _______________________________
Criteria 2: _______________________________
Criteria 3: _______________________________
Criteria 4: _______________________________

CONCEPTUAL UNDERSTANDING REQUIRED:
What must students understand to succeed?
_________________________________________
_________________________________________

TRANSFER DEMAND:
How does this require applying learning to a new context?
_________________________________________

Template 6.3: Conceptual Understanding Assessment Plan

CONCEPTUAL UNDERSTANDING ASSESSMENT PLAN

UNIT: ______________________________________
KEY CONCEPTS: ______________________________
TARGET GENERALIZATION: ______________________
_________________________________________

FORMATIVE ASSESSMENTS (during inquiry)

Assessment Point 1:
□ When: _________________________________
□ Method: _______________________________
□ What I'm looking for: ___________________

Assessment Point 2:
□ When: _________________________________
□ Method: _______________________________
□ What I'm looking for: ___________________

Assessment Point 3:
□ When: _________________________________
□ Method: _______________________________
□ What I'm looking for: ___________________

SUMMATIVE ASSESSMENT (end of unit)

Type: □ GRASPS Task  □ Generalization Defense
      □ New Context Application  □ Other: ______

Description:
_________________________________________
_________________________________________

Transfer Context:
How is this different from contexts studied?
_________________________________________

Success Criteria:
What will demonstrate conceptual understanding?
1. _______________________________________
2. _______________________________________
3. _______________________________________

DIFFERENTIATION:
How will assessment be adapted for varied learners?
_________________________________________
_________________________________________

AI Prompts for Provocations & Assessments

Prompt 6.1: Provocation Generator

Generate 5 provocations to launch a unit on [TOPIC]
targeting these concepts: [CONCEPTS]

The provocations should lead students toward this
generalization: [GENERALIZATION]

Grade Level: [GRADE]
Subject: [SUBJECT]

For each provocation:
1. Describe it in detail
2. Identify its type (visual, artifact, scenario, etc.)
3. Explain what cognitive dissonance it creates
4. List 3-4 questions it should naturally generate
5. Explain how it connects to the inquiry that follows
6. Note any materials or preparation needed
7. Suggest how to facilitate discussion after presenting

Please include a variety of provocation types and
indicate which would be most powerful as the unit opener.

Prompt 6.2: GRASPS Task Creator

Create a GRASPS performance assessment for a unit on
[TOPIC] that assesses understanding of this
generalization: [GENERALIZATION]

Grade Level: [GRADE]
Subject: [SUBJECT]

Provide:
1. Complete GRASPS framework
   - Goal
   - Role
   - Audience
   - Situation
   - Product/Performance
   - Standards/Criteria

2. Student-facing task description
   (written at appropriate level)

3. Detailed rubric with 4 levels
   - Sophisticated
   - Proficient
   - Developing
   - Beginning

4. Teacher notes including:
   - What conceptual understanding this reveals
   - How it requires transfer (not just recall)
   - Potential scaffolds for struggling students
   - Extensions for advanced students

Prompt 6.3: Transfer Task Designer

Design 3 tasks that assess transfer of this
generalization: [GENERALIZATION]

The unit focused on [ORIGINAL CONTEXT].

Create transfer tasks at three levels:

1. NEAR TRANSFER
   - Similar context to what was studied
   - Should be accessible to most students
   - Describe context and task

2. MODERATE TRANSFER
   - Different but related context
   - Requires adaptation of understanding
   - Describe context and task

3. FAR TRANSFER
   - Significantly different context
   - Tests deep, flexible understanding
   - Describe context and task

For each task, explain:
- What makes it this level of transfer
- What successful response would look like
- Common mistakes to anticipate
- How to scaffold if students struggle

Prompt 6.4: Formative Assessment Menu

Create a menu of formative assessment strategies for
a unit on [TOPIC] focusing on [CONCEPTS].

Target Generalization: [GENERALIZATION]
Grade Level: [GRADE]
Unit Duration: [LENGTH]

Provide 8-10 formative assessment strategies that:
1. Can be used quickly during instruction
2. Reveal conceptual understanding (not just facts)
3. Inform instructional decisions
4. Are appropriate for the grade level

For each strategy:
- Name/title
- Brief description
- When to use it (which phase of inquiry)
- What it reveals about understanding
- How to respond to what you learn
- Variation for different learning needs

Prompt 6.5: Assessment Alignment Check

Review this assessment for alignment with CBI principles:

Assessment Description:
[DESCRIBE YOUR ASSESSMENT]

Target Generalization:
[YOUR GENERALIZATION]

Please evaluate:

1. ALIGNMENT
   - Does this assessment measure conceptual understanding?
   - Does it require transfer or just recall?
   - Does it assess the stated generalization?

2. AUTHENTICITY
   - Is the context meaningful?
   - Would students see purpose in this task?

3. RIGOR
   - Does it demand sophisticated thinking?
   - Can surface-level understanding succeed?

4. FEASIBILITY
   - Is it manageable for teacher and students?
   - What resources are required?

5. RECOMMENDATIONS
   - What would strengthen this assessment?
   - What alternative approaches might work?
   - How could it better measure transfer?

Key Takeaways

  1. Provocations create the intellectual need that drives inquiry. They generate questions rather than answer them.

  2. Effective provocations create cognitive dissonance, generate authentic questions, connect to concepts, are accessible yet challenging, and allow multiple entry points.

  3. Assessment in CBI measures transferable understanding, not fact recall. The key question is: Can students apply their understanding to new contexts?

  4. The GRASPS framework helps design authentic performance tasks with Goal, Role, Audience, Situation, Product, and Standards.

  5. Transfer is the ultimate test of conceptual understanding. Near and far transfer tasks reveal depth of understanding.

  6. Formative assessment throughout inquiry helps monitor and guide conceptual development.

  7. Multiple assessment strategies—generalization defense, new context application, concept mapping, explanation—provide varied evidence of understanding.


Reflection Questions

  1. Think about a unit you teach. What provocation could create more intellectual need at the start?

  2. How do your current assessments measure transfer versus recall? What would you need to change?

  3. Design a GRASPS task for a unit you'll teach soon. What makes it authentic?

  4. How do you currently assess conceptual understanding versus factual knowledge? What balance do you want?


This concludes Part 2: The CBI Toolkit. In Part 3, we'll explore how to implement these tools appropriately across different grade levels, from early elementary through high school.