Building a Play-Based Curriculum That Meets State Standards
You already know play works. You've seen what happens when children are given space to explore, build, negotiate, and create. You've watched a child work through a social conflict during dramatic play and develop more emotional regulation in ten minutes than a worksheet on feelings could produce in a week.
The challenge isn't the philosophy. It's the paperwork.
When your director asks how block play addresses IELDS Benchmark 6.A, when a parent wants to know why there are no handwriting worksheets in the take-home folder, when your QRIS reviewer needs documentation that your program is systematically addressing all developmental domains — that's where the tension lives. Not in whether play works, but in how you prove it.
The Core Reframe
State early learning standards — including the Illinois Early Learning and Development Standards (IELDS) — define what children should know and be able to do. They do not prescribe how you teach it. The IELDS document itself states it is "not a curriculum" and "not an assessment tool." Standards are the destination. Play-based curriculum is a route. The question isn't play or standards. It's how your play-based approach systematically ensures you're addressing all standard domains — and how you document that it's happening.
This article offers a practitioner framework for doing exactly that.
The Play Spectrum: Where Purposeful Play Fits
Not all play is the same, and the research literature draws important distinctions. Understanding where your program sits on the spectrum clarifies both your pedagogy and your documentation strategy.
At one end is unstructured free play — child-directed, adult-absent, no predetermined learning goal. At the other end is direct instruction — teacher-directed, scripted, focused on specific skill acquisition. Neither extreme, practiced exclusively, constitutes effective early childhood practice. NAEYC's fourth edition position statement on Developmentally Appropriate Practice is explicit on this point: effective practice "does not mean simply letting children play in the absence of a planned learning environment, nor does it mean predominantly offering direct instruction."
The productive middle ground is what the research calls guided play. Weisberg, Hirsh-Pasek, and Golinkoff defined guided play as having two essential features: the child maintains autonomy and agency within the play experience, and an adult provides scaffolding — either through environmental design or active facilitation. Guided play takes two forms. In the prepared environment approach, the teacher designs the setting to highlight a learning goal while children have autonomy to explore within it. In active facilitation, the teacher enters ongoing play to extend learning through questioning, modeling, or introducing vocabulary.
What I call the Purposeful Play Framework adds a third layer that most guided play literature doesn't operationalize at the program level: a systematic assessment-to-action cycle. Teachers observe during play, document what they see against developmental benchmarks, and adjust the next day's environment and facilitation based on that data. Play is the vehicle. Purpose is the engine. And the assessment cycle is the steering.
This distinction matters for standards alignment because it gives you a documentation trail. Free play produces learning but doesn't produce evidence. Direct instruction produces evidence but often at the cost of the learning conditions that make early childhood education effective. Purposeful play — guided play embedded in an intentional assessment system — produces both.
What the Evidence Actually Shows
If you're making this case to a director, a board, or a skeptical parent, the research base is stronger than you might realize.
The most rigorous current evidence comes from a 2022 meta-analysis published in Child Development by Skene and colleagues, which reviewed 39 studies and included 17 in a quantitative synthesis. Guided play outperformed direct instruction on early math skills, shape knowledge, and executive function measures including task switching. Guided play also outperformed free play on spatial vocabulary acquisition. The effect sizes were not trivial — the advantage for shape knowledge, for instance, was substantial.
The longitudinal evidence is even more compelling. The HighScope Perry Preschool Study, which followed participants for more than fifty years, used a play-based curriculum built around child-initiated activity with teacher scaffolding. Participants showed sustained gains in cognition, employment, earnings, and health — including intergenerational benefits documented by Heckman and colleagues in 2020. And the HighScope Curriculum Comparison study offers perhaps the most striking finding in the field: children in the direct instruction group showed comparable IQ gains to the play-based group initially, but by age 23, the direct instruction group had three times as many felony arrests and significantly higher rates of emotional impairment requiring treatment. The cognitive outcomes were equivalent. The social-emotional outcomes were not.
I want to be intellectually honest here. Lillard and colleagues published a rigorous critical review in Psychological Bulletin in 2013 questioning the causal claims about pretend play's developmental benefits. Their conclusion was not that play doesn't work — it was that the evidence for unstructured pretend play specifically is weaker than commonly asserted, and that other pathways may produce similar outcomes. This critique actually strengthens the case for guided play. The research doesn't support stepping back and hoping play produces learning on its own. It supports intentional design, teacher facilitation, and systematic observation — the defining characteristics of a purposeful approach.
A Practical Framework: Standards-Aligned Play in Five Steps
This is the section most educators tell me they need. The philosophy makes sense. The research is persuasive. But when you're standing in front of a block area on a Tuesday morning with ten children and a standards binder, you need a process.
Here is the framework I use at Spark Academy and the one I walk through with the programs I consult with.
Step 1: Start with the standard, not the activity. This is backward design applied to play-based settings. Instead of planning a block activity and then trying to figure out which standards it addresses, begin with the IELDS benchmark you're targeting. For example: Benchmark 6.A.ECd — "Compare, order, and describe objects by size, length, or weight." Now design a play context where children will naturally encounter comparison, ordering, and descriptive language about physical properties.
Step 2: Design the environment to make the encounter inevitable. Set up the block area with graduated cylinders, blocks of varying weights, a balance scale, and photographs of structures organized by height. A dramatic play bakery with measuring cups, different-sized containers, and recipe cards. A sensory table with objects that sink or float. The children don't know they're addressing Benchmark 6.A.ECd. They're playing. But the environment is designed so that comparison, ordering, and descriptive language emerge naturally from whatever they choose to do.
Step 3: Know what you're watching for. Before the play period begins, your observation prompts should be ready. Does the child use comparative language — bigger, smaller, heavier, lighter? Do they arrange objects by size without being prompted? Do they predict which object will be heavier before picking it up? These prompts turn observation from "watching children play" into formative assessment with developmental targets.
Step 4: Document when you see it. An anecdotal record takes thirty seconds. Date, child's name, learning center, what you observed, developmental domain tag. Some educators use structured templates. Others use a notes app on their phone. The system matters less than the consistency. Photo documentation with learning annotations is especially powerful — a photograph of a child arranging blocks by height, tagged to IELDS 6.A.ECd, tells a richer story than a completed worksheet ever could.
Step 5: Use what you documented to plan what comes next. This is the step most programs skip — and it's the step that transforms documentation from compliance exercise into instructional tool. If your observation showed that three children in your group are consistently using comparative language but two are not, tomorrow's environment should scaffold those two children specifically. Maybe the balance scale moves to their preferred play area. Maybe you join their play and model the language. The assessment-to-action cycle is what makes this a curriculum system rather than a documentation system.
One play session, designed intentionally, addresses multiple standard domains simultaneously. A single block-building experience can generate evidence of mathematical thinking (spatial reasoning, counting, comparison), scientific reasoning (balance, cause and effect, prediction), language development (descriptive vocabulary, narrative, negotiation with peers), and social-emotional competence (collaboration, turn-taking, frustration tolerance). One rich play experience addresses what would take four separate worksheets to cover — and it does so in a context where the child is motivated, engaged, and building executive function skills that worksheets cannot reach.
Addressing the Pressure Points
The practical framework works inside the classroom. But the pressure often comes from outside it.
"We need more academics." This directive usually comes from administrators or boards responding to parent anxiety. The Skene meta-analysis is your counter-evidence: guided play produces equal or better academic outcomes than direct instruction across math, spatial reasoning, and executive function. "More academics" doesn't require less play. It requires more intentional play — and better documentation of what that play is producing.
"Parents want to see worksheets." Parents aren't wrong to want evidence of their child's progress. They're working with an incomplete understanding of what evidence looks like. A portfolio showing photographs of their child's block structures growing more complex over three months, annotated with the developmental skills each structure demonstrates, tells a more compelling developmental story than a stack of traced letters. The communication challenge is real, but the answer is better documentation, not a retreat to worksheets. This is a consulting conversation I have frequently — how to build family communication systems that make play-based learning visible and credible.
"How do I prove this to my licensing body?" Your QRIS reviewer and your DCFS licensing specialist are not evaluating whether you use worksheets. They're evaluating whether your program has an intentional curriculum, whether you assess children's development systematically, whether you use assessment data to inform planning, and whether your teacher-child interactions are responsive and stimulating. A well-documented play-based program with a clear assessment-to-action cycle meets every one of those criteria — and often scores higher on process quality measures than programs using scripted, direct-instruction curricula.
"We tried play-based and it was chaos." Then what was tried wasn't play-based curriculum. It was the absence of curriculum. Play without purpose, without environmental design, without observation prompts, without an assessment cycle — that's not a pedagogical approach. That's a gap. The structure in a purposeful play-based program is substantial. At Spark Academy, every morning follows a predictable rhythm: Academic Play, Developmental Playroom, Daily Enrichment, Outdoor Play, Snack. Within that structure, children have agency and choice. The structure is what creates the conditions for purposeful play, not what constrains it.
The Practitioner-Researcher Advantage
Most of what's written about play-based curriculum design comes from one of two places: academic researchers who study play but don't run programs, or curriculum companies selling products. Both have value. Neither has the perspective of someone who designs a play-based framework, implements it with real children every morning, assesses its outcomes against state standards, and then consults with other programs to help them do the same thing.
That's what I do. The Purposeful Play Framework isn't a theoretical model. It's the operational system behind every classroom at Spark Academy — a DCFS-licensed preschool serving children ages 3 through kindergarten in Morton, Illinois. The assessment-to-action cycle, the daily enrichment rotation, the monthly Developmental Playroom redesign, the individualized learning paths — these aren't conference presentation concepts. They're Tuesday morning.
When I work with other programs, I bring that operational specificity. Not "you should try guided play" — but here's how to restructure your daily schedule to create observation windows, here's how to map your existing play activities to IELDS domains, here's how to build a documentation system your teachers will actually use, and here's how to communicate the value of play-based learning to the families and stakeholders who need to see the evidence.
Frequently Asked Questions
Do play-based programs meet IELDS requirements?
Yes. The Illinois Early Learning and Development Standards define what children should know and be able to do — they do not prescribe a specific instructional methodology. A play-based program that uses intentional environmental design, formative assessment, and systematic documentation is fully aligned with IELDS. The standards document itself states it is "not a curriculum" and "not an assessment tool."
What's the difference between free play and guided play?
Free play is child-directed with no predetermined learning goal and minimal adult involvement. Guided play maintains child agency and autonomy while an adult provides scaffolding — either by designing the environment to highlight a learning concept or by entering play to extend learning through questioning and modeling. The research consistently shows that guided play outperforms both free play and direct instruction on key academic and executive function outcomes in early childhood settings.
How do I document learning during play?
Observation-based documentation is the standard for play-based programs. Practical approaches include anecdotal records (30-second written notes capturing specific behaviors and tagging them to developmental domains), photo documentation with learning annotations, learning stories that connect observed play behaviors to standard benchmarks, and developmental checklists completed from observation data. The key is consistency — a simple system used daily produces a richer evidence base than an elaborate system used sporadically.
Can play-based curriculum work for QRIS quality ratings?
Yes. Quality Rating and Improvement Systems evaluate both structural quality (ratios, credentials, curriculum alignment) and process quality (teacher-child interactions, intentional planning, use of assessment data). Play-based programs with strong documentation systems often score highly on process quality measures because intentional play requires responsive, individualized interactions — exactly what process quality instruments measure.
What if my staff isn't trained in play-based pedagogy?
This is a common starting point. Play-based curriculum requires more teacher skill than scripted instruction, not less — teachers must be intentional environmental designers, skilled observers, and responsive facilitators simultaneously. Professional development focused on observation techniques, formative assessment practices, and guided play facilitation strategies can build these competencies. Staff training on evidence-based play is one of the consulting services I provide.
Bring This Expertise to Your Organization
Dr. Michelle Peterson, Ed.D., consults with schools, districts, and early childhood programs on curriculum design, inclusive classroom strategies, NDBI implementation, and program evaluation.