The Tumblr Experiment, Part 3: Blogging as Formative Assessment

Literacy & Technology Notes from the Classroom Oakland Writing Project

This is part 3 in a series. Parts 1 and 2 explored the in-class use of Tumblr, a blogging platform, as an exercise in writing for an authentic audience. You can read part 1 and part 2 online.

tumblr-logoAs the Tumblr experiment progresses, I’m faced with a difficult question about evaluation and feedbackWhat is a good measure of a writer’s success?

The answer, I believe, lies in whether a writer has achieved his or her purpose. This approach forces my students to really think about what they’re trying to accomplish. Yes, I get the obvious student response: “Trying to get an A.” But as we move deeper into the experiment, I’m finding that students are beginning to see other possible purposes. Tumblr is a space in which they can deliberately pursue an idea in writing. It’s also a place to take risks, both in what we think and how we want to write. Still, how do I encourage risks in writing without promoting ones that appeal to me?

This isn’t easy territory for evaluation.

I want this to be formative, but I don’t want my students to write for me or for points. At the same time, I do want them to know that I’m watching, steering us toward writing a solid essay. That said, the essay is really just one aspect of this larger project, whose goal is to produce authentic writing and voices, while developing rhetorical dexterity. 

Screen shot 2015-02-03 at 8.01.59 PM

A Good Exchange

Using their blogs as a lens on the class, we discuss what kind of writing students are noticing–reblogs and responses–and bring that back to the classroom, where we can talk about why certain posts are creating more action than others. We’ve begun to notice that success often comes down to the writer’s awareness of audience. One student, for example, blogged about a piece of music and was rewarded with a lot of attention and discussion. When we talked about it in class, the writer said that he knew that his friends liked music, and he was betting that if he could draw them in, he’d draw others with the same interest as well.

You can picture me clapping my hands, because isn’t this exactly how real writers–really anyone who produces any kind of product–think? 

The students were all good writers. But as we talked through their writing choices, it became clear that some of these writers valued their own choices over those that appealed to their Tumblr audiences. Some prefered not to “cater” to the audience. This led to a discussion of different rhetorical moves that might attract a different audience–or alienate an audience.

For me, the real value lies in the conversation about purposes–whether, as writers, they’re achieving their purposes. That’s the rhetorical triangle in action, with real consequences.

Screen shot 2015-02-03 at 7.35.51 PM

As a formative task, this works to let me see how we’re doing without being intrusive. Is what I think I’m teaching actually sticking to my students? Did it show up in the writing? If it is, great, but if not, I can see it before the essays come in, make adjustments, and revisit topics. We’ve talked technique and SOAPs and audience, of course, but always as an abstraction, very rarely as a practical “thing” we do as writers, choices we make on purpose. It’s this pivot from abstraction to “real” that’s important with the Tumblr experiment.

By moving students students out of the static model of traditional instruction, and into an environment that has entirely new and changing demands, I’m looking for a way to change them from people who write for me into people who write more authentically. The feedback that they’re getting from their audience–each other and me–is more valuable because it’s authentic, connected to their own goals as writers, and is rewarded by people whose opinions they value–each other, not just me.

 

RICKRick Kreinbring teaches English at Avondale High School in Auburn Hills, Michigan. His current assignments include teaching AP Language and Composition and AP Literature and Composition. He is a member of a statewide research project through the Michigan Teachers as Researchers Collaborative partnered with the MSU Writing in Digital Environments Program, which concentrates on improving student writing and peer feedback. Rick has presented at the National Advanced Placement Convention and the National Council of Teachers of English Conference. He is in his twenty-third year of teaching and makes his home in Huntington Woods.

Trying to Lose 10 Pounds a.k.a. the Paper Load Problem

Notes from the Classroom

I have a problem with to-do lists. I love them. I gleefully make long, epic lists of my plans for a day or weekend. Sometimes I even write things on my list that I’ve already done just so I can experience the joy of crossing it off. Other people do this, too. Right? Maybe?

shutterstock_95585014Unfortunately, sometimes my lists spiral into what my husband calls my “ten pounds” problem. He teases me that I insist on trying to cram 10 pounds of fun into a five-pound bag. Often when I tell him my plans for a weekend, he’ll roll his eyes and say, “Sounds like ten pounds to me.”

But this past Friday—the second Friday of the school year—when I dragged home my bag filled with 85 AP English essays, 85 AP English quizzes, 58 ELA-10 narratives, 58 ELA-10 quizzes, and 58 ELA-10 constructed responses, I realized that I, for once, might actually have 10 pounds of fun in my bag. And I was not happy about it.

The Paper Load

It was the second week of school, and I was already overwhelmed.

Some of it was simply a product of a busy first two weeks back: Curriculum Night, activities for my own shutterstock_103570856kids in the evenings, first week exhaustion. Those pressures will ease up as the year progresses, and things will get easier. But some of the pressures will not go away.

Yet one of the downsides of this job, which I adore, is the paper load. Often, it is easiest to assess students’ comprehension of a text by having them write about it. And students only become better writers if you give them specific feedback on their writing.

As I slogged through the stacks this past weekend, I resolved to be smarter this year. I’m still committed to giving consistent, quality feedback, but I need to figure out how to lighten the load in my 10-pound bag a bit.

Here are some of the things I’m hoping to try this year and some of my concerns:

  • Reading conferences. Some of the assessments I graded this weekend could have been easily replaced with a simple conversation. An individual reading conference would give me the opportunity to connect with my students one on one and ask some pointed questions about their reading. I created this Reading Conference Prep sheet to help them prepare for our reading conferences.
    • My concern: How will I make this work time-wise? What will the other 28 students do while I’m conferencing? Twenty-nine individual conferences will likely take three days of class time. I will have to use this strategy carefully and plan independent activities for the students. Also, it’s not something I’ll be able to do that often. Still, I think it’s a worthwhile “sometimes” solution.
  • Small-group discussion. This is a strategy I have used for several years now and I will need to employ it more this year. I divide my students into small groups and send them out in the hallway for a 10-minute discussion. They videotape it and I can grade it later using a rubric like this.
    • My concern: What about my shy students? Will I truly see what they know about the text in this setting? This can’t be the only way I gauge understanding, but it is valuable to assess speaking and listening. That’s a whole strand of the Common Core!
  • Group comments, analysis, and revision. This is a strategy that I used this weekend. As I tackled the third stack of writing assignments, I realized I was writing the same comments over and over. Enough. Why was I working so hard when they were all struggling with the same thing? I stopped writing comments and scored the essays on a 4/3/2/1 scale. I pulled some student samples of 4-level writing to use as models. In class we examined the models and figured out the characteristics of each level of writing. When I handed back the scored, comment-less papers, I had the students tell me what was missing from their writing and then revise to make them better.
    • My concern: Does this even save time? Realistically, no. I spent a lot of time prepping this lesson. Still, I think it was more useful than handing back papers littered with comments. The students were engaged in the revision process and, hopefully, it will pay off in future writing assignments.

None of these ideas are earth shattering or groundbreaking. Teachers have been doing these types of things for years. For me, though, I need to commit to finding a better balance between my desire to give quality, timely feedback with my need to not be overwhelmed by grading. As my husband constantly reminds me, there’s only so much room in a bag.

Hattie profileHattie Maguire is an English teacher and Content Area Leader at Novi High School. She is spending her fourteenth year in the classroom teaching AP English Language and Composition, English 10, Debate, and Practical Public Speaking. She is a National Board Certified Teacher who earned her BS in English and MA in Curriculum and Teaching from Michigan State University.

 

A Dinner Conversation with James Popham on Formative Assessment

Consultants' Corner Oakland Writing Project

dinner2A few months ago, I had the opportunity to share dinner and conversation with James Popham. As an avid fan of his work on learning progressions, I was excited to finally meet him. Three years prior, in the midst of a statewide collaboration to build a Common Core Standards-aligned model 6-12 ELA curriculum, I had hit upon Popham’s research.

To build a model curriculum meant we had the blessing and the challenge of vertically aligning across a K-12 grade span. Vertical alignment meant sequencing not only content knowledge, but also conceptual understanding and application of content. Dealing with content, in some ways, is the easier part of sequencing a learning progression. As teachers, a content approach is ingrained in our practice, district pacing guides, and state standards. The much harder task is to theorize a learner’s learning progression of conceptual understanding and application. As we developed the K-12 curriculum we had to consider:

● what tasks to scaffold across a progression in terms of complexity,
● what embedded instructional practices best support students into increasingly more complex content, understanding, and application, and
● what evidence we should focus our attention on when studying student work to determine growth.

So I now sat across the table from the James Popham with my mind spinning. What would I say to or ask a scholar who has been such an influential voice in education for decades? After the perfunctory jokes about bad Michigan weather, ordering our meals, and performing introductions of our small group of colleagues, the conversation shifted to formative assessment. Why the continued struggle to build system-wide effective formative assessment practices? Popham’s theory: educators are paying attention to the wrong things. Leaning in, we waited to hear what these wrong things are. He went on to share that in his experience and research, teachers focus more on instructional procedures than what happens to students as a result of those procedures. My initial reaction was to push back. Was he saying that instructional practices were not important enough to focus professional learning around? He politely engaged in a back and forth volley with me, but it was clear that the exchange didn’t change either of our minds.

Weeks later, I engaged further with Popham’s theory by reading his book Unlearned Lessons: Six Stumbling Blocks to Our Schools’ Success. If you’re wondering, the six stumbling blocks Popham presents as holding school systems back from maximizing student achievement are:
● Too many curricular targets
● Underutilization of classroom assessment
● Preoccupation with instructional process
● Absence of affective assessment
● Instructionally insensitive accountability tests
● Abysmal assessment literacy

In my second “hearing” of Popham’s argument, I found some common ground. His claim was not that no attention should be paid to instructional procedures but rather, that as teachers and administrators, we can easily fall prey to focusing solely on procedures and lose sight of what happens to students as a consequence of instruction. Now pair this preoccupation with instructional procedures to the underutilizing of classroom assessment and voila`–formative assessment becomes misinterpreted as studying summative student data long after its collection.

I had to admit–I could now see where Popham was coming from. The question remains though: how do we better balance attention between instructional practices and formatively assessing student learning? Over the next few weeks, I will share a story of a statewide collaborative of writing teachers and university researchers who have been working in this particular problem space. Their inquiry: what instructional practices produce effective student peer to peer feedback and what learning does peer feedback make visible? In the meantime, if learning progressions and their relationship to effective formative assessment interests you, the following resources may be useful.

Formative Assessment in Practice: A Process of Inquiry and Action, Margaret Heritage (2013)
Transformative Assessment, W. James Popham (2008)
Transformative Assessment in Action, W. James Popham (2011)

Links
Assessment Literacy in Today’s Classroom
Creating Learning Progressions

Learning Progression to Support Self-Assessment and Writing about Themes in Literature: Small Group

Using a Learning Progression to Help Students Work Towards Clear Goals (K-2)


Susan GolabSusan Wilson-Golab
joined Oakland Schools in 2010 following 22 years of in the field 6-12 experience across two different states and rural, suburban, and urban contexts. Her research and practice focus heavily on the evolving definition of literacy, developmental learning progressions, and formative assessment. At the district level, Susan has served as classroom teacher, Literacy Specialist, and ELA Curriculum Coordinator. These experiences and study helped Susan in her role as Project Leader for developing a model 6-12 ELA curriculum for the Michigan Association of Intermediate School Administrators (MAISA)— a curriculum resource now globally available. More recently, Susan launched Michigan Teachers as Researchers Collaborative (MiTRC). The mission: to build collaborative participatory research between university and secondary teachers from around the state interested in exploring and developing the teaching and assessing of writing. In 2000, she joined the National Writing Project through the satellite Oakland Writing Project site based out of University of Michigan. She now serves as Site Director for the Oakland Writing Project.

Podcast #15: Student Data, Mining of This Data, and Implications

Podcasts

The ability to collect and store vast amounts of information on students has increasingly become easier and cheaper. At its best, this information can be used to support students. At its worst, the information can be used against students, often without their knowledge. This information can be stored and manipulated forever.

In this podcast, Chris Gilliard, Hugh CulikDaniel Hoops and Jason Almerigi provide an insightful and interesting discussion on this issue.

Links to sites mentioned in the podcast:

This podcast is also on iTunes.

 

 

Student Reflections Confirm Teaching & Inform Grades

Notes from the Classroom Oakland Writing Project

176511139Several years ago, I developed an inquiry question that asked if students use the language of workshop. Because I consistently use the “ELA Speak” of mentor texts, seed ideas, and generating strategies, I questioned, do students know these terms and use them to forge work?

This work began with a checklist of workshop language that I wanted students to leave eighth grade knowing and using. I culled the list from the ELA Common Core State Standards, the MAISA Units of Study, and my lesson plans.

I decided I would look at summative writing work to evaluate students’ use of these skills. Additionally, I felt strongly about students having their voice heard, so final work was accompanied by a reflection which asked students to name skills they now had as writers, to give examples of these skills in their writing, and to set a goal for future use of these skills.

Originally, I modeled a reflection that focused on the end-product skills my writing showed, and student reflections did the same. In my example, I wrote a reflection on our opening unit about narrative poetry. In this reflection you can see how I named skills that are explicitly evident in the published final copy, such as craft skills (alliteration, repetition) and theme.

I shouldn’t have been surprised by this, but as I recorded language students named and used correctly or used and didn’t name, I realized that they were using the language of workshop, however, the tool I gave them to show this understanding didn’t allow everything to be shown. Namely, students didn’t name process skills or skills that students used to develop a final product, but I could see evidence of this work in my conference notes and in their drafts. In the next reflection, for the same unit a year later, you can see that I named generating and finding seeds as part of the journey to finding the topic I wrote about. Additionally, I explained several more skills that I used as a writer such as the overall structure and type of ending.

So, I made two changes. First, I more explicitly named the skills and associated lessons. I even hung these up in my classroom during the unit (pictured is literary essay unit).

Skills Bulletin Board

Skills Bulletin Board

Second, I created a model reflection that named process skills in addition to the end-product skills shown in my writing. I also exhibited this more thoroughly by writing in specific lines of my text that exhibit the traits I name in my reflection: Revised Reflection 2.

Now, student reflections named all of the skills learned and used. So, I know from reflections that students use the language of workshop in theory and in practice.

Reflections serve another purpose, though. As I grade the writing, according to a rubric which for me is a curricular model rubric assessing organization, content, and language use, I used student reflections as an accompaniment to reading the writing. For many language arts teachers, we take the student into account on these summative grades by considering, the growth that the student has achieved from conference suggestions, the specific use of skills from lessons, and the ability of the student.

As I read a student’s literary essay, I commented about the depth of commentary with the statement, “Commentary – how does this evidence relate to your claim/topic?” In my classroom, I work with students to understand that commentary re-explains evidence, tells why evidence is important, and relates evidence to claim, topic, other evidence. I muddled between the score of adequate and below. Deciding on adequate for the overall content of the paper, I read the student’s matching reflection. He stated in the future goals section, “I would take more time, and think deeper about what my claim should be. I think that I took the easy and the most obvious route. If I had taken more time, and thought deeper, I could have created a more sophisticated essay with better evidence and commentary.” This statement validated the “adequate” score I gave for the student’s essay content.

Overall, I use reflections to inform my teaching and to give students voice as I grade their papers. In the example, above, it was as if the student was sitting next to me as I graded the writing. Throughout the year, as students reflect on each unit of reading and writing, they can see their growth over time. As students are allowed to think about what they have actually learned through the course of a unit and show evidence of that learning, the writer improves more quickly over time because they can think deeply about their writing decisions and exhibit inquiries about their own work. These inquiries help to increase the amount of independence the writer possesses while writing because they can make choices about the writing they publish.

pic 2Amy Gurney is an 8th grade Language Arts teacher for Bloomfield Hills School District. She was a facilitator for the release of the MAISA units of study. She has studied, researched, and practiced reading and writing workshop through Oakland Schools, The Teacher’s College, and action research projects. She earned a Bachelor of Science in Education at Central Michigan University and a Master’s in Educational Administration at Michigan State University.

 

Moving from the ACT to the SAT in 2016

News

The Michigan Department of Education announced this shift yesterday.  For more information, see the press release below.

***

MDE News Release

Contact:    Martin Ackley, Director of Public and Governmental Affairs, (517) 241-4395

                  Caleb Buhs, Michigan DTMB, (517) 241-7422

State Awards Future College Assessment to College Board’s SAT for Michigan Students

January 7, 2015

LANSING –- All Michigan high school juniors will begin taking the SAT as the state-administered college assessment exam beginning in 2016 after the College Board won the three-year competitively-bid contract, the Michigan Department of Education and Department of Technology, Management and Budget jointly announced today.

The College Board administers the SAT, a globally-recognized college admission test that lets students show colleges what they know and how well they can apply that knowledge. It tests students’ knowledge of reading, writing and math — subjects that are taught every day in high school classrooms in Michigan.

ACT, Inc. will continue to provide its WorkKeys assessment for all high school students. Both the college entrance assessment and work skills tests are required in state law to be provided free to all high school students, and each is periodically competitively bid through the state’s structured procurement process, as directed by the Department of Technology, Management and Budget (DTMB).

 “The College Board’s SAT test is respected and used around the country,” said State Superintendent Mike Flanagan, “and Michigan high schools work with them now through their Advanced Placement program that helps students earn college credits while in high school.

“Their bid was rated the highest; provides valuable assistance to Michigan educators, students, and parents; is more aligned to Michigan’s content standards; and saves the state millions of dollars over the course of the three-year contract,” Flanagan said.

The College Board’s bid was $15.4 million less over the three-year contract than the next bidder and scored 10 percentage points higher by the Joint Evaluation Committee (JEC). In addition to staff from MDE and DTMB, the evaluation committee also included members representing the education community, including a high school principal; local school superintendent; a testing and assessment consultant from an intermediate school district; and a vice president from a Michigan community college.

Bill Barnes, principal at Charlotte High School and member of the JEC said: “The attention to detail with which the College Board created its proposal and the extensive resources that it will provide to schools and students to help them prepare for the test, make its college readiness assessment the best choice for Michigan.”

Another member of the Joint Evaluation Committee, Jim Gullen, a data and evaluation consultant for the Macomb Intermediate School District, said: “After two days of review and discussion, there was no question that College Board put forth the best proposal. Considering the quality of College Board’s proposal, the value presented in the pricing, and our current legislation, it is a good time to transition to the SAT to assess Michigan’s high school students’ mastery of the Michigan curriculum.”

Each year, the College Board helps more than seven million students prepare for a successful transition to college through programs and services in college readiness and college success — including the SAT and the Advanced Placement program. The organization also serves the education community through research and advocacy on behalf of students, educators and schools.

The Michigan Department of Education (MDE) is forming a team that will include the local, regional, and community college members of the Joint Evaluation Committee to assist in the transition to the SAT. In addition, the department will hold an onsite meeting with the College Board to discuss how it intends to positively affect the transition for Michigan schools, educators, parents, and students.

In its successful bid, the College Board included the following value-added components that will benefit Michigan schools and families:

  • Beginning in Spring 2015, the College Board will provide all schools and students with free test prep materials and online practice tests to help students prepare for the redesigned SAT in 2016.
  • Professional Development
    • In-person and technology-based training for local test administrators, proctors, and technology coordinators
    • Professional development for teachers, students, and parents in understanding the new SAT and analyzing test results
    • Professional development for post-secondary enrollment professionals in using the data/resources for admissions and financial aid decisions
  • An updated and relevant assessment
    • Redesigned SAT beginning in 2016
    • Aligned to Michigan content standards, evidence-based design
    • Additional item types beyond multiple choice
    • New forms developed each year
    • Reports available online
  • Simplification and reduction of school staff effort to request testing accommodations
    • No need to reapply for testing accommodations if already approved for the Advanced Placement Program, or the PSAT testing for National Merit Scholarship Qualification Test

The college entrance exam and work skills assessment are given free to approximately 115,000 Michigan high school students each year.

ACT WorkKeys is a job skills assessment system that helps employers select, hire, train, develop, and retain a high-performance workforce.  This series of tests measures foundational and soft skills and offers specialized assessments to target institutional needs.

As part of ACT’s Work Readiness System, ACT WorkKeys has helped millions of people in high schools, colleges, professional associations, businesses, and government agencies build their skills to increase global competitiveness and develop successful career pathways.

Successful completion of ACT WorkKeys assessments in Applied Mathematics, Locating Information, and Reading for Information can lead to earning ACT’s National Career Readiness Certificate (ACT NCRC), a portable credential earned by more than 2.3 million people across the United States.

Michigan high school students have taken the WorkKeys assessment since 2007.  Over 413,000 Michigan students have received an NCRC credential.

Although the contracts await final completion and approval of the State Administrative Board, the three-year contract for the college entrance assessment will cost approximately $17.1 million, and the three-year work skills assessment will cost approximately $12.2 million.

More Details on Spring 2015 M-STEP Testing

News

M-Step-Logo_473059_7We have a few more details about the tests that will be given in the spring, including types of tests at each grade level. A batch of sample items is in production now. This sample will be available “shortly” to all schools and will demonstrate the online functions and tools of the M-STEP.

The ELA Spring 2015 M-STEP is a comprehensive ELA model:

·         Grades 3-8: Smarter Balanced content plus Michigan-developed field-test items. This will include a Computer Adaptive Test (CAT), a Classroom Activity, and a Performance Task.

·         Grade 11: Smarter Balanced content plus Michigan-developed field-test items. This will include a Computer Adaptive Test (CAT), a Classroom Activity, and a Performance Task. This is in addition to the ACT plus Writing and Work-Keys.

·         The M-STEP (grades 3-8, 11) will include items from the following Michigan Standards: reading, writing, language, listening.

The most current assessment transition document outlines the details for the M-STEP.  For additional information, click here.

To get up to date news on the state assessments, subscribe to MDE’s Spotlight on Assessment and Accountability Newsletter.

Podcast 5: Richard Koch – Writing Portfolios and Learning

Podcasts

Richard Koch is a respected educator and leader in the area of portfolios and writing. He shares what portfolios are, how they support student learning and the teaching of writing, as well as helpful ideas on how to effectively use portfoliosRichard is the Director of the Michigan Portfolios . He is also the co-author of The Portfolio Guidebook and consults with educators in the area of teaching writing and writing assessment.

Contact Richard Koch at:  [email protected]

State Assessment Choice in MDE Budget

Legislative Updates

Smarter BalancedAs of Friday, April 25, the Senate Appropriations Committee reported the Michigan Department of Education (MDE) budget with an amendment that places several restrictions on the assessment chosen and implemented by the state.  Specifically, the amendment states the Department cannot expend funds unless it selects an assessment that meets all the requirements listed:

(a)    The assessment system measures student proficiency and growth on the current state curriculum standards and is capable of measuring individual student performance in the following subject matter areas:

(i)                         English.

(ii)                       Reading.

(iii)                     Writing.

(iv)                      Mathematics.

(v)                        Science.

(b)    The content of the assessment system is aligned with the current state curriculum standards.

(c)    The content of the assessment is subject to a transparent review process involving public review and comment.

(d)    Ensures that students, parents, and teachers are provided with reports that convey individual student proficiency and growth on the assessment.

(e)    Ensures that students, parents, teachers, administrators, and community members are provided with reports that convey aggregate student proficiency and growth data for a given school.

(f)     Ensures the capability of reporting the necessary data to support educator evaluations.

(g)    Ensures that reports are available within 1 month after completion of the exam.

(h)    The assessment is capable of being implemented statewide with existing infrastructure in a fully operational manner no later than the 2015-2016 school year.

(i)      Except as necessary to support educator evaluations pursuant to subdivision (f), ensures that access to individual student data is available only to the student, parents, legal guardians, administrators, and teachers of the student.

(j)      The assessment is pilot tested prior to statewide implementation.

(k)    Each exam shall not designate more time to be completed than the previous statewide assessment designated.

(l)      The total cost of executing the adopted assessment statewide each year shall not exceed twice the cost of executing the previous statewide assessment after adjustment for inflation.

(2) School districts are not prohibited from adopting interim assessments.

The MDE currently is researching whether these conditions preclude the Smarter Balanced Assessment (SBAC) from being used.  Further, the Senate removed the funding for assessments related to educator evaluation and student assessment phase-in both in the MDE budget and the School Aid budget.  This may have just been to create a point of difference with the House budget.  There are still opportunities for change so please reach out and let your legislators know your thoughts on state assessments.

Formative Assessment

ReadingOverview

In thinking about formative assessment, we have found the work of Margaret Heritage (2007) to be particularly helpful.  Heritage describes formative assessment as an organized and planned process used to “gather evidence about learning.”  In this process, the data gathered about student learning are analyzed in order to determine students’ skills and understandings so the teacher can then shape learning activities to better support the students as they move towards their learning goal.  Heritage writes that, “In formative assessment, students are active participants with their teachers, sharing learning goals and understanding how their learning is progressing, what next steps they need to take, and how to take them” (p. 141).

Heritage then describes three primary types of formative assessment:  “on-the-fly assessment, planned-for interaction, and curriculum-embedded assessment” (p. 141).  On-the-fly assessments take place in the moment of instruction, for example when a teacher notices a student misconception about an important concept and uses a Stop and Jot writing exercise to quickly gather information on the prevalence of this misunderstanding across the whole class. Planned-for-interactions are moments built into lesson plans where the teacher gathers information about student understanding in order to inform instructional moves going forward in the lesson.  This might involve, for example, planned small group discussions about a key question or process in the lesson followed by groups presenting a graphic organizer representing their thinking.  The teacher can then make informed choices about  re-teaching important ideas or moving forward in the lesson.  Curriculum-embedded formative assessments are, as suggested in their name, built into curricula or instructional sequences and are placed at important points in learning progressions to provide teachers and students with insight into the thinking and learning taking place.

Finally, Heritage identifies four “core elements” of formative assessment:

  • identifying the “gap,”
  • feedback, and
  • student involvement,
  • learning progressions (p. 141)

Formative assessment first needs to “identify the gap,” or evaluate a where a student currently is with respect to some pre-determined goal for learning activity.  Formative assessment also has to include feedback, and this feedback needs to be geared towards helping the student move closer to the learning goal.   As already implied, the student needs to be involved in this process, and very importantly, aware that this is the process taking place.  In other words, students need to know about the learning goal, they need to know where they stand, and they need feedback that is is designed to help them achieve the learning goal.  Underlying of all of these other core elements is the notion of learning progressions;  the final learning goal needs to have identified sub-goals so that the students can set short-terms learning goals they can achieve as they advance and learn.

Heritage, M.  (2007).  Formative Assessment:  What do teachers need to know and do?  Phi Delta Kappan 89(2):  140-145.