learning (27)
The Assessment Network has grown to the point where that it now contains many different examples of how the power of assessment can be maximized in the classroom. These ideas are scattered throughout the site. To make this site easier to navigate, this one blog will include links to all of the other classroom AFL examples. It's sort of like an AFL Wal-Mart - everything you need in one blog!
- AFL, Art Class, and Failure Management - learning from trying
- Student Self Assessment
- Examples of using rubrics (rubric for students to assess their own learning)
- The Spelling Pre-Test
- A school counselor uses AFL (AFL chart for students to track their progress)
- Examples of using rubrics (rubric for students to assess their own learning)
- Don't confuse it for a specific strategy (various AFL strategies and grading practices)
- An everyday activity can become an AFL tool (review sheets that allow students to track level of mastery)
- AFL + Enthusiasm = Powerful Instruction (students analyzing their own progress and predicting their success)
- Students understanding the value of assessment (using examples of past work to guide current efforts)
- 3 Perspectives on an AFL example (test grades replacing quiz grades)
- Simple AFL activity in Math (getting feedback prior to a graded activity)
- An AFL email to parents and students (communicating AFL to stakeholders)
- A Salem High School teachers uses a GPS (retaking quizzes to reach mastery)
- An Assessment Becomes a Learning Tool (enhancing the impact of test corrections)
- AFL Communication and a Self-Assessment Rubric for Math (a rubric for students to use self-assess progress)
- The Power of Asking "Can You" - an example of training students to assess their own progress
- Would This Work? (A Question for Math Teachers) - daily quizzes on specific steps to Math processes
- How AFL Shouldn't and Should Look in a Math Class - more than just teach, test, and retest
- Teaching More than the Notes and Rhythm (by Mark Przybylowski)
- How AFL could be applied to a PE class (students charting their own progress)
- AFL and Heart Rate Monitors
Special Education
Spelling
Trades and Industrial
- Laying an AFL Bead in Welding (the importance of feedback)
- An AFL review strategy that can be used by any teacher in any content area (white boards for review)
- Students checking their progress (student progress check sheet)
- LOOPING: Where AFL and SBL all comes together
- 10 Formative Assessment Tech Tools by Edutechchick
- 56 Examples of Formative Assessment by David Wees
- The Multi-Colored Note Card
- Practical ideas for more frequent assessment to enhance learning (4 general examples)
- It's about students taking ownership of learning (general examples of how students can assess themselves)
- Getting and Giving Student Feedback (exit slips and more)
- AFL principles can guide many different types of classroom practices (students calculating their grades)
- AFL strategies and descriptions (7 different AFL strategies and descriptions of how to use them)
- School administrators try creating classroom AFL objectives (sample AFL objectives)
- AFL Flashcard Review
- An Elementary Activity that Applies to All Grade Levels (daily review sheet)
- The Pre-Test
- AFL Communication and a Self-Assessment Rubric for Math - an example of communicating the purpose of AFL practices to students and parents
- The Power of Asking "Can You" - an example of training students to assess their own progress
- Students checking their progress (student progress check sheet)
- An AFL review strategy that can be used by any teacher in any content area (white boards for review)
- A Salem High School teachers uses a GPS (retaking quizzes to reach mastery)
- An Assessment Becomes a Learning Tool (enhancing the impact of test corrections)
- An everyday activity can become an AFL tool (review sheets that allow students to track level of mastery)
- AFL, Art Class, and Failure Management - learning from trying
- Ideas for Making Retakes and Redos Work
- Focused Formatives
- Student Self Assessment
- The AFL/SBL Exit Slip
- 5 Fantastic, Fast Formative Assessment Tools from Vicki Davis
- Making Every Assessment a Formative Assessment
I just finished watching a TED TALK by Sal Khan, founder of the Khan Academy. Sal was talking about mastery learning and the importance of building strong learning foundations before layering on additional information.
As I watched the video, I was thinking about why a stubborn 25% of most students in the upper elementary, middle, and high schools are reading two or more years below grade level.
Sal cites the example of a child who scores an average grade of 75% on a unit test. Most educators would accept 75% as an average score, and in fact most diagnostic assessments would accept 75—80% as mastery level; however, Sal points out the not knowing 25% of the test components is problematic. From the student's perspective: "I didn't know 25% of the foundational thing, and now I'm being pushed to the more advanced thing."
When students try to learn something new that builds upon these shaky foundations, "they hit a wall... and "become disengaged."
Sal likens the lack of mastery learning to shoddy home construction. What potential homeowner would be happy to buy a new home that has only 75% of its foundation completed (a C), or even 95% (an A)?
Of course, Sal is a math guy and math lends itself to sequential mastery learning more so than does my field of English-language arts and reading intervention. My content area tends to have a mix of sequential and cyclical teaching learning, as reflected in the structure of the Common Core State Standards. The author of the School Improvement Network site puts it nicely:
Many teachers view their work from a lens that acknowledges the cyclical nature of teaching and learning. This teaching and learning cycle guides the definition of learning targets, the design of instructional delivery, the creation and administration of assessments and the selection of targeted interventions in response to individual student needs.
At this point, our article begins to beg the question: What if a shaky foundation is what we're dealing with now? We can't do anything about the past. Teachers can start playing the blame game and complain that we're stuck teaching reading to students who missed key foundational components, such as phonics. All-too-often, response to intervention teachers are ignoring shaky foundations and are trying to layer on survival skills without fixing the real problems.
Instead, teachers should re-build the foundation. Teachers can figure out what is missing in the individual student skill-sets and fill the gaps... this time with mastery learning.
Mark Pennington, MA Reading Specialist, is the author of the comprehensive reading intervention curriculum, Teaching Reading Strategies. A key component of the program is our 13 diagnostic reading assessments. These comprehensive and prescriptive assessments will help response to intervention reading teachers find out specifically which reading and spelling deficits have created a shaky foundation for each of your students. I gladly share these FREE Reading Assessments with teachers and welcome your comments and questions.
My daughter's 7th grade English teacher at Andrew Lewis Middle School uses a time-tested easy-to-apply simple AFL strategy that motivates my daughter to work, helps her to learn, and ensures that her grade is an accurate reflection of that learning.
Every Monday the students are given a pre-test on that week's spelling words. If the student spells 100% of the words correct on the pre-test, then the grade is recorded in the teacher's grade book, and the student does not have to take the post-test. All other students will take a post-test on Friday of that week.
Simple but effective. Students receive feedback on Monday. They now have the rest of the week to work on improving. More importantly, though, is that they know exactly what they need to do to improve.
I'm going to brag on my daughter, Kelsey, for just a moment. She is a terrific speller, and almost always scores a 100 on the pre-test. Knowing that she can get out of having to take the post-test is a wonderful incentive for her to prepare for the pre-test. When she occasionally misses a word on the pre-test, she becomes a very focused and motivated studier when preparing for the post-test.
However, her teacher uses the pre-test in a more powerful way than just as a motivator. Since Kelsey almost always scores a 100 on the pre-tests, the rest of the week's focus on spelling potentially could be a waste of time for her. However, her teacher turns the better spellers into spelling tutors during the week. This gives Kelsey a much-needed opportunity to be a leader. It allows her to have fun serving her peers, and it helps her peers do better on their spelling by providing one-on-one assistance that a teacher would have a difficult time providing during a busy school day.
Most teachers in America have probably tried pre-tests. This is not a ground-breaking strategy. That's the beauty of AFL. To be a good AFL teacher doesn't mean re-inventing the wheel. It means taking the best of what you already do and focusing your purpose toward providing meaningful feedback that gets used by both the teacher and the students.
One word of warning: You can completely mess up the benefit of this AFL strategy by the way you grade. Please do not ever average the pre- and post-tests together or allow the pre-test to factor into the grade at all unless the student reaches the desired benchmark on the pre-test. Otherwise, allow the post-test score - the one that reflects the outcome of the teacher's instruction - to be the one that is recorded in the grade book.
Classroom Assessment for Learning
Classroom assessment that involves students in the process and focuses on increasing learning can motivate rather than merely measure students.
Imagine a classroom assessment as a healthy part of effective teaching and successful learning. At a time when large-scale, external assessments of learning gain political favor and attention, many teachers are discovering how to engage and motivate students using day-to-day classroom assessment for purposes beyond measurement. By applying the principles of what is called assessment for learning, teachers have followed clear research findings of the effects that high-quality, formative assessment can have on student achievement.
… largely absent from the traditional classroom assessment environment is the use of assessment as a tool to promote greater student achievement (Shepard, 2000). In general, the teacher teaches and then tests. The teacher and class move on, leaving unsuccessful students, those who might not learn at the established pace and within a fixed time frame, to finish low in the rank order. This assessment model is founded on two outdated beliefs: that to increase learning we should increase student anxiety and that comparison with more successful peers will motivate low performers to do better.
By contrast, assessment for learning occurs during the teaching and learning process rather than after it and has as its primary focus the ongoing improvement of learning for all students (Assessment Reform Group, 1999; Crooks, 2001; Shepard, 2000). Teachers who assess for learning use day-to-day classroom assessment activities to involve students directly and deeply in their own learning, increasing their confidence and motivations to learn by emphasizing progress and achievement rather than failure and defeat (Stiggins, 1999; 2001). In the assessment for learning model, assessment is an instructional tool that promotes learning rather than an event designed solely for the purpose of evaluation and assigning grades. And when a student become involved in the assessment process, assessment for learning begins to look more like teaching and less like testing (Davies, 2000).
STUDENT-INVOLVED ASSESSMENT
Research shows that classroom assessments that provide accurate, descriptive feedback to students and involve them in the assessment process can improve learning (Black and William, 1998). As a result, assessment for learning means more than just assessing students often, more than providing the teacher with assessment results to revise instruction. In assessment for learning, both teacher and student use classroom assessment information to modify teaching and learning activities. Teachers use assessment information formatively when they:
• Pretest before a unit of study and adjust instruction for individuals or the entire group.
• Analyze which students need more practice.
• Continually revise instruction on the basis of results.
• Reflect on the effectiveness of their own teaching practices.
• Confer with students regarding their strengths and the areas that need improvement.
• Facilitate peer tutoring, matching students who demonstrate understanding with those who do not.
We tend to think of students as passive participants in assessment rather than engaged users of the information that assessment can produce. What we should be asking is, “How can students use assessment to take responsibility for and improve their own learning?”
Student involvement in assessment doesn’t mean that students control decisions regarding what will or won’t be learned or tested. It doesn’t mean that they assign their own grades. Instead, student involvement means that students learn to use assessment information to manage their own learning so that they understand how they learn best, know exactly where they are in relation to the defined learning targets, and plan and take the next steps in their learning.
Students engage in the assessment for learning process when they use assessment information to set goals, make learning decisions related to their own improvement, develop an understanding of what quality work looks like, self-assess, and communicate their status and progress toward established learning goals. Students involved in their own assessment might:
• Determine the attributes of good performance. Students look at teacher-supplied anonymous samples of strong student performances and list the qualities that make them strong, learning the language of quality and the concepts behind strong performance.
• Use scoring guides to evaluate real work samples. Students can start with just one criterion in the guide and expand to others as they become more proficient in scoring. As students engage in determining the characteristics of quality work and scoring actual work samples, they become better able to evaluate their own work. Using the language of the scoring guide, they can identify their areas of strength and set goals for improvement - in essence, planning the next steps in their learning.
• Revise anonymous work samples. Students go beyond evaluating work to using criteria to improve the quality of work sample. They can develop a revision plan that outlines improvements, or write a letter to the creator of the original work offering advice on how to improve the sample. This activity also helps students know what to do before they revise their own work.
• Create practice tests or test items based on their understanding of the learning targets and the essential concepts in the class material. Students can work in pairs to identify what they think should be on the test and to generate sample test items and responses.
• Communicate with others about their growth and determine when they are nearing success. Students achieve a deeper understanding of themselves and the material that they are attempting to learn when they describe the quality of their own work. Letters to parents, written self-reflections, and conferences with teachers and parents in which students outline the process they used to create a product allow students to share what they know and describe their progress toward the learning target. By accumulating evidence of their own improvement in growth portfolios, students can refer to specific stages in their growth and celebrate their achievement with others.
Source: From "Classroom Assessment for Learning," by S, Chappuis and R.J. Stiggins, 2002, Educational Leadership, 60(1), pp. 40-44. Copyright 2002 by ASCD.
This network is dedicated to promoting outstanding assessment practices - the kind of assessment practices that help students learn as opposed to simply documenting what they do or don't know. These types of practices are known as Assessment FOR Learning (AFL) strategies - an appropriate name since they are assessment strategies that lead to learning.
One set or type of AFL strategies are those that fall into the category of Standards Based Learning (SBL). SBL strategies are AFL strategies that focus on specific content standards. Students are assessed and taught based on standards. Their learning is driven by standards mastery, and the ultimate grade they receive is a communication of how well they have mastered standards - instead of the result of averaging a bunch of numbers together in a grade book.
As SBL strategies are shared on The Assessment Network, they also will be added to this blog. This post will become a one-stop-shop for all sorts of SBL ideas scattered throughout the Network. If you have any ideas or suggestions, please let Scott Habeeb know.
Blog Posts:
- It's TIme to Take an Assessment Journey from VASCD, Pawel Nazarewicz, & Scott Habeeb
- Getting Students to Focus on Learning
- Redos and Retakes? Yes. But don't forget to LOOP!
- Practical Examples of LOOPING - Philosophy in Action
- An Overview of Standards Based Learning from @CVULearns
- Quit Focusing on Standards Based Grading from @CVULearns
- Standards Based Teaching and Learning After Year One from JumpRope, Pawel Nazarewicz, & Scott Habeeb
- (Video) Standards-Referenced Grading from Des Moines Public Schools
- SBL Success: Things are moving in the right direction when students ask you questions like this...
- Standards Based Grading Flowchart from Matt Townsley
- Response to a Parent from Rick Wormeli
- Standards Based Grading Video from Rick Wormeli
- Formative Assessment and Standards Based Grading from Robert Marzano
- Redos and Retakes: 14 Practical Tips from Rick Wormeli
- How SBL Should and Shouldn't Look in a Math Class from Matt Townsley
- #SBLchat on Twitter
- Grade Like A Torpedo
- Vermont SBL Collective - great collection of resources
- AFL Teachers Reporting Progress in an SBL Method (using PowerSchool)
- Standards Based Learning and Grading Facebook Page
- The AFL/SBL Exit Slip
- This I Believe from Ken O'Connor
- Grading in 3D by Pawel Nazarewicz
- A Math Teacher Shares Her Students' Thoughts on SBL by Kristin Manna
- AFL: As Basic As Taking Off Your Coat
- A Winning Moment: A Math Teacher Shares SBL Success
- Letter to the Editor of Forest City Summit, IA from Rick Wormeli
Pictures:
Videos:
- Standards Based Grading Video from Rick Wormeli
- The teacher will have to guide/train students about how to use the rubric in this manner. Don’t expect magic the first time.
- This will work best if the teacher provides class time for the students to use their rubrics.
- The teacher might want to keep the rubrics in the classroom so that they do not get lost. Students might not take them home until the night before a large test/quiz/graded assignment.
- Be very explicit with your students about the purpose of the rubric. Don’t let this become just another "thing". This could be yet another worksheet provided by a teacher but not effectively used by students. Instead help your students view self-assessment as a core learning strategy and something that they can apply to future classes/learning. Help them view the rubric as a key to success.
There are some exciting things going on in the Northeast these days with Standards Based Learning. Here is a link to a phenomenal list of SBL resources from the state of Vermont.
http://vermontsbl.weebly.com/resources.html
Thanks to the following educators for putting this together:
Laurie Singer: Principal, ADL Intermediate School, Essex
|
Emily Rinkema: Teacher/Instructional Coach, CVUHS, Hinesburg
Check out Seven Practices for Effective Learning from the November 2005 edition of ASCD's Educational Leadership. This is a great description of how to use assessment to promote learning.
Followers of this site will find the 7 practices outlined in the article to be quite familiar. They are:
- Use summative assessments to frame meaningful performance goals.
- Show criteria and models in advance.
- Assess before teaching.
- Offer appropriate choices.
- Provide feedback early and often.
- Encourage self-assessment and goal setting.
- Allow new evidence of achievement to replace old evidence.
Kudos to Salem High School math teacher, Erin Stenger, for thinking to put a sign like this right next to her doorway where students will see it each day as they leave her class.
It has been noted before on this website that for AFL to truly have its greatest possible impact, the students need to be using assessment-elicited feedback to measure their own progress and guide their own learning. Like most things that we want students to do, though, we must train them to do it. This is especially true for AFL since most students (just like most parents and most teacher) tend to look at grades from a summative position.
If we want students to view grades as feedback that guide their learning rather than just get averaged together to determine a grade, then we must 2 things:
1. We must grade and assess in a formative manner rather than just collect a bunch of scores to average.
2. We must train our students.
This picture in Mrs. Stenger's room is a subtle but important example of this. Most importantly, it reveals the fact that AFL is a core philosophy that permeates the way Mrs. Stenger runs her classroom.
Here are some other blog posts that deal with the same idea of students knowing what they know:
2. Did AFL Guide My Instruction Today?
Recently, several letter writers to the Forest City Summit, an Iowa newspaper, have disparaged standards-based grading. Specifically, they disparaged Rick Wormeli's work in that field. As a result, Mr. Wormeli wrote a response to those letter-writers, and the newspaper agreed to run it.
While I am personally unfamiliar with the events in Forest City Schools, IA that led to these letters being written, public arguments like this over grading issues always cause me to wonder if the school division employed too much of a top-down method of improving assessment strategies.
At its heart, standards based learning really shouldn't be controversial. Learning should be measured against standards and communicated in terms of standards so that grades actually represent learning and, more importantly, so teachers and students know where to focus their instructional and learning efforts.
When individual teachers implement solid and well-communicated SBL strategies, students tend to appreciate the descriptive and helpful nature of the feedback. Students tend to appreciate knowing where their strengths and weaknesses are so that they can then focus on improving where necessary. And typically, when students appreciate what is going on in class and feel like it helps them learn, parents are supportive.
However, when policies are implemented at the division-level and then required or mandated it is not uncommon to create controversy where none need exist. I would encourage schools and divisions to focus on a meaningful professional development journey - to take the long view approach - instead of looking to change practices by changing policy.
Again, I do not know what exactly went on in this Iowa school district, but I do know that educators exploring the merits of standards based learning would benefit from reading Mr. Wormeli's letter.
Here's a link to the letter in its original form on the Forest City Summit's website:
Below is the same letter copied and pasted into this blog:
To the editor:
In recent letters to the editor in the Summit, my work was mentioned as one catalyst for the shift in grading practices in Forest City Schools from traditional to standards-based grading. Many of the claims made by the authors misrepresent me and these practices, however, and I’d like to set the record straight.
Most of us think the purpose of grading is to report what students are learning, as well as how students are progressing in their disciplines It is important for grades to be accurate, we say, otherwise we can’t use grades to make instructional decisions, provide accurate feedback, or document student progress.
These are wise assertions for grading. Nowhere in these descriptions, however, is grading’s purpose stated as teaching students to meet deadlines, persevere in the midst of adversity, work collaboratively with others, care for those less fortunate than ourselves, or to maintain organized notebooks. While these are important character attributes, we realize that none of the books or research reflecting modern teaching/parenting mentions grading as the way in which we instill these important values in our children.
We actually know how to cultivate those values in others, but it isn’t through punitive measures and antiquated notions of grading. Author of Grading Smarter, Not Harder (2014), Myron Dueck, writes,
“Unfortunately, many educators have fallen into the trap of believing that punitive grading should be the chief consequence for poor decisions and negative behaviors. These teachers continue to argue that grading as punishment works, despite over 100 years of overwhelming research that suggests it does not (Guskey, 2011; Reeves, 2010).”
In 2012, researcher, John Hattie, published, Visible learning for Teachers: Maximizing Impact on Learning, with research based onmore than 900 meta-analyses, representing over 50,000 research articles, 150,000 effect sizes, and 240 million students. He writes,
“There are certainly many things that inspired teachers do not do; they do not use grading as punishment; they do not conflate behavioral and academic performance; they do not elevate quiet compliance over academic work; they do not excessively use worksheets; they do not have low expectations and keep defending low quality learning as ‘doing your best’; they do not evaluate their impact by compliance, covering the curriculum, or conceiving explanations as to why they have little or no impact on their students; and they do not prefer perfection in homework over risk-taking that involves mistakes.”
Those interested in research on standards-based grading and its elements are invited to read books written by Robert Marzano, Tom Guskey, Carol Dweck, Doug Reeves, John Hattie, Susan Brookhart, Grant Wiggins, Tom Schimmer, and Ken O’Connor. Matt Townsley, Director of Instruction in Solon Community School District in Iowa has an excellent resource collection at https://sites.google.com/a/solon.k12.ia.us/standards-based-grading/sbg-literature.
A caution about worshiping at the research altar, however: ‘Not all that is effective in raising our children has a research base. A constant chorus of, “Show me the research,” adds distraction that keeps us from looking seriously and honestly at our practices. When we get our son up on his bicycle the first time, and he wobbles for stretch of sidewalk then crashes abruptly into the rhododendrons, we give him feedback on how to steer his bicycle, then ask him to try again. Where’s the vetted research for doing that? It’s not there, and we don’t stop good parenting because we don’t have journaled research.
Trying something, getting feedback on it, then trying it again, is one of the most effective ways to become competent at anything. How does an accountant learn to balance the books? Not by doing it once in a trumped up scenario in a classroom. Can a pilot re-do his landings? ‘Hundreds of times in simulators and planes before he actually pilots a commercial airliner with real passengers. How do we learn to farm? By watching the modeling of elders and doing its varied tasks over and over ourselves. How do we learn to teach? By teaching a lot, not by doing it once or twice, then assuming we know all there is. I want a doctor who has completed dozens of surgeries like the one she’s about to do on me successfully, not one who did one attempt during training.
This is how all us become competent. Some individuals push back against re-doing assignments and tests, however, because there’s a limited research base for it, or so they claim (There’s actually a lot of research on the power of reiterations in learning). My response to the push back is: When did incompetence become acceptable? How did we all learn our professions? Does demanding adult-level, post-certification performance in the first attempt at something during the young, pre-certification learning experience help students mature?
Parents should be deeply concerned when teachers abdicate their adult roles and let students’ immaturity dictate their learning. A child makes a first attempt to write a sentence but doesn’t do it well, and the teacher records an F for, “Sentence Construction,” in the gradebook with no follow-up instruction and direction to try it again? ‘Really? We can’t afford uninformed, ineffective teaching like this. To deny re-learning and assessment for the major standards we teach is educational malpractice. Parents should thank their lucky stars for teachers who live up to the promise to teach our children, whatever it takes.
We can’t be paralyzed by the notion put forth by Dr. Laura Freisenborg in her Nov. 25 letter of juried journals of research as the only source of credibility. Dr. Friesenborg says that there has been, “…no robust statistical analysis of students national standardized test scores, pre- and post-implementation” of the practices for which I advocate. This is disingenuous because it’s physically and statistically impossible to conduct such study, as there are so many confounding variables as to make the “Limitations of the Study” portion of the report the length of a Tom Clancy novel. We do not have the wherewithal to isolate student’s specific outcomes as a direct function of teachers’ varied and complex implementations of so many associated elements as we find in SBG practices, including the effects of varied home lives and prior knowledge. If she’s so proof driven, where is her counter proof that traditional grading practices have a robust statistical analysis of pre- and post-implementation? It doesn’t exist.
She dismisses my work and that of the large majority of assessment and grading experts as anecdotal and a fad education program, declaring that I somehow think students will magically become intrinsically motivated. This is the comment of someone who hasn’t done her due diligence regarding the topic, dismissing something because she hasn’t explored it deeply yet. Be clear: There’s no magic here – It’s hard work, much harder than the simplistic notion that letter grades motivate children.
Friesenborg diminishes the outstanding work of Daniel Pink, who’s book Drive, is commonly accepted as well researched by those in leadership and education, and she does not mention the work of Vigotsky, Dweck, Bandura, Lavoie, Jensen, Marzano, Hattie, Reeves, Deci, Ripley, de Charms, Stipek and Seal, Southwick and Charney, Lawson and Guare whose collective works speak compellingly to the motivational, resilience-building elements found in standards-based grading. Is it because she is unaware of them, or is it because their studies would run counter to her claims? Here she is distorting the truth, not helping the community.
We DO have research on re-learning/assessing (see the names mentioned above), but it’s very difficult to account for all the variables in the messy enterprise of learning and claim a clear causation. Some strategies work well because there’s support at home, access to technology in the home, or a close relationship with an adult mentor, and some don’t work because the child has none of those things. Sometimes we can infer a correlation in education research, but most of the time, good education research gives us helpful, new questions to ask, not absolute declarations of truth. When research does provide clear direction, we are careful still to vet implications thoughtfully, not dismiss what is inconvenient or doesn’t fit our preconceived or politically motivated notions.
When we are anxious about our community’s future, we want clear data points and solid facts, but teaching and learning are imperfect, messy systems, and we’re still evolving our knowledge base. Many practices have stood the test of time, of course, but it’s only a minority of them that have a strong research base. We can’t cripple modern efforts by waiting for one, decisive research report to say, “Yay or Nay.” At some point, we use the anecdotal evidence of the moment, asking teachers to be careful, reflective practitioners, and to welcome continued critique of practices in light of new perspective or evidence as it becomes available. If we’re setting policy, we dive deeply into what isavailable in current thinking and research nationwide so our local decisions are informed.
In her letter, Friesenborg describes standards-based grading as, “radical.” Please know that it is quite pervasive with thousands of schools across the country actively investigating how to implement it or who have already done so. Most states, in fact, are calling for competency-based learning and reporting to be implemented. Friesenborg states that the Iowa State Board of Education makes standards-based learning a legislative Advocacy Priority. This is a positive thing, and SBG practices promote exactly this. We want accurate reporting. That means we separate non-curriculum reports from the curriculum reports. It helps all of us do our jobs, and it provides more accurate tools for students to self-monitor how they are doing relative to academic goals.
Such grading practices are not even close to the definition of radical. Read the observations of schooling in Greece, Rome, Egypt, Babylonia, and on through the 1700’s, the Renaissance, the 1800’s, and the 1900’s: Grades reporting what students have learned regarding their subjects was the predominant practice. There were separate reports of children’s civility and work habits. That’s what we’re doing here with SBG, nothing else. It’s dramatically more helpful than a grade that indicates a mishmash of, “Knowledge of Ecosystems, plus all the days he brought his supplies in a timely manner, used a quiet, indoor voice, had his parents sign his reading log for the week, and brought in canned food for the canned food drive.” In no state in our country does it say, “Has a nice neat notebook” in the math curriculum. That’s because it’s not a math principle. It has no business obscuring the truth of our child’s math proficiency.
We have plenty of research, let alone anecdotal evidence, that reporting work habits in separate columns on the report card actually raises the importance of those habits in students’ minds, helping them mature more quickly in each area. The more curriculum we aggregate into one symbol, however, the less accurate and useful it is as a report for any one of the aggregated elements or as a tool of student maturation. SBG takes us closer to the fundamental elements of good teaching and learning.
Rick Wormeli
Sometimes when you're learning a new skill or trying to figure out how to apply a new philosophy, it helps to watch that skill or philosophy being used or implemented in a totally different arena. Thinking outside the box and adopting new ideas can be difficult when you're extremely familiar with your own domain. Observing the skill or philosophy at work in someone else's domain is less threatening. Once you are able to see the benefit of the skill or the power of the philosophy it might be easier to figure out how to include it into your personal realm of familiarity.
I think this might hold true for the application to the classroom of the philosophies of Assessment FOR Learning, Standards Based Grading, and Measuring Student Growth.
Below is a recent article Sports Illustrated article about the Oklahoma City Thunder's Kevin Durant. As I read it I was struck by just how much sense it makes to assess for the purpose of learning (not grading), to grade and assess based on standards, and to intentionally and meaningfully measure growth. It just makes so much sense when it comes to improving in life, as evidenced by this article about Durant's attempts to improve as a basketball player. I wonder why it doesn't always make sense in the classroom where we educators are working tirelessly to get students to improve?
Read the article below for yourself, and as you do, pay attention to the intentional steps Kevin Durant has taken to improve his shooting.
- He is constantly - daily - assessing himself.
- He has broken down shooting into "standards" based on different locations on the floor.
- He is using the feedback from the assessments to determine what "standards" he needs to practice and where he needs to grow.
- His improvement is constantly being charted so that he and his personal trainer/shot doctor/video analyst/advance scout can keep adjusting the learning plan for maximum growth.
It just makes so much sense for him to do this. Durant wants to grow, and this is how one intentionally sets out to grow.
Likewise, it makes sense to me that every teacher would want to:
- Constantly - daily - assess students.
- Break down learning into standards based on content knowledge and skills.
- Use assessment feedback to determine which standards individual students need to focus on in order to grow.
- Constantly chart improvement so that learning plans can be adjusted for maximum growth.
So read the article below, look for the examples of Assessment FOR Learning, Standards Based Grading, and Measuring Student Growth, and then consider how you could better apply them to your classroom.
HOW 'BOUT THEM APPLES?
Copied from http://www.sportsillustrated.com and written by Lee Jenkins (@SI_LeeJenkins)
On the day after the Heat won their 27th game in a row, Kevin Durant sat in a leather terminal chair next to a practice court and pointed toward the 90-degree angle at the upper-right corner of the key that represents the elbow. "See that spot," Durant said. "I used to shoot 38, 39 percent from there off the catch coming around pin-down screens." He paused for emphasis. "I'm up to 45, 46 percent now." Durant wore the satisfied expression of an MIT undergrad solving a partial differential equation. You could find dozens of basic or advanced statistics that attest to Durant's brilliance this season-starting with the obvious, that he became only the seventh player ever to exceed 50% shooting from the field, 40% from three-point range and 90% from the free throw line-but his preferred metric is far simpler. He wants what Miami has, and he's going to seize it one meticulously selected elbow jumper at a time.
The NBA's analytical revolution has been confined mainly to front offices. Numbers are dispensed to coaches, but rarely do they trickle down to players. Not many are interested, and of those who are, few can apply what they've learned mid-possession. Even the most stat-conscious general manager wouldn't want a point guard elevating for an open jumper on the left wing and thinking, Oh no, I only shoot 38% here. But Durant has hired his own analytics expert. He tailors workouts to remedy numerical imbalances. He harps on efficiency more than a Prius dealer. To Durant, basketball is an orchard, and every shot an apple. "Let's say you've got 40 apples on your tree," Durant explains. "I could eat about 30 of them, but I've begun limiting myself to 15 or 16. Let's take the wide-open three and the post-up at the nail. Those are good apples. Let's throw out the pull-up three in transition and the step-back fadeaway. Those are rotten apples. The three at the top of the circle-that's an in-between apple. We only want the very best on the tree."
The Thunder did not win 27 straight games. They did not compile the best record. Durant will not capture the MVP award. All he and his teammates did was amass a season that defies comparison as well as arithmetic. They scored more points per game than last season even though they traded James Harden, who finished the season fifth in the NBA in scoring, five days before the opener. They led the league in free throws even though Harden gets to the line more than anybody. They posted the top point differential since the 2007-08 Celtics, improving in virtually every relevant category, including winning percentage. Their uptick makes no sense unless Durant was afforded more shots in Harden's absence, but the opposite occurred. He attempted the fewest field goals per 36 minutes of his career. He didn't even take the most shots on his team, trailing point guard Russell Westbrook, and he seemed almost proud that his 28.1 points per game weren't enough to earn the scoring title for the fourth consecutive year. "He knows he can score," says Thunder coach Scott Brooks. "He's trying to score smarter."
Durant is lifting Oklahoma City as never before, with pocket passes instead of pull-ups, crossovers instead of fadeaways. He remains the most prolific marksman alive, unfurling his impossibly long arms to heights no perimeter defender can reach, but he has become more than a gunner. He set career marks in efficiency rating, assists and every newfangled form of shooting percentage. "Now he's helping the whole team," says 76ers point guard Royal Ivey, who spent the past two seasons with the Thunder. "Now he's a complete player." The Thunder are better because Durant is better. Of course, the Heat will be favored to repeat as champions, and deservedly so. But Oklahoma City has been undercutting conventional wisdom for six months.
NBA history is littered with stars who languish in another's shadow, notably Karl Malone, Charles Barkley, Patrick Ewing and Reggie Miller through the Michael Jordan reign. Oklahoma City lost to Miami in the Finals last June, and Durant will surely be runner-up to LeBron James in the MVP balloting again. Durant is only 24 and is as respectful of James as a rival can be, but he's nobody's bridesmaid. "I've been second my whole life," Durant says. "I was the second-best player in high school. I was the second pick in the draft. I've been second in the MVP voting three times. I came in second in the Finals. I'm tired of being second. I'm not going to settle for that. I'm done with it."
"I'm not taking it easy on [LeBron]. Don't you know I'm trying to destroy the guy every time I'm on the court?"
Justin Zormelo doesn't have a formal title. He is part personal trainer and part shot doctor, part video analyst and part advance scout. "He's a stat geek," Durant says, expanding the job description. Zormelo sits in section 104 of Oklahoma City's Chesapeake Energy Arena, with an iPad that tells him in real time what percentage Durant is shooting from the left corner and how many points per possession he is generating on post-ups. After games, he takes the iPad to Durant's house or hotel room and they watch clips of every play. Zormelo loads the footage onto Durant's computer in case he wants to see it again. "If I miss a lot of corner threes, that's what I work on the next morning before practice," Durant says. "If I'm not effective from the elbow in the post, I work on that." Zormelo keeps a journal of their sessions and has already filled two notebooks this season. Last year Zormelo noticed that Durant was more accurate from the left side of the court than the right, and they addressed the inconsistency. "Now he's actually weaker on the left," Zormelo says, "but we'll get that straightened out by the playoffs."
Zormelo, 29, was a student manager at Georgetown when Durant was a freshman at Texas, and they met during a predraft workout at Maryland that included Hoyas star Brandon Bowman. Durant embarked on his pro career and so did Zormelo, landing an internship with the Heat and a film-room job with the Bulls before launching a company called Best Ball Analytics in 2010 that has counted nearly 30 NBA players as clients. Zormelo kept in touch with Durant, occasionally e-mailing him cutups of shots. They bonded because Zormelo idolizes Larry Bird and Durant does, too.
Durant left a potential championship on the table in 2011, when Oklahoma City fell to Dallas in the Western Conference finals. About two weeks after the series, Durant scheduled his first workout with Zormelo in Washington, D.C. "I didn't sleep the night before," Zormelo remembers. "I was up until 4 a.m. asking myself, What am I going to tell the best scorer in the league that he doesn't already know?" They met at Yates Field House, where Georgetown practices, and Zormelo told Durant, "You're really good. But I think you can be the best player ever." Durant looked up. "Not the best scorer," Zormelo clarified. "The best player." It was a crucial distinction, considering Durant had just led the league in scoring for the second year in a row yet posted his lowest shooting percentage, three-point percentage and assist average since he was a rookie. He was only 22, so there was no public rebuke, but he could not stand to give away another title.
"He was getting double- and triple-teamed, and in order to win a championship, he needed to make better decisions with the ball," says former Thunder point guard Kevin Ollie, now the head coach at Connecticut. "He needed to find other things he could do besides force up shots. That was the incentive to change his pattern." Over several weeks Zormelo and Durant formulated a written plan focusing on ballhandling, passing and shot selection. They were transforming a sniper into a playmaker. Growing up, Durant dribbled down the street outside his grandmother's house in Capitol Heights, Md. He played point guard as a freshman at National Christian Academy in Fort Washington. He watched And1 DVDs to study the art of the crossover. "Where I'm from, you got to have the ball," Durant says. "That's how we do it. We streetball." But he sprouted five inches as a sophomore, from 6'3" to 6'8," and suddenly he was a forward. Though his stroke didn't suffer, his handle did. "I still had the moves," Durant insists, "but I dribbled way too high."
He could compensate in high school, and even during his one season at Texas, but the NBA was changing to a league where the transcendent are freed from traditional positions and boundaries. When Portland was deciding between Durant and Ohio State center Greg Oden before the 2007 draft, Texas coach Rick Barnes copped a line that Bobby Knight used when the Blazers were debating between Jordan and center Sam Bowie in 1984. "He can be the best guard or he can be the best center," Barnes told G.M.'s. "It doesn't matter. Whatever you need, he'll do." The Trail Blazers selected Oden and Durant was taken second by Seattle, where coach P.J. Carlesimo started him at shooting guard. "Kevin could be all things," Carlesimo says, but back then he was too gangly to hold his spot or protect his dribble. Brooks replaced Carlesimo shortly after the franchise relocated to Oklahoma City the following season and wisely returned him to forward.
This season Durant is averaging two fewer field goals and nearly two more assists than he did in 2011, and he has practically discarded two-point shots outside 17 feet. Brooks tells him on a near nightly basis, "KD, it's time. I need you to shoot now." Says Brooks, "To extend the apple metaphor, I'm now able to put him all over and get fruit." He isolates Durant at the three-point line, posts him up and uses him as the trigger man in the pick-and-roll. When defenders creep too close, Durant freezes them with a crossover at his ankles or deploys a rip move that former Thunder forward Desmond Mason taught him four years ago to pick up fouls.
"Remember when tall guys would come into the league and people would say, 'They handle like a guard!' but they never actually did handle like a guard?" says Thunder forward Nick Collison. "Kevin really does handle like a guard." Durant has become both facilitator and finisher, shuttling between the perimeter and the paint, stretching the limits of what we believe a human being with his build can do. If his progression reminds you of someone else's, well, that's probably not an accident.
Without Harden, Oklahoma City needed a new playmaker, and Durant had spent more than a year preparing for the role. He just didn't realize it at the time. "They were looking for somebody else to move the defense and handle the ball in pick-and-roll," says a scout. "It turned out to be him." When Durant was 20, the Thunder asked him to act 25, and now that he is nearly 25, the plan for his prime has come to fruition. He is the NBA's best and perhaps only answer for James. "I've given up trying to figure out how to stop him," said Celtics coach Doc Rivers. "And I'm not kidding."
On Nov. 24, four weeks after Harden left, the Thunder were a respectable but unremarkable 9-4 and nursing a five-point lead with one minute left in overtime at Philadelphia. Durant posted up on the right wing, bent at the waist, a step inside the perimeter. Dorell Wright, the unfortunate 76er assigned to him, planted one hand on Durant's rib cage and another on his back. "What do I tell a guy in that position?" asks an NBA assistant coach. "I shake his hand and say, 'Good luck.'"
Durant faced up against Wright, tucked the ball by his left hip and swung his right foot behind the arc, toe-tapping the floor like a sprinter searching for the starting block. Durant had scored 35 points, but on the previous possession he fed Westbrook for a three, and on the possession before that he set up a three by Kevin Martin, who had arrived from Houston in the Harden trade. It was time for the Durant dagger, but before he shimmied his shoulders and unfurled his arms he spotted guard Thabo Sefolosha, ignored in the left corner. Sefolosha was 1 for 6, and in the previous timeout Durant had told him, "You're going to make the next shot." Durant could have easily fired over Wright and finished the Sixers, but he let his mind wander to the ultimate destination, seven months away. I'm going to need all these guys to get to the Finals, he thought.
Durant took one dribble to his left, and center Lavoy Allen rushed up to double him at the free throw line. He dribbled twice more, to the left edge of the key, and two other Sixers slid over. Surrounded by four defenders, Durant finally shoveled to Sefolosha, so open that he feared he might hesitate. He didn't. Durant jabbed him in the chest as the ball slipped through the net.
How about them apples?
- Homework would still be given but would either not count for points or all homework assignments would add up to one homework grade of approximately 30 points. Another idea I have contemplated would be that at the end of the grading period students with all homework completed would get a reward, perhaps a pizza party, while students with missing assignments would spend that time completing their work.
- Quizzes would still be given almost daily but would now only count 10 or 15 points each. In addition, if a student's test grade was higher than the quizzes that led up to it I would excuse the quiz grades for that student.
- Tests would count more. In the class I taught the tests were used as the ultimate gauge of mastery learning. The tests would continue to build on themselves but would probably start somewhere around 300 points and build up to around 800 points.
- To build on the point I made above, the quizzes would be excused if the student's test grade was higher. The quizzes would be considered practice grades. Students would be trained to not fret about quizzes but to instead use them as ways to gauge their learning. I might even borrow Beth Moody's GPS idea occasionally and allow students to retake an occasional quiz; however, this would probably not be the case for most quizzes since whenever possible I would be repeating quizzes anyway.
- The goal of quizzes would be to practice for the test. In the past I viewed the quizzes more as grades unto themselves. The problem with this, though, was that if I had four 30 pt quizzes before a 100 pt test, then the quizzes added up to more than the test. Adding in the four or five 10 point homework assignments further got in the way. Yes, they were assessments that helped the students learn, but they also had an inappropriate impact on the grade. They could help the student master the content as evidenced by the high test score while simultaneously lowering the student's grade.
- If I was in the classroom today I would add an entire new element of students assessing themselves. I would want students to take control of their own learning. and to know what they do and don't know. I would then want them to use that knowledge to guide their own studies.
- One thing I would do would be to make sure that everyday (if possible) the students and I would both receive feedback. As I prepared my lessons I would ask myself the questions posed in this earlier post.
- When I reviewed with students for tests I would change my method and adopt a strategy similar to this one used by Paola Brinkley and many other teachers in our building. (I would probably find a way to turn it into a game since I love playing games in class.)
- At the beginning of each unit/topic I would give students a rubric like the one in this post. At some point during most class periods I would have the students use the rubric to assess themselves and see how well they are mastering content. They would then use the rubric as a study guide as described in the post.
- I would also have students analyze their grades regularly so that they would know how well they needed to do on a test to reach their grade goal. (Implied in this is the fact that I first would have students regularly set goals.) I would use a strategy similar to this one used by Lewis Armistead.