Last year, when our class was in the thick of creating political campaign strategies for the election of 1800 (Thomas Jefferson vs. John Adams – our first bare knuckled, let’s -take-down-the-other-guy-no-matter-how campaign), Aly sidled up to me with a gleam in her eyes. Could her team create a Flipagram, embed their slogans, and write their campaign messages as a rap? I had never heard of Flipagram, and had no idea how to do any of the things she proposed, but I was game. Sure, I said, go for it! Twenty minutes later, Aly and her team went to the front of the classroom, and blew us all away with their campaign. After all the clapping and whooping had ceased, I looked down at my rubric and realized that it could not capture the type of skill and learning we had just witnessed – I needed to find another way.
In Assessing Students’ Digital Writing:Protocols For Looking Closely, Troy Hicks and a team of forward looking educators have given us lenses through which to appreciate and evaluate the type of digital creativity that students like Aly seem adept at, and most engaged with. Hicks writes:
In an age where digital is, we can no longer afford to look at student work in the ways we have in the past…both the processes and the products of writing continue to undergo change in the digital age; thus, it is crucial that teachers at all grade levels begin to initiate serious conversations about how writing is taught, how we value the process of writing, and how we pay attention to the assessments of students’ multimodal compositions.”
Hicks and his NWP team of teachers used an inquiry process through which to view student digital writing. These “protocols” allowed teachers to examine and discuss student work at each stage of the writing process. These discussions allowed each teacher to adjust and refine his or her own teaching to help each student create the best, most authentic, piece possible. Assessment of digital writing looks and feels very different from what we as teachers are most used to (checklists, rubrics, etc.), so it is informative to read through these discussions in preparation for the work of our own students.
Each team member brought one student-created digital project to the inquiry process, and Hicks and his team evaluated these digital pieces through the following protocols:
- The artifact: the finished product.
- The context: the way the finished product was thought out and its purpose.
- The substance: the overall content and the quality of its ideas
- Process management and technique:the technical skills necessary to create the project
- The habits of mind: the behavior attitudes of the student as he worked through problems, took risks, etc.
Through their discussions, team members found themselves asking new questions about their teaching practices, and developing new ways to assess their students’ writing process. Each chapter is a case study of how a student initiated and then developed his or her project, and how each teacher responded to the learning steps along the way. It was interesting to see how open these teachers had to be to their students’ own problem solving skills, and how the discussions with their project cohorts helped them to arrive at more authentic assessments.
Julie Johnson, whose 4th. grader Carson created a Wonderopolis-inspired project, shares this insight about constructing new ideas about assessment that she gained through this protocol inquiry:
I find that I am more keenly aware of the digital moves individual students are making. In addition, these close observations allow me to have deeper discussions and ask strategic questions as I build my assessment into our writing conferences and any other conversations we might have throughout the day.
Jeremy Hyler, whose 7th. grader Lauren created a book trailer, writes about the shift in his thinking about crafting rubrics for digital writing assessment:
Originally, I created a rubric that was mostly based on specific attributes of the written book review and the digital book trailer, but not necessarily the quality of those attributes. Thus, in terms of having a rubric, I decided on creating a structure for students where they can work within guidelines , not a rubric. I feel a rubric doesn’t give the freedom for students to be creative and it doesn’t allow them to explore different possibilities for meeting the criteria set before them. A rubric forces students to meet certain expectations, and there is no room for them to go “outside the box”
And Bonnie Kaplan and Jack Zangerle, whose 8th. grader Katie created a public service announcement, reassure us that encouraging our students to create multi modal digital pieces can be done without losing sight of academic content:
Katie’s early draft highlights the fact that though her work may be visually and technically impressive, it is still the content of the piece that had to take center stage. As we evaluate rich student work, we are keeping the content learning at the center. This speaks to the concerns of many “content” area teachers who hesitate to dive too deeply into authentic literacy experiences. The “stuff” of the piece still must remain central to the focus of our work with students.
Our students live in a world dominated by digital media, and so it is only natural that they want to engage and explore new ways of expressing themselves. Assessing Students’ Digital Writing:Protocols For Looking Closely is a thoughtful read for those of us struggling to allow this exploration to flourish in our writing workshop classrooms. This book, paired with Franki Sibberson and William Bass’ Digital Reading: What’s Essential in Grades 3 – 8, would be a wonderful book club choice for teachers to read together and discuss before creating assessments for their students. After all, don’t we want to encourage the creativity and enthusiasm of students such as Aly?
9 thoughts on “How to Assess Students’ Digital Writing – Ideas from Troy Hicks’ New Book”
It sounds like a whole new ballgame. Essentially, I assume, the basic rule applies that the writing makes sense and the main idea is backed up with examples and arguments. What are the major differences?
The major differences have to do with assessing the digital components, which are anchored in technical issues. Our kids are using new platforms to say what they used to say in essays and reports alone, so that aspect needs to be guided and also assessed.
Thank you for the information.
Tara, I loved how you were “game” and allowed your students to choose their own way of expressing their learning!! Kudos to you for taking that risk with your students!
The longer I have been in teaching, the easier it has become to let students lead and take risks. It’s what makes learning relevant and exciting for them.
Moving to “practical” and / or demonstrating real – life skills sounds like a perfect win for students. Love the idea of allowing students to have more creative choices of the product, context, substance, process management and the habits of mind.
I think choice really does foster engagement, Fran. But even choice needs parameters and guidance from us. Always a balancing act, right?
LikeLiked by 1 person
Please consider linking this post to the DigiLit Sunday round-up here: https://reflectionsontheteche.wordpress.com/2015/10/11/digilit-sunday-choice/
I am concerned about the creativity of our students in this age of CC and testing. Digital literacy lends itself to a new kind of creativity. I haven’t thought about changing the way I assess these products. Thanks for getting me thinking and reflecting on assessment of digital literacy.
This is a wonderful book through which to learn how to change assessment stances for digital products, Margaret. It certainly helped me!
Comments are closed.