Wednesday, August 24, 2016

Breathing Fresh AIR into Lab Reports

One of the first posts I ever wrote for this blog was about using the PARCC rubric for Narrative and Analytic Writing to create a rubric that I could use for lab reports. My reasoning was that some of my students would experience PARCC tests and using this rubric would help cement it in their minds. Also, that rubric was developed by a team of people and I believed it to be stronger than what I had been using. 

I used my adapted rubric for one year, and Ohio gave the PARCC tests one time, but then our state legislature voted to abandon PARCC and write our own state tests. This happened during the summer and when it was time to start school, a new rubric for writing had not emerged, so I stuck with the adapted PARCC rubric last year. Now Ohio has developed rubrics to use with its newly designed assessments, so a friend and I adapted the Ohio AIR rubric for Explanatory Writing for this year's lab reports. Here is what we came up with:



In past years I provide students with a set of guidelines for writing lab reports and we take a quick look at the rubric. Still, when I grade the first set of papers, some students have really missed the mark. This year I tried a different approach.

I provided my students with two sample reports from previous years. I chose one report that was a very strong example and one report that was a very weak example. I asked the students to read the reports, referring to the rubrics, and then, as a group, determine a grade for each report out of 20 points. I asked each group to report their scores and we listed their results on the whiteboard. 

Sample 1, the strong report, was scored as an 18, 19, or 20 out of 20 by every lab group. The students agreed that it was well-written, contained all pertinent details, and could be regarded as an exemplar. Sample 2, the weak report, received a much wider range of scores, from 4 to 12 out of 20, so we looked at this one in greater detail. We talked through what the author did well and where she faltered. The students were much more critical than I was. I think I recorded this report as a 13 or 14 out of 20 when I graded it a few years ago.

Overall, this was a better approach to introducing lab reports than just reading the guidelines and glancing at the rubric. At least I hope it was. First lab reports from my classes are coming in today and tomorrow. I am hoping for much better first reads!

Sunday, August 21, 2016

Another Reason to Check Out ChalkUp

In April I posted about my use of Chalkup to teach a PD course. This summer I used Chalkup again to teach a new section of that same class and I found another feature that I really liked - the collaborative discussion.

We chose a text that we wanted the participants to read and we attached it to a collaborative discussion in Chalkup. We also posted some questions about the reading and vocabulary. Then participants read the article and, as they did, they commented on the text using some cool commenting features. You can attach a comment to a point (that you create) or to some text (that you select) or to an area of the text (that you select). Others can respond to your comments and create their own, all in the same document. The result looked something like this:


I like the idea of this tool to help with close reading and to get initial ideas on a text before discussing it widely in class. Maybe students could read and comment or select text as evidence while they read. Then, after everyone is finished, a face-to-face discussion could follow up. I also like that the ideas of the participants are still there for a closer look once a lesson is over.

Chalkup is a neat and free option for an online version of a class. With collaborative discussions, it moves even higher on my list of tools that are a must try!

Monday, August 15, 2016

Measure their Minds with Mentimeter

Last week I presented at my district's Blended Learning Conference. I had seen a blurb about Mentimeter, so I tried it out. I liked it a lot (details below) and will definitely use it again!

Mentimeter captured my interest right away because it offers question types that I haven't seen with a lot of other formative assessment tools. In addition to the standard multiple-choice type questions, mentimeter offers word cloud creation and sliding scales and a 2x2 matrix. I love a 2x2 matrix! Here are the question types:


After you log in to your account, you create a "presentation." The presentation consists of the questions you will ask. My presentation was just a quick poll at the beginning of my session about web tools we all use to bolster our own professional development. I started with the word cloud question you see above and followed up with the sliding scale question below:


At the session, it was very easy to launch. Click the name of the presentation and you are provided with a link and code to share with your audience. Participants join by going to the link and entering the code. The presenter has the option of presenting the questions at the presenter's pace or the audience's pace and whether or not to share the audience responses as they come in.

It's a little hard to see in the image above, but I liked an almost hidden feature of the sliding scale question. The number in the circles above represents the average score for each tool. When you hover on a particular tool, you can see a wave plot of how many people chose each of the values on the scale. I thought that was really slick!

Start to finish - from create the questions to launch - took me only about 15 minutes. I spent more time thinking about and deciding from among the cool question types than I did setting everything up. There is a free plan that allows for unlimited audience size (great for big groups) and the tool works on any device - smart phone, tablet, or computer. I thought mentimeter was a fresh look at formative assessment. Definitely worth a look!