This is IDIGOntario‘s 2nd post of the #9x9x25 Challenge
When I signed up to be on the IDIG team I very vaguely said that I would like to write something about the “Front End Analysis” phase of Instructional Design. Also known as the “what are we doing and why” part.
If you do this part well you can avoid making big mistakes down the road. You might even realize that you shouldn’t even do it at all. You also tend have that “come on, come on, let’s get going!” feeling buzzing around you. I am feeling that right now as we prepare to try a new way of delivering Ontario Extend in January. But no matter how many angles you try to anticipate, something will surprise you when you implement it.
A good example of this came from the scholarly project I worked on to complete my Master’s of Instructional Design. I created an instructional Alternate Reality Game (ARG). It was designed to help youth identify problem gambling behaviours and to know how to reduce their harm. I completed a lengthy front-end analysis in which I tried to anticipate who the learners would be and what needs I should meet to help them complete the game. I never considered that some kids might not be up for suspending their disbelief in what was meant to be a fun way to learn.
The first test went great, with a group of ninth grade students who were asked to participate and agreed of their own accord. They had fun and were successful in taking the story to its conclusion. The final test run, however, was a different story. In working with the program facilitator for the gambling awareness group, we chose to bring the game to test it out on an entire class of (I think) 11th grade students at an “alternative” high school. I don’t recall too much about the make up of the class or the reasons they had enrolled in a “different” kind of high school. In general you could say that the students were rightfully kind of pissed off about how their education was going so far.
They didn’t want to pretend. They didn’t want to suspend disbelief. They didn’t give a damn about rescuing a fake dog. They completed the game activities, but it would have probably served them better to give them a handout describing the harm reduction strategies and to just have a frank discussion about how these things have affected their lives. I remember clearly the look one student gave me when he realized I was trying to trick him into playing along. That’s when I knew that this program was not even close to the right thing to bring to them. It was utterly deflating.
The results of the test run were that yes, students reached the objectives. Learning was measured to have happened. But the feeling in the room was not the fun buzz I was working toward in the back of my mind. It was a stark opposite.
I’m going way over 25 sentences by digging in to that anecdote. My point is that I did not anticipate, at all, that this idea of learning via a game would resonate so poorly with these students. I didn’t ask the right, or enough, questions in my front-end analysis. JR Dingwall’s post in which he did ask the right questions to help bring about a great result, is what got me thinking about what questions to ask in the beginning.
So I ask you, what are the big questions you ask yourself and others when you first sit down to analyze a potential ID project? How can you avoid making something that leaves students feeling flat and misunderstood?
“Question?” flickr photo by spi516 https://flickr.com/photos/spi/2113651310 shared under a Creative Commons (BY-SA) license