Assessing assessment apathy
After three years at USI as a transfer student, the Assessment Goblin finally got me. It was only then I found out the major field assessments are basically pointless in reflecting one’s educational development.
Now, I know that I shouldn’t have bothered caring about the assessment, and neither should anyone else.
I used to give students guff for scoffing at assessment days. Of course they’re important; the university needs to know how everyone is performing if they’re to report any sort of success to the room full of old white men who decide if we get funding next year.
Then I learned the truth.
The big day came. I clocked out early from work knowing that, even though I was walking away from $62.50 of potential hours at my job (specifically organized so I could work on days where I don’t have class), I would be helping the college provide quality education for future students. It was time to do the Goblin proud.
Then, after three years of attentive studying and reading anything I could get my hands on, I would’ve failed the exam if there had been any, y’know, weight behind it.
I’m sure the company that sold the testing program to USI had an arsenal of figures and statistics ready to prove their test would be the one to accurately measure how students are retaining information in their major. There was probably even a pie chart.
Then their fancy testing system elected to slam an English senior, who’s never taken a single course designed to teach the intricacies of poetry, with an assessment that focused almost entirely on the intricacies of poetry.
Somewhere in cyberspace the Assessment Goblin reached his hands into a bin labeled “300 level poetry questions” and flung handful after handful at a man who’s written two poems in the last decade, a man who took three minutes to remember the word “stanza” a week prior.
I was slammed with questions on the intricacies of poem structure, the mechanics of how a stanza is organized and the names of methods used to construct poems I’ve never read before.
The questions were so bizarrely specific that I made a mental note of two more difficult ones and texted a poetry nerd friend about them as I walked to my car. The response text was the verbal equivalent of a shrug.
The next day I asked anyone who would stop to listen how the assessment went. Either they were in the camp of people who tried and failed like I did, or they were among the masses of people who clicked randomly and finished the test within 20 minutes.
I envy the second group.
Here’s a free tip for the university from an ex-education major: if one is to appropriately assess a student’s development, one needs something more targeted than what is currently being used. There were questions on my test that felt like they existed purely because they were in a bank of pre-written questions that came with the software.
Until then, students are going to fight the Goblin the only way we know how: answer randomly so we can go home and eat fast-food while trying not to feel bad about a score on a meaningless test.
Feel free to go write a condescending letter to the editor explaining how I’m wrong.