As any reader of this blog over the past two years will know, I've developed a deep interest in the field of public education, and its relatives, public/social policy and political discourse. Maybe I should take it as a sign that I'm (finally!) a grown-up. Maybe it's a re-awakening of my poli-sci major past.
I managed to graduate with a B.A. in said field from Illinois's premier public university...without ever having taken a math class. It took me 15 years to realize what a mistake that was (UIUC revised its policy for students who graduated high school in 1993 or later). A strong foundation in mathematics is important in life--not simply for calculating insurance risks, or making a pampered living for oneself on the stock market, but for making the connections (or recognizing their lack) between research and reality. As John Ewing reports, and as Diane Ravitch reported in her book about NCLB, there is a problem with mathematics in this country.
I stopped trying in math when the work got too hard for my "smart" brain--to my great detriment as an adult. I'm trying to change that by recognizing that numbers aren't always meaningful without context. It seems that my fellow LAS grads often create and perpetuate falsehoods in their reporting. Falsehoods that might be prevented by a better understanding of math, assessments, and standards. The more I read the words, "studies show" or "research proves" in any article, the more I wonder if that is really the case. It would seem that in the newsfeed-happy world in which we live, editors are so desperate to be the first to break the story that they fail to check the facts or the data accurately before publishing. Twitter feeds or Reddit headlines that compress the information into even fewer characters exacerbate the problem further.
The
truth is, research is not a quick process and there really aren't any
easy solutions to education reform. Last spring, I spent the better part
of three weeks looking at data from a huge fundraising project in hopes of isolating trends and ideas that could inform future decision-making. You'll notice that I used the term data-inform, not data-driven. I have lately wondered if the interchangeability of these terms is a case of willful or accidental ignorance.
It seems to me that too much weight is placed on assessments and their relative value in elementary education. ISAT scores, which are routinely used by prospective parents to judge a school's merit, are really only one very small piece that should be used in evaluating a school. Neither, of course, should parents use the number or frequency of assessments as a sign that the school "teaches to the test. " The real use in data produced by standardized tests is not in judging teachers or schools, but in tailoring teaching methods and learning opportunities to the students themselves. As our principal recently said, "If we don't know how students are learning, how can we accurately teach them?" (emphasis mine).
No comments:
Post a Comment