How reliable are report cards?

As owner and director of a private educational center, I often meet with students who are facing an entrance exam of some sort. These include the Secondary School Admission Tests and the Independent School Entrance Exams for private day and boarding schools; the COOP and High School Placement Tests for parochial schools; the SAT and ACT for college and universities, among others. These exams are usually the first exposure to a national exam for most Connecticut public school students. Parents who come to enroll their students for the test preparation course often do so after they receive the results of the first test attempt. They explain that their children receive A’s and B’s in their classes, and they perform at the “mastery” level on the state’s CMT’s and CAPT’s. So they were confident that they would score very high on the national exams. Then their children take the test. Too often these parents are devastated when they realize that the student who performs at the top of her class ranks only in the 30% or 40% on the more comprehensive and challenging exams. I explain that these basic level state curriculum exams bear no resemblance to the national tests. So why is the information from the school and state such an unreliable indicator of a student’s ability and performance?

Grades are subjective. Teachers award grades using a wide range of criteria including homework assignments, projects, class participation, attendance, and behavior. In many cases, the grades are inflated. The average grade is a C. Unlike in the fictional Lake Wobegone, all real students are not above average. Yet a majority receive grades in the A or B range. Just take a look at the local newspaper to see how many students appear on the honor roll. In fact, a parent with a very capable child in an elementary school approached me recently to inquire why 30% of the students in her child’s class received A’s in the class. She was reluctant to ask her child’s teacher. I provided one personal experience: Many years ago, when, as a principal evaluating a teacher’s performance, I inquired why the vast majority of his students had such high grades, he explained “that was what the parents wanted to see.” He would have to do some explaining if he gave a C. Unfortunately, this situation is not unique.

The state exams do not reflect student ability. Rather, they serve to illustrate a district’s adherence to state curriculum standards. The statistics provide information that allows state educators to assess pockets of success and failure and compare performance in a variety of areas. And Connecticut is doing dreadfully in that arena! The National Assessment of Educational Progress (NAEP) released a report at the beginning of this month that ranked Connecticut as having the largest gap of all 50 states between student performance in “poor vs. non-poor” districts!

Private and parochial schools use national tests like the Iowa’s and ERB’s to assess student performance on a national level. Many public schools participate in the Johns Hopkins Center for Talented Youth Program, which requires students to take a national exam (either the SCAT in the lower grades or the SAT’s in grades 7 or 8 to qualify). Yet school personnel must first recognize a student’s ability to provide a recommendation for participation. Because this state has no standardized exam, parents can nominate their own children for the CTY program. In order for parents of public school students to a real measure of their children’s performance nationally they must be pro-active and schedule testing. How many parents have the facts, finances, or time to do this? The school system needs to institute some guidelines for reliable and objective student assessment.

One Response to “How reliable are report cards?”

  1. Kafar Kaghzi

    Very nice and informative.

    Reply

Leave a Reply