Test Data Analysis: A Wasted Opportunity
David L. GrayPublic school districts spend millions of dollars annually to procure and administer standardized tests, but those responsible for interpreting and disseminating results unwittingly misuse or ignore a valuable instructional tool.
Test data are not used extensively because administrators are not trained adequately to analyze the information they receive from test agencies. The technical skills needed for data analysis can be developed through self-help sessions or refresher courses on tests and measurements. Yet too many school leaders remain deficient in this area. They have chosen the easiest method of dealing with standardized test scores: avoid discussing them.
Moreover, superintendents usually are no more attuned to test data analyses than their subordinates. They aren't comfortable holding principals accountable for things they as principals didn't understand. Consequently, when strengths and weaknesses identified by standardized tests go unnoticed, the direction provided from the superintendent usually is cursory.
Because superintendents are responsible for test procedures within their districts, they should ensure the procedures include analyses to determine learning trends and effectiveness of textbooks and materials. Yet they rarely do.
The cycle of administering tests, receiving results, filing them, and dealing with more immediate concerns repeats itself.
Test data are not used extensively because the bureaucracy through which results are returned to local schools is not user-friendly. Analyzing scores may be an item to accomplish, but other issues demand immediate attention.
Selecting Methods
Superintendents are key to improving methods by which subordinates use standardized test results. They should conduct data reviews with principals and instructional leaders to establish instructional goals and objectives.
Someone from the superintendent's staff must understand the use of test data and be given complete authority for the district's test services program. A coordinated effort will include facilitating instructional improvement, establishing a single method by which data will be interpreted, and determining the format in which they will be disseminated.
Colleges of education responsible for training public school principals and teachers need to provide more direction in tests and measurements. A minimum of two graduate courses in this subject should be required for administrative certification and at least one course for teachers. The most reliable place in which to develop competence for interpreting test data is at the entry level.
It is difficult to define a single method by which test analyses might be conducted. Most tests, however, are selected to measure cognitive development within an academic year. At a minimum, the test process should compare a student's potential to achievement and provide summaries for all students and grades.
Comparing potential to achievement is important. If standardized tests have a correlation with curricular matter, one should be able to determine whether or not students are demonstrating achievement based upon intellect. Administrators should be able to extract information from summary data that allows them to determine academic strengths and weaknesses of classes and grade levels. Those data may be used to develop schedules, allocate resources, and acquire materials to supplement content areas.
Public Awareness
Once data analysis has taken place, administrators must deal with less tangible issues of standardized testing, such as public awareness. Principals and teachers must receive test results before they are disseminated to the public. Superintendents who reverse that order will create myriad problems.
"Keep it simple" is the watchword for presentations of test data. The public wants to know how its children performed and often is not conversant with test terminology.
Presentations to school staff, however, should enable them to understand how well students performed, strengths and weaknesses, and remediation requirements. Faculty members should leave analyses meetings with suggestions to effect instructional improvements. Parent and faculty groups receive similar information; faculties should discuss results in detail.
Presenting test results to parents may alarm some administrators, but districts will be served well by making information available to those who have paid for it. A public discussion of data and an effort to improve what students learn will yield dividends.
COPYRIGHT 1994 American Association of School Administrators
COPYRIGHT 2004 Gale Group