A study of the impact of Alberta achievement tests on the teaching-learning process shows that the Alberta achievement testing program is unfair to students and teachers.

The study, conducted by University of Alberta professor Peter Calder, was released in June as students were writing the 1997 achievement tests.

Calder's research included an analysis of teachers' views, a survey of Grades 3, 6 and 9 teachers, and an analysis of the testing program itself. Calder contacted 90 randomly-selected ATA representatives who distributed an open-ended questionnaire to Grades 3, 6 and 9 teachers in their schools. These teachers were asked for their views, both positive and negative, about the achievement testing program. The responses were analyzed and seven themes were identified: focus of instruction, teacher evaluation, public pressure, norming issues, student stress, fairness and validity. These themes were used to develop a survey in which Calder contacted almost 200 teachers of Grades 3, 6 and 9 by telephone.

Teachers in Grades 3 and 6 said that achievement tests are unfair for their students; teachers in Grade 9 believe that the tests are fair. Teachers believe that the tests force them to narrow their teaching practices and create stress for students. Many teachers do not believe that parents of their students support the examinations. While teachers believe that the achievement testing program should be discontinued, support for this view is highest in Grade 3 and reverses by Grade 9.

Calder noted two major concerns of teachers. There is strong opposition to the program's inclusion of students who have not learned the curriculum on which the tests are based. Teachers are also concerned about the misuse of test results, including unfair and inappropriate comparisons of teachers and schools. Calder's analysis of the testing program identifies a number of serious problems; he also notes positive features, as well. However, "in its present form, the Alberta achievement testing program requires significant change for it to meet acceptable standards," he concludes.

ATA President Bauni Mackay welcomed Calder's study and committed the Association to work toward improvements in the achievement testing program that will address members' key concerns. "The problems identified by Dr. Calder are serious and need to be remedied," the president noted. She called on the minister of education to address the program's flaws. She also called on the minister to establish a ministerial advisory committee, with representation from the profession, the general public and test experts, so that the minister can receive systematic advice on the program.

In order to work toward much needed changes in the program, the Association agreed to name teachers, for the first time in almost a decade, to serve on the technical advisory committees. Mackay noted that "the Association will work with the minister, but we expect an end to practices that make achievement tests unfair to students and teachers."

Concerns with flawed Alberta achievement tests

University of Alberta professor Peter Calder's independent analysis of the provincial achievement testing program identifies a number of serious problems. Key concerns include the following:

  • The purpose of the testing program is unclear. The tests are used to do things the program was never designed to do. Initially designed for provincial monitoring, the tests have evolved to produce individual information about the student, to obtain information to improve instruction and to serve as a student's final examination. Tests have limitations and when they are stretched to fulfil too many purposes they lose validity.
  • Examination fairness is a serious problem. It is unethical for the province to require a student to be tested when the test does not reflect the student's instructional program.
  • The tests are a combination of criterion-referenced and norm-referenced approaches. This does not make sense; one can't have it both ways. With a criterion-referenced test, standards are established which do not change; it is possible for all students to meet the standards (in fact, that's the goal). With a norm-referenced test, scores are adjusted if the test is too easy or too hard. While the province contends that the tests are criterion-referenced, the final results are tailored. This defies conventional testing practices.
  • Achievement tests are well developed and are a valuable tool in monitoring. However, especially at the elementary level, one-shot academic assessment is fraught with error. It is inappropriate for results to be reported without accompanying contextual information which is not provided by department information sheets on individual test results.
  • There is no need for tests to be scored and reported by a central authority. A more efficient program could be run with greater validity if scoring and reporting were left to teachers at the classroom level.
  • There is no need to test all students to determine the degree to which provincial standards are achieved. National and international tests use sampling. Student achievement of provincial standards still could be monitored at significantly reduced cost.
  • There is no ministerial advisory committee to receive advice on the purpose and nature of the achievement testing program. As a result, the general public and the profession have little input into achievement testing.
  • The ATA should direct its efforts toward making achievement testing as valid and fair an experience as possible.

Socio-economic status strong predictor of performance

A study released by the ATA, Edmonton Public Schools and the Department of Education Student Evaluation Branch shows that socio-economic status accounts for about half of the variance in Edmonton Public's achievement test scores.

The study was conducted by a team headed by W. Todd Rogers, professor and director of the Centre for Research in Applied Measurement and Evaluation (CRAME) at the University of Alberta. It's the second study in a few months to focus on the importance of social factors as predictors of student performance.

At the school level, SES was by far the single most important factor accounting for the variance in student performance. At the student level, prior reading performance was key.

Association President Bauni Mackay welcomed the study. "It's extremely important to recognize that a number of factors affect student achievement. When media rank order schools on the basis of diploma examination or achievement test scores, what they are really doing is rank ordering the student population by family income. Such a practice is unfair to students and their parents and teachers."

A study released in June by Hugh Lytton and Michael Pyryt with the University of Calgary, examined achievement test scores for 142 elementary schools in the Calgary Board of Education. Their analysis is that social class factors, including income levels and unemployment, account for about 45 per cent of the variation in achievement test scores.

Facts about achievement tests

What are achievement tests and who writes them? Top of page

Achievement tests are administered annually in English language arts and English and French mathematics in Grade 3 and in English and French language arts, mathematics, science and social studies in Grades 6 and 9.

The province's goal is to have as many students as possible in Grades 3, 6 and 9 write provincial achievement tests. The Student Evaluation Regulation (40/89) requires all students registered in Grades 3, 6 and 9 and ungraded students in their third, sixth and ninth years of schooling to complete provincial achievement tests. This includes Francophone students, students in English as a second language (ESL) programs, the integrated occupational program (IOP) and special education programs; this also includes students with learning disabilities or physical disabilities.

What are the provincial expectations for students who write the tests and how are test results reported? Top of page

The province expects at least 85 per cent to reach the acceptable standard and at least 15 per cent of students to reach the standard of excellence. Provincial, school jurisdiction, school and individual reports are prepared and include information about the percentage of students meeting provincial standards.

Does the province tinker with the achievement test results or do they reflect actual test raw scores? Top of page

The province tinkers with the test results each year. If department officials determine that an achievement test was too hard, the raw score required to reach the standard is reduced; if officials determine that an achievement test was too easy, the raw score required to reach the standard is increased. In 1997, the acceptable standard for Grade 9 social studies increased to 53 per cent; the acceptable standard for Grade 6 mathematics dropped to 46 per cent. The purpose of this procedure is to maintain consistent standards from year to year.

Is there an external standard against which to compare the structure and nature of Alberta's achievement testing program? Top of page

Yes. A set of principles about good testing has been established by test experts in cooperation with other members of the education community. The standards are reflected in the Principles for Fair Student Assessment Practices for Education in Canada.

Are achievement testing practices consistent with the Principles? Top of page

No. Achievement testing practices fall short of the Principles especially with respect to clarification of purpose, testing of Francophone, special education, IOP and ESL students and use of test results.

Influence of various factors on achievement test results

A study of Edmonton Public achievement test results headed by W. Todd Rogers assesses factors that might help to explain the variation in performance. Highlights include the following:

  • At the student level, approximately half of the variability in test scores in English language arts in Grades 3 and 6 is accounted for by reading performance at the end of the previous grade level. In Grade 3 mathematics, nearly 40 per cent of the variance is accounted for by prior reading performance measured at the end of Grade 2.
  • At the school level, approximately half of the variability in language arts performance at the Grade 3 level is accounted for by socio-economic status (SES), student enrolment and parent satisfaction; at Grade 6, variability is accounted for by SES, student satisfaction and a percentage of special needs students. Approximately 36 per cent of the variance in mathematics achievement is accounted for by SES and parent satisfaction.
  • By far, the strongest predictor of student performance on achievement tests is SES. Remaining variables included in the analysis (gender, student enrolment, satisfaction levels of students, parents and teachers, percentage of special needs, expenditures, school generated funding) were not as important predictors in the variability of performance.
  • The full set of results reveals that about half of the variance in student performance on achievement tests is accounted for by SES.

A study of Calgary Public achievement test results conducted by Hugh Lytton and Michael Pyryt, released in June, reached similar conclusions:

  • Social class factors explain about 45 per cent of the variation in achievement test results.
  • The correlation between income level and achievement test scores is very strong. Comparing the highest income group of Calgary Public schools with the lowest income group shows an average income difference of $50,000 and a mean test score difference of 14 per cent.
  • Student body factors represent 6 to 11 per cent of the difference in test scores. Most important factors include the number of special needs and ESL students.
  • School factors accounted for 3 to 6 per cent of the difference in test scores. Significant factors include a positive attitude to achievement tests by the principal and the years of experience of the teaching staff.