You are here

Testing on the net: Assessment Resource Banks in mathematics and science

Cedric Croft
Abstract: 

The ARBs are computerised assessment resources that harness the power of information technology within a school context and give users the capacity to match assessment and teaching.

Journal issue: 

Testing on the net:
     ASSESSMENTRESOURCE BANKS

IN MATHEMATICS AND SCIENCE

CEDRIC CROFT NEW ZEALAND COUNCIL FOR EDUCATIONAL RESEARCH

What are assessment resource banks?

The Assessment Resource Banks (ARBs) are computerised databases of assessment resources which reflect the current New Zealand curriculum statements in mathematics and science. Schools connected to the Internet can access them, and, by using the computerised classification, teachers can select and print off assessment material to match their curriculum and teaching objectives. The ARBs are being developed by the New Zealand Council for Educational Research (NZCER) under contract to the Ministry of Education. Panels of teachers and advisers have worked with NZCER staff to develop each assessment resource published in the ARB. Each resource has then been tested and trialled on students, with the result that schools using the resources can be assured of their validity. Currently available are 275 mathematics resources and 225 science resources. English will be introduced in mid-1998.

Image

Structure of the computerised banks

Essentially, the ARBs are a selection of assessment material, arranged and classified in a manner that makes them accessible to users “on demand”. Previous experience with item banks has shown that the quality of the classification system often determines the usefulness of the banks for teachers. Millman and Arter (1984) note, “classification is the key that unlocks the item bank”. Unless a bank’s contents can be retrieved quickly and precisely, it will not be used. This is precisely where the ARBs’ “search engine” comes into its own, as it allows a tailored search to be undertaken by each user. Also, the ARBs contain a broader range of assessment material than an item bank, as they include multiple-choice items, brief constructed-response questions, longer constructed-response questions, and practical tasks.

As the structure of the banks and the associated search engine reflects the New Zealand Curriculum Framework, each resource is classified by learning strand, achievement objective, level, and process skill/integrating strand. Keywords are an additional method of searching. Each resource has keywords in an on-line dictionary. A search may be undertaken by one or more classification fields, a keyword, or a combination of these.

Multiple-choice items traditionally outnumber other types of assessment material in previously established banks, as they are easy and quick to mark objectively, and performance data may be more readily calculated. Prior to the ARB project, the experience of most New Zealand teachers with item banks has been largely of hard-copy collections of mathematics items. Banks of this type were regarded as less than “user-friendly”, mainly because the identification and retrieval of suitable material was tedious, the business of “cutting and pasting” material cumbersome, and hardcopy banks were difficult to update regularly.

For the New Zealand ARBs, a resource which is broader than the traditional item bank has been developed, with a major aim of helping teachers assess important aspects of national curricula in mathematics, science, and more recently English. The focus is on making available a range of tasks which reduce emphasis from a simple objective approach to assessing educational achievement, to one incorporating extended student responses in problem-solving contexts.

The focus during 1996 and 1997 has been on developing resources for school-based uses, though the eventual scope of the resources may need to be modified for national uses by requirements (as yet unknown) of government policy.

The search strategy

The strength of the ARBs’ search design (which has been fine-tuned progressively after trialling with teachers) is its flexibility, see figure 1.

As may be seen, a search may be undertaken by any single classification field or combination(s) of six classification fields (learning strand, achievement objective, process skill or integrating strand, resource type, keywords, level). The results of the search may also be accessed at any point.

Image

Each classification field (except for keyword) is displayed on screen. Either all, some, or none of the classification fields may be specified. If the viewer does not wish to specify a particular field, the box is left as displayed (that is, All Process Types).

Other special features of the ARBs include:

Imagebackground information on the ARB project and suggestions for school-based uses of resources;

Imagedetailed instructions on how to search for resources;

Imageinstructions on marking (using bookmark) and printing resources;

Imagedirections for cutting, pasting, and downloading resources to word processing software;

Imagea registration page so schools can receive the passwords;

Imagean electronic feedback form for users to complete;

Imagefrequently asked questions; and

Image“navigational buttons” and links to various parts of the search screens.

Performance data — national comparisons

Information about the difficulty level of the resource is available on the scoring guide. The data is obtained from trials on representative groups of between 150–200 students. The five descriptive statements and corresponding difficulty levels are:



Very easy

80 percent and above

Easy

60 percent to 79 percent

Moderate

40 percent to 59 percent

Difficult

20 percent to 39 percent

Very difficult

19 percent and below

Initially, it was planned that this performance data would become part of the retrieval system. However, as each resource comprises multiple questions, and may cover a range of difficulty levels, searching for an individual level would not reflect the range of difficulty in the resources.

Diagnostic information about the performance of the trial group is now incorporated in some scoring guides. With the continuation of trial testing and resource development, the availability of diagnostic information will increase.

Many teachers have found the scoring guide information on the difficulty of each resource useful, an indication, perhaps, of teachers’ desire to relate the achievement of their own students to a broader picture of student performance. Also interesting is the reported use of the banks to help assess curriculum levels. The fact that one-third of those respondents who have used the banks have done so in order to help assess curriculum level is, perhaps, indicative of growing teacher uptake of this role of the banks.

Uses by schools

The computerised ARBs are designed to be used for school-based assessment purposes. They are not intended to replace a school’s own assessment procedures, but rather to provide additional material. The ARBs provide material for six main school-based assessment purposes:

Image Formative assessment, which helps indicate how well individuals or groups have learned particular skills. Future teaching is then based on this analysis. Students’ performance on sets of resources are the primary type of formative assessment associated with the ARBs.

ImageDiagnostic assessment, which assists teachers to isolate strengths and weaknesses in learning, and helps identify common misunderstandings and possible areas of future difficulty.

ImageSummative assessment, which may follow a sustained block of teaching and is carried out to summarise achievement at a given point in time. This is generally more structured than formative assessment.

ImageMonitoring performance over time. By repeated administration of selected resources over time a school may develop a database of information to help monitor school-wide performance. Each resource incorporates a difficulty estimate within the scoring guide, which may provide a benchmark for monitoring performance against a sample of students beyond the school.

ImagePre-and post-tests. Teachers may select resources to assess levels of knowledge and understanding before a new phase of teaching. The same collection of resources, or a selection of similar ones, may be administered at the end of the teaching phase.

ImageExemplars for teacher-made assessment. The resources have been prepared by teachers, advisers, and New Zealand Council for Educational Research assessment staff working collaboratively, and using generally accepted guidelines for test development. Teachers who need to prepare their own assessment materials may wish to adapt some of the approaches and ideas illustrated by some resources.

Image

Image

Figure 2 shows an example of a mathematics resource and its scoring guide which can be accessed from the ARB.

Figure 3 shows a science resource (Living World 1001) which illustrates aspects of the ARBs in science, with the search engine functioning in a similar manner as for mathematics.

User responses to the ARB

A survey of users was undertaken by Strafford and Ansell (1997) as part of the on-going evaluation of the project. The findings point to directions for further development, and show the patterns of use by the ARBs in schools.

Which schools have registered with the ARBs?

Since the banks went on-line in March of 1997 there has been a steady increase in the number of registered users—from 187 in September 1997 to 369 in November 1997. Of these, 338 are teachers who work in 309 different schools (both primary and secondary). The majority of the other 31 users who are not based in schools are staff from colleges of education, teacher support services, and education consultants.

Comparison with ministry data base

A comparison has been made of the total schools registered by early November, with the Ministry of Education information on all schools in New Zealand. Of school types registered for ARB access, secondary schools catering for year 7–15 are significantly over-represented. Secondary schools make up 21 percent of ARB users, whereas secondary schools comprise only 12 percent of all schools. The primary sector is under-represented by 8.6 percent, compared to the total population of New Zealand schools. Some of this difference could be due to the greater proportion (but not the greater number) of secondary schools that have taken part in ARB trial testing to date. Secondary schools are also more likely to be connected to the Internet. A census of computer use in schools (Owens, 1996) showed that 28 percent of primary schools, 55 percent of composite schools, and 70 percent of secondary schools were connected to the Internet (though the majority of schools intend to be connected by the beginning of 1998).

The majority of the decile groups of registered users match the proportion of deciles from the total school population. However, registered ARB users from decile one schools are under-represented. Decile six ARB users are over-represented (see table 1).

Minor urban schools, or schools from areas with a population between 1,000 and 9,999, are over represented by 5.8 percent, compared to the total population of schools. In contrast, rural schools (schools in an area with a population of under 1,000) are under represented as ARB users by 11.7 percent compared to the total proportion of rural schools in New Zealand (see table 2).

Schools with roll sizes under 100 are under-represented in the population of ARB registered users, whereas there are more schools with a roll size of over 300 registered compared to the total population. The larger proportion of secondary registered users and the under-representation of rural schools may be an influence here.

Schools with 30 percent or more Maori students on their roll are under-represented as registered users. This may be partly due to the fact that the ARBs do not currently cater for full-immersion schools. Schools with 8–15 percent Maori on their roll are over-represented.

Image

Image

Image

Which teachers within those schools are registered users of the ARB?

Teaching experience

The ARBs are being accessed primarily by educators with 15 or more years teaching experience. Over 70 percent have more than 15 years experience, while 21 percent have 8–15 years, and 9 percent have less than 7 years experience. Of those registered users who responded to the questionnaire, 30 percent were female and 70 percent male.

Position in the school of respondents

The greatest percentage of those who responded from the primary sector were those with management responsibilities (principal, deputy principal/assistant principal contrasted with syndicate leaders and scale A teachers). The greatest percentage of responses from secondary school users were head of departments (HODs).

A quarter of the respondents did not have direct teaching responsibilities. The majority of the teaching undertaken by the respondents fell fairly evenly across the curriculum levels for which the ARBs are intended, that is, levels 3–6.

Image

Barriers to ARB use

The survey also asked respondents to identify barriers to ARB use. Time was identified by most respondents, with 35 percent noting this as a major barrier. However, when the time taken to search the banks is compared with the time needed to write quality assessment resources, let alone trial and improve them, time as an isolated variable becomes an insignificant issue.

A quarter of teachers found the lack of ARB resources in the areas that they had searched to be a major barrier to full use but half of the respondents found this a slight barrier. Because we are progressively adding new resources to the banks (there were 295 resources accessible at the time of the survey, increasing to 475 at the time of writing) this will be less of an issue over time.

Just over a quarter of teachers in this survey found computer location within their school a slight barrier. Computer knowledge has become less of a barrier with only 18 percent in this survey compared to 38 percent in an earlier phase of the project. This change could be due to the different characteristics of the samples and the possibility that the teachers who are registered now may already have a good level of computer competency. A few teachers have addressed some of the barriers mentioned above by accessing the banks from home rather than from school.

Discussion of user survey

As already noted, the population of ARB users, as taken from the ARB-user database, generally corresponds to the total population of New Zealand schools. However, significant divergence occurs in regard to the over-representation of secondary schools, minor urban schools, decile 6 schools and schools with 8–15 percent Maori enrolment; and the under-representation of rural schools, decile 1 schools, and those with over 30 percent Maori enrolment.

It is not possible, at this stage, to arrive at a clear understanding of which, if any, of these confounding variables that relate to under-representation is the major determinant in the pattern of uptake revealed. It would be of interest to investigate Internet access and other resourcing issues, to gain a clear picture of the factors that effect registration levels.

Another set of confounding variables was found within our exploration of users of the banks. There is a clear majority of teachers with over 15 years teaching experience, teachers in management positions, and males. Again, experience suggests that these factors are intertwined. Teasing out which, if any, of these factors is the central thread is a question which we will tentatively address at this stage.

The primary determinant in terms of ARB registration appears to be position in the school. Principals, deputy principals, and HODs often have, at least nominally, more non-contact time than their scale A or assistant teacher colleagues. They also generally have the responsibility for keeping up to date with curriculum developments. Thus it should come as no surprise that educators with a management role are significantly more likely to be the staff members who are the registered users of the ARBs. In turn, educators with a management role are generally those with more years of teaching experience.

With reference to the disproportionate number of men registered as ARB users it would be interesting to explore the relationship between this statistic and the numbers of men and women in management roles and to explore often potentially related factors.

User comments and suggestions for improvement

Three main themes emerged from the open-ended questions included in the survey:

Imagesupport for continued development of the banks,

Imageintentions to increase usage and introduce the banks to colleagues, and

Imagethe need to increase the numbers of resources available.

As this is an ongoing project and resources are being steadily added to the banks, the desire for increased numbers of resources will be met over time. During 1997, for example, some 600 resources have been included in trials with 8,000 students from some 275 schools. Since the banks went on-line in March 1997, the number of resources has increased from 125 to 475. The provision of more resources should also help address the secondary barrier noted earlier; namely the lack of resources in the areas searched.

One other suggestion for improvement is to make it possible for users to select and print more than one resource at a time. This is being investigated—if suitable software becomes available the resultant changes may lead to a reduction of the impact of “lack of time” as a barrier to ARB use.

Even as the ARBs have been developed and are expanding, there has been very strong support from most teachers for their school-based uses. The ARBs harness the power of IT within a school context and give users the capacity to match assessment and teaching by careful choice of resources from the banks. As the ARBs become more widely used the common pool of available resources will help overcome the situation where many schools may be “reinventing the wheel” as far as assessment material for mathematics and science is concerned.

NOTES

CEDRIC CROFT is Chief Research Officer at the New Zealand Council for Educational Research, P O Box 3237, Wellington. Fax: 04 384 7933.

Members of the Assessment Resource Bank team include: Cedric Croft (leader), Sally Boyd, Gavin Brown, Karyn Dunn, Anne Gilbert, Chris Marston, Alex Neill, Gareth Rapson, Christina Smits, Ed Strafford, Rose-Ann Yianakis.

The ARB team would be pleased to hear from schools willing to trial mathematics, science, and English resources as they are being developed. Please contact Christina Smits.

In order to access the resources all ready available, register by connecting to the World Wide Web and type the following address in the location or URL box: http://www.nzcer.org.nz/nzcer3/reg.htm

For further details of the development of the ARBs, see:

Croft, C. (1997, August). Assessment resource banks in mathematics and science. Paper presented at the annual conference of the Australasian Curriculum Assessment and Certification Authorities, Hobart, Tasmania.

Croft, C., Boyd, S., Dunn, K., & Neill, A. (1996, December). Resource banks in mathematics and science for school-based assessment. Symposium conducted at the annual conference of the New Zealand Association for Research in Education, Nelson.

Croft, C., & Dunn, K. (1997, December). Assessment resource banks: Past, present and future. Part A: Paper session. Paper presented at the annual conference of the New Zealand Association for Research in Education, Auckland.

Croft, C., Gilbert, A., Boyd, S., Burgon, J., Dunn, K., Burgess, L., & Reid, N. (1996). Assessment resource banks in mathematics and science: Transition-point assessment—part 2: Implementation trial. Wellington: New Zealand Council for Educational Research.

For more details on the survey of users, see:

Strafford, E., & Ansell, S. (1997, December). Assessment resouce banks: Past, present and future. Part A: Paper session. Paper presented at the annual conference of the New Zealand Association for Research in Education, Auckland.

The importance of the classification system in resource banks is noted by:

Millman, J. & Arter, J. (1984). Issues in item banking. Journal of Educational Measurement, 21 (4), 315–330.

For the recent census of computer use in schools, see:

Owens, J. (1996). A survey of computer use in New Zealand schools. The Research Bulletin, 7(November), 1–10.

Figures for the comparison between registered schools and others are from:

Ministry of Education. (1997). Directory of New Zealand schools and tertiary institutions 1997. Wellington: Ministry of Education, Data Management Unit.