You are here

Key competencies in the classroom

Author(s): 
Sally Boyd, Rose Hipkins, Rachel Dingle

This webpage provides information for educators to support the use of the NZCER teacher and student surveys: Key competencies in the classroom. This page provides links to information about the development of the surveys, and ideas about how data from these surveys might be collected, analysed, and used in schools.

Surveys

Student data template [835KB Excel spreadsheet]
Teacher data template [3.1MB Excel spreadsheet]

Introducing the Key competencies in the classroom surveys

NZCER has developed parallel teacher and student surveys to gather indicative data about the learning experiences and teaching practices that support students to strengthen the five key competencies described in the New Zealand Curriculum (Ministry of Education, 2007). We developed these surveys for our own research purposes and have refined them during several projects. Now some schools are asking if they can use them for their own purposes.

Any school can use these surveys but there are some important things you need to keep in mind. These surveys:

  • are overview tools. They explore some dimensions of each key competency but it would be impossible to explore everything that could be relevant to each one. There is a trade-off to be made between exploring the complexity of the key competencies and the diversity of practices relevant to each, whilst also keeping the surveys a reasonable length.
  • are primarily focused on views about classroom practice. They do not explore how other school-related experiences, such as participating in a sports team or as a student leader, might also assist students to strengthen the key competencies.
  • gather perceptions, not data about actual activity. Different people will see a situation in different ways. It can be useful to compare different perspectives. There are teacher and student versions of the survey to help you do this.
  • are used to look at aggregated or group patterns — they are not about individual thoughts and practices. Responses should be kept confidential. Be careful about making generalisations — especially if you only have small numbers of responses (e.g. from staff in a small school).
  • are designed to support professional learning. Do use the surveys to generate learning conversations about what's happening in classrooms now and what could change.
  • are not designed to be used for accountability purposes. Don’t use them to make judgments about how well teachers are supporting students to strengthen the key competencies.

The teacher survey can be completed by primary or secondary teachers. The student survey is designed for Year 4-13 students.

Surveys

Where have we used the surveys?

The first versions of these surveys were developed for use in the evaluation of the Curriculum Innovation Projects (Boyd et al., 2005). Over time we refined the surveys as we used them to explore classroom practice in these projects:

We adapted some of individual items or sets of items for use in:

How were the items developed?

We analysed existing descriptions of the NZ key competencies and developed an idea of possible sub-components of each competency. Then we reviewed a number of research studies (and their related survey tools). These studies all broadly explored the types of learning opportunities and teaching practices that could support students to become “lifelong” 21st century learners. From this work we developed a list of experiences and practices we considered to have a good “fit” with each key competency and developed a list of parallel teacher and student items. These were peer reviewed. As we debated the essence of each key competency, and where to place particular practices, we found some overlaps. The placement of some items is subjective and depends on your interpretation of which key competency it fits best.

How do the items fit together?

We have found that understanding the complexity of the key competencies is an on-going journey. Over time, our collective understanding about the practices that are likely to strengthen the key competencies has evolved. We still see these surveys as a “work in progress” and we refine them each time we re-use them. We have used the survey datasets from our research projects to analyse the performance of each item, and each set of items. We have used this information to further reduce the number of items relating to each key competency. An analysis of a set of student data from a number of schools showed that groups of items do “hang together” but with some overlaps (see the comment above about deciding where to place items). We would expect this as the key competencies are inter-related, and people use them in combination.

The survey content and format

Both the teacher and the student surveys contain six sections. Five sections contain parallel student and teacher items that relate to each of the key competencies. In these sections we describe the same practice using student and teacher-friendly language. One section in each survey contains background information about the person who is completing the survey (e.g., their gender).

Teachers are asked to rate how often each practice occurs in their classes (on a 4-point scale), and how important they considered each practice to be (on a different 4-point scale). Students are only asked to rate how often each practice occurs in their classes. We used four-point rather than five-point scales so that people could not “sit on the fence” by choosing a middle “neutral” category.

Ideas for using the surveys appropriately

  • Conducting a “snap-shot” or “stocktake” of views about practice: A “snap-shot” or one-off administration of the surveys can collect data at a particular point in time, in one class, school, or a group of schools. This data can become the basis for professional discussions about how teachers are currently helping students strengthen the key competencies and for “next step” planning. The Normal School’s Key Competencies project is an example of this type of use.
  • Charting change over time in views about practice: Prior to, or at the start of, a professional learning initiative, baseline data can be gathered to develop a picture of school practice and to assist in setting priorities for the future. Follow-up data can then be collected at the end of the professional learning initiative (to ascertain if changes in views about practice have occurred), or at a later date (to ascertain if these changes are maintained over time). The Normal School’s EHSAS cluster data is an example of this type of use.
  • Exploring secondary school students’ views: There are a number of options for administration of the student survey for secondary-age students. Students can be asked to respond to the survey by thinking about all their classes. Alternatively the school can select a core learning area to focus on, such as English, science, or social studies.

Some important questions to consider about exploring change over time or exploring differences between groups

Is there a close fit between the content of the survey items and the types of experiences and practices that our initiative is expected to foster?

If we are looking for change, will there be enough time between the first and last administration of the surveys? (In general a time frame of less than one year can be too short for changes to be evident in teacher views and practices and it can take about 2 years for changes to be evident in teacher data. It can take even longer for these changes to translate into classroom practice and therefore be clearly evident in student data.

When would be the best time to carry out the surveys? (For comparisons, it is important to give the surveys to students at the same time each year. Students tend to express more positive views about school at the start of the year, and any change in their views over the course of the year will influence the data.)

How will we take different year levels or student groups into account? We suggest that this survey is best used to look at overall patterns. If you are comparing year or class groups you need to carefully consider the impact on your data of differences between class, year, or age cohorts, or other types of student groups. When comparing student groups over time it is helpful to ensure the groups are as similar as possible. Younger students tend to view school experiences more positively than older students and therefore often report that practices are happening more frequently. Over many studies we have noticed that students who identify as Pasifika tend to rate school experiences more positively than students from other groups. Because of these differences it is very important that care is taken in interpreting findings.

Ethics and confidentiality

Students and teachers are more likely to feel comfortable giving frank and honest views if they understand that their confidentiality is being respected. When students and teachers complete these surveys as part of research studies, the surveys are anonymous and we do not ask them to put their names on the surveys. We tell students: “The information you are giving is private. This is your chance to have a say.”

Tips on administering the surveys

The teacher survey

The teacher survey can be given to teachers to complete in their own time, and returned to a central collection point. Alternatively, the teacher survey can be completed during a staff professional learning session. During this session staff can also discuss the survey after filling it in.

The student survey

Tailoring the administration of the student surveys to the age group of students is likely to produce the best results. It is helpful if:

  • A teacher reads the survey instructions on the first page to students of all year levels and discusses any questions they might have.
  • For Year 4-6 students, the survey is best done as a group activity with the teacher reading out each statement. Following this, students can individually decide which option to select.
  • Year 7-8 and Year 9-13 students can be left to read the survey themselves, but can ask for assistance if needed. The survey can take Year 7 students about 25-30 minutes to complete on their own.

How could you analyse the results?

The best way to analyse this type of survey data is to count up all the responses to each question on the survey and report that count as a percentage of the total number of responses. For example, for each statement in the student survey we would count how many students said it happened “very often”, “often”, “sometimes” or “hardly ever” and calculate the percentage each of those categories were out of the total number of students in the survey. However, this is a time-consuming job and needs to be very accurate. To help you achieve this we have created some data templates in excel for you to fill in. These templates count up the responses for you, work out the percentages and put the results into a chart to help you use the results.

Data templates

We have developed Excel spreadsheets which you are free to download and use to analyse data collected: one for student data, the other for teacher data.

Student data template [835KB Excel spreadsheet]
Teacher data template [3.1MB Excel spreadsheet]

Please note:

  • When looking for group patterns, the teacher survey needs to be completed by at least 8-14 teachers. If there are a smaller number, the survey can be used for a professional discussion (please note that when comparing percentages from a small number of responders each response [and therefore a person’s perception] can be reflected by a large change in percentage. For example, if you have only five respondents each person accounts for 20 percent of the overall total).
  • If there are fewer than 30–50 students in your school, one possibility is to have a discussion with students about the survey items and their experiences of these practices. With small numbers of students it is better to aggregate data (i.e., group the data from all students, or the data from a sub-group such as those in one year level.)

Next steps

Over a number of research studies, we have found these surveys to provide valuable information that can be used as one starting point for professional discussions. The surveys can provide data that sometimes confirms and sometimes challenges educators’ view about classroom practice. To further explore the patterns that have emerged in the student data, some schools have conducted focus groups with targeted groups of students. During these sessions, teachers ask students about their interpretations of the data and for suggestions about what could be changed or prioritised.

Other information

For background reading and resources related to the key competencies, see:

References

Boyd, S., Bolstad, R., Cameron, M., Ferral, H., Hipkins, R., McDowall, S., & Waiti, P. (2005). Planning and managing change: Messages from the Curriculum Innovation Projects. Wellington: Ministry of Education.

Boyd, S., & Watson, V. (2006). Shifting the frame: Exploring integration of the Key Competencies at six Normal Schools. Wellington: New Zealand Council for Educational Research.

Dingle, R., & Boyd, S. (2009). A focus on opportunities to learn and student engagement. Paper presented at the AARE International Research Conference - A Capital Idea!, Canberra, 29 November-3 December.

Ministry of Education. (2007). The New Zealand curriculum. Wellington: Learning Media.

Further information:

Research studies
Analysis of a set of student data

Year published: 
2010
Publication type: 
Surveys
Publisher: 
NZCER
Full text download: 
not full-text
-