• Nem Talált Eredményt

MEASURING SCHOOL EFFECTIVENESS

In document SNAPSHOT OF HUNGARIAN EDUCATION 2014 (Pldal 110-113)

Teachers, Students, School

MEASURING SCHOOL EFFECTIVENESS

School effectiveness was measured based on the site-level research data base of the 2010-2013 National Assessment of Basic Competencies. The analysis was based on the school-level average of the mathematics and text comprehension scores of 8th grade students in the year in question and the relevant information from the site-level back-ground questionnaires regarding social and school-specific factors affecting

perfor-3 We would like to express our gratitude to the Hungarian Educational Authority for making their data files available to us for this research. We are also grateful to László Ostorics, the acting head of the Public Education Survey and Evaluation Department for supplying to us a manuscript that includes the technical description of the National Assessment of Basic Competencies.

4 The online survey-based data collection was carried out within the framework of SROP priority project 21st Century School Education (Development and Coordination), Phase 2 (SROP 3.1.1-11/1–2012-0001).

mance.5 Naturally, the authors realise that test results only serve as one indicator of the complex phenomenon of school performance; at the same time, we also have to stress that well-designed assessments can provide important insights into effective-ness (see Nahalka 2015).

The performance of schools was calculated using an added value approximation, as using absolute test results would obfuscate the differences between schools due to numerous confounding factors (e.g. the family and social background of students etc.). There are various possible means of defining schools’ added value and calculat-ing specific added value indicators. Most added value models control for two major factors: (1) Students’ family background and the composition of the institution’s stu-dent body, and (2) previous test results. The literature contains discussions of various methods for creating added value models and the explanatory variables used in such models.6

For the present analysis, schools’ added value was calculated using a linear regres-sion model based on the ordinary least squares (OLS) method for each year between 2010 and 2013. First, an expected average performance was calculated for each school based on the composition of the student body and the previous mathematics and text comprehension scores, then the differences between the expected performance and the measured values – the so-called unstandardized residuals – were calculated. Thus, the educational added value of schools is the performance increase identified after taking into account the average student composition of the school – in order to elimi-nate artefacts arising from the effects of the family and the environment outside of the school – and the differences between average student performance at the two points in time – in order to eliminate artefacts arising from prior knowledge, native abilities and prior environmental factors.7

5 Here, we would only like to note that the National Assessment of Basic Competencies is based on tests of applied general knowledge, much like the PISA studies; however, it is grade-based and it is carried out each year with full coverage. The tests are based on a carefully established content framework that follows Hungarian and international assessment and evaluation trends. The National Assessment of Basic Competencies and the associated background questionnaires regarding students, school sites

and institutions are described in Balázsi et al. (2014) and OH (2015). The national reports and the reports on maintainers, schools and school sites (FIT-jelentés) and other important information and background documents are available at https://www.kir.hu/okmfit/.

6 For detailed discussions of added value indicators and models, see the following: Gyökös (2014); Horn (2010, 2015); Kertesi–Kézdi (2004); OECD (2008); Kim–Lalancette (2013); Recommendations of… (2011).

7 The significant explanatory power of linear models (the percentage that shows to what extent they explain the variance of test results) varies between 45-60 percent in mathematics and 60-75 percent in text comprehension over the years studied.

Thus, the average economic background of students8 was used when calculating each school’s educational added value, not the average family background index9. There were four main reasons for this decision:

1. The two indexes represent similar factors.

2. The data on the composition of schools’ student bodies is significantly more complete than the family background index data (~10% missing each year vs. ~40% missing each year).

3. There is a very strong positive correlation between the two indexes.10

4. The educational added values and the school rankings calculated based on the values are very similar.11

The previous average scores of each school were calculated based on the average scores of the students of the school two years before, irrespective of whether the spe-cific students tested had been going to the same school two years prior. Therefore, the scores indicate how much the students of the school improved over the course of two years (EA, 2015).

The educational added value that the school grouping is based on was determined by averaging the annual educational added values of grade 8 for each area under study.

The 2013 data and two valid results from the 2010-2012 period were used; thus, the final average is the average of three or four years. The authors felt that it was important to use the results of multiple years because different classes of the same school may perform rather differently. Grade 8 was chosen because the test scores of the last grade may show the impact the school is having more clearly, and also because previous scores (measured two years prior) are available for 8th grade students. The school site level educational added value is considered to be the school’s collective result.

Multi-year averages were ranked separately for mathematics and text comprehen-sion, and the two rankings thus obtained were added together and the schools were re-ranked. Thus, the final ranking shows the combined performance in the two sub-jects studied; the top schools are those that showed especially high added value in both areas, while the bottom of the list contains the schools that showed very low educational added value.12

18 The index generated based on the composition of the student body of the school site is calculated based on the number of students with above average economic means, those with very poor economic means, those who receive regular child welfare benefits, those who are vulnerable, those who receive free or discounted meals at school, those who receive free textbooks, childcare benefits or social benefits, those whose parents are unemployed and those whose parents have a higher education diploma. Negative index values represent poorer conditions, while positive values correspond to better conditions. The index is compiled based on data from the school site’s background questionnaire, filled in by the director of the school site. On the calculation of the index, see EA (2013a, 2015).

19 The family background index is calculated based on the following information: the number of books in the home, the educational attainment of the parents, whether the family owns at least one computer, and whether the student owns any books. From 2013, the index also contains information on whether the student is considered multiply disadvantaged. Negative index values represent poorer conditions, while positive values correspond to better conditions. The index is compiled based on data from the student background questionnaire, filled in by the students at home with their parents. Regarding the calculation of the index and aggregation at the school site level, see EA (2013a, 2015).

10 The Pearson correlation coefficient is around 0.8 for each year (p<0.001).

11 The Pearson correlation coefficient and the Spearman’s rho (rank correlation) values were above 0.9 in every year in both subjects assessed (p<0.001).

12 The schools where the educational added value of the two subjects assessed diverges sharply deserve specific analysis. This study does not undertake that analysis due to space constraints.

In document SNAPSHOT OF HUNGARIAN EDUCATION 2014 (Pldal 110-113)