Elaine Furniss, unicef new York The Purpose of this Paper



Download 261.97 Kb.
Page5/6
Date23.04.2018
Size261.97 Kb.
#46674
1   2   3   4   5   6

9.1 PISA

The OECD's Programme for International Student Assessment (PISA) PISA is a collaborative process, bringing together scientific expertise from the participating countries and steered jointly by their governments on the basis of shared, policy-driven interests. PISA aims to define each assessment domain not merely in terms of mastery of the school curriculum, but in terms of important knowledge and skills needed for full participation in society. PISA will span over the decade to come and will enable countries to monitor, regularly and predictably, their progress in meeting key learning objectives. The age-group covered: assessing young people near the end of their compulsory schooling provides a significant indication of the performance of education systems. PISA does not limit itself to assessing the knowledge and skills of students but also asks students to report on their own, self-regulated learning, their motivation to learn and their preferences for different types of learning situations. PISA has a global coverage: 32 countries including 28 OECD countries, Brazil, China, Latvia and the Russian Federation participate. The PISA 2000 Assessment of Reading, Mathematical and Scientific Literacy has been developed in terms of: the content that students need to acquire, the processes that need to be performed, and the contexts in which knowledge and skills are applied. 29 PISA will also be assessing students in 2003.


9.2 TIMMS

The Third International Mathematics and Science Study (known in the US as The Trends in International Mathematics and Science Study) 30 assessed the mathematics and science performance of students at three different grade levels in 1995. TIMSS also collected information on schools, curricula, instruction, lessons, and the lives of teachers and students to understand the educational context in which mathematics and science learning takes place. The 1999 Third International Mathematics and Science Study-Repeat (TIMSS-R) was a successor to the 1995 TIMSS and focused on the mathematics and science achievement of eighth-grade students in participating nations. TIMSS 2003 will assess student achievement in mathematics and science in Grade 4 and Grade 8


9.3 PIRLS
Thirty-five countries participated in PIRLS 2001, IEA's new Progress in International Reading Literacy Study at the fourth grade. With 150,000 students tested, PIRLS 2001 is the first in a planned 5-year cycle of international trend studies in reading literacy. PIRLS consists of a carefully-constructed test assessing a range of reading comprehension strategies for two major reading purposes - literary and informational. PIRLS collected extensive information about home, school, and national influences on how well students learn to read. As well, parents and caregivers completed questionnaires about their children's early literacy activities. PIRLS 2001 coincided with the IEA's 10 year anniversary of their 1991 Reading Literacy Study and provided 9 countries an opportunity to replicate that study and obtain a 10 year measure of trends from 1991. The range of performance across 35 countries was large. Sweden had the highest reading literacy achievement. Bulgaria, The Netherlands, and England also performed well. In all countries girls had significantly higher achievement than boys. Statistically significant gender differences favouring girls at each quartile were consistent across countries, with only a few exceptions (Italy and the United States at the upper quartile, France at the median level and Columbia and Morocco and the lower quartile). (p29)

Two other significant achievement studies
9.4 LAMP

UNESCO’s Institute for Statistics is developing in collaboration with others, including UNICEF, a new assessment tool for literacy called LAMP, The Literacy Assessment and Monitoring Programme.31 This will sample a fairly small group of adults in each country. LAMP will then project the results from the sample to the entire population, and for this will seek to exploit the statistical techniques of synthetic estimation. Such a survey is needed because most current data on adult literacy in developing countries are not sufficiently reliable to serve the needs of national and international users. For example, the data generally rely either on individuals’ self-declaration of their literacy or on “proxy” indicators such as their educational level. LAMP will face many challenges, such as: ensuring test questions are in agreement with local socio-cultural and linguistic circumstances; maintaining international comparability; and ensuring the transfer of knowledge. LAMP results will probably see literacy levels FALLING because they will be assessed on the basis of a test rather than self-reporting.


9.5 SITES

The IEA's Second Information Technology in Education Study, 1999-2002, surveyed responses to new questions about the effectiveness and impact of technological applications on schooling. Are our education systems measuring up with regard to innovative potential of ICT applications? To what extent are there gaps between objectives and educational reality? Which innovations exist and what is the evidence of their effectiveness? The first study focused primarily on the use of information and communication technology (ICT) in educational practice from an international comparative perspective, and was guided by several general questions, including: How, by whom, and to what extent is ICT used in education systems, and how does it develop over time? What differences in ICT-related practices exist within and between educational systems and how can these differences be explained? Which innovative practices exist that may offer educational practitioners achievable new targets? The second module of this study is a qualitative study of innovative pedagogical practices that use information and communication technology (ICT).32



10. Overall
In the basis of what we know from results of achievement assessments, we can ensure that information is fed back into the education system to analyse exam performance in terms of gender, ethnic/ language group membership and geographic location, and differential performance by curriculum area. We can set up steering committees to ensure that information relevant to particular groups of people (education managers and teachers, planners, parents and politicians) can be used practically.
Ultimately those of us who are involved with assessment have to try and answer the following questions:

  • What forms of assessment are likely to have the greatest impact on students’ learning?

  • What kinds of learning do we wish to foster?

  • What steps are necessary to improve a system’s ability to deliver effective types of assessment?

  • How will the information derived from an assessment be used?



11. Advocacy for Learning Assessment
Given the challenges of establishing monitoring programs and the long-term obstacles (particularly ensuring funding) of maintaining programs, it is helpful to have an advocacy strategy for monitoring learning achievement. Advocates need enthusiasm and determination to bring stakeholders together to ensure that results are used to improve learning. (Forster, 2002).
References
Al Nahr, T. Learning Achievement of Grade Four Elementary Students in Some Arab Countries Regional Synthesis Report, UNESCO 2000
Berryman, S. Priorities for Educational Reforms in the Middle East and North Africa http://www.worldbank.org/mdf/mdf1/priomena.htm
Billeh, V. International Assessment of Educational Progress: Jordan’s Experience

http://www.worldbank.org/mdf/mdf1/assess.htm
Black, P. & Wiliam, D. 'Inside the Black Box: raising standards through classroom assessment', Phi Delta Kappan, July 1998, pp139-148.
Chisholm, L. et al A South African Curriculum for the Twenty-First Century Report of the Review Committee May 31, 2000
Curriculum Council of Western Australia, Draft Progress Maps, Health & Physical Education http://www.curriculum.wa.edu.au/ProgressMaps/health.html
Darling-Hammond, L., and Snyder, J., (2000) Authentic assessment of teaching in context Teaching and Teacher Education, 16 (2000), 523-545.

Forster, M. & Masters, G. (1999) Paper and Pen, Assessment Resource Kit, Melbourne: ACER


Forster, M. & Masters, G. (1996) Performance, Assessment Resource Kit, Melbourne: ACER
Forster, M. & Masters, G. (2000) Portfolios, Assessment Resource Kit, Melbourne: ACER
Forster, M. & Masters, G. (2000) Products, Assessment Resource Kit, Melbourne: ACER
Forster, M. & Masters, G. (1996) Projects, Assessment Resource Kit, Melbourne: ACER
Forster, M. (2000) A Policy Maker’s Guide to International Achievement Studies Melbourne, ACER ISBN 0-86431-360-8
Forster, M. (2001) A Policy Maker’s Guide to System-wide Assessment Programs Melbourne, ACER ISBN 0-86431-359-4
M Forster, (2002) National Monitoring of Learning Achievement in Developing Countries UNICEF Education Section Working Paper
Gordon, D. (1992) One Teacher's Classroom Eleanor Curtain Publishing Melbourne, Australia
Greaney, V., Khandker, S.R., & Alam, M.(1999) Bangladesh: Assessing Basic Learning Skills The World Bank, Bangladesh
Hildebrand, G. M. (1996) . Redefining Achievement in Equity in the Classroom: Towards Effective Pedagogy for Girls and Boys Patricia Murphy & Caroline Gipps (Eds) London: Falmer Press
Kagia, R. (2000). Gateways into Learning: Promoting the Conditions which Support Learning for All http://www1.worldbank.org/education/est/resources/Training%20and%20presentations
Kellaghan, Thomas & Greaney, Vincent (1996) Monitoring the Learning Outcomes of Education Systems World Bank ISBN: 0-8213-3734-3 SKU: 13734
Kellaghan, T. Using Assessment to Improve the Quality of Education (IWGE, 2000), http://www.unesco.org/iiep/eng/networks/iwge/recent.htm
Knowledge and Skills for Life First Results from the OECD Programme for International Student Assessment (PISA) 2000 Order from OECD Online Bookshop at www.oecd.org
Lagging Behind: A Report Card on Education in Latin America The Task Force on Education, Equity and Economic Competitiveness in Latin America and the Caribbean November 2001
Maguire, T. (1998) Quality Issues in Basic Education: Indicators, Learning Achievement Reports, and Monitoring Teaching/ Learning Processes UNICEF ROSA, (ROSA Report Number 31)
Marzano, R.J. (1998) Model of Standards Implementation: Implications for the Classroom Mid-continent Regional Educational Laboratory
Marzano, R., Pickering, D., & Pollock, J, (2001) Classroom Instruction That Works: research-based strategies for increasing student achievement Association for Supervision and Curriculum Development Alexandria, VA. http://www.ascd.org
Masters, G & Forster, M.(1996) Progress Maps, Assessment Resource Kit, Melbourne: ACER
Masters, G & Forster, M.. (2000) Developmental Assessment, Assessment Resource Kit, Melbourne: ACER
Micklewright, J. Education, Inequality and Transition, UNICEF Innocenti Working Papers, Economic and Social Policy Series no.74, January 2000.
No Child Left Behind Issue Brief A Guide to Standards-Based assessment adapted from A Policy Maker's Guide to Standards-Led assessment by Robert L. Linn and Joan L. Herman published jointly February 1997 by ECS and the National Center for Research on Evaluation, Standards and Student Testing. www.ecs.org
Skills for Health: Skills-based health education, including life skills: An important component of a Child-Friendly/Health-Promoting School produced jointly by UNICEF, WHO, World Bank, UNFPA and other FRESH partners, final draft November 2002.
Sum, Andrew, Kirsch, Irwin & Taggert, Robert , The Twin Challenges of Mediocrity and Inequality: Literacy in the US from an international perspective Policy Information Report Educational Testing Service 2002 download at www.ets.org/research

Wiggins, G. & McTighe, J. (1998) Understanding by Design Association for Supervision and Curriculum Development http://www.ascd.org



Appendix One: Rubric for naming distinctions and judgements according to the six facets of understanding. (Wiggins & McTighe, 1998)

Explanation

Interpretation

Application

Perspective

Empathy

Self-Knowledge

Sophisticated: an unusually thorough and inventive account (model, explanation); fully supported, and justified; deep and broad: beyond the information.

Profound: a powerful and illuminating interpretation and analysis of the importance/meaning/significance; tells a rich and insightful story; provides a rich history or context; sees deeply and incisively any ironies in the different interpretations.

Masterful: fluent, flexible, and efficient; able to use knowledge and skill and adjust understandings well in novel, diverse, and difficult contexts.

Insightful: a penetrating and novel viewpoint; effectively critiques and encompasses other plausible perspectives; takes a long and dispassionate, critical view of the issues involved.

Mature: disposed and able to see and feel what others see and feel; unusually open to and willing to seek out the odd, alien or different.

Wise: deeply aware of the boundaries of one’s own and others’ understanding; able to recognize his prejudices and projections; has integrity-able and willing to act on what one understands.

In-depth: an atypical and revealing going beyond what is obvious to what was explicitly taught; makes subtle connections; well supported argument and evidence; displayed.

Revealing: a nuanced interpretation and analysis of the importance/meaning/significance; tells an insightful story; provides a telling history or context; sees subtle differences, levels, and ironies in diverse interpretations.

Skilled: competent in using knowledge and skill and adapting understandings in a variety of appropriate and demanding contexts.

Thorough: a revealing and co-ordinated critical view; makes own view more plausible by considering the plausibility of other perspectives; makes apt criticisms, discriminations, and qualifications.

Sensitive: disposed to see and feel what others see and feel; open to the unfamiliar of different.

Circumspect: aware of one’s ignorance and that of others; aware of one’s prejudices; knows the strengths and limits of one’s understanding.

Developed: an account that reflects some in-depth and personalized ideas; the student is making the work her own, going beyond the given - there is supported theory here, but insufficient evidence and argument.

Perceptive: a helpful interpretation or analysis of the importance/meaning/significance; tells a clear and instructive story; provides a useful history or context; sees different levels of interpretation.

Able: able to perform well with knowledge and skill in a few key contexts with a limited repertoire, flexibility, or adaptability to diverse contexts.

Considered: a reasonably critical and comprehensive look at all points of view in the context of one’s own; makes clear that there is plausibility to other points of view.

Aware: knows and feels that others see and feel differently; somewhat able to empathize with others; has difficulty making sense of odd or alien views.

Thoughtful: generally aware of what is and is not understood; aware of how prejudice and projection can occur without awareness and shape one’s views.

Intuitive: an incomplete account but with apt and insightful ideas; extends and deepens some of what was learned; some “reading between the lines; account has limited support/data or sweeping generalizations. There is a theory, but one with limited testing and evidence.

Interpreted: a plausible interpretation or analysis of the importance/meaning/significance; makes sense of a story; provides a history or context.



Apprentice: relies on a limited repertoire of routines; able to perform well in familiar or simple contexts, with perhaps some needed coaching; limited use of personal judgement and responsiveness to specifics of feedback/situation.

Aware: knows of different points of view and somewhat able to place own view in perspective, but weakness in considering worth of each perspective or critiquing each perspective, especially one’s own; uncritical about tacit assumptions.

Developing: has some capacity and self-discipline to “walk in another's shoes,” but is still primarily limited to one’s own reactions and attitudes; puzzled or put off by different feelings or attitudes.

Unreflective: generally unaware of one’s specific ignorance; generally unaware of how subjective prejudgements colour understandings.

Naïve: superficial account; more descriptive than analytical or creative; a fragmentary or sketchy account of facts/ideas or glib generalizations; a black-and-white account; less a theory unexamined hunch or borrowed idea.

Literal: a simplistic or superficial reading: mechanical translation; a decoding with little or no interpretation; no sense of wider importance or significance; a restatement of what was taught or read.

Novice: can perform only with coaching or relies on highly scripted, singular “plug-in” (algorithmic and mechanical) skills, procedures, or approaches.

Uncritical: unaware of differing points of view; prone to overlook or ignore other perspectives; has difficulty imagining other ways of seeing things; prone to egocentric argument and personal criticisms.

Egocentric: has little or no empathy beyond intellectual awareness of others; sees things through own ideas and feelings; ignores or is threatened or puzzled by different feelings, attitudes, or views.

Innocent: completely unaware of the bounds of one’s understanding and of the role of projection and prejudice in opinions and attempts to understand.

Appendix Two:

An overview of the ways in which developing countries collect information about student achievement at a national level

Forster, M. 2002, National Monitoring of Learning Achievement in Developing Countries, UNICEF Education Section



No systematic national data collection on student learning

Use of national examinations or other proxy indicators

Regional testing or international agency testing project

National monitoring program

Angola; Burkina Faso; Burundi; Cameroon; Cape Verde; Central African Republic; Chad; Ethiopia; Equatorial Guinea; Gabon; Ghana; Guinea Bissau; Ivory Coast; Liberia; Mali; Mauritania; Rwanda; Sao Tome and Principe; South Africa (Republic of); Senegal; Sierra Leone; Somalia; Swaziland; Togo

Botswana; Cape Verde; Comoros Islands; Eritrea; Gambia; Guinea; Mauritius; Togo; Tanzania (Zanzibar); Zimbabwe

Kenya; Mali; Malawi; Mauritius; Mozambique; Namibia; Nigeria; Seychelles; South Africa; Swaziland; Tanzania (Mainland); Tanzania (Zanzibar); Uganda; Zambia; Zanzibar; Zimbabwe

Benin*; Congo*; Gambia; Lesotho; Madagascar*; Zaire; (Congo Democratic Republic)*; Zambia

Egypt

Algeria; Djibouti; Gaza Strip; Iran; Iraq; Jordan; Morocco; Oman; Saudi Arabia; Sudan; Tunisia; Yemen

Oman

Jordan; Lebanon*; Morocco*; Syria*; Tunisia

Afghanistan; India


Bhutan; Maldives; Nepal; Sri Lanka

Bangladesh; Maldives; Nepal; Pakistan; Sri Lanka




Myanmar; North Korea Democratic People’s Republic;

Cambodia; China; Fiji; Indonesia; Laos; People’s Democratic Republic; Papua New Guinea; Malaysia; Vietnam

Cambodia; China; East Timor; Mongolia

Philippines; Thailand



Barbados; Belize; Ecuador; Guyana; Haiti; Nicaragua; Panama

Argentina; Bolivia; Brazil; Chile; Colombia; Costa Rica; Cuba; Dominican Republic; Honduras; Mexico;

Paraguay; Peru; Venezuela



Brazil; Chile; Colombia; Costa Rica; El Salvador; Guatemala; Honduras; Jamaica; Mexico; Nicaragua; Paraguay; Peru; Uruguay; Venezuela

Albania; Belgrade (Fed Rep of Yugoslavia); TFYR Macedonia; Moldova; Pristina; Turkey#

Armenia; Azerbaijan; Bosnia & Herzegovina; Croatia; Georgia; Kazakhstan

Croatia

Mongolia;

Romania (under review)



* There is some uncertainty about whether these countries have ongoing national monitoring programs or whether one-off studies are being reported.

# Although Turkey has no institutionalised national monitoring program, it has conducted a number of one-off assessment projects.



ESTABLISHING NATIONAL PROGRAMS TO MONITOR LEARNING ACHIEVEMENT

Excerpt from M Forster, (2002) National Monitoring of Learning Achievement in Developing Countries UNICEF Education Section Working Paper


. PROTOCOLS FOR THE USE OF DATA
Another way to ensure, at the planning stage, that monitoring programs will provide information that can be used to improve student learning is to develop a set of protocols for the use of student achievement data.
As with other aspects of a program, the particulars of individual countries will shape the development of protocols. Nevertheless, as with other aspects of a program there are some generalisations that provide a useful starting point for all countries. Below are five suggested protocols.
Download 261.97 Kb.

Share with your friends:
1   2   3   4   5   6




The database is protected by copyright ©ininet.org 2024
send message

    Main page