Prepared for the 2004 Summer Institute on Comprehensive Needs Assessment
Adapted for the 2005 Migrant National Conference
How to Prepare for the Case
To get the most out of the case study method, first carefully read and think about the case. While there is no single way to prepare for a case discussion, you might find the following strategies helpful:
*** BEFORE YOU ARRIVE AT THE INSTITUTE *** Read the first few paragraphs, then go through the case almost as fast as you can turn the pages, asking yourself, “what broadly is the case about and what types of information am I being given to analyze?”
Read the full case very carefully, underlining key facts as you go. Then ask yourself: “What are the basic problems this manager has to resolve?” Try hard to put yourself in the position of the manager in the case. Develop a sense of involvement in the manager’s problems.
AT THE INSTITUTE Per instructions provided by the Facilitator, re-orient yourself to the case by skimming it. Then go through the case again, evaluating the event in light of your notes from the earlier training on the comprehensive needs assessment process. Note the key problems on scratch paper.
Develop a set of recommendations supported by your analysis of the case data.
After completing your own analysis, compare your findings with your fellow participants at your table. The purpose of this discussion is not to develop a consensus or a “group” position; it is to help each member refine, adjust, and amplify his/her own thinking.
What Happens in the Case Discussion
In the Institute, the facilitator will let members discuss any aspect of the case they wish. However, it is the facilitator’s role to prod participants to examine fully all avenues of investigation and to lead you to consider areas you may have missed. A healthy debate and discussion is encouraged.
The facilitator may tell “war stories” from other settings that relate to the situation under discussion and will encourage participants to do likewise. Finally, the facilitator will summarize the discussion and draw out the useful lessons and observations which are inherent in the case problem and which emerge from the case discussion.
A typical request by participants at the end of a case discussion is: “What is the answer?” Let me emphasize here that the case method of learning does not provide one single best answer. Rather, several viable “answers” will be developed and supported by various participants within the total group. While the facilitator may suggest the pros and cons of various alternatives, what is more significant is that you know what you would do in a specific situation.
In July of 2004, Valerie Ortega was appointed as the new state director of migrant education for the State of Atlantis. Two weeks after her appointment, Valerie received a letter from the U.S. Office of Migrant Education inquiring as to the progress the State had made in implementing a comprehensive needs assessment. The letter noted that the Atlantis had been found out-of-compliance with this program requirement in March of 2003 during an on-site program review. Valerie was not sure what the State Education Agency (SEA) had accomplished nor what this particular requirement entailed. After reviewing the regulations and guidance on this issue, Valerie set out to discover to what degree the State had completed its comprehensive needs assessment.
Background Migrant Student Profile Atlantis has identified 8,208 migrant students as eligible for the Title I, Part C, Migrant Education Program (MEP). Over 60 percent of the migrant students left Atlantis with their families to obtain seasonal agricultural work in other states during the 2003-2004 program year. Approximately 35 percent of Atlantis’ migrant students miss part of their schooling due to their migration.
Most of the migrant students reside in 40 of the State’s 250 school districts. Twenty-five (25) school districts serve the largest number of migrant children (and the most mobile). These districts are located in the more rural, southern part of the State. Migrant students that have “settled out” tend to reside in 15, more urban, school districts that are located in the northern section of the State.
For the last three years, the State’s MEP allocation has remained stable (see Table 1). However, during the same period, the State’s migrant child population increased by about 200 students per year, from 7,800 to a high of 8,409 in 2001-2002. The migrant child population dropped by 200 students in 2002-2003 (see Table 1). In FY 2002, the State’s subgrant formula (now taking into account migrant students with priority-for-service) had shifted a significant amount of MEP funds from northern districts to those in the south. This shift of MEP funds had created hard feelings among many local project staff in the northern part of the State as they felt their migrant students were just as needy as those in the south.
Table 1: 5 Year History of Atlantis MEP Grant Awards & Eligible Student Population
The academic performance of migrant students who were home-based in Atlantis, was also disappointing. The percent of migrant students who were proficient or making substantial progress toward proficiency had not increased in three years. In fact, for some grades, the percent proficient had decreased. Similarly, almost half of the secondary migrant students continued to drop out and not finish high school.
Progress to Date CNA Committee In talking with her supervisor, Valerie discovered that the previous State MEP Director had established a Needs Assessment Committee in June of 2003 to guide the implementation of a statewide comprehensive needs assessment process. The committee was composed of local MEP staff from 15 school districts, roughly half from the southern part of the state and half from the north. The role of the committee was to design and implement the needs assessment. The committee worked as a “committee of the whole” on all tasks related to designing the needs assessment.
The members of the committee were given some materials to review on the needs assessment process and the previous State Director provided an orientation that lasted one hour. While many members of the committee had years of experience with the MEP, most of the members were not well acquainted with the needs assessment process or using data to make decisions.
Valerie contacted two committee members who operate large migrant education projects in their district to talk with them about the process and to learn how much of the needs assessment had been actually completed. Both of these administrators are considered to be opinion leaders. Javier Lopez administers a migrant education program in the southern part of the State. Kathleen Sullivan is the local Title I director and also administers a migrant education program in a district in the northern section of the State. From these two local administrators Valerie learned the following.
Starting in August 2003, the committee had met every other month up through April 2004. Each of the five meetings lasted one and one-half days. In chairing the committee meetings, the previous State MEP Director viewed her role as convening the committee and empowering it by delegating the tasks of agenda setting, meeting facilitation, and decision-making to the committee.
Concern Statements The committee began its work by brainstorming what they believed to be the problems that migrant students face. The identification of “concerns” was started in the committee’s first meeting and was finalized by the end of their second meeting in October.
Initially, over 22 concerns were identified. The committee used the silent nominal group voting method to reach consensus on the top ten concerns. No other criterion was used, nor was there any discussion among the committee members since there were clearly eleven major concerns. The final list of high-priority concerns included:
We are concerned about the “mobility” of migrant students because they experience discontinuity in instruction that has negative effects on academic performance.
We are concerned about “quality of instructional delivery/teacher effectiveness” for migrant students because teachers may not effectively implement instructional strategies and classroom management techniques appropriate for supporting educational success for migrant students (ESL).
We are concerned about “quality of instructional delivery/teacher effectiveness” for migrant students because teachers may not adjust teaching practices to value the migrant culture and lifestyle.
We are concerned about “teacher-pupil ratio” for migrant students because a classroom’s teacher-pupil ratio may not allow a teacher to individualize instruction for migrant students when needed.
We are concerned about the “study habits” of migrant students because they often do not attend extra-help and make-up opportunities that are available.
We are concerned about “connection with school” because migrant students may not have sufficient interactions with caring adults or teachers.
We are concerned about “attendance” for migrant students because migrant students often have family and work obligations that prevent consistent school attendance that may lead to poor grades or incomplete courses.
We are concerned about “parent involvement in school activities” for migrant students because their parents may feel un-welcomed at school since the majority of the communication is “problem-related” (discipline) and events are not in the parents’ native language or in a parent-friendly format.
We are concerned about “homework” for migrant students because for elementary students, migrant parents may not have the information about strategies to help children learn at home (reading), for secondary students, migrant parents may not have the information or resources (transportation to tutoring) to help their student complete their homework.
We are concerned about “English language instruction” for migrant students because teachers may not link the student’s native language in a way that supports English language acquisition (use of cognates).
We are concerned about “procedures for credit accrual” for migrant students because there is no standardization for how students accrue credits from state to state which may result in some students not earning full credit toward graduation (credits earned on a quarter semester system vs. credit earned by a fifteen week systems).
Need Indicators / Data Sources In December, the committee began to work on creating need indicators that would help validate that their concerns actually existed. The committee reviewed each concern and discussed whether or not there was any existing data to create a needs indicator. After much discussion, the committee identified 6 concerns for which data existed and then quickly crafted a need indicator for each concern. The concerns, need indicator(s), and data sources that the committee developed in its December meeting are listed in Table 2.
The percent of migrant students who attend after school extra-help tutoring sessions
Local project performance report data
English language instruction
The percent of students identified as LEP
LEP flag recorded on the State’s migrant student record system
Procedures for credit accrual
The number of States that have credit accrual agreements with Atlantis
Credit accrual agreements
Data Collection / Analysis In February, the committee began to work on its data collection and analysis plan. The committee began this task by discussing how they would extract and compile data from each of the data sources. The committee also began to explore how they would determine a “comparison or benchmark” measure for each indicator in order to determine how large a gap there was on each need indicator (i.e., between what is and what ought to be). During this discussion, Javier Lopez and other members from the southern school districts began to express their unease that the selected need indicators weren’t focusing enough on the unique needs of mobile migrant students, or the cumulative effect of mobility on either currently migrant or settled out migrant students. Committee members from northern districts disagreed. They felt that the State had not focused enough on the quality of instruction and staffing of migrant education projects.
At the April 2004 meeting, the friendly and positive atmosphere of the committee began to unravel. Members from southern districts felt that the needs of “true” migrant children were being overlooked. The committee members who worked with migrant children residing in the north of the State felt that their projects were being less valued and their concerns being judged as less important.
The committee failed to make any progress in the April meeting. Members had begun to worry more about how the needs assessment might affect their projects and their funding rather than taking an objective look at the needs of migrant children. Several members from districts in the north expressed their belief that their programs were effective and that all they needed was more funding and to improve staff development for their classroom teachers. With the departure of the previous State MEP Director later in June, the majority of committee members felt that the process had been perhaps a waste of time.
What Went Wrong? Conclusion
A number of problems in this process were apparent to Valerie Ortega. Valerie knew that a comprehensive needs assessment was essential to improving the effectiveness of the MEP resources and improving the academic achievement of migrant students. But, the current process had gone off track. Valerie knew that finding a way to re-engage the Needs Assessment Committee, and at the same time go back and correct the problems in the process, was a difficult but necessary task facing her as the new State MEP Director.
What were the major problems with the comprehensive needs assessment process as implemented to date?
What leadership strategies and actions would you suggest Valerie Ortega employ to deal with the current situation?