A CASE STUDY: AN ECOLOGICAL LEADERSHIP MODEL AND DATA-BASED DECISION-MAKING by KATHLEEN E. LUDWIG A DISSERTATION Presented to the Department of Educational Leadership and the Graduate School of the University of Oregon in partial fulfillment of the requirements for the degree of Doctor of Education June 2008 11 University of Oregon Graduate School Confirmation of Approval and Acceptance of Dissertation prepared by: Kathleen Ludwig Title: "A Case Study: An Ecological Leadership Model and Data-Based Decision Making" This dissertation has been accepted and approved in partial fulfillment of the requirements for the Doctor of Education degree in the Department of Educational Leadership by: Diane Dunlap, Chairperson, Educational Leadership Philip McCullum, Member, Educational Leadership Leanne Ketterlin Geller, Member, Educational Leadership Jean Stockard, Outside Member, Planning Public Policy & Mgmt and Richard Linton, Vice President for Research and Graduate Studies/Dean of the Graduate School for the University of Oregon. June 14, 2008 Original approval signatures are on file with the Graduate School and the University of Oregon Libraries. © 2008 Kathleen E. Ludwig III IV An Abstract of the Dissertation of Kathleen E. Ludwig for the degree of Doctor ofEducation in the Department ofEducational Leadership to be taken June 2008 Title: A CASE STUDY: AN ECOLOGICAL LEADERSHIP MODEL AND DATA-BASED DECISION-MAKING Approved: _ Dr. Diane Dunlap This case study identified which ofBaker and Richards' (2004) leadership models (compliance, performance, ecological) were used to make data-based decisions in six Oregon schools. Two elementary, two middle and two high schools in a suburban school district were selected. Typologies of each school's reported data Sources, Leadership, Processes and Impacts were developed. The results of the typologies were applied through pattern-matching to a Conceptual Model ofData-Based Schools developed by Hill (2004) in an earlier study. The study investigated (a) the similarities and differences in how the schools used the data they collected; (b) patterns that emerged indicating how data were used to inform decisions; and (c) the data-based leadership model (compliance, performance, ecological) evidenced at each school, school level and within the overall district. Findings indicated consistent patterns of data-based practices across all six schools and placed each of them, as well as the overall district, on the continuum between the perfonnance and ecological leadership models. School administrators reported an ecological set of beliefs to guide their site-based decisions; teachers reported a perfonnance set of beliefs and practices in their classrooms. There was no significant difference attributed to school levels. The findings build on Hill's (2004) previous study, strengthen Baker and Richards' (2004) ecological leadership model, and add to the emerging literature on ecological leadership in schools. School leaders can use the model to identify current practices in data-based decision-making and share their fmdings with their staff in order to improve data practices and move along the continuum toward ongoing and continuous school improvement. v CURRICULUM VITAE NAME OF AUTHOR: Kathleen E. Ludwig GRADUATE AND UNDERGRADUATE SCHOOLS ATTENDED: University of Oregon Lewis and Clark College Whitworth College DEGREES AWARDED: Doctor of Education in Educational Leadership, 2008, University of Oregon Master ofEducation in Educational Administration, 2002, Lewis and Clark College Bachelor of Arts in Education, 1991, Whitworth College AREAS OF SPECIAL INTEREST: Educational Leadership Ecological Leadership Women in Educational Leadership PROFESSIONAL EXPERIENCE: School Principal, Sunset Primary School, West Linn-Wilsonville School District, West Linn, Oregon, 2004-present Instructional Coordinator, Stafford Primary School, West Linn-Wilsonville School District, West Linn, Oregon, 2000-2004 Teacher, Nancy Ryles Elementary School, Beaverton School District, Beaverton, Oregon, 1997-2000 VI Teacher, Tollgate Elementary School, Hopewell Valley Regional School District, Hopewell, New Jersey, 1993-1997 Teacher, Priest River Elementary School, Bonners Ferry School District, Sandpoint, Idaho, 1991-1992 AWARDS AND HONORS: Golden Apple Award, Whitworth College, Washington, 1991 New Jersey Governor's Teacher Recognition Award, New Jersey, 1995 Golden Apple Award, Beaverton School District, Oregon, 1999 Vll V111 ACKNOWLEDGMENTS First and foremost, I wish to thank the "Highland School District" and its teachers for their time and participation in this study. To Dr. Martha Hill, I offer my admiration and appreciation for her pioneering case study, which inspired my own investigation. Her hard work should be recognized. Thank you to Dr. Roger Woehl and Dr. Jane Stickney for their vision in creating and supporting a doctoral cohort at the University of Oregon, enabling me to have this opportunity in the first place. I wish to thank my colleagues in the Portland Metro Cohort, as we paved the way for other doctoral cohorts to follow and supported one another in achieving this important professional milestone. Specifically, a huge thank you to Christine Taylor from my cohort group. Together, we challenged and supported one another to get through each part of the journey, even calling each other up late at night to ask the silliest of questions. I wish to express my sincere gratitude to Dr. Diane Dunlap, my advisor and committee Chair, for her assistance throughout this doctoral journey: the courses she taught, comprehensive exams she proctored, and her superb assistance in the preparation of this manuscript. Thank you to the members of my committee: Dr. Leanne Ketterlin- Geller, Dr. Jean Stockard, Dr. Phil McCullum and Dr. Kate Dickson, for their insights and recommendations towards my research. Special thanks to Edith Kotbagi and Robin Rokey for their expertise in statistical analysis and editing, respectively. To the staff of my own school, Sunset Primary School, I offer my thanks for their encouragement and support through this 5-year journey. I wish to thank my family, particularly my parents, Ed and Audrey Carlson, who have always been a source of inspiration in my professional and personal life. They are great models of ecological leadership through their career work with adult literacy and health education in Pakistan. And finally, cheers to my husband, Don Ludwig, who also completes his doctorate this month and has been my consummate advocate and partner. Now we can finally go on that well-deserved vacation. IX DEDICATION To my husband, Don, for sharing the computer and my dreams. x Xl TABLE OF CONTENTS Chapter Page I. INTRODUCTION 1 Historical Perspective of Data-Based Decision-Making 1 Data-Based Decisions for School Improvement 8 Study Design 14 Significance of the Study 16 II. LITERATURE REVIEW 18 Impacts of Data-Based Decisions 18 Implementing Data-Based Decision-Making 26 Research Implications 31 Theoretical Framework for Data-Based Decision-Making 32 III. METHODOLOGY 45 Purpose of the Study 45 Research Design 46 Research Questions 48 Setting and Participants 50 Procedures 70 Validity and Reliability 77 Data Analysis Methods 79 Strengths and Limitations of Design 82 Confidentiality 83 XlI Ch~~ P~e IV. RESULTS 84 Data Collection 85 Research Question 1 89 Research Question 2 153 V. DATA ANALYSIS 164 Sources 164 Leadership 168 Processes 172 Impacts 175 Research Question 3 178 Summary 183 VI. CONCLUSIONS 185 Purpose Statement 185 Research Questions 186 Summary of the Findings 187 Limitations of This Study 189 Recommendations for Further Study 194 Recommendations for Practice 196 Summary 198 APPENDICES A. HIGHLAND SCHOOL DISTRICT SDP GUIDELINES 199 B. ADMINISTRATOR UNSTRUCTURED INTERVIEW PROTOCOL 207 C. CODED TEACHER SURVEY INSTRUMENT 210 D. TYPOLOGY OF THE COMPONENTS OF DATA-BASED SCHOOLS, HILL (2004), REVISED 218 Chapter X111 Page REFERENCES 224 XIV LIST OF TABLES Table Page 1. Baker and Richards' (2004) Model of Data-Based Decision-Making .40 2. A Comparison of the Differences and Similarities of the Replicated Study 49 3. Adams Teacher Characteristics 54 4. Johnson Teacher Characteristics 56 5. Cleveland Teacher Characteristics 59 6. Taft Teacher Characteristics 61 7. Roosevelt Teacher Characteristics 63 8. Hoover Teacher Characteristics 65 9. Demographic Comparison of Teacher Experience and Age (N = 180) 67 10. Comparison of Six Schools in Highland School District 69 11. Components ofData-Based Schools Aligned With Data-Collection Sources, Hill (2004) 75 12. Respondents to the Survey in the Six Highland Schools (N = 181) 87 13. Postcertification Coursework in Data Use or Statistics 88 14. School Data Use for School Improvement 90 15. Data Use by Classroom Teachers for Students and Classroom 97 xv Table Page 16. Percentage of Teachers Regularly Involved in Decision-Making 110 17. Highland Schools Teacher Survey Leadership Summary 114 18. Leadership Positive Responses Summary 126 19. Teacher Survey of Processes Summary 129 20. Process Survey Summary: Agree and Disagree Only Responses 133 21. Processes Positive Responses Summary 138 22. Impacts of Data Summary 140 23. Impacts Positive Responses Summary 149 24. Summary of Teacher Responses to Impacts of Data Use 150 25. Baker & Richards' (2004) Model of Data-Based Decision-Making 165 XVI LIST OF FIGURES Figure Page 1. Barber's (2002) Trends of Educational Accountability and Professional Judgment 3 2. School Leadership in an Ecological Framework 36 3. Hill's (2004) Conceptual Model of Data-Based Schools 47 4. Leadership Percentage Positive Responses by School 126 5. Processes Average Positive Responses by School 138 6. Impacts Average Positive Responses by School 148 1CHAPTER I INTRODUCTION School leaders acknowledge that the use of data to inform educational practices and improve student achievement has taken "center stage" in political arenas, federal and state mandates, and in their own site-based decisions. It is not unusual to pick up an academic journal on school improvement and read the terms "data-based" or "data- driven" within the content. Nor is it unusual for school-based improvement plans to include performance goals with data-based measurable outcomes. Using data to identify and report academic outcomes continues to be a challenging balance of "high-stakes" legislative policy and a school staffs interest in implementing an integrated school- improvement process. Historical Perspective of Data-Based Decision-Making Using data is not new to the field of education. Educators have always used forms of qualitative or quantitative measures-e.g., tests, exams, or essays-to measure student knowledge. Attendance, graduation rates and grades have served as other formal collections of data to report school effectiveness. But data remained primarily an in- house collection to be used and reported as the schools or individual teachers determined. From a positive perspective, this meant individual teachers and schools collected data to inform instruction or report progress linked to individual students. From a negative perspective, data have not been historically collected for systematic school-level improvements. A Recent Chronology of Data Use in the United States Trends in data usage during western educational reform can be traced back over the last four decades. Educational policy advisor Michael Barber (2002) depicts these trends chronologically as a 4-quadrant cross-section of progression between national prescription and professional judgment, beginning with poor knowledge and ending with rich knowledge. See Figure 1 for the graphic descriptor of Barber's timeline. Barber (2002) explains that prior to and during the 1970s, educators operated largely as individuals within a variance of broad guidelines and used their own professional judgments to make decisions. This could be characterized as a "leave us alone to teach" era and "uninformed professional judgment" (p. 11). During the 1980s, governments began to take control of education with state-directed curriculum and assessment systems (Barber, 2002; Tyack & Cuban, 1995). Responding to the report A Nation at Risk, states in the U.S. began to establish educational laws and regulations to improve student achievement. The reform included more days and hours of schooling, more academic courses, and more standardized forms testing student achievement (Tyack & Cuban, 1995). Essentially, this state-mandated model of reform was the 2 3Knowledge Poor National Prescription 1980s Uninformed prescription 1990s Informed prescription 1970s Uninformed professional judgment 2000s Informed professional judgment Knowledge Rich Professional Judgment FIGURE 1. Barber's (2002) trends of educational accountability and professional judgment. Source: Adapted from From Good to Great: Large-Scale Reform in England (p. 11), by M. Barber, April 2002, paper presented at Futures of Education Conference, Zurich, Switzerland. impetus to bureaucratic control and externally imposed rules to control performance in schools. Data collection and accountability moved from an individual choice to a mandated statewide regulation. Educational judgment in the 1980s could be characterized as "uninformed and prescriptive." As the years progressed into the late 1980s, national goals and curriculum standards were coupled with school-site management. Teachers were told what to teach but not how to teach it. This decision 4was left to teachers and their principals under a "site-based" management model (Tyack & Cuban, 1995). In the 1990s, state and federal government still controlled the educational agenda, but emerging research and studies began to influence and prompt new public policies (Barber, 2002). Barber describes this era as "informed prescription" (p. 11). These new policies placed additional external pressures on school administrators and teachers to be innovative and competitive (Tyack & Cuban, 1995). Private schools, charter schools and school-choice vouchers pressured public school districts to gather and report evidence of their own successful programs, strong curricula and student achievement (Holcomb, 1999; Tyack & Cuban, 1995). Public schools needed "proof' that they were making a difference. The mid- to late 1990s experienced a wide sweep of educational studies proclaiming a need for school reform and the importance of data-based decisions to drive changes in practice and improvement in student achievement. In 1994, the Elementary and Secondary Education Act (ESEA) reauthorization required states to adopt or develop challenging curriculum content and performance standards. It required assessments that aligned with content standards and accountability systems in order to assess progress in raising student achievement. Not long after, a report by the National Educational Goals Panel (1996), followed by the Quality Counts '99 (Olson, 1999) report, established expectations and processes for school improvement. These 5expectations included setting measurable goals, collecting data, and reporting the results. States were to respond to schools' results with rewards or sanctions. Barber (2002) described the 2000s as the era of "informed professional judgment." He proposed that control of education be returned to educators, but with requirements to be informed professionals, using evidence and research to justify and support decisions (p. 11). But the new millennium would not see a "return to educators" quite yet. The most recent 2001 federal legislation ofNo Child Left Behind (NCLB), a reauthorization of the 1994 ESEA, brought the most dramatic and controversial mandate in the history of accountability and school improvement. NCLB outlined a more systematic and standardized gathering and reporting of student data at the school and district levels. And unlike any other legislation, NCLB added federal sanctions for schools and districts that failed to make adequate yearly progress (AYP). Schools and districts now had no choice but to pay serious attention to performance data-in all subgroups-and make targeted goals for improvement. Since NCLB, the collection and use of school and student data have been fundamental to educational leaders in order to keep informed on a changing school environment as well as compliant with federal mandates. Schools no longer have time for "trial and error" experimentation in programs or curricula; instead, educators are now turning to data to inform immediate decisions and make changes needed to improve data scores (Earl & Katz, 2006; Wayman, 2005). 6Data Use and Legislation in Oregon Like other states across the nation, Oregon complied with federal mandates and expectations for school accountability and reform. In 1991, the Oregon Education Act for the 21st Century was enacted and amended later in 1995. The act included the following provisions to ensure school accountability: 1. Establishment of statewide standards for academic subjects. 2. Establishment of statewide assessments given in Grades 3, 5, 8, and 10. 3. Creation of certificates of mastery to award students who demonstrate proficiency in meeting state standards (Certificate ofInitial Mastery) and also to those who achieve a CIM as well as additional academic standards (Certificate of Advanced Mastery). At the same time the state legislation was adopting school reform, Ballot Measure 5 was passed by voters. This measure gave the majority (two thirds) of school funding from the local school districts to the state. With funding now the greater responsibility of the state, the legislature began to take more control and show greater interest in curricula, standards and assessment initiatives (McComb, 2002). Oregon legislation on school accountability included (a) mandated school- district-improvement plans, (b) annual reports of school district and school performance, (c) and annual school district and school-graded report cards. The State Board of Education required school districts and schools to conduct self-evaluations and update their local district-improvement plans on a biennial basis. This self-evaluation 7process involved public representatives (often in the form of school-based Site Councils) to help set goals, identify actions to meet the goals, and then monitor and evaluate the progress toward the goals. The Superintendent of Public Instruction and the office of the Oregon Department of Education (ODE) collected and issued annual public reports containing information on student performance, student behavior and school characteristics. Using criteria set by the State Board of Education as a basis, ODE assigned a grade to each school for each of the three areas summarized by an overall grade. If a school fell within the low or unacceptable performance classification in any category, the school filed a school-improvement plan with the Superintendent of Public Instruction and with the school district board and the 21st Century Schools Council for the school. Other data in the school performance report included attendance rates, school safety, dropout rates, school staff certification levels, special education enrollment, and enrollment in English language learner (ELL) courses. To date, the Oregon Department of Education states that accountability has always been one of its core foundations, ensuring that schools are effectively and efficiently meeting the needs of each student. Annual school report cards, produced by the Oregon Department of Education, document each district's and school's attendance rate, state assessment participation and state assessment achievement scores. Using these three criteria as a basis, ODE rates each district by documenting that they meet or 8fail to meet federal and state AYP standards; in addition, ODE labels each school as "Exceptional," "Strong," "Satisfactory" or "Unsatisfactory." Data-Based Decisions for School Improvement Decades ago educators made school-improvement decisions based on their own professional training, classroom observations and anecdotal records. Today, teachers and administrators are expected to consider a variety of additional data to measure and document student achievement. Community patrons, district school boards and state departments expect school-improvement plans and reports to contain data-based decision processes that include both qualitative and quantitative measures. Public accountability aside, data use in education can certainly be an integral and critical component for improvement of teaching and learning. Simply put, educators need to "know clearly, understand intricately and communicate effectively how their students benefit from attending school-in other words, the 'value added' of schooling" (Education Commission of the States, 2000, p. 1). Defining Data-Based Decision-Making Data-based decision-making can be defined as "the process of selecting, gathering, and analyzing data to address school improvement or student achievement problems and challenges and acting on those findings" (Streifer, 2002, p. 8). Data-based decision-making borrows much of its practices from literature on program evaluation. 9Streifer cautions that any decision about actions in response to data analyses requires judgment and experience; decisions should not be solely "data based" or made by intuition. These decisions can be made by any member of the organization or groups of members and in consultation with those most affected. Data Collection Schools can become even more efficient and effective learning organizations if data playa more active role in the ongoing processes and decisions made by educators (Bernhardt, 1998; Lambert, 2003; Reeves, 2006; Senge, 2000; Zemelman, Daniels, & Hyde, 2005). But what data should schools collect? And how are data to be used effectively for student achievement and school improvement? Schools are "awash" with data and house large archives of information about individual students, groups of students, programs, finances, facilities, staff records, community demographics, and even trends and patterns of enrollment growth or decline. The sheer amount of data available can be overwhelming and confusing. There can be details and databases on every aspect of school improvement as well as excesses or abuses of data that have no relevance to school improvement. Knowing how to define and categorize good data is one of the first steps to data literacy; otherwise, decisions may be based only on opinions, individual perceptions, or limited observations (Earl & Katz, 2006). Educators need to know first what types of data are helpful to collect and how to use them effectively. 10 Types of Data for Collection Dr. Victoria Bernhardt (1998), author of Data Analysis for Comprehensive Schoolwide Improvement, defines four major types or measures of essential data that schools should collect: (a) demographics, (b) perceptions, (c) student learning, and (d) school processes. Demographic data provide descriptive information about the school community. These include emollment, attendance, grade level, ethnicity, gender and native language. These data are part of the educational system over which we have no control. However, from them we can create a profile of the school and examine trends and patterns of the system. Perceptions data can be gathered through questionnaires, interviews, focus groups, and/or observations. Perceptions data are important to gleaning what other people believe, perceive or think about different topics or what is happening in the school community. Student learning data give information about student performance on different measures. These may include standardized tests, norm- or criterion- referenced tests, teacher observations of abilities, or authentic assessments. Schools typically use a variety of student learning measurements separately, often neglecting to think how they may be interrelated. School processes define what schools do to help students learn. School processes include programs, instruction and assessment strategies, and other classroom practices. School personnel who plan to change the results that schools are getting must document these processes and align them with the results in order to determine how to make changes. 11 These four measures of data can be examined separately, separately over time, or in combination to answer deeper questions, such as "How have students of different ethnicities scored on standardized tests over the past 3 years?" Intersecting the measures can facilitate a deeper investigation ofthe data and help examine all facets of the school (Bernhardt, 2004; see, also, Bernhardt, 1998). Technological Advances in Data Usage Fortunately, advancements in technology have provided more efficient access to a wide variety of data. Years ago, technology was capable of housing data but such archives involved limited or laborious access to composites of information (Streifer & Schumann, 2005). Today, electronic database software on personal computers can sort, merge, disaggregate and produce statistical reports on a multitude of variables efficiently and quickly. Computer software or vendor-purchased programs allow data to be streamlined through statistical programs that convert data from raw numbers to spreadsheets, graphs, charts, and comparison tables. Educators can now make decisions based on longitudinal or up-to-the-minute data that evaluate many variables at once. Personal computer software has become an essential tool for data-based decision- making (Wayman, 2005). To provide additional support, many school districts employ full-time "information technologists" or "data managers" who compile, disaggregate and distribute data to school boards, administrators and teachers (Salpeter, 2004; Streifer, 2002; Wayman, 2005). 12 In Oregon, all student assessment data are entered electronically into school districts' computerized data warehouses. When a student participates in the assessment during the school year, students, teachers and administrators have access to these results immediately. Every teacher, student or parent can access school data through a password or student identification number. The internet and web-based software have provided efficient and rapid ways of delivering, sorting and reporting data to individuals and groups. However, these technological advances have not always made the process for data collection and management easier or more accurate. Budget constraints for new technology, technology malfunctions, or continual updating of new software systems can frustrate data retrieval and financially drain many school districts. In 2006, contractual issues between the Oregon Department of Education (ODE) and the web-based software testing company resulted in abrupt cessation of the entire statewide testing status mid-year. ODE rapidly printed and distributed paper-pencil tests throughout the state to maintain NCLB compliance. The incident resulted in an "eleventh hour" administration of different tests to remaining students. Subsequently, the state and school districts found themselves at a loss for using the year's data in any reliable manner. Technological advances do not always suggest effective use of data by an organization. Nor does it suggest that all educators or stakeholders have the ability to respond with discernment to the data collected. It is equally important that school leaders understand when and how to use data as it is to understand what data to collect. 13 Leadership and Data Use According to Leading Learning Communities: Standards for What Principals Should Know and Be Able to Do, a publication by the National Association of Elementary School Principals (NAESP, 2002), there are six performance standards for principals. One of these standards includes expectations for data usage: "Use multiple sources of data as diagnostic tools to assess, identify, and apply instructional improvement" (p. 2). Strategies accompanying this standard encourage principals to consider a variety of data sources to measure performance; analyze data using a variety of strategies; use data tools to identify barriers to success; design strategies for improvement and to plan daily instruction; benchmark successful schools with similar demographics to identify strategies for improving student achievement; and, create a school environment that is comfortable using data. (p. 7) As these strategies infer, educational leaders have the task of managing thinking around the use of data. This thinking may still include skepticism about data, fears of misuse of data, limited use of data, overuse of data, and even inadequate understanding about using data in general. Identifying and evaluating how people within a system think about and use data are essential to keeping data in context, manageable, and useful. School leaders are expected to be intentional about implementing processes so that analyzing data becomes a habitual aspect of school work leading to school improvement within a school culture "comfortable" using data. So how are educational leaders making decisions regarding the data they collect? Federal regulations demand that data be collected and reported by states and school 14 districts. States mandate that school districts write improvement plans with data-based outcome measures. School boards and local communities expect to see results from standardized tests documenting student academic achievement. District administrators expect to see school plans that include outcomes measured by data. And teachers are expected to use formative or summative assessments to measure and report students' academic progress back to their students, parents and administrators. Despite each of these external or internal expectations, how are principals and teachers actually collecting and analyzing data? Furthermore, how are they using the data? And what impact does the collection and use of data have on the organization? Schools, like other social systems, are not mechanistic whereby one small part that "isn't working" can be identified, extracted, treated and rehabilitated within the system. Instead, they can be better viewed as ecological systems where a variety of people and subsystems flow in and out, constantly impacting and influencing the entire system (Bronfenbrenner, 1979; Goodlad, 2001). How do leaders and data, therefore, influence this ecological system? Study Design In this exploratory case study, data-based leadership models were identified in six schools within a high-performing school district in suburban Oregon to answer the various questions about data use and leadership responsibility. The study was focused on these questions: 15 1. What were the similarities and differences in how the schools used the data they collected? 2. What patterns emerged that indicated how data were used to inform decision- making? 3. What data-based leadership model (compliance, performance, ecological) predominated at each school, at each schoolleve1 (elementary, middle, high) and across the district? This study was a multiple case study design, incorporating separate case studies of six schools within the same district: two elementary, two middle and two high schools. Each school was examined through surveys, interviews and document analysis using a typology of indicators to identify how data were collected and used to inform decision-making. A data-based leadership model, informed by Baker and Richards' (2004) leadership continuum (compliance, performance, ecological), was identified for each school and the overall district. An earlier case study in North Carolina (Hill, 2004) examined five elementary schools by using Baker and Richards' (2004) leadership continuum. The schools were found to be within the compliant and performance continuum. The current study was a replication of Hill's work, with an expansion that included middle and high schools. 16 Significance of the Study By replicating and expanding Hill's (2004) earlier work, this study adds to the research by (a) exploring and describing how a school district in Oregon, with its accountability standards, uses data to inform decision-making; and (b) expanding the study to include middle and high schools. This expansion allows for the comparison of data use between school levels (elementary, middle, high) and an overall K-12 leadership model analysis. Teachers, administrators, and district personnel may find this study helpful in identifying how they are using data within their own organization. School leaders may choose to evaluate their current leadership model along the Baker and Richards' (2004) ecological leadership model continuum. Recommendations for staff development, systems analyses, and resource allocation may result. Additionally, the study will contribute to the body of research and literature examining schools' data-based decision- making within an ecological model of leadership. The next chapter in this study includes a literature review, which identifies impacts and barriers to data-based decisions; research implications; and the theoretical framework of an ecological leadership model for data-based decision-making. Chapter III describes the study's methodology, including research design, descriptions of each site and participants, and the data-collection process. Chapter IV reports the results of the first two research questions. In Chapter V, the results are analyzed and the third research question is answered. The final chapter summarizes the findings and offers recommendations for further research and changes in practice. 17 18 CHAPTER II LITERATURE REVIEW An emerging body of literature claims that schools which "take charge of change" by collecting and using organizational data have repeatedly shown an ability to be more effective and improve more rapidly than those that do not (Bernhardt, 1998, 2004; Blink, 2007; Earl & Katz, 2006; Protheroe, 2001; Reeves, 2006; Schmoker, 1996). A number of recent studies have demonstrated that schools which effectively use data actualize positive impacts on teaching and learning (Brunner et aI., 2005; Chen, Heritage, & Lee, 2005; Walpole, Justice, & Invernizzi, 2004) and development towards a professional culture of inquiry (Chrispeels, Brown, & Castillo, 2000; Feldman & Tung, 2001; Huffman & Kalnin, 2003). Impacts of Data-Based Decisions Impacts on Teaching and Learning in the Classroom The central goals ofNCLB and school-improvement plans are to close achievement gaps and increase student achievement. As school teachers work towards these goals for all students, they find themselves re-evaluating their current practices and implementing those that connect research and practice and include data-based 19 decisions (Walpole et aI., 2004). Walpole et al. conducted a case study of an elementary school in a high-poverty, highly mobile area with a culturally diverse population. This school, like others around the nation, had committed to a school-wide literacy reform in response to NCLB and internal efforts to improve its literacy program. After 6 years of concerted effort, staff development and an emphasis on prevention-based instruction in the early grades, teachers and administrators began to see the success of their efforts. More students at-risk were identified early on in kindergarten, and each at-risk cohort group diminished in size during the year and into the following year. In other words, the achievement gap was closing and students were making academic progress. The study examined essential "ingredients" contributing to the school's progress. Two of the key ingredients included assessment-based (data) decision-making and ongoing data analysis. The teachers in the study agreed that important decisions about students' literacy levels and instruction had to be made based on reliable, valid, and ongoing assessment data collection. Subsequently, data analysis was essential to the comprehensive reading reform in the school. It guided the intervention decisions and informed program refinement. Teachers' understanding ofliteracy research and best practice was informed by the ongoing collection and analysis of data (Walpole et aI., 2004). Likewise, Chen et al. (2005) reported findings from an evaluation research study of a web-based decision-support tool, the Quality School Portfolio (QSP). Telephone interviews were conducted to gather data on users' experiences during their QSP 20 training and their implementation of this tool. Results revealed that after teachers received some basic training centered around this user-friendly, data-analysis software program, QSP, they were able to identify more students with specific academic needs. The QSP helped the teachers create individual, longitudinal records for each study, including demographic information and assessment information. The records included the option to include data related to students' perceptions, interests, habits and opportunities to learn. By using these records, teachers could prepare and share data analysis of individual performance, group performance and school-wide performance with other schools. Teachers were able to disaggregate test results by subgroups of students and by subscales, making it easier to identify at-risk students. Results from the study suggest that technology-supported data "have the potential to increase the capacity of practitioners to use summative and formative assessments to identify needs and focus instructional planning" (Chen, 2005, p. 328). An additional outcome of the study indicated that using data in these schools reportedly promoted collaboration and shared planning among the educators. A similar study by Brunner et al. (2005) provided teachers in 15 New York City public schools with training to use a web-based data-report system (Grow Network) to inform decisions about teaching and learning. The 2-year study (despite changes in district and administrative restructuring) reported that teachers used the data system to meet their classes' and diverse students' academic needs. Teachers were able to use the 21 collected data to target resources and provide additional support for students who could benefit from more specific instruction, thereby increasing student achievement. Academic data can be collected by teachers through a concerted school-wide effort using curriculum-based measurement tools (Walpole et aI., 2004) and through the use of web-based software systems for more sophisticated analysis of demographic and academic information (Brunner et aI., 2005; Chen et aI., 2005). Data can also be collected by school personnel through existing procedures to evaluate social behaviors-e.g., student behavior in schools-that impact academic performance (Irvin et aI., 2006). Across four school districts, Irvin et al. surveyed users (n = 56) who routinely entered office discipline referrals (ODRs) into a school-wide information system (SWIS). The users included teachers, instructional assistants, principals, counselors and other classified staff. Questions in the survey asked about entry of ODR reports, helpfulness of the reports, decisions made using the reports, and impacts from these data-based decisions. Nearly all survey respondents rated the ODR report data as "increasing both efficiency and effectiveness of decision-making in schools" (p. 19). The researchers concluded that as more and different needs for data on student behavior emerge in schools, data-collection programs like SWIS are needed to help compile and manage the information so that it is useful for decision-making. These studies demonstrated the positive impact that ongoing data collection and data analysis made in schools. Teachers and administrators were able to identify students' academic and social behaviors that were at-risk and implement interventions 22 for performance improvement. Whether the school administrators would have made data-based decisions without NCLB mandates for performance improvement is unknown. Impacts on a Professional Culture of Inquiry Outcomes from several of the studies indicated that when data are used in a collaborative manner and center around a clear set of questions focused on student learning and achievement, schools begin to create a culture of inquiry (Chen et aI., 2005; Walpole et aI., 2004). Feldman and Tung's (2001) study revealed the same impact. The researchers examined six schools engaged in data-based inquiry and decision-making processes. The process in each school extended beyond individual classroom teachers using data to inform their practice to a systemic process that encompassed the whole school. Under this process, the school analyzes data to generate challenge areas and then creates inquiry groups of teachers who generate hypotheses, collect and analyze more data, and return to the full staff with an action plan to address their challenge area. The cyclical nature of the process ensured flexibility and ongoing inquiry. (Feldman & Tung, 2001, p. 11) Teachers from the six schools-one K-8, three middle and two high schools- either attended data-based inquiry or decision-making institute seminars and received "coaching" from on-site facilitators. A representative sample of teachers and administrators from each school were interviewed about the process, what they learned, and their perceptions of how the process was being used in their schools. The 23 researchers also attended meetings where the data-based decision-making processes were discussed. Results from the interviews and observations indicated that in schools where personnel implemented data-based inquiry more consistently, teachers reportedly became more reflective about their practice. The interviewees at these schools also reported that "their school culture had changed through 1) deprivatization of practice; and 2) building a more professional culture" (p. 16). Feldman and Tung (2001) concluded that "teachers must own the process, provide leadership for inquiry groups, facilitate meetings, push the thinking of others, and coordinate other aspects of the process" (p. 19). In this manner, school staff can create a culture that facilitates more professional dialogue, more reflective practice, and engages one another in asking questions and seeking answers (Feldman & Tung, 2001). In some states, such as California, school leadership teams (SLTs) have created this same type of professional culture of inquiry. SLTs in these schools are an aspect of school-based management that is teacher-led, focused on curriculum and instructional support, and an integral component of school reform (Chrispeels et aI., 2000). These teams often receive training to learn to work together so they can improve teaching and learning at their schools with the goal of affecting student outcomes. Chrispeels et al. (2000) conducted a study of 142 California schools (71 elementary and 71 secondary) that had received a full year of SLT training. To understand which factors predicted effective SLTs, the authors analyzed survey data from each of the 142 schools. Using a path-analysis procedure, they tested a model that identified the relationships among the 24 factors most likely to influence a team's ability to focus on teaching and learning. The strongest predictor of this focus was the team's use of data collected within the school to identify needs and guide future decisions. The findings demonstrated that data used positively impacted these SLTs and the decisions made in their schools. The more the teams learned about and used data, the more data were used to inform important decisions. Concurrently, student achievement improved. A culture of inquiry can also be influenced by a diverse "team" of school stakeholders across several school districts (Huffman & Kalnin, 2003). The purpose of Huffman and Kalnin's study was to investigate the impact of a long-term collaborative inquiry project undertaken by members of several school district communities. Teams of educators from across Minnesota participated in a year-long series of workshops "designed to help teams engage in inquiry in their own schools" (p. 571). Eight teams of 4-8 people were created. These teams were comprised of teachers, administrators, curriculum and assessment coordinators, and in some cases school board members and students. The inquiry project used the Third International Mathematics and Science Study (TIMSS) to identify questions about team members' own mathematics and science programs in their schools. Throughout the following year, teams were brought back together periodically to "analyze their data, develop action plans, and create long- term continuous improvement systems" (p. 571). The impact of this approach to encourage data-based decision-making was documented through surveys and a focus group held with the participants. Overall, the participants reported positively to all 25 aspects of the collaborative project. Participants indicated that they felt empowered and less isolated as they used data to collaborate on student progress and instructional practice. Several teachers from the groups indicated that data helped them to break the cycle of isolation and "get beyond their own classroom walls to discuss and debate school-wide issues" (p. 578). These studies suggest that data-based decision-making processes produce positive results in terms of teaching and learning within a culture of inquiry. In an article citing several similar studies which included data-based decision-making, Noyce, Perda, and Traver (2000) concurred that in a data-driven culture, "there is a willingness to use numbers systematically to reveal important patterns and to answer focused questions about policy, methods, and outcomes" (p. 53). Collaborative inquiry that integrates teams of teachers, administrators, school board members, and parents positively influences teachers and engages them in a continuous improvement process as part of a data-driven culture. Noyce et al. (2000) claimed that data-based school cultures "do not arise in a vacuum"; rather, they are cultivated intentionally by the district, administration and teachers (p. 54). How these school personnel determined what areas of their school culture to "cultivate" deserves more exploration. To date, most studies report administrative and school-wide efforts to implement data-based decisions in response to regulation mandates or academic performance goals. 26 Implementing Data-Based Decision-Making Despite the literature and research regarding positive impacts of data-based decisions, other studies have demonstrated that a number of teachers and administrators not only reported feeling inadequate regarding their use of data, but also cited a number of barriers to data usage in their own practices or schools (Bettesworth, 2006; Ingram, Louis, Schroeder, 2004; Lachat & Smith, 2005). In response, these same studies and others recommended strong school-based leadership as an integral component in tackling barriers that thwart data-use proficiency, usage and subsequent school improvement (Armstrong & Anthes, 2001; Feldman & Tung, 2001; Lachat & Smith, 2005; Murnane, Sharkey, & Boudett, 2005). Barriers to Data Use Ingram et al. (2004) conducted a longitudinal study of nine high schools that revealed most teachers in the study were willing to use data but had significant concerns about the kind of information available and how to use it. It is important to note that the nine schools in the study-located across the United States and varying in demographics-were nominated as leading practitioners of Continuous Improvement (CI) practices. CI was defined as a set of practices and philosophies found in seven categories: (a) continuous improvement, (b) customer input/focus, (c) data-based decision-making, (d) studying and evaluating processes, (e) leadership, (f) systems thinking, and (g) training. Although the nine schools were selected as exemplars of CI 27 practice, many of them were found to be limited in their implementation of CI. Barriers to establishing a school culture supportive of data-based decision-making were synthesized into three categories: cultural challenges, technical challenges and political challenges. Cultural challenges were those engendered by the way teachers thought about data and what they valued. For example, there may have been disagreement about what data were important or what kinds of data were meaningful. Technical challenges that emerged from the study included time to collect and analyze data, or unavailable data due to limited tools, and the difficulty in measuring the type of data that teachers really wanted. Finally, teachers felt that since data have often been used politically, there was significant mistrust about using it. Some wanted to avoid it altogether (Ingram et aI., 2004). Similar challenges in effectively implementing data use were found in a study of five high schools in varying urban school districts (Lachat & Smith, 2005). Lachat and Smith's case study focused on data use in five low-performing urban high schools undergoing significant school reform. Over a period of 4 years, qualitative data were collected examining the process of data use and factors that either supported or inhibited the use of data. These data included (a) school reform documents; (b) field-note documentation of data-analysis meetings and outcomes; (c) archival catalogues of data used by the schools; and (d) interviews with principals and other school administrators, teachers, school design teams and data teams. The case study findings uncovered four 28 key factors that had an impact on data use: (a) data quality and data access, (b) capacity for data disaggregation, (c) collaborative use of data organized around a clear set of questions, and (d) leadership structures that enhanced data use. Among the findings, the researchers concluded that schools where leadership structures supported a collaborative school-wide use of data experienced greater progress. While each of the schools made varying progress, the common component was the use of data for continuous improvement, not just as an activity, in each of the schools' reform plans (Lachat & Smith, 2005). A common theme among studies that noted barriers to data usage was the role of school leadership in contributing to the barriers or working to tackle them (Feldman & Tung, 2001; Lachat & Smith, 2005). Bettesworth's (2006) study emerged from a review of the literature on school leaders' barriers to using data to make decisions, namely their "self-efficacy." This study involved educators (n = 76) enrolled in administrative training programs designed for those seeking leadership opportunities in their districts. An initial survey regarding data-use skill level allowed for the group to be divided into two subgroups comparable in skill level. A treatment group (n = 31) received seminars and training modules that taught the participants how to use school data to make informed decisions regarding their instructional practices. The comparison group did not receive the seminars. Both groups received a posttest 6 months later, and several members of the treatment group were interviewed. Bettesworth's study revealed that the group which participated in a training session reportedly felt they were overwhelmingly 29 better able to apply their knowledge of data analysis and interpretation. As they learned to analyze, interpret and make instructional decisions based on data analysis, their efficacy increased and there was improvement in the participants' attitudes and ability to use data (Bettesworth, 2006). Ingram et al. (2004), Lachat and Smith (2005), and Bettesworth (2006) highlighted the importance of teacher collaboration, data-analysis training, and self- efficacy in addressing barriers that contributed to a lagging understanding of how data could be used effectively towards school improvement. Strong leadership was noted as a component to tacking these barriers. Leadership and Data-Based Decision-Making School leaders are charged with the responsibility for managing data and reporting it; therefore, their role encompasses tackling the barriers as well-even when the barriers include their own limitations. Betteworth's (2006) study began from the premise that if school leaders could increase their self-efficacy regarding data use, then perhaps their ability to use data in their schools would increase. And it did. Conversely, a school leader's lack of commitment and support for data usage can impede its progress (Earl & Fullan, 2003; Murnane, Sharkey, & Boudette, 2005). Murnane et al. (2005) conducted a study in Boston aimed at helping teams from 10 public schools tackle the barriers of time, expertise, support and understanding of the value of examining assessment data. This was accomplished through a series of 30 workshops and a user-friendly web-based software program. Participants included cluster teams (typically one administrator and one to three teachers) from each school. The teams met for fourteen 2.5-hour sessions designed to help the participants learn about the conceptual use of data, identify central questions regarding the data, and design and implement strategies to answer the questions. During the course of the study, the researchers learned from the participants and the design of the study that "schools need (a) a process for engaging in conversations around teaching and learning, (b) an opportunity for support of analyses of data within their schools, and (c) leadership committed to the endeavor" (p. 269). This last finding was evident when teams from different schools experienced different results based on the attitudes and support from their principals. Teams where strong leadership support was present were able to make explicit links between data analysis and instruction. In contrast, at another school a team's insights "fell on barren ground because the school principal did not see data analysis as important" (p. 279). Murnane et al. (2005) concluded that in these schools, the teams' work was not likely to make a difference in school practice. Fortunately, where school leadership embraced data-based decision-making, teachers saw positive impacts in their classrooms and across their school (Armstrong & Anthes, 2001; Feldman & Tung, 2001; Lachat & Smith, 2005). Armstrong and Anthes' (2001) case study demonstrated that this impact can occur on a greater systemic level and influence entire school districts. The researchers conducted interviews in six school districts in five different states that had reputations as exemplary data users. The 31 findings from the study indicated that each of the districts employed data-based decision-making processes. Each district collected a variety of data, used the data to improve student achievement, and expected both administrators and teachers throughout the district to use data as the bases for their decisions. The common characteristic across the districts was the strong leadership and supportive district-wide culture for using data for continuous improvement. Research Implications The body of research on data-based decision-making links school-wide efforts of data collection, data analysis and data-informed processes and procedures to school improvement. Despite the identified barriers in making data integration occur more consistently and immediately across all schools, a number of studies demonstrated that in schools where strong leadership supported and valued data-based decision-making, there was more evidence of its likelihood to happen and make a positive difference. This impact of an administrator's leadership mindset and practice regarding data- based decision-making deserves more research and examination. How does a school leader evaluate the current mindset and practices occurring in one's own school, or across a district? What leadership model facilitates data-based decision-making with a design to help school leaders tackle the barriers and make cultural, technical and political changes? 32 Theoretical Framework for Data-Based Decision-Making Earl (2005) presents the idea that an educational leader must learn to use and think about data less as a linear and mechanistic tool and more as though one were an artist. Artists are always gathering and using data. They are constantly observing, investigating, and responding to colors, textures, and images. And they use their considerable interpretive talent and experience to draw the salient features of the foreground, emphasize important dimensions and communicate a mood and a message to the audience. (p. 8) Earl (2005) explains that keeping this role or model in mind enables educators to become comfortable with using data. Leaders who make informed decisions need to "develop an inquiry habit of mind; become data literate; and create a culture of inquiry in their school community" (p. 8). The Annenberg Institute for School Reform's (1998) study of 18 Challenge schools mentioned earlier moved beyond implementing data-based decision-making practices and processes, to a place of "internalizing" the notion of being accountable. Commenting on this study, Rallis and MacMullen (2000) added that "too often schools or districts view accountability as public and external rather than a central component of their own practice" (p. 769). The authors proposed that schools find a way to combine internal and external accountability through a cycle of reflective inquiry. This process produces accountability and builds greater capacity. Inquiry-minded schools "perceive 33 questioning, seeking data, reflection, and subsequent action as the steps that are necessary to improve performance and get things done in the school" (p. 770). A learning community that can create and sustain a culture of inquiry would be able to integrate data-based decisions on a broader and deeper level than schools that responded only to the mandates. Becoming data literate and inquiry-minded are significant changes in practice that are consistent with the idea of professional learning organizations (Blink, 2007; Earl & Fullan, 2003; Earl & Katz, 2006). Peter Senge (1990) was the first to popularize the idea of thinking of organizations like schools, as "learning organizations" whereby everyone in the organizational system shares knowledge and the organization has a professional culture of inquiry and ongoing learning. According to Senge, "learning organizations" construct knowledge from their individual and social experiences, beliefs, values, emotions, and will. Dynamic learning organizations do not seek to separate out or deliver knowledge; rather, knowledge is constructed via the disciplines of systems thinking, shared mental models, collaboration, reflection and inquiry. This learning-organization construct emerges from a "systems thinking theory" of understanding (Senge, 1990). Ecological Systems Theory Whereas in the context of biology, "ecology" is the study of the habitats in which organisms live, many psychologists, social scientists and even business leaders have applied an ecological systems theory to understanding human behavior in social or 34 organizational settings (Kelly, Ryan, Altman, & Stelzner, 2000; Petrides & Guiney, 2002; Wielkiewicz & Stelzner, 2005). Urie Bronfenbrenner (1979) began the groundwork for understanding human development within the context of an ecological framework. Bronfenbrenner's ecological systems theory and model (now referred to as the bioecological model) proposed that human development progresses throughout life and through interactions with others, other systems (both micro- and macrosystems), and the environment (as cited in Kazdin, 2000). Bronfenbrenner's (1979) ecological systems model demonstrated not only the interconnectedness of our world's natural systems of plants, animals, fresh water and so on, but it also explained the interconnectedness within cultures and cultural groups (Bowers, 1995). Later work on this ecological systems theory by Bronfenbrenner and Morris (1998) identified a theory of individual or organizational development that included five nested environmental systems that ranged from "individual" characteristics to cultural influences. These systems included the chronosystem, macrosystem, exosystem, mesosystem and microsystem. In a recent paper presented to the American Education Research Association (AERA), Dunlap, Garrison, Hernandez, and Clott (2008) described an ecologically nested system within the framework of education. Examining the nested systems from the larger encompassing "circles" inward to the smaller "circles," the authors began with the chronosystem, which is not included as a nested "circle." The chronosystem is the interaction of all parts of the ecological system and its individuals over time. For 35 example, in terms of schools and data, the technological advances of computer systems, databases, software programs, statistical analysis programs, and efficient access could be documented as they have influenced and related to the organization over time. In a model, the chronosystem may be depicted as a one-way arrow that represents a trajectory of time. The macrosystem-the outermost circle---described the broad culture within which the organization exists and its influences on the organization. These influences may include race, ethnicity, society, culture, and values (Dunlap et aI., 2008). Inside the macrosystem nests the exosystem. It is the legal and operational context within which the organization functions. It includes policies, procedures and laws. The mesosystem lies inside the exosystem and describes relationships between the microsystems. Identifying the relationships within the mesosystem depends on what the individual core of the organization is and the microsystems surrounding it. The microsystem is the system where the individual or individual unit spends most of its time. In an educational framework, the microsystem may be the school, the district and the surrounding community. To clarify, Figure 2 adapts the ecological models of Bronfenbrenner (1979) and Dunlap et aI. (2008) into an ecological framework for school leadership. Placing "leader" and "leadership team" in the center circle within the nested system identifies the individual or "individual team" as the core decision-makers within a leadership context. Since principals or superintendents rarely make decisions in 36 Mesosystem Relationships between the microsystems Individual Leader Leadership Team Chronosystem --------------------~~ FIGURE 2. School leadership in an ecological framework. Source: Adapted from "The Ecology of Developmental Processes" (p. 996), by U. Bronfenbrenner and P. A. Morris, 1998, in W. Damon and R. M. Lerner (Eds.), Handbook o..fChild Psychology: Vol. 1. Theory (5th ed.), New York: Wiley; and A Theoretical Frameworkfor Education Based on Bronfenbrenner's Model (p. 11), by D. Dunlap, A. Garrison, P. Hernandez., and A. Clott, 2008, a paper presented at the annual meeting of the American Education Research Association, New York. 37 isolation or autonomously, "leader" and "leadership team" are interchangeable when observers claim, for example, that "leadership" made the decision. A bidirectional arrow is included in Figure 2 to reflect the constant action and reaction ofthe systems to one another. The school leader or leadership team is influenced by the school, district and community as well as external policies, procedures and laws. At the same time, systems such as culture, race, society, and values impact the internal systems and are impacted by them. The model and framework designed by Bronfenbrenner (1979) and Dunlap et al. (2008) allow us to see how an organization such as a school and its "leadership" are ecological and therefore impacted by the closest of relationships to the most distant yet influential systems. Kelly et al. (2000) concurred that an ecological perspective "provides an understanding of the interrelationships of social structures and social processes of groups, organizations, and communities in which we live and work" (p. 133). The public school is one such ecological social structure. Schools as Ecological Systems Margaret Wheatley's work and studies of interconnected systems use the terminology "ecosystems" in her many examinations of organizations. She describes these ecosystems as self-organizing, simple and yet complex (Wheatley & Kellner- Rogers, 2003). Within ecological systems, the stability actually depends on the ability of the members to change, and should the system refuse new ideas, it becomes vulnerable 38 and self-destructs. In these systems, members support one another with information and respond to one another with intelligence and collaboration (Bowers, 1995; Goodlad, 2001; Wheatley & Kellner-Rogers, 2003). School systems are one of many types of ecological systems where members are connected to one another and their environment. And these members greatly influence one another. Goodlad (2001) describes this ecological model of schooling as one that is concerned primarily with interactions, relationships, and interdependencies within their defined domain or environment. This model does not avoid setting goals; rather, the criteria for evaluating goal achievement are broadened and the inquiry into school effectiveness includes gathering multiple layers of data. In an ecological framework for understanding schools, goals are established and evaluated both inside the system as well as reckoned with from outside the system. Questions about the efficacy of the system begin to transcend those that are set only by external mandates. Data on test scores provide only information about those test scores. They give no in-depth information about the more complex functioning and health of the system (Goodlad, 2001). When school systems are considered as ecological systems, questions about the efficacy of the school take on broad stances. What constitutes a good learning environment (habitat) for teachers and students? What are the signs of a healthy school? What is a good school and how do we know when we have one? And just as important, how does one lead from an ecological perspective? 39 An Ecological Leadership Model for Data-Based Decision-Making Baker and Richards (2004), authors of The Ecology ofEducational Systems, concurred with the idea of the school as a learning organization with learning systems within it, and propose that school environments become places where "information flows freely and feedback among the participants stimulates continued growth of the organization" (p. 5). The authors claimed that such environments were the essence of learning communities and were "ecologically self-aware." Combining the theoretical understanding of schools as "ecological systems" with a leadership model that integrated this "flowing process" of information, Baker and Richards (2004) developed an ecological leadership model integrating data-based decision-making. Baker and Richards' (2004) model of leadership was "ecological in perspective" and designed to reflect the pedagogical principles and practices of the organization. The theory supported the notion that "children can develop physically, emotionally, intellectually, ethically, and spiritually" within a system, and particularly within a school system (p. 4). Therefore, ecological schools should embrace data-based decisions that seek to nurture and develop children physically, emotionally, intellectually, ethically, and spiritually. This was a broader and more holistic imperative to the use of data. Guided by this theory, Baker and Richards' model included three progressive dimensions of organizational leadership behavior: managing for Compliance, managing for Performance, and leading for a dynamic and sustainable organizational Ecology. 40 Table 1 describes the models of compliance, performance, and ecological data-based decision-making. As a culture moves along the continuum, the focus, analysis and evaluation of data within the model also progress along the continuum. TABLE 1. Baker and Richards' (2004) Model of Data-Based Decision-Making Goal Focus Analysis Evaluation Compliance Regulations and Standards Compare data to the standards Data are collected for accountability measures Performance Input from Stakeholders Data are collected from multiple resources and performance of the organization is important Efficient and increased performance Ecological Ongoing and Continuous Growth and Transformation Data expose relationships and differences, and is the basis for action Organizationalleaming occurs through the experiences and introspection of the process Note. Adapted from The Ecology ofEducational Systems (p. 20), by B. Baker and C. Richards, 2004, Upper Saddle River, NJ: Pearson Education. According to Baker and Richards (2004), managing for Compliance typically occurs in the form of rules and regulations. Both management style and learning behavior are focused on compliance with external expectations or mandates. In this type of dimension, knowledge is "received" and learning is "rule-based" (p. 19). Performance management shifts from external expectations to internal control (Baker & Richards, 2004). Organizations that operate using performance management have concluded that compliance alone does not produce results. There is a shift to narrowly targeted aspects of the organization that need improvement. The environment 41 is less regulated but the vision is still narrow. In this type of dimension, knowledge is "skill-based" and learning is "need-to-know" based (p. 19). School leaders who operate with an ecological mindset understand that learning is reciprocal and collaborative for both the children and the adults in the organization (Baker & Richards, 2004). Administrators who practice ecological leadership also understand that the organization is unpredictable and incorporates many "agents" acting simultaneously. The leadership decentralizes control and acknowledges that all members-including the children-are active participants and contributors. An ecological school nurtures a reflective learning environment where knowledge is "constructed" and learning is "collaborative" (p. 19). Study Implications In schools today, data-based decisions are a mandated reality. Organizations could simply operate with a regulatory approach to management and collect data to meet standards. And yet, the literature and research has demonstrated that when schools reach beyond simply responding to federal and state mandates and begin using data proactively, they also begin to foster school improvement and a more collaborative professional culture. Teachers in these same schools reported evidence of improved student achievement, better identification of student needs, more targeted allocation of resources, and an emergence of a culture of inquiry. This movement indicates a shift in thinking regarding the collection and use of data. 42 Katz, Sutherland, and Earl (2005) describe this mindshift as an "evaluation habit of mind" (p. 2327). Katz et al. acknowledge that external mandates for using data can serve very much like extrinsic motivators which operate to encourage a set of processes and practices that may be absent. If the extrinsic motivator (mandate) is removed, there is the risk that the desirable set of behaviors will discontinue. It is important, therefore, for the organization to find the intrinsic value for the behavior in order to sustain growth and improvement. Katz et al. conclude that when organizations "take part of various evaluation activities (i.e., goal setting, data collection, data analysis, and so forth) [they] learn to view the organization in a different manner and begin to question basic assumptions and practices" (p. 2330). There exists a leadership challenge to use data not merely as a "surveillance activity" but as an integral component of school improvement (Earl & Katz, 2006). As long as data are used for compliance, the actions and energy expended are fragile and empty. Data must be used not because of external mandates, but based on intrinsic reasons for collecting and using it for internal improvement and transformational practice (Baker & Richards, 2004; Bernhardt, 2004; Earl & Fullan, 2003). In order to do so, school personnel must examine and evaluate how data and information flow through the school or district. How are data and information being used by teachers or administrators? Is there a fundamental "mindshift" towards ongoing inquiry that has taken place within the culture of a school? Is there a distinguishable leadership model as measured by Baker and Richards' (2004) continuum? There is little research exploring 43 how data-based decision-making practices are measured within an ecological framework. Hill's Study In 2004, Hill set out to address this gap in the research. Hill's study of five elementary schools in North Carolina used Baker and Richards' (2004) leadership continuum for the first time to evaluate whether schools were operating within a Compliance, Performance or Ecological model of data-based decision-making. Using her review of the literature as a basis, Hill (2004) developed a typology of indicators for data-based decision-making that was applied through case study analysis to each of the five schools. The typology included components summarized into four categories: Sources (of data), Leadership, Processes, and Impacts on decision-making. The typology provided an overview of how each component may be measured, the possible range of the component's existence and measurability, and the source or sources where evidence of the component may be located. This typology became the basis for analysis and comparison of the five schools in the study. Once Hill's (2004) typology was developed, archival documents, interviews and surveys were collected to create an in-depth profile of each school and its data collection, leadership, processes, and impacts of decision-making. By considering how each school made decisions using data, Hill identified the school along Baker and Richards' (2004) Compliance-Performance-Ecological continuum of leadership models. 44 Once the schools had an established profile, they were compared to each other and the district was provided with recommendations for helping schools move along the continuum. Findings in Hill's (2004) North Carolina study revealed that (a) personnel in the schools did not have a common language or expertise related to data, (b) the five elementary schools were predominantly situated on the continuum between compliance and performance, and (c) the schools were defined more by external standards and demands than the overall systemic health of the school or organization. Purpose of This Study Little is written about schools as ecological systems, and even less about ecological leadership as a model within school systems that must contend with data (Petrides & Guiney, 2002). This study will replicate and expand upon Hill's (2004) work, adding to the body of research on data-based decision-making within an ecological leadership model. By examining the data-based leadership practices within schools, this study and others like it are in essence "modeling the model" of a new approach to leadership within learning institutions. 45 CHAPTER III METHODOLOGY This study was an exploratory case study with a multiple, embedded case design. According to Yin (2003), case studies are used to "contribute to our knowledge of individual, group, organizational, social, political, and related phenomena" (p. 1). Case study research is notably the most challenging of social science research; yet, with a clear set of questions, a supporting research design, a protocol for data collection, and a strategy for data analysis, case study research can add robust and rich information to the body of knowledge of individual, group or societal phenomena (Yin, 2003). Purpose of the Study This study will contribute to the recent research examining schools' current data- based decision-making practices and leadership models. In addition, it will add to the literature on ecological leadership within school systems. There is very little research that examines a district's data-based decision-making leadership models and practices within the context of an ecological system. A recent case study in North Carolina by Hill (2004) examined several schools within an ecological leadership model. 46 Research Design This was a replicated study of Hill's (2004) work, expanded to include middle and high schools. The data-collection processes were also slightly modified to account for the differences in the location of the study. Six schools within a school district in Oregon were examined to determine the use of data along Baker and Richards' (2004) leadership model continuum. The study involved two elementary, two middle and two high schools. As with multiple-case-study designs, the results of each site's data collection were not pooled across sites; rather, the data were part of the findings for each individual site. By expanding its scope to include a sample of elementary-middle- high schools, the study provided information about individual schools as well as comparative data between school levels. The addition of schools provided a more comprehensive K-12 data-based leadership model for the school district than did the Hill (2004) study, which examined only elementary schools. With Hill's (2004) typology as a guide, profiles of the two elementary, middle and high schools were created across data-based Sources, Leadership, Processes, and Impacts on decision-making. See Figure 3 for Hill's conceptual model. At each site, teachers were surveyed, administrators were interviewed, and archival documents were examined in order to collect information on data-based decision-making perceptions and processes. The cumulative data from each school's typology, surveys, interviews and collection of documents were used to analyze the schools along the continuum of the data-based leadership models of compliance-performance-ecological. Compliance Schools Performance Schools Ecological Schools 47 Sources Sources Sources Data on demographics, Data on demographics, Data on demographics, perceptions, internal <=> perceptions, internal perceptions, internal processes, and student processes, and student <=> processes, and student learning are utilized toward learning are collected and learning-all based on the meeting a specific goal utilized to increase four intelligences within specified parameters, performance in target areas (physical, emotional, criteria, or standards. or overall. mental, and spiritual)-are collected continuously to inform growth and ongoing transformation. Leadership Leader$hip Leadership Data are collected and used <=> Data are collected and <=> Data are collected on for compliance to standards shared in order to increase external and internal and established regulations. performance of specific environment as well as the areas or of entire school. physical, emotional, mental, and spiritual well-being of students on an ongoing basis to continually inform the school. Processes Processes Processes Processes for collection, Processes for collection, Processes for collection, dissemination, <=> dissemination, <=> dissemination, collaboration, and collaboration, and collaboration, and communication of data are communication of data are communication of data are used only when needed to used to increase part of the school culture. meet specific standards or performance of specific regulations. areas or of the overall school. Impacts Impacts Impacts Standards and regulated <=> Performance is increased or <=> The school continually requirements are met or not not increased based upon evolves and improves in met. the results. the four domains of intelligence. Compliance -----~~ Performance Ecological FIGURE 3. Hill's (2004) conceptual model of data-based schools. Source: Adapted from A case study in data-driven school improvement (p. 90), by M. G. Hill, 2004, unpublished doctoral dissertation, Teachers College, New York. 48 A comparative analysis of each of the schools revealed how individual data were collected and used for decision-making, and which leadership model predominated in each school, in each school level, and across the district. Table 2 demonstrates how the research design replicated and extended Hill's (2004) previous research. Research Questions This study's research questions replicated those used in Hill's (2004) original study and expanded them to include middle and high schools. These questions explored the same sets of beliefs and actions within each school regarding data collection and its use within decision-making: 1. What were the similarities and differences in how the schools used the data they collect? 2. What patterns emerged that indicated how data are used to inform decision- making? 3. What data-based leadership model (compliance, performance, ecological) predominated at each school, at each school level (primary, middle, high), and within the district (K-12)? Answering these research questions added to the body of research on ecological leadership; examined the similarities and differences of data-based decision-making between school levels (elementary, middle, high); and provided a K-12 analysis of the school district's predominant data-based decision-making leadership model. 49 TABLE 2. A Comparison of the Differences and Similarities of the Replicated Study Sites Data Collection Hill's (2004) case study (North Carolina) Five elementary schools Documentation: unique to schools in Hill's study Archival records: unique to schools in Hill's study Interviews Surveys This study (Oregon) Six schools: Two elementary, two middle, two high Documentation: Unique to schools in this study Archival Records: Unique to schools in this study Interviews (same protocol and questions from Hill's study) Surveys (same survey with modified answer options unique to schools in Oregon and middle and high school data) Data Analysis Hill's Conceptual Model of Data-Based Schools Hill's Typology of the Components in Data-Based Schools Baker and Richards' theoretical framework of data-based leadership models Hill's Conceptual Model ofData-Based Schools Hill's Typology of the Components in Data-Based Schools (modified to include middle and high school data and unique components to Oregon) Baker & Richards' theoretical framework of data-based leadership models Data Analysis Comparison Differences and similarities between the five elementary schools; data-based leadership model within each school; overall general leadership model in place within the district Differences and similarities between the same school levels and across school levels; data-based leadership model within each school; data-based leadership model within same school level; overall general leadership model in place within the district 50 Setting and Participants The setting for this case study was Highland School District in an Oregon suburb. The names of the towns, school district and individual schools within this study were altered to allow for confidentiality. At the time of the study, the Highland School District stretched across two towns and encompassed 42 square miles. The district was a pre-K-12 public school district with approximately 8,000 students. The district was well-supported by its community; its voting population consistently passed local bonds and levies even when surrounding districts were unable to do so. According to the district website, the Highland School District prided itselfon progressive and substantive staff-development opportunities for its teaching staff. It was the district's belief that a well-educated staff served its students well. State report cards at the time of the study rated each school as "Strong" or "Exceptional." Data from Scholastic Aptitude Test (SAT) scores showed students in Highland School District as among the highest performers in Oregon. Approximately 90% of all its graduates went on to 2- and 4-year colleges. The town of Oakview, within Highland School District, had a relatively stable population size over the last decade; however, in the last few years it was beginning to show signs of growth as residents sold off family farmland or sectioned off large residential acreage for multihome dwellings. Three of the selected schools in the study (one elementary, one middle and one high) were within the Oakview town boundaries. Each school had a higher socioeconomic population than in the neighboring town of 51 Belleview, the second of the district's towns. Each school also had a lower migrant and ELL population than in Belleview. None of the schools in Oakview were Title I schools. Belleview, on the other hand, had been growing rapidly. New subdivisions for homes and multihome dwellings continued to be built as the area continued to be developed within the urban growth boundary. The three schools located in Belleview (one elementary, one middle and one high) were comparable in size to those in the study located in Oakview. Two of the schools in Belleview, the elementary and the middle schools, were Title I schools and qualified for additional resources due to the socioeconomic status of the school community. Examining documents about each school revealed many similar vision themes and academic goals, especially among school levels. However, each school also described its own set of unique challenges or goals, which impacted decisions and plans. Despite these differences, the six schools operated under the same district policies and guidelines for curricular, budgetary and performance accountability expectations. Schools were staffed according to student enrollment; additional resources were provided to Title I schools and those with higher numbers of ELL students. Each of the school's administrators produced state-mandated School Development Plans (SDPs) describing their school's profile, improvement goals and action plans. The school administrators met on a regular basis-twice a month with school levels and once a month across school levels-to discuss, evaluate and plan district and site-based goals, processes, procedures and staff development. 52 Participants interviewed and surveyed for the study included two central district administrators, six principals, four assistant principals (middle and high schools), two instructional coordinators (elementary schools), and a self-selected, majority sample of certified teaching staff from each school site. Case Profile 1: Adams Elementary School Adams Elementary School was one of seven elementary schools located in the town of Belleview in the Highland School District. It covered about 16 square miles in its attendance boundary. Adams opened in 1990 and by the end of the first year, 290 students were enrolled. By 2006-2007, Adams had approximately 590 students enrolled in kindergarten through fifth grade. The growth and transition rate at Adams was noticeable each year. During the 2005-2006 school year, Adams Elementary School identified 180 new students in the school population. This was approximately one third of the student body. By 2006- 2007, thirteen percent of the students were English Language Learners. Approximately 23% qualified for free or reduced lunch. Thirty-nine percent lived in nearby apartment dwellings. According to the 2006-2007 Oregon state School Report Card, Adams received an Overall Rating of "Strong" based on academic achievement on state assessments, participation in state assessments, and school attendance. Adams Elementary School had a female principal and a female instructional coordinator. The principal had been leading the school for 11 years. She and the 53 instructional coordinator had been a leadership team for 3 years. Prior to this position, the instructional coordinator was a teacher at Adams for 5 years. The role of the instructional coordinator at Adams (and elsewhere in the district) was that of curriculum coach to teachers, on-site Gifted Education coordinator, on-site ELL coordinator, and administrative support to the principal. The instructional coordinator was under a certified teaching contract and restricted injob description only from any evaluation or supervision of staff. Adams' certified staff included one administrator, one instructional coordinator and 36 certified teachers. The certified teachers represented 23 classroom teachers (K-5), 3 Title I teachers, 1 Speech/Language Pathologist, I Librarian, 2 Special Education teachers, 1 CounselorlPsychologist, 2 ELL teachers, 1 PE teacher and 2 Music teachers. One hundred percent of the classroom teachers meet the Highly Qualified federal and state expectations. In this study, 22 teachers self-selected to respond to the survey, a response rate of 61 % of the total teaching staff of 36. The teachers were informed that participation in the survey was voluntary and confidential. The respondents were the participants representing Adams Elementary School for the purposes of this study. Table 3 displays the years of teaching experience, years of experience in the district, and age of the participants (N = 22). Teachers who did not respond to these questions were not calculated in the mean index. 54 TABLE 3. Adams Teacher Characteristics Years of No Mean experience 1-3 4-9 10-15 16-20 21-25 26-30 30+ response N index Number of 10 6 4 2 0 0 0 0 22 1.91 teachers Code 2 3 4 5 6 7 Years in No 22 district 1-3 4-9 10-15 16-20 21-25 26-30 30+ response Number of 14 5 3 0 0 0 0 0 1.50 teachers Code 2 3 4 5 6 7 No Teacher age 21-30 31-40 41-50 51-60 61+ response Number of 9 6 5 0 21 1.82 teachers Code 2 3 4 5 The participants' average amount of teaching experience was approximately 4-9 years, with the majority of these teachers in the category of 1-3 years. The average period of employment within the district was 4-9 years, with a large majority of these teachers in the category of 1-3 years in the district. The average age of the respondents was 31-40, with the majority of these teachers in the age range of21-30. Case Profile 2: Johnson Middle School Johnson Middle School was one of the three middle schools (Grades 6-8) in the Highland School District. It is also located in the town of Belleview. At the time of the study, Johnson Middle School was 26 years old; it opened its doors in 1980. Two years ago, a completed $4.5 million remodeling project updated the existing school. 55 Additional classrooms, technology spaces, and a stage were added. During 2006-2007, Johnson had approximately 668 students enrolled in school. Compared to the district at large, Johnson Middle School had a relatively diverse population, both ethnically and economically. Housing in this area was 60% owner- occupied and 40% renter-occupied, including rent-subsidized apartments across the road from the school. Nineteen percent of the student body were minority students; the largest minority group was Hispanic (12% of the student body). A revolving pattern of students transitioning in and out of the school was a documented concern at Johnson Middle School. During 2005-2006, the administrators reported a 24% turnover rate in student enrollment. In 2006-2007, more than 60% of those who enrolled after the September start date were ELL or special education students. According to the 2006- 2007 School Report Card, Johnson received an Overall Rating of "Strong" based on academic achievement on state assessments, participation in state assessments, and school attendance. Johnson Middle School had a female principal and a male assistant principal. The principal had been leading the school for 4 years. Prior to this position, she was the assistant principal for a year at the same school. She and the current assistant principal had been a leadership team for 1 year. The current assistant principal was new to the district. Johnson Middle School's staff included two administrators and 48 certified teachers. The certified teachers included 25 classroom subject-area teachers, 5 Music 56 and Arts teachers, 2 PE teachers, 1 school counselor, 1 psychologist, 3 Foreign Language teachers, 4 special education teachers, 2 program class teachers, 1 Librarian, 2 Speech/Language Pathologists, and 2 ELL teachers. Not all of the teachers worked full- time. All of the teachers met the Highly Qualified federal and state expectations. Thirty-four teachers responded to the survey, a response rate of 71 % of the total teaching staff of 48. Teachers were informed that participation in the survey was voluntary and confidential. Table 4 displays the years of experience, years of experience in the district, and age of the participants (N = 34). Teachers who did not respond to these questions were not calculated in the mean index. TABLE 4. Johnson Teacher Characteristics Years of 26-3 No Mean experience 1-3 4-9 10-15 16-20 21-25 0 30+ response N index Number of 10 14 5 3 0 0 33 2.1818 teachers Code 2 3 4 5 6 7 Years in 26-3 No district 1-3 4-9 10-15 16-20 21-25 0 30+ response Number of 12 9 8 3 0 0 33 2.1515 teachers Code 2 3 4 5 6 7 No Teacher age 21-30 31-40 41-50 51-60 61+ response Number of 11 11 4 5 2 33 2.2727 teachers Code 2 3 4 5 57 Johnson participants' average amount of teaching experience was approximately 4-9 years, with the majority of these teachers in the category of 4-9 years. The average years of their employment within the district was 4-9 years, with a majority of these teachers in the category of 1-3 years. The average age of the respondents was 31-40, with the majority of these teachers in the age range of21-40. Case Profile 3: Cleveland High School Cleveland High School was one of the two high schools in Highland School District, not including a small, new charter high school. Cleveland High School was located in the town of Belleview. It opened 10 years ago with 400 students in three grades. By 2003-2004, the high school had increased to 900 students in all four grades, 10th through 12th. In less than a decade, Cleveland High School had earned a reputation as one of the best high schools in Oregon. In 2006-2007, Cleveland had approximately 980 students enrolled in school. Similar to Johnson, its neighboring middle school, Cleveland High School had a more diverse population, both ethnically and economically, than the other schools in the district. According to the 2006-2007 School Report Card, Cleveland received an Overall Rating of "Strong" based on academic achievement on state assessments, participation in state assessments, and school attendance. Cleveland High School had a male principal and a female assistant principal. The school also had a second assistant principal responsible for student life and a 58 full-time Athletic Director who also served as an administrator. The principal had been leading the school for 8 years. He and the current assistant principal had been a leadership team for 5 years. Prior to this position, the assistant principal was a Teacher on Special Assignment at the high school and an instructional coordinator at an elementary school. Cleveland High School's staff included these four administrators, an additional TOSA (Teacher on Special Assignment), and 56 certified teachers. Ninety-seven percent of the certified staff met the criteria of Highly Qualified Teachers. Not all of the teachers were full-time employees. Thirty-one teachers responded to the survey, a response rate of 55% of the total teaching staff of 56. Teachers were informed that participation in the survey was voluntary and confidential. Table 5 displays the years of experience, years of experience in the district, and age of the participants (N = 31). Teachers who did not respond to these questions were not calculated in the mean index. Cleveland participants' average amount of teaching was approximately 10-15 years, with the majority of these teachers in the category of 4-9 years. The average years of their employment within the district was 4-9 years, with a majority of these teachers in the category of 4-9 years. The average age of the respondents was 31-40, with the maj ority of these teachers in the age range of 31-40. 59 TABLE 5. Cleveland Teacher Characteristics Years of 26-3 No Mean experience 1-3 4-9 10-15 16-20 21-25 0 30+ response N index Number of 5 13 7 3 2 0 0 31 2.5806 teachers Code 2 3 4 5 6 7 Years in 26-3 No district 1-3 4-9 10-15 16-20 21-25 0 30+ response Number of 7 13 6 5 0 0 0 0 31 2.903 teachers Code 2 3 4 5 6 7 No Teacher age 21-30 31-40 41-50 51-60 61+ response Number of 6 11 9 5 0 0 31 2.4194 teachers Code 2 3 4 5 Case Profile 4: Taft Elementary School Taft Elementary School was located in the town of Oakview. Taft was one of five elementary schools in Oakview and was the third oldest school in the district. While it had remained in its current location for over 100 years, it had experienced numerous level changes (middle school to elementary school) and remodels of the building over the years. Most recently, the school was remodeled to enlarge and update its library and cafeteria due to the growing emollment. At the time of the study, Taft was the second largest elementary school in the district with approximately 600 students emolled in kindergarten through fifth grade. During the same year, Taft Elementary had 12% of its student body emolled in the free/reduced meal program based on 60 socioeconomic income. Almost 10% of its total students were minorities. According to the 2006-2007 School Report Card, Taft Elementary School received an Overall Rating of "Strong" based on academic achievement on state assessments, participation in state assessments, and school attendance. Taft Elementary School had a female principal and a male instructional coordinator. The principal had been leading the school for 12 years. She and the instructional coordinator had been a leadership team for 3 years. Prior to this position, the instructional coordinator was a teacher at Taft for 6 years. Taft's certified staff included one administrator, one instructional coordinator, and 34 certified teachers. The certified teachers included 24 classroom teachers (K-5), 1 Speech/Language Pathologist, 1 Librarian, 3 Special Education teachers, 1 Counselor, 1 Psychologist, 1 ELL teacher, 1 PE teacher and 1 Music teacher. One hundred percent of the classroom teachers met the Highly Qualified federal and state expectations. Twenty-three teachers responded to the survey, a response rate of 68% of the total teaching staff of 34. Teachers were informed that participation in the survey was voluntary and confidential. Table 6 displays the years of experience, years of experience in the district, and age of the participants (N = 23). Teachers who did not respond to these questions were not calculated in the mean index. Taft participants' average amount of teaching experience was approximately 4-9 years, with the majority of these teachers in the category of 4-9 years. The average years 61 of their employment within the district was 4-9 years. The average age of the respondents was 31-40, with the majority of these teachers in the age range of21-30. TABLE 6. Taft Teacher Characteristics Years of 26-3 No Mean experience 1-3 4-9 10-15 16-20 21-25 0 30+ response N index Number of 7 10 2 2 0 0 23 2.2609 teachers Code 2 3 4 5 6 7 Years in 26-3 No district 1-3 4-9 10-15 16-20 21-25 0 30+ response Number of 9 9 4 0 0 0 0 23 1.8696 teachers Code 2 3 4 5 6 7 No Teacher age 21-30 31-40 41-50 51-60 61+ response Number of 9 5 2 7 0 0 23 2.3043 teachers Code 2 3 4 5 Case Profile 5: Roosevelt Middle School Roosevelt Middle School was one of two middle schools in the Highland School District located in the town of Oakview. Roosevelt Middle School opened its new building in a new location in 1999. Prior to this move, it was located in a smaller building that now serves as a renovated, early-childhood elementary school. Roosevelt's updated interior building design created a large common space and five smaller 62 "houses" of grade-level classrooms. During 2006-2007, Roosevelt had approximately 660 students enrolled in school. At the time of the study, Roosevelt Middle School had a 13% minority population; the largest minority group was Asian (6% of the student body). Eight percent of the total student body was enrolled in the free/reduced lunch program based on socioeconomic income. According to the 2006-2007 School Report Card, Roosevelt received an Overall Rating of "Strong" based on academic achievement on state assessments, participation in state assessments, and school attendance. Roosevelt had a female principal and a male assistant principal. This was the principal's first year in this position. Prior to this position, she was the assistant principal for 3 years at the same school. This was also the assistant principal's first year in his new position. Prior to this role, he was an assistant principal at Hoover High School in the same district. Roosevelt Middle School's staff included two administrators and 41 certified teachers. The certified teachers included 24 classroom subject area teachers, 4 Music and Arts teachers, 2 PE teachers, 1 school counselor, 1 psychologist, 3 Foreign Language teachers, 3 special education teachers, 1 program class teacher, 1 Librarian, 1 SpeechlLanguage Pathologist, and no ELL teachers. Ninety-six percent ofthe teaching staff met the Highly Qualified federal and state expectations. Not all ofthe teachers worked full-time at the school. 63 Twenty-three teachers responded to the survey, a response rate of 56% of the total teaching staff of 41. Teachers were informed that participation in the survey was voluntary and confidential. Table 7 displays the years of experience, years of experience in the district, and age of the respondents (N = 23). Teachers who did not respond to these questions were not calculated in the mean index. TABLE 7. Roosevelt Teacher Characteristics Years of 26-3 No Mean experience 1-3 4-9 10-15 16-20 21-25 0 30+ response N index Number of 8 7 3 2 2 0 0 23 2.3913 teachers Code 2 3 4 5 6 7 Years in 26-3 No district 1-3 4-9 10-15 16-20 21-25 0 30+ response Number of 10 8 2 2 0 0 23 2.0435 teachers Code 2 3 4 5 6 7 No Teacher age 21-30 31-40 41-50 51-60 61+ response Number of 3 5 10 4 0 23 2.7826 teachers Code 2 3 4 5 Roosevelt participants' average amount of teaching experience was approximately 4-9 years, with the majority of these teachers in the category of 1-3 years. The average years of their employment within the district was 4-9 years, with a majority of these teachers in the category of 1-3 years. The average age ofthe respondents was 41-50, with the majority of these teachers in the age range of 41-50. 64 Case Profile 6: Hoover High School Hoover High School was the larger of the two high schools in Highland School District and was located in Oakview. It was also one of the oldest schools in the district, dating back to the 1920s. The current building had received many facelifts. As it continued to increase in size, additions and remodels to the building were required. The previous year, the school opened a new common area, new performing arts center, and additional classroom space. During 2006-2007, Hoover had approximately 1,530 students enrolled in school. According to the 2006-2007 School Report Card, Hoover received an Overall Rating of "Exceptional" based on academic achievement on state assessments, participation in state assessments, and school attendance. Hoover High School had a female principal and a female assistant principal. The school had another full-time assistant principal of student services and a full-time Athletic Director who also served as an administrator. The principal had been leading the school for 8 years. Prior to this role, she was the principal of a middle school in the district. She and the current assistant principal had been a leadership team for 7 years. Prior to her current position, the assistant principal was a Teacher on Special Assignment at the same high school. Hoover High School's staff included the four administrators, one additional TOSA (Teacher On Special Assignment), and 82 certified teachers. Ninety-seven percent of the certified staff met the federal and state criteria as Highly Qualified Teachers. 65 Forty-eight teachers responded to the survey, a response rate of 58% of the total teaching staff of 83. Teachers were informed that participation in the survey was voluntary and confidential. Table 8 displays the years of experience, years of experience in the district, and age of the respondents (N = 48). Teachers who did not respond to these questions were not calculated in the mean index. TABLE 8. Hoover Teacher Characteristics Years of 26-3 No Mean experience 1-3 4-9 10-15 16-20 21-25 0 30+ response N index Number of 12 12 13 4 3 3 0 48 2.7708 teachers Code 1 2 3 4 5 6 7 Years in 26-3 No district 1-3 4-9 10-15 16-20 21-25 0 30+ response Number of 12 14 10 5 3 2 2 47 2.6383 teachers Code 2 3 4 5 6 7 No Teacher age 21-30 31-40 41-50 51-60 61+ response Number of 12' 13 11 11 0 48 2.5000 teachers Code 2 3 4 5 Hoover participants' average amount of teaching experience was approximately 10-15 years, with the slight majority of these teachers in the category of 10-15 years. The average years of their employment within the district was 10-15 years, with a majority of these teachers in the category of 4-9 years. The average age of the 66 respondents was 31-40, with the slight majority of these teachers in the age range of 31-40. District Site: Highland School District Central Administration There were three superintendents in the current central administration department: one superintendent and two assistant superintendents. The current superintendent of Highland School District had been in this position for 14 years. Prior to leading Highland, he had worked in Washington as a superintendent. The central administration team included two assistant superintendents. One of the assistant superintendents helped to oversee the middle and high schools. He was new to this position this year, after serving as a principal and instructional coordinator in the district for 14 years. The second assistant superintendent had been in the district for a combined 30 years, as teacher, instructional coordinator, principal and assistant superintendent. She helped to oversee the elementary schools as well as the Curriculum and Instruction across all levels. This study included interviews by the superintendent and the second assistant superintendent. Comparison of the Six Sites A comparison of participants from each school site was conducted. The teachers were compared by average age, years of teaching, and years of employment in the district. Table 9 displays the results of the comparison. 67 TABLE 9. Demographic Comparison of Teacher Experience and Age (N = 180) Average years of experience at Average years in Average age current level district Adams 31 - 40 4-9 4-9 Elementary (n = 22) Taft Elementary 31 - 40 4-9 4-9 (n = 23) Johnson Middle 31 - 40 4-9 4-9 (n = 33) Roosevelt Middle 41 - 50 4-9 4-9 (n = 23) Cleveland High 31 - 40 10- 15 4-9 (n = 31) Hoover High 31 - 40 10 - 15 10 - 15 (n = 48) Table 9 demonstrates that participants in the study from both elementary schools had comparable characteristics. They shared an average age of 31-40 years and teaching experience of 4-9 years. The average of their years of employment in the district was also similar. The middle school participants at each school differed slightly only in their average age. Roosevelt's teaching staff had an older average age than Johnson did: 41-50 years and 31-40 years, respectively. Years of teaching experience and employment in the district were similar. The two participants from the two high schools were also comparable in age and years of teaching experience. Hoover High School, however, had more participants who had taught in the district longer compared to teachers at Cleveland. All four elementary and middle schools' participants had 68 comparable years of teaching experience. Both high schools had staff with more years of teaching experience on average than at the other two levels. Each of the schools was also compared in regard to location, size (student enrollment), percentage of ethnic groups, English Language Learners (ELL), and Socioeconomic Status (SES) based on qualifications for free and reduced lunch and Title I status. Table 10 summarizes the findings. The information was based on each school's 2006-2007 Oregon state report card, SchoolMaster database and School Development Plans (SDPs). The sizes of the elementary and middle schools were comparable. Enrollment at the two high schools differed; Cleveland High School was about two thirds the size of Hoover High School. Each school had a large majority of White students (between 77% and 91 %). The English Language Learner (ELL) population was approximately one tenth of the total population at four of the schools. At Adams Elementary it was slightly higher (12.6%). At Hoover High School the ELL population was reported to be less than 1% of the school's entire enrollment. Regarding SES, there was some variability between the schools. The two schools with the higher percentages of students receiving free and reduced lunch, Adams and Johnson, were also both Title I schools. This meant they received federal funding for programs and instructional support. Cleveland and Hoover also differed in the proportion of their student population who received free and reduced lunch (17% and 5%, respectively). Cleveland High School, however, did not 69 TABLE 10. Comparison of Six Schools in Highland School District Ethnicity Free or % reduced Enroll- ELL lunch Title I School ment White Black Asian Hispanic Other* % % status Adams 590 79 4 13 3 12.6 28 .I Elementary (Belleview) Taft 600 88 6 3 2 4.4 14 Elementary (Oakview) Johnson 670 77 4 16 2 10.5 27 Middle (Belleview) Roosevelt 660 87 7 2 3 2.0 9 Middle (Oakview) Cleveland 980 84 3 10 2 7.9 17 High (Belleview) Hoover 1,530 91 4 2 2 6.0 5 High (Oakview) *Indicated as Multiracial, Native American, or Not Provided. qualify as a Title I school in the district. The three schools in the study with the highest SES population were also located in Belleview. These demographic differences, while not overwhelming, were notable and were taken into consideration during the study and addressed in the results. 70 Procedures Since this was a replicated study, the same data-collection process and instruments used in Hill's (2004) study were implemented. Miles and Huberman (1994) note that "using the same instruments as in prior studies is the only way [to] converse across studies.... We need common instruments to build theory, to improve explanations or predictions, and to make recommendations about practice" (p. 35). Modest modifications to the survey were made to include middle and high school data sources. These modifications were reviewed by two administrators (not within the study) to verify data sources collected at these levels; adjustments were made based on recommendations prior to the study. Data Collection Regarding data collection, Yin (2003) has outlined three principles that are important in conducting case studies. These include the use of multiple sources of evidence, establishing a case study database, and maintaining a chain of evidence. For this study, all three principles were applied. Documents and Archival Records The documents and archival records collected provided a description of each school's demographics, school culture, student achievement, school goals, processes and procedures. These data were gathered from the district's and schools' websites, the 71 district information database (SchoolMaster), each school's state Report Card, and school development plans. School Development Plans (SDP) are documents written by principals in conjunction with a school's Site Council. A Site Council is a representative group (about 8-15 members) of parents, teachers, classified employees and administrators. The role of a Site Council is to help write, monitor and evaluate the school development plan. The state of Oregon has specific criteria for data that are to be included in an SDP. Data from SDPs are entered into a combined Consolidated District Improvement Plan. Since each of these documents is a mandated, public document, they do have necessary and similar core elements. These elements include (a) demographic information, (b) reports of progress, (c) analysis of data, (d) new goals and action plans, and (d) staff development. Within these parameters, the Highland School District gives each school latitude in personalizing the SDP according to each school's unique community, goals and aspirations. No template or forms are given. See Appendix A for a copy of the Highland School District SDP guidelines. The result is a unique document in layout, length, narrative, and goals written by each school with similar themes and district initiatives woven throughout. In this study, the participant schools' SDPs were collected and used to look for evidence of data-based decision-making indicators. These were entered into each school's typology. Oregon school Report Cards are annual documents issued by the state providing a school's general demographics and student achievement data. The information from these multiple sources was included in each school's typology to identify indicators and patterns of data use. These data sources 72 provided information on each school's leadership models for using data, data processes and the impacts of data. Interviews The interview questions that were used for this study were taken from Hill's (2004) Administrator Unstructured Interview Protocol (see Appendix B). The design of the interview questions were categorized to provide information regarding (a) Leadership, (b) Data Sources, (c) Processes, and (d) Impacts on decision-making. The categories and questions aligned with the components of Hill's typology. The interview data were used to support or clarify other information gathered in the surveys, documents and archival records. Interviewees included the superintendent, assistant superintendent, six principals, four assistant principals and two instructional coordinators from the six sites. All of the interviews were taped and transcribed by the researcher. Responses were kept confidential by giving each interviewee a code-H (Highland), P (Principal/Vice-Principal/Instructional Coordinator) or S (Superintendent)-followed by a number. Quotations used in this study adhered to this code. Surveys The survey used in this study was developed by Hill (2004) and was also aligned with the research questions and the typology (see Appendix C). Hill's survey was 73 piloted in three elementary schools in two neighboring school districts prior to its use within Hill's study. Information from teacher reactions during the piloting stage was used to refine the instrument. For the purpose of this study, modest modifications were made to the survey. These included (a) changing the district and state assessment data to reflect those unique to the district and state of Oregon; (b) adding more data source items pertinent to middle and high schools as options for teachers to circle; and (c) transferring the survey to an electronic, secure web-based format for easy access, anonymity options and downloadable data retrieval. These modifications allowed the survey to be used across the K-12 spectrum and with an efficient access-retrieval format. The modifications to the content of the survey were reviewed for accuracy by two administrators outside of the study. Slight changes were made based on feedback received from the consulted administrators. The electronic version of the survey was tested by two volunteers (outside of the study) for ease of user access and completion. The electronic survey website was E-mailed to the teaching staff at each of the six schools. Case Study Database Yin's (2003) second principle for a strong case study design was the creation of a database. In this study, a database was set up for each of the six schools. Data collected from the documents and archival records were kept for each school as hard copies as well as entered electronically within each school's typology. And wherever possible, full 74 documents were collected from school development plans. Raw data from the surveys were downloaded into spreadsheets for statistical analysis and review. Transcripts from the interviews were typed verbatim and in full. All of these items were maintained in an electronic or hard copy filing-system database. The database included each school's data-based decision-making profile using Hill's typology. Maintaining a Chain of Evidence Maintaining a chain of evidence is often recommended to increase a study's reliability. It also allows for an external observer to follow the derivation of any evidence and follow the steps throughout the research process. Since this study was a replicated research design, there was already a well-documented research design and process established and outlined. Replication of a study already strengthens its reliability (Creswell, 2003; Yin, 2003). Nevertheless, for this study, a similar chain of evidence was maintained. Careful protocols, procedures and documentation of the study's process were followed. Utilizing Hill's Typology Hill's (2004) review of the literature revealed a need for a set of indicators to determine elements of a data-based school. According to Hill, "the research revealed that data-based decision-making is present in schools in various forms, different levels, and to varying degrees within the organization" (p. 73). In order to identify the forms, 75 levels and degrees, a typology was designed. This study used Hill's typology to examine components of each data-based school-its Sources, Leadership, Processes, and Impacts on decision-making. Hill explains, The typology provides an overview of how each component may be measured, the possible range of the component's existence and measurability, and the source or sources of information where evidence of the component may be located and studied in detail. This typology is the basis for analysis and comparison of the schools. (p. 95) Table 11 demonstrates how Hill (2004) aligned the components of the typology and the data collected at each school. TABLE 11. Components of Data-Based Schools Aligned With Data-Collection Sources, Hill (2004) Types of Data Sources Demographics Perceptions Internal processes Student learning outcomes Leadership Leadership State and district support Processes Process for decision-making Process for communication and dissemination of information Process for building and increasing organizational capacity Professional development process in data use Impacts on decision-making Active interaction of sources and processes Individual and organizational learning occur Individuals, subgroups, and organization are strengthened through participation in the data-based process Documentation and Archival Records Interview Survey 76 Hill's (2004) typology included four sections: 1. Sources (variables), which include the following categories: (a) demographics, (b) perceptions, (c) school processes, and (d) student learning. These categories were gleaned from the literature that suggested data-based schools include these four critical sources of data. 2. Leadership, which includes the following categories: (a) Mental Models, and (b) Leadership Models. It is important to analyze both the school's mental model (how the school perceives and applies its data use in the decision-making process) and the actual data-based leadership model that is in practice within each school. Hill (2004) also included seven literature-based Leadership Components in data-based schools within this category to be measured by the report ofthe participants within the study. The Leadership Components include (a) accountability standards, (b) communication, (c) dissemination of data, (d) local/district/state support, (e) principal support and advocacy, and (1) aligned curriculum and assessment. 3. Processes: This section analyzed each school along process categories ranging from the process for data collection to whether there was a process for building and increasing organizational capacity. 4. Impacts: As Hill (2004) states, The ultimate outcome for any data-driven process is the impact that the process has on aspects or parts of the organization or on the organization as a whole. Schools are no exception, and when data playa vital role in the development of the school's work in part or in whole, the school gains new insights into itself, its effectiveness, and its direction for change. (p. 69) 77 This section of the typology measured whether data sources and participants noted impacts on decision-making within the organization. For the purpose of this study, Hill's (2004) typology was expanded to include indicators pertaining to middle and high schools and indicators unique to the state of Oregon. These adaptations did not significantly alter or misrepresent the purpose of the typology; rather, they added opportunities for more data collection to be analyzed and for the typology to be relevant to the region of this study. This revised typology (see Appendix D) was used in the study. Validity and Reliability The quality of a social empirical research design-such as this one-ean be judged on four commonly used tests: construct validity, internal validity, external validity, and reliability (Yin, 2003). The quality of this study was aligned with these same four tests of validity and reliability. According to Yin (2003), construct validity, which is often the most criticized element of case study research, involves "establishing correct operational measures for the concepts being studied" (pp. 34-35). Creswell (2003) described validity as a strength of case study design yet cautioned that the "credibility" or "authenticity" of findings can be questioned by the reader if measures are not taken to check the accuracy of findings. There are many strategies to do so. Creswell (2003) and Yin (2003) recommended the use of multiple sources of evidence, establishing a chain of evidence and having a peer 78 debriefer to enhance the accuracy. In this study, the first two of these tactics were employed. In a case study, internal validity (establishing causal relationships) may be problematic because the researcher is making inferences (Creswell, 2003; Yin, 2003). For example, the researcher may "infer" that a particular event resulted from an earlier occurrence based on the responses to interviews or documentation from a surveyor records. Anticipating these inferences, a researcher may ask: Is the inference correct? Have all the rival explanations and possibilities been considered? Is the evidence convergent? In this study, the tactic to address internal validity was the application of pattern matching. Data collected from multiple sources were used to match themes and responses to indicators of Baker and Richards' (2004) ecological leadership continuum (compliance-performance-ecological) within Hill's (2004) model (see Figure 3). External validity may be understood to address whether the study's findings can be generalized beyond the immediate study (Creswell, 2003; Yin, 2003). However, this is incorrect when dealing with case studies that deal with analytical generalization-that is, the investigator strives to generalize a particular set of results to a broader theory (Yin, 2003). Generalization is not automatic, but is tested by replicating the findings in repeated, different case studies. This multiple, embedded case study research design (replicating an earlier study) did follow a "replication logic," thereby strengthening its external validity. 79 Reliability simply answers the question whether a later investigator could follow the same procedures and conduct the same case study all over again and arrive at the same findings. The goal is to minimize errors and biases (Creswell, 2003; Yin, 2003). Since this was a replicated study, care and attention to the same protocols and procedures from Hill's (2004) study were taken throughout the data-collection process. In addition, a case study database was maintained throughout the process. Data Analysis Methods Surveys To gather further information about each school, surveys were sent to every certified teacher at each school site. Participation in the study was voluntary and a majority of teachers from each school responded to the study. The data from these sample pools were analyzed. Descriptive statistics from the survey included demographic percentages of respondents' characteristics. A questionnaire portion of the survey, Questions 1-4, provided information about each teacher's age, teaching experience, and years in the district for comparison of respondents from each site. Questions 5 and 6 asked for a description of any course or seminar on data use and interpretation. Questions 14 and 15 asked respondents to identify all of the types of data sources that the school and the respondent used to make decisions about instruction and school improvement. Question 16 asked respondents to identify the number of staff members regularly involved in decision-making in the school. Question 33 was an 80 open-ended essay response about the impacts of data-based decision-making observed at the site by the respondent. The remaining questions (7-13 and 16-32) used a Likert scale for rating each question with responses of "agree," "disagree," or "undecided" within the categories of Sources, Leadership, Processes, and Impacts on Decision- Making. Responses were totaled by positive and negative responses according to the coding of the survey. Undecided responses were eliminated. A mean index was calculated for each of the questions within the specified category by totaling the positive responses and dividing by the number of items. Chi-square tests were run on each question (after eliminating the undecided responses) to compare the six schools across different components. An index of the six schools was created for each area-Leadership, Processes, and Impacts-by determining the mean and standard deviation of the schools. An ANOVA (analysis of variance) was performed on each categorical index. Documents and Archival Records The data collected from the documents and archival records were entered into the typology of each school. Primary documents included each school's School Development Plans, which contained demographics on each school, reports on previous years' goals, and descriptions of current goals and action plans. (See Appendix A for district's expectations for a school development plan.) 81 Interviews Each interview was tape recorded and conducted face to face by the researcher. Questions were standard for each interview and followed a protocol. Each interview lasted about 40-60 minutes. The interview responses were transcribed verbatim. The questions in the interview were designed to gather information from each administrator about the school or district's mental model of data use, sources of data use, processes for data-based decision-making, and impacts on decision-making. Themes and patterns were noted and entered into each school's typology. Data analysis included transcribing, coding, and sorting through each of the sources of data gathered: surveys, interviews, archival records, and documents. These data were entered into databases created for each school site. The data were analyzed at each school level and the findings were recorded. Data analysis was guided by Hill's (2004) same procedures according to the instruments and protocols used. A cross-case analysis of all six schools and across each school level (elementary, middle, high) was performed. The similarities, differences and patterns were noted. On the basis of individual school analyses and emerging patterns, each individual school was placed along the leadership model continuum (compliance-performance- ecological). An overall determination of the district's placement on the continuum was also made based on the similarities across the schools. 82 Strengths and Limitations of Design A fundamental strength of this research design was the number of schools being used in the study (N = 6). A multiple-case design is often regarded as more robust than a single-case design (Creswell, 2003; Yin, 2003). Using multiple cases enables a theoretical "replication" study to take place. This means that each case study is predicted to have similar or contrasting results but for predictable reasons (Yin, 2003, p. 47). This research design included both quantitative and qualitative research methods, and triangulation of data, which also strengthened the design. Replicating and extending Hill's (2004) study strengthened the earlier work and further validated the instruments (typology, survey, interview protocol) as well as Baker and Richards' (2004) theory of data-based leadership models. Limitations included the researcher as an elementary school administrator in the district being studied. Although the researcher's school was not considered in the study, the primary collegial relationship could have influenced the teacher surveys and administrator interview responses. Control for these threats included assurances of anonymity in the surveys and the use of pseudonyms for the schools. "Negative-case sampling" method was employed to reduce researcher bias. The only other case study of this nature was Hill's (2004) study and the results of the study indicated that the schools were between compliance and performance. The results, analysis and findings of the data from that study were compared to the results in the current study. By doing so, the researcher could carefully check that the results in the current study were 83 different than those in Hill's. A professional statistician was employed to check the researcher's statistical accuracy. Statistical generalization of the study is a limitation, as this study was conducted in a suburban, high-performing school district. However, schools in urban settings or with low-performing standardized assessment scores would still be able to use elements of the study for their own school and district evaluation. Analytical generalization of the data-based leadership models can be applicable. Confidentiality In order to strengthen the study's validity and honor the request of the district chosen for the study, measures were taken to ensure confidentiality during the interviews and anonymity with the surveys. The names of the district and the schools were changed. Individuals interviewed were not identified. All documents and archival data used in the study are in the public domain; school names were changed. No student-specific data were used in the study. 84 CHAPTER IV RESULTS The purpose of this study was to explore data-based decision-making in six Oregon schools within the same district. This study replicated and expanded Hill's (2004) earlier investigation of elementary schools to include middle and high schools, allowing for comparison within levels and across school levels (elementary, middle, high) and also allowing for an overall K-12 investigation. A typology of indicators was designed to identify data sources, leadership behaviors, processes and impacts on decision-making. Documents, archival records, surveys and interviews were collected at each school site in order to explore the extent to which school personnel utilized data to make decisions. In this chapter, results of both the quantitative (surveys) and qualitative (interviews, documents) investigations of each case study school are reported, addressing the first two of the three research questions: 1. What were the similarities and differences in how the schools use the data they collect? 2. What patterns emerged that indicate how data were used to inform decision- making? 85 3. What data-based leadership model (compliance, performance, ecological) predominated at each school, at each school level (elementary, middle, high), and across the district? Answering the first research question required a determination of (a) the kinds of data used in each school, (b) the leadership behaviors in making decisions, (c) the processes in place at each school for decision-making, and (d) the impacts from data use that the teachers and administrators reported. The second research question emerged from the investigation of the first research question. As the typologies of each school were developed, patterns emerged indicating how data were used at each school to inform decisions. The data results from addressing the first two questions provided the data for the analysis outlined in Chapter V and the information to answer the final research question. Data Collection To maintain the integrity ofa replicated study, this research used the same research questions and followed the same data-collection processes and protocols as those found in Hill (2004). Variance from the earlier model included an expansion of the data collection to include middle and high schools and an adaptation of the survey to include data sources unique to the state of Oregon. Neither adaptation changed the essential model of the earlier study. The intent of this replicated study was to explore the same research questions within a different geographical location and to add to the 86 body of literature regarding an ecological leadership model for data-based decision- making. The quantitative component of the study was conducted through surveys of teachers at each of the six schools. The schools were purposefully selected to represent each level (elementary, middle, high). They were comparable in size. One school at each schoolleve1 was also geographically located in one ofthe two towns comprising the district (Belleview and Oakview). Whether the variance of geographical locations within the district contributed to any data-based decisions by members of the leadership team or the teaching staff could be examined. All six principals of the six schools agreed to participate in the research study. Time was provided by each principal for the researcher to meet with each school's teaching staff at a faculty meeting. The survey protocol was uniformly explained at each school site and teachers were assured that their participation was voluntary and confidential. The survey was electronically designed using SurveyMonkey, a web-based, secure survey software company. The website link was E-mailed to each certified teacher at each school site. The final response rate was dependent on teachers reading the E-mail and deciding to participate in the study by clicking on the link and taking the survey. The response rate at each school site was over 50%, indicating that at least a majority of the teaching staff at each school provided a representative sample for each school unit. When reference is made to the total number of surveys in this study, N = 181. In individual school case studies, it should be noted that N = the total number 87 of respondents in that particular school unit of analysis. An open-ended question concluded the survey given to the teachers. These answers were kept confidential and coded for each respondent: H (Highland), T (Teacher), followed by a number. Quotations used within the study adhered to this code. Table 12 shows the distribution of the survey participants from all six schools. TABLE 12. Respondents to the Survey in the Six Highland Schools (N = 181) Specialists (ELL, Special Total Education, Title I Reading, respondents Respondents School K-5 6-8 9-12 Counselor, Psychologist) N % Taft ES 21 2 23 68 Adams ES 17 5 22 61 Roosevelt MS 22 23 56 Johnson MS 32 2 34 71 HooverHS 42 6 48 58 Cleveland HS 23 8 31 55 Survey Questions 1 through 4 gathered brief demographic data about the participants, such as their current level or area of instruction, years of experience in teaching and in the district, and their age range. Survey Questions 5 and 6 asked participants if they had taken any coursework in statistics or data use and interpretation. The purpose of this question was to address the potential impact that a teacher's knowledge or understanding of data use could have on their answers. As noted in Table 13, the number of teachers who took coursework in data use was low: only about one third of the participants from each school. 88 TABLE 13. Postcertification Coursework in Data Use or Statistics Taft Adams Roosevelt Johnson Hoover Cleveland ES ES MS MS HS HS (N= 23) (N= 22) (N = 23) (N= 34) (N= 48) (N=3l) Number of teachers 9 7 8 12 13 12 (percentage) (39%) (32%) (35%) (35%) (27%) (39%) Coursework taken at a 8 5 8 12 13 12 higher education location Coursework taken at 2 0 0 0 0 school/district location Taft Elementary and Cleveland High Schools had the highest percentage of participant teachers who had taken data-related coursework (39% at both schools). Approximately the same percentage of participant teachers at Adams Elementary School (32%), Roosevelt Middle School (35%), and Johnson Middle School (35%) had data-related coursework in their backgrounds. Hoover High School had the least amount: just under one third ofthe participants (27%). Most respondents indicated that their postcertification coursework occurred at a higher education location. Three respondents at each of the elementary schools indicated that their coursework was provided at their school, as arranged by the principal or district administration. The respondent at Taft Elementary indicated that the course was offered by her principal at a previous district in California. The two respondents at Adams Elementary indicated that district-led seminars in teaching strategies for ELL students and analyzing reading assessment data presented opportunities for understanding data development and usage. 89 Research Question 1 Research Question 1 asked, "What are the similarities and differences in how the schools use the data they collect?" Each school site was examined for similarities and differences in how they used the data they collected. The typology (as devised by Hill, 2004) categorizes information collected into data Sources, Leadership for data-based decision-making, data Processes, and data Impacts on decision-making. Similarities and differences were reported within these categories. Data Sources In the surveys, teachers were asked to identify which Sources of data and information their school used to make decisions about (a) school improvement and (b) classroom instruction. The Sources of data were classified as demographic, perception, and student learning data. School Improvement The summary of the results for data used for school improvement is shown in Table 14. Percentages for each school were computed only for those sources of data used by at least half of the survey participants, with 50% being considered the point of "critical mass" for the purposes of this study. This mathematical protocol was used by Hill (2004) and was also used for the present study. TABLE 14. School Data Use for School Improvement Taft Adams Roosevelt Johnson Hoover Cleveland (N = 23) (N= 22) (N= 23) (N= 34) (N= 48) (N= 31) Total % N= 181 Rank Category Type of Data n % n % n % n % n % n % 4 Demographic Enrollment 12 .52 15 .68 15 .68 18 .53 33 .69 19 .61 .56 Demographic Attendance 16 .70 18 .82 19 .83 24 .71 37 .77 23 .74 .77 Demographic Ethnicity 11 .50 Demographic Gender Demographic Freelreduced lunch (SES) 19 .86 Demographic Gifted and talented 14 .61 16 .73 13 .57 2 Demographic Special education 15 .65 18 .82 18 .78 22 .65 27 .56 .64 6 Demographic Discipline referrals/action 19 .86 20 .87 26 .54 .52 Demographic Graduation rate/drop out rate Demographic Number/types of courses 29 .60 offered Demographic English Language 21 .95 20 .59 Learners Demographic Mobility rate \0 o TABLE 14 (Continued) Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N=22) (N= 23) (N= 34) (N= 48) (N= 31) Total % N= 181 Rank Category Type of Data n % n % n % n % n % n % Perception Parent surveys 12 .55 17 .74 30 .63 Perception Student surveys 16 .70 28 .58 5 Perception Staff surveys 19 .83 12 .55 17 .74 32 .67 .54 Perception Community surveys 3 Student Oregon state assessments 16 .70 17 .77 16 .70 23 .68 .60 Learning Student CIM/CAM data 32 .67 21 .68 Learning Student Standardized assessments 18 .82 Learning Student Analysis of student work 17 .74 17 .77 16 .70 Learning Student Scored student work 13 .57 19 .86 Learning samples Student Teacher-made assessments 14 .61 17 .77 15 .65 Learning '..0 ........ Rank Category Student Learning Student Learning Student Learning Student Learning Type of Data Teacher observations Summer School enrollment Transcripts Student GPAs TABLE 14. (Continued) - Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N= 22) (N= 23) (N= 34) (N= 48) (N= 31) Total % N= 181 n % n % n % n % n % n % 15 .65 17 .77 16 .70 \0 N 93 The results in Table 14 indicate that teachers at Adams Elementary used 16 different types of data for school improvement. Teachers at each of the other five schools reported fewer types of data usage: Roosevelt Middle (12); Taft Elementary (10); Hoover High (9); Johnson Middle (5); and Cleveland High (3). The teachers at the two middle schools reported the greatest variance of data use: Roosevelt (12) vs. Johnson (5). The high school teachers at both schools used the fewest data sources: Hoover (9) and Cleveland (3). Teachers at all six of the schools reported use of enrollment and attendance data. The teachers at the elementary and middle schools reported use of the Oregon state assessments, which were given in Grades 3-8. The teachers at the high schools reported use of the CIM/CAM (Certificate of Initial Mastery/Certificate of Advanced Mastery) results, which were standardized tests at the high school level. Only the teachers at the elementary schools reported use of scored student work samples as data sources. Hoover High was the only school where teachers reported Number/Types of courses offered as a data source used for school improvement. Adams Elementary was the only school where teachers reported using ethnicity, SES (Socioeconomic Status calculated by percentage of students receiving free/reduced lunch), and other standardized assessment data sources for school improvement. The teachers at Adams Elementary and Johnson Middle were the only two schools that reported using ELL (English Language Learner) data. Teachers at Roosevelt Middle and Hoover High reported the most use of surveys (perception data) compared to the other four schools. 94 Table 14 also ranks the six most used data Sources by all of the six schools (N = 181). The ranking includes only those data sources where the combined averages of the six schools were over 50%. The data sources were ranked from 1 to 6, indicating most used (1) to least used (6). The most used data source was attendance (demographic), with an average of 77% of the teachers reporting its usage for school improvement. The second most used data source was special education (demographic), with an average of 64% participant use, even though fewer than half of the participants from Cleveland High (48%) indicated that they used it. Ranked third were Oregon state assessments (student learning), which were used by an average of 60% of the participants. Both Hoover High and Cleveland High ranked the use of that data source below fifty percent (44% and 48%, respectively). Fourth was emollment data (demographic), with an average of 56% reported usage by all six schools. Staff surveys (perception) data were ranked fifth with an average of 54% of the teachers reporting usage of this data source. Fewer than half of the participants from Johnson Middle (32%) and Cleveland High (19%) reported usage of this data source. Regarding the reported use of staff surveys, a 64% difference was found between Taft Elementary (83%) and Cleveland High (19%). Finally, the sixth ranked data source was discipline/office referrals (demographic); an average of 52% of the participants indicated that this data source was used for school improvement. Taft Elementary (17%) and Johnson Middle (44%) reported this data source used by fewer than half of the participants. Regarding the use of discipline/office referral data for school improvement, 95 a 69% reported difference was found between the two elementary schools, Taft (17%) and Adams (86%). All three types of data (demographic, perception, student learning) were reported to be used by all six schools for school improvement. The results indicate that the primary sources of data usage for school improvement were in the demographic category. Enrollment, attendance, special education and discipline referral data comprised four of six top ranked data sources. The teachers at the two elementary schools reported the use of four additional sources of student learning data: (a) analysis of student work, (b) scored student work samples, (c) teacher-made assessments, and (d) teacher observations. The teachers at Roosevelt Middle School also reported the use of three of these same student learning data sources. Archival data revealed that there were higher ELL populations in Belleview than in Oakview; therefore, regarding the reported use of ELL data for school improvement, Adams Elementary (95%) surpassed Johnson Middle (59%). Adams Elementary teachers indicated that this was the highest ranked data source for their school. This indication matched both the goals in their school development plan and the interview information from the school's administrators. In their interviews and surveys, both principals and teachers mentioned that they used this data in terms of instruction and intervention programs. This finding was supported by the high percentages of student learning data usage also reported by this same group of teachers. 96 Classroom Instruction Table 15 summarizes the types of data Sources used by teachers for their own classroom and students. Again, Adams Elementary School teachers reported the highest number of different data sources (12) used. Teachers at Roosevelt Middle School reported the usage of 10 data sources. Teachers at the remaining four schools reported usage of comparably fewer data sources: Taft Elementary (6), Johnson Middle (6), Hoover High (5), and Cleveland High (3). Teachers at all six schools reported usage of analysis of student work (student learning) and teacher-made assessments (student learning) as data sources for their classroom instruction. Teacher observations (student learning) were used in five of the six schools. Cleveland was the only school where slightly fewer than half of the participants (45%) indicated usage ofthis data source. Two types of demographic data were reported as used at five of the six schools: attendance and special education. More than 50% of the participants in five ofthe schools (except Cleveland, 45%) used special education data. More than 50% of the participants in five of the schools (except Taft, 48%) used attendance data. Teachers at Adams Elementary and Roosevelt Middle Schools reported the most sources of data usage for classroom instruction. Teachers at both schools reported the use of gifted education (demographic), discipline/referrals (demographic), student surveys (perception), Oregon standard assessments (student learning), and scored TABLE 15. Data Use by Classroom Teachers for Students and Classroom Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N= 22) (N= 23) (N= 34) (N= 48) (N= 31) Total % N= 181 Rank Category Type of Data n % n % n % n % n % n % Demographic Enrollment 5 Demographic Attendance 14 .64 18 .78 18 .53 39 .81 20 .65 .66 Demographic Ethnicity Demographic Gender Demographic Freelreduced lunch (SES) Demographic Gifted and talented 14 .64 15 .65 4 Demographic Special education 18 .78 19 .86 19 .83 22 .65 19 .60 .67 Demographic Discipline referrals/action 14 .64 17 .74 Demographic Graduation rate/drop out rate Demographic Number/types of courses offered Demographic English Language 12 .52 19 .86 19 .56 Learners Demographic Mobility rate \0 -...J TABLE 15 (Continued) Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N= 22) (N= 23) (N= 34) (N= 48) (N= 31) Total % N= 181 Rank Category Type of Data n % n % n % n % n % n % Perception Parent surveys Perception Student surveys 11 .50 13 .57 Perception Staff surveys Perception Community surveys Student Oregon state assessments 13 .59 13 .57 Learning Student CIM/CAM data Learning Student Standardized assessments 14 .64 Learning 3 Student Analysis of student work 20 .87 18 .82 16 .70 20 .59 31 .65 19 .61 .69 Learning Student Scored student work 12 .52 15 .68 13 .57 Learning samples Student Teacher-made assessments 21 .91 18 .82 17 .74 23 .68 37 .77 17 .55 .73 Learning 1.0 00 Rank Category Type of Data 2 Student Teacher observations Learning Student Learning Student Learning Student Learning Summer School enrollment Transcripts Student GPAs ID ID 100 student work (student learning) data to inform their work with students. Adams Elementary was the only school where teachers reported usage of other standardized assessments (64%) as part of their data collection. Table 15 also ranks the five data Sources most used by all of the six schools' respondents (N = 181). The ranking includes only those data sources where the combined average of the respondents was over 50%. The data sources were ranked from 1 to 5, where 1= most used and 5 = least used. The highest ranked data source was teacher-made assessments (student learning) with an average of73% usage across the six schools. Ranked second was teacher observation (student learning) with 72% usage. Cleveland High was the only school with slightly fewer than half of the participants (45%) indicating usage of this data source. Analysis of student work (student learning) was ranked third across the schools (69%). Teachers at all six of the schools indicated usage of this data source within classrooms. Special education (demographic) was ranked fourth. Cleveland High was the only school with slightly fewer than half of the participants (45%) indicating usage of this data source. Finally, attendance (demographic) was ranked fifth (66%). Taft Elementary School was the only school with slightly fewer than half of the participants (48%) indicating usage of this data source. The results of Table 15 indicate that teachers primarily used student learning data to inform their work in their classroom and with students. This finding differs from the results ofthe previous question in the survey, which showed demographic data to be 101 the primary data source used for school improvement. The three highest ranked sources across all six schools were teacher-made assessments, teacher observations, and analysis of student work. There did not appear to be any noticeable difference in the use of data sources for classroom instruction between levels (elementary, middle, high). Once again, teachers at Adams Elementary and Johnson Middle Schools indicated using ELL data for their work in classrooms. This result was consistent with their reported use of this data for school improvement, their school development plans and from what was reported in the administrators' interviews. The only reported perception data used to inform classroom work were student surveys at Adams Elementary and Roosevelt Middle schools. According to the teachers, parent surveys were not generally used. Community surveys were also not generally used by any of the six schools to inform either school improvement or classroom instruction. Demographic data on enrollment, ethnicity, gender and socioeconomic status were reportedly not high-priority factors for teachers to inform their classroom work with students. Overall, there were reportedly fewer data sources collected for decision- making within classrooms than for school improvement, as indicated by teachers from four of the six schools (excluding Adams and Roosevelt). Cleveland High School remained consistently lower than the other schools in its reported data use for both school improvement and classroom instruction (three sources each). 102 The examination of archival documents and records as well as information from the principals' interviews (including interviews of vice principals and instructional coordinators) revealed that staff at all six schools collected and reported data as required by state mandates. The administrators at each school acknowledged that external accountability was a reality. Since the state evaluated schools only on demographic and standardized state achievement data (student learning) contained in School Development Plans (SDP), each principal stated that there was a certain level of compliance operating in collecting and reporting these data. However, each principal noted that since they had the data, they worked with teachers to analyze it and make it useful for informing school-wide performance goals or other academic improvement initiatives. They added that teachers and administrators examined and analyzed state assessment data to inform them on how the school was doing in comparison to other schools and state and federal expectations. Each principal chorused the sentiment that the statewide data were only helpful for large-group data analysis or noting yearly trends and was not helpful for individual student analysis. They understood the political nature and ramifications of having "high" or "low" assessment scores and the risks of failing to meet the NCLB federal mandates. Since the school district was located in a high-income community where school ratings are widely viewed as an important aspect of residential location, the principals and teachers noted the added stress and emphasis that came from community expectations of each school. Several principals referred to this reality as a "balancing act" of doing what was right by external expectations and 103 also doing what was right by internal convictions. In each of the schools, several participant teachers agreed that state assessments were used for school improvement but were not used for making decisions on a daily basis in the classroom. This conflict regarding the use of state assessment data to inform classroom instruction could also be due to the nature of the assessment itself. Teacher survey responses and principal interviews revealed a frustration with the state assessment data and doubts about the ability of such data to be helpful in guiding instructional goals and practices. At the time of this study, the Oregon state assessments were multiple-choice and considered "formative." Students could take the assessment up to three times during the school year. Immediate results, in the form of a raw score, allowed the students to know how they had passed the grade-level standard (Met, Exceeded or Not Met). The score provided some numerical data, but teachers and principals at each of the schools commented that receiving a number or a "Met" descriptor did not provide any more information than simply the descriptor. Teachers were not able to access substrands of data (due to limited test-response items and statistical constraints) in a reliable manner. Only large group data-for example, 60% of our 4th graders met their grade-level math benchmark standard-were available. This data could be and were used to evaluate grade level or school-wide progress. Teachers had varying opinions about how effective the data were in facilitating understanding of individual students' abilities and learning profiles. Both teachers and principals agreed that state assessment data were one form of 104 data, and should not be the only form of data used to evaluate a school's effectiveness. The use of multiple forms of data was preferred. Using many different assessments, including teacher observations and input, have given us a more complete picture of the students so that we can provide interventions or modify instruction to meet the individual student needs. (RT210) When I think of data I think about what we have in terms of academic [data], e.g., reading, and being very broad in terms of not letting anyone tool be the sole indicator of a child's ability or performance. (RP01) Each of the principals agreed that localized data (collected at the school site and by the teachers themselves) were the most useful in terms of identifying goals for instruction and school-wide or classroom improvement. The analysis of student work, teacher-made assessments, and teacher observations were the three forms of data reported by teachers as important for school-wide improvement, but only by the elementary schools and one middle school (Roosevelt). In the second middle school (Johnson) and both high schools, there was apparently no use oflocalized data to inform instruction and school-wide improvement. At these sites, there still appeared to be a reliance on statewide assessment data to inform school-wide improvement, despite any internal staff views about its effectiveness. Summary Demographic data were collected at each of the six schools regarding decisions about school improvement and classroom instruction. According to the teacher surveys across all six schools, attendance data and special education data were the most 105 collected and used sources. Each of the six schools' SDPs included attendance and special education data within demographic descriptions, school report cards, and state assessment scores. Several schools' SDPs indicated goals for improvement of special education services for students. For example, at Cleveland High School, the principals noted a school-wide goal with the SDP entirely focused on "improving and expanding special education services" with six targeted subgoals. Teachers at Adams Elementary and Johnson Middle Schools reported the importance of ELL data for making decisions both on a school-wide and classroom level. Both of these schools were located in Belleview. In each schools' SDPs and in the respective principals' interviews, ELL data were used for reporting demographic data and tracking students' academic progress (as an ethnic group or for qualifications of ELL services). The schools' SDP plans for each school indicated a growing number of ELL students each year, and principals reported an impact on teachers to learn and implement effective instruction for these students. There were no reported performance goals or plans indicated for the ELL students at these two schools. At Cleveland High School (also located in Belleview), the SDP revealed several specific goals targeted towards (a) expanding literacy opportunities for ELL students, (b) researching and further developing a high-quality ELL program, and (c) expanding course offerings and support for all ELL students. At Cleveland, 42% of the teachers indicated that ELL data were used to make decisions about school improvement; yet only 23% of the teachers indicated use of ELL data for classroom decisions. 106 The results of the surveys, SDPs and principal interviews revealed that perception data (surveys) were used at all six schools at varying percentages and mostly for school improvement rather than for classroom instruction. Teachers at four of the six schools (Adams Elementary, Taft Elementary, Roosevelt Middle and Hoover High) reported more usage of surveys than did the teachers at the other two schools (Johnson Middle and Cleveland High). At Cleveland High School, the principal reported the use of parent surveys to gather information regarding volunteering opportunities, school- based web-page effectiveness, and posthigh school plans for graduating seniors. The principal also indicated that staff and student perception data were gathered informally at meetings and in conversations. The vice principal "hoped" that with current technology (electronic surveys) there would be more collection of perception data from stakeholders. There was no mention of using surveys in Cleveland's SDP. At Johnson Middle School, the principals mentioned using surveys with students regarding bullying and with staff regarding special education modifications. Despite these responses gathered in the interviews, there was not a high enough response rate in the teacher surveys to indicate a regular use of perception data at either Johnson or Cleveland. _The principals at each ofthe four schools that did indicate a general use of surveys (Taft, Adams, Roosevelt and Hoover) emphasized the value of conducting surveys to gather data for school-wide decision-making. At Roosevelt Middle School, teachers indicated that parent surveys (74%), student surveys (70%) and staff surveys (74%) were used to inform decision-making at a school level. Student surveys (57%) 107 were used at a classroom level. Roosevelt's SDP and principals' interviews revealed the use of several surveys administered to parents, students and staff. In recent years, staff surveys have focused on teaming expectations and administration effectiveness. The second school that demonstrated a high use of perception data was Hoover High School. Teachers indicated that parent surveys (63%), student surveys (58%) and staff surveys (67%) were used to inform school-wide decisions. Principals' interviews revealed that student surveys were used on a regular basis to provide "voice" on topics (such as alcoholism, relationships or communication), opinions about grading practices, and post-high-school plans. Student panels were used to reveal feelings and stories about current practices at the school. One time we had a student panel sit in front of our entire staff and talk about grading (without teacher names). It was amazing. We had one teacher cry afterwards, realizing she had completely missed the mark about grading. The kids were very honest. (HP09) At Hoover, parent survey topics included post-high-school plans for students and the proposal for a school career center. The teaching staff at Hoover was also surveyed about the above topics as well as the effectiveness of the school's current administration. During 2005-2006, parents, teachers and students at Hoover were all surveyed about the school's effectiveness and function as a system, its practices and its procedures. Within the SDPs of each ofthe four schools that indicated a broad use of survey data, there was evidence of goals, inquiry and action plans in response to the perception data. Each of the school levels was examined for patterns of data Sources. During the interviews, a difference between levels regarding data collection was pointed out by one 108 of the high school principals. After having been a middle school administrator prior to her current position, she noted, I see how data are used differently in those levels [middle and high]. At the middle level, I used it and teachers used it more for individual instructional things for kids and programs. At the high school level ... they are content-driven people in many respects. It doesn't mean they don't care about it ... they have 150-200 kids in a day. You could spend the whole summer looking at the [individual] data from that. If there are individuals struggling, they'll talk to the individual. So the purpose of it [data], from the high school level ... is less about individuals and more about program and trends. (HP09) In the study, the comparison between school levels (elementary, middle, high) regarding data sources did not reveal any significant patterns. Leadership A common theme of collaborative decision-making was mentioned in each of the 14 administrator interviews. This leadership philosophy was also described by the superintendent: [In our district], the learning and leadership environment should be based on strong relationships embedded in collaborative decision-making processes with high expectations for effective results. Relationships are everything, especially in the world of public schools. We are a people business and worthy achievements are accomplished by people for people. The concepts and practices of leadership are at once dynamic and expansive while at the same time founded in enduring values and beliefs. Students of leadership must take time to retreat and think together; to read, study and dialogue about learning, and create a language of leadership that embodies the value of relationships, belief and trust in collaborative processes, and a belief that high quality results are important and attainable. (HS01) 109 According to the principals, leadership decisions were expected to occur in a collaborative manner and to include the voices of all stakeholders as necessary. This district expectation was also viewed as a state expectation. At the time of this study, the Oregon Department of Education expected schools to have a standard of shared leadership and an inclusion of data within their school development plans. The use of Site Councils was encouraged by the Oregon Department of Education to promote shared leadership by representative stakeholders in guiding school improvement. In Highland School District, as evidenced in the SDPs and principals' interviews, the configurations of internal teams and committees were different at each school, depending on level (elementary, middle, high), size of school, and the school administrator's preference. Differences were evident in the number and types of committees and their roles in decision-making. Differences were also evident between school levels, particularly when team configurations were in content-area departments at the middle and high school levels. The similarities rested in the underlying expectation or mandate assumed by each school, which was to have representative voices (teacher, parent, student, administrator) at Site Councils and representative voices (teacher, departments, administration) at all school-level committees and teams. Table 16 displays the involvement in site-based decision-making at each of the schools, as documented from the teacher surveys. Teachers were asked to quantify the number (percentage) of staff members regularly involved in decision-making in their school. A code was used to determine the mean index of teacher involvement from each 110 TABLE 16. Percentage of Teachers Regularly Involved in Decision-Making Taft Adams Roosevelt Johnson Hoover Cleveland ES ES MS MS HS HS (N= 23) (N= 22) (N= 23) (N= 34) (N= 48) (N= 31) Mean index 2.24 1.98 2.89 1.69 1.67 1.73 % of teachers 26-50 26-50 51-75 26-50 26-50 26-50 school: 1 = 1% to 25% teacher involvement; 2 = 26% to 50% teacher involvement; 3 = 51 % to 75% teacher involvement; and 4 = 76% to 100% teacher involvement. Respondents who "skipped" this question were given a code of .5, which was calculated into the mean index. This method of reporting was used in Hill's (2004) study and applied in this study. Teachers in five of the six schools indicated that one fourth to half of the school staff were involved in making decisions. The principals in all six schools described internal processes for involving teachers in making decisions within their schools. Besides Site Councils and general staff meetings, the principals described some form of decision-making committee that represented either grade levels or content departments. These committees met on a regular basis and discussed academic concerns, analyzed data and worked with the administrators to put plans into action. Three of the principals described this shared decision-making process: Rarely do I make a decision by just us [administrative team]. There are maybe a handful of times ... or if it's 50-50, then I decide. Important decisions are made for us two ways: usually with the CCC or occasionally with the entire staff. Every department is in on it. We talk, they go out and get input, sometimes they make a decision or they get input and we come back and we make a decision. The CCC [Curriculum 111 Coordinating Committee] always see the data before we present it in front of the entire staff. And we have data at every meeting ... hard or soft. And then they also collect and bring back data. So, it's the in and out-big/small-go out to the masses and back to the small-group system. (HP09) Teachers make a lot of decisions and teams make a lot of decisions and then we have groups that meet-for example, our Character Education Committee that makes recommendations to the staff. In staff meetings we meet a lot in groups. We kind of go back and forth to try to keep continuity within the school and keep a variety of perspectives that go into decision-making. (HP03) We try to have an open and collaborative process. The administrative team meets on Monday and the department chairs on Tuesday and counselors on Friday. Staff meetings are once a month. There's an open door policy [to attend any meeting]. We try to be visible. (HP11) Roosevelt Middle School was the only school where teachers indicated that half to three fourths of its staff were involved in the decisions at their school, a higher percentage than reported at the other five schools. The survey results at Roosevelt mirrored the description from its principals about teachers' involvement in school-wide decisions. There were only two leadership teams within the building, but 20 teachers were involved on both teams. This was approximately half of the 41 certified teachers on staff. One of the leadership teams delved into school-wide academic issues and conversations and communicated back and forth to their grade levels. They were responsible for conducting surveys and gathering information to help shape the direction of the school's academic progress. The second leadership team performed the same role with issues and conversations regarding the school culture and social- emotional needs of the students. The Roosevelt principal added the following comment: 112 Our school is very team oriented. Our school really functions based around two primary leadership teams. We meet twice a month and go over the different things. Sometimes they bring concerns to us; sometimes we bring concerns to them. Whenever possible, the decisions are made in a group forum or group format. (HP07). The principals at each of the six schools stated that data collection was a shared responsibility in the school and that many people were responsible for collecting a variety of data. However, they reported that it was still the primary role of the principals and vice-principals (instructional coordinators) to bring data forward with a team or group of teachers or urge that specific data be collected and used in order to make decisions about school improvement. This included formal standardized data as well as informal, localized data. Several principals contributed to this discussion: We [principals] have a very key role in determining what we feel needs to be assessed and what are the appropriate ways of assessing it and gathering the data. (HP07) The role [of the principal] would be facilitating the conversation-the coherency. Fitting the data with other things that you are doing to learn and inform. I'm actively teaching our staff on protocols to use and ways to talk about and analyze what we're doing. Having the conversations and leading the conversations. (HP05) The Ie [instructional coordinator] and I are key in the umbrella, overarching, in looking at the data. You need to be perceptive of staff and how you present the data and what opportunities you give to teachers to analyze it. (HP03) [Our role] is about deciding what data need to be presented and what it does. If we don't think we're going to use it, then it doesn't make sense to burden the teachers with information that doesn't matter. (HPIO) The principal is usually not involved in the collecting of data but looks at the results and asks questions about the factors and where we need to go next and to move forward. (HP02) 113 My role? What data are needed. I think of it in terms of questions. What data do teachers need to have, how should the data be presented in a way that is both informative and keeps the doors open as well as cause some positive disequilibrium to move forward. (RP01) According to the teachers at each school, there was a similar assumption and expectation that the principals will collect and bring forward data for school-wide improvement or be the most influential determiners of how data will be used. In the surveys, the teachers indicated that they collect and analyze their own local data (student work, classroom assessments, observations). In several of the open-ended questions, several teachers noted that larger data sets--ones that impacted more than their own classroom-were collected and analyzed by the administrators or by groups led by the administrators. One teacher appeared to summarize this observation of data collection criteria: Change in special education procedures is based on monitoring of data by the state. Change in school procedures are based on staff and community surveys. Change in educational practices are based on student performance on assessments. (RT102) Table 17 reveals the survey responses from teachers regarding Leadership for data-based decision-making. Teachers responded to this section in the survey with Agree, Disagree, or Undecided. The results in the table show the number and percentage of positive responses, the overall rank for each question by school and system, and the chi-square distribution. Chi-square distributions were computed on each question in the area of Leadership. If the chi-square was not valid for a particular question, due to TABLE 17. Highland Schools Teacher Survey Leadership Summary Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N=22) (N= 23) (N= 34) (N= 48) (N= 31) % Rank Question n % n % n % n % n % n % N= 181 X2 6 Strong principal leadership in 10 .43 16 .73 16 .70 6 .18 18 .38 12 .39 .43 23.063* using data to improve classroom 2 Strong principal leadership in 15 .65 16 .73 22 .96 14 .41 26 .54 16 .52 .60 21.491 * using data to solve problems school-wide 5 Principal models good use of 14 .61 16 .73 19 .83 11 .32 17 .35 14 .45 .50 16.848* data; encourages staff to use data in making decisions to improve school 8 Teachers are proficient at 9 .39 13 .59 9 .39 20 .29 17 .35 11 .35 .38 9.875* accessing and using data and info. to solve problems in the classroom 3 Data support services from 14 .61 14 .64 17 .74 19 .56 28 .58 13 .52 .58 central administration to the school 7 Data support services from 10 .43 13 .59 15 .65 10 .29 16 .33 10 .32 .41 7.692 central administration to the teacher - -+::- TABLE 17. (Continued) Taft Adams Roosevelt Johnson Hoover Cleveland -(N= 23) (N=22) (N= 23) (N= 34) (N=48) (N= 31) % Rank Question n % n % n % n % n % n % N= 181 X2 8 State data support for 10 .43 8 .36 11 .48 14 .41 19 .40 7 .23 .38 7.467 school-based decisions and improvement Teacher input into decisions in 14 .61 18 .82 20 .87 12 .35 28 .58 21 .68 .62 the school 4 Decisions are based upon 17 .74 15 .68 15 .55 13 .38 16 .33 21 .68 .54 17.234* multiple sources of data 7 SDP process included all 6 .26 8 .36 15 .65 13 .38 19 .40 13 .42 .41 9.017* stakeholders AVERAGE TOTAL POSITIVE 12 .52 14 .62 16 .69 12 .36 20 .43 14 .45 .49 RESPONSES SCHOOL RANK 3 2 1 6 5 4 *p<.05. **p<.10. ........ ........ VI 116 expected counts being less than 5, then a hyphen (--) marks that question. The critical value for chi-square with 5 degrees of freedom was 11.07,p < .05. The overall majority of positive responses in the Leadership category were more evident at Adams Elementary and Roosevelt Middle Schools. These were also the two schools where teachers reported a higher number of data Sources collected for decision- making. Teachers at other schools reported data collection as a positive leadership model for improving their school; however, only teachers at Adams (73%) and Roosevelt (70%) agreed that there was strong leadership in using data to improve classroom instruction. Teachers from these two schools commented as follows: Using many different assessments, including teacher observations and input, have given us a more complete picture of the students so that we can provide interventions or modify instruction to meet the individual student needs. (HT210) Our administrators are good about bringing [data] to us in team or staff meetings. I have used data to make decisions about instruction. (HT30 I) In all six schools, the principals acknowledged that data collection and usage for decision-making was important; however, they added that data needed to be collected and applied within context and in meaningful ways. For example, several principals acknowledged that teachers collected their own data-the data that were meaningful to them-adding that it was still the primary responsibility of the principals to collect the data that teachers may not initiate or may not have access to that impacts grade levels, academic departments or the school. There are teachers that keep their own data on how their students are doing ... it's not to say no one does that. But it's the expectation that 117 more data will come from the leadership and already sorted and what's relevant will be brought forward to look at and have conversations around as a whole staff or department or individuals. (HP09) I hope that we help people feel that they're a part of the data collection and it's not something that is being done to them, but it's a process that we're engaged in for school improvement. (HP04) The survey questions in Table 17 were also ranked in order of positive responses to leadership across the six schools. The two highest positive responses from teachers were: (a) their input into decisions made in the school and (b) the principals' use of data to solve problems school-wide. Teachers from five of the six schools reported that both of these aspects were true for their schools. This supported the information found in the SDPs and the principals' interviews expecting teachers to be engaged in the process of making decisions in their schools and using data as an important element in solving problems or prompting conversations. Roosevelt ranked highest of the six schools in these two aspects ofleadership (87% and 96%, respectively). Only teachers from Johnson Middle School (35% and 41 %, respectively) reported a greater negative response to these same two aspects of leadership than did the teachers at the other five schools. Three of the teachers from Johnson commented as follows: I feel like we use data but it seems chaotic and disorganized when we do this. (HT40l) There is a lack of direction and decision-making on what actions to take to improve outcomes; ideas discussed by the entire staff with no action taken; input given without any decision-making power. (HT404) [We get] the results of state testing when comparing us to other middle schools. (HT405) 118 This last comment may summarize the concerns and stresses that the teachers and principals at Johnson Middle School (and others) reported they felt when their school's state assessment scores did not compare well to others in the same district. In Johnson's SDP there was evidence of recent drops in state assessment scores, which had prompted multiple conversations at staff meetings and a review of the school's goals for improvement. The principal at Johnson mentioned some community concern about the recent data and some staff members feeling "ashamed and attacked by what the data say." In contradiction to the teachers' reports of low input into the school's decision- making process and the use of data to solve problems, Johnson's SDP and principals' interviews indicated several opportunities for teachers' input (such as staff, team and subject-alike meetings) and the usage of data throughout decision-making processes. Teachers from five of the six schools agreed that the central administrators appropriately provided data support to the school leaders. Only teachers from Cleveland High School (42%) had a slightly more negative response to this question than reported by the other participants. Teachers from two schools (Adams and Roosevelt) reported positively that the central office staff gave data support to the teachers. Principals from all six of the schools indicated that positive central administrative support came in the form of trust that the principals were using data and creating goals according to their schools' needs. Two of the high school principals mentioned accessing the district Information Technologist to help with some data 119 analysis procedures. One of the principals concurred that the district leaders gave support and freedom for building leaders to "cultivate our people and culture" (HPIO). I think there is great support. The tendency is that they recognize that all the schools are unique. And the question becomes, "How can I support you?" or often support is in "do what you need to do." Support is in not becoming an obstacle. Support is in a green light from the district level. (HP06) I think that some of the support from our district is to approach data in the way that makes sense for our school. (HP04) None of the teachers from the six schools positively reported that the state provided support for school-based decisions and improvement. Examining the raw data to this question further revealed a higher total number of "undecided" responses than positive responses across the six schools with this same question about state data support. A few teachers indicated approval with the current web-based format of the state assessment, allowing immediate feedback to the students and the teachers. It was great when TESA [Technology Enhanced Standardized Assessments] was up and working well for us. It was good for the kids to have immediate feedback on how they did. It was great for us as teachers to get the data in order to strengthen our curriculum instruction for the short term as well as the long term. (HT204) Looking at the students' statewide assessment helps me to plan lessons in response to where students struggle. (HT301) Each of the principals and superintendents reported that the state provided data to them from state standardized assessments and regulatory reports. None of the principals stated that they looked to the state for support regarding school improvement. One of the superintendents added the following comment: 120 I don't think the state does much analysis at all. School districts are viewed by the state as regulatory, which means they have to keep us compliant. It's all directed towards "those people that need regulation and we're going to use data to do that." It's not [about] using data to make a difference and make things better. So that's the level of support I need to give; support to make things better, not to regulate. (HS01) Each school's SDP and administrators reported multiple sources of data used at their sites to make decisions, including demographic data, perception data and a variety of student learning data. The principals at each of the schools reported collecting a wide variety of data to inform their decisions (formal data, informal data, anecdotal data, observational data, surveys). According to the surveys, teachers from four ofthe six schools agreed that decisions in their schools were based upon multiple sources of data. The teachers at Johnson Middle School (38%) and Hoover High School (33%) indicated otherwise. This low response from Hoover contradicts many of the open-ended responses in the same survey, which indicated various types of data used at their school. Five of the teachers gave examples of multiple sources of data used at their school: Revision of way to treat student attendance issues based on data from studies. (HT502). Improving the (special education) referral process based on data collection. (HT505). Course offerings, school climate and safety, student forum groups. (HT508) Test data to help us strengthen writing across the curriculum. (HT509) Surveys have given our school good info about various topics. (HT512) 121 Further examination of the raw data revealed that 20 teachers responded to this question with "undecided." This was close to half of the respondents. At Johnson Middle School, this same survey question also received a less than positive report from the teachers. In the Johnson SDP and principals' interviews, there were reports of demographic, perception (surveys) and student learning data used to guide decisions and school goals. However, unlike Hoover, the Johnson teachers' open-ended responses about data collection focused mostly on the use of state assessment scores. Five of the Johnson teachers commented about using state assessment data: [We have] long discussions about test scores and how they might benefit our teaching. (HT412) Tracking standardized test performance of students deemed at risk. (HT411) Stress increased based on test scores and school report cards. (HT404) Test scores are the only data we use. (HT407) Looking at state test scores and targeting the students who were close to meeting [benchmarks] to "step it up" to passing level. (HT4l0) The teachers at only one school (Adams, 59%) indicated that they were proficient at accessing and using data and information to solve problems in the classroom. This response was consistent to other responses of strong leadership (73%) and encouraging data use (73%) within this particular school. Several principals acknowledged that they could be more intentional in teaching staff (and themselves) how to collect and analyze data and explore better data-collection tools. Four principals made the following comments: 122 I'm limited in my computer skills to do it efficiently and effectively. I feel deficient in certain levels of support. (HPI2) I would not overall say that I or our school right now is using data effectively. We're very much learning how to use that. Definitely learning as we fly the plane. (HP05) I don't know enough about what's out there. Data.com is a way to collect data. SurveyMonkey has been amazing. Maybe there are other ways to gather information. (HPI0) I want to get more training in [data use]. (HP08) The teachers at only one school (Roosevelt, 65%) agreed that the SDP process included all stakeholders. This result appears to contradict the reports in the SDPs where each building's Site Council is referenced. According to the principals, there was representation from stakeholders within the process of the school's SDP. A closer look at the raw data revealed approximately the same number of "undecided" responses (72) across the six schools as "agree" responses (74) to this question. This may indicate that a large number of teachers in the six schools are unsure about the representation of stakeholders in the school development process, or the voice of the stakeholders within the process, or how the process works in general. A contrast between the six schools regarding leadership and data-based decision- making was found in the responses from Johnson Middle School as compared to the other five schools. Only one question in the survey received a positive response from the teachers at Johnson. That question pertained to support central administration provides to the school. According to the raw data from Johnson Middle School, four of the questions were answered and showed an approximately equal distribution of the 123 possible choices: "agree," "disagree" and "undecided." These questions addressed the following issues: staff being encouraged to use data, teachers being proficient at accessing data, teachers having input into decisions, and the SDP process including all stakeholders. The lowest average positive response to all of the questions among all of the schools came from Johnson Middle School. The lower positive response of only 18% to "strong principal leadership in using data to improve the classroom" was a contrast to other schools' responses (43%, 73%, 70%, 38% and 39%). The principal at Johnson reported a leadership effort to support teachers in their data use while encouraging and raising staff morale regarding state assessment scores and teachers' best efforts in their work with students. In the Johnson SDP, statements reported the following challenges: (a) the high student turnover rate per year (19%), (b) growing enrollment that pushed the school close to capacity, (c) a lower attendance rate than the state target (the only one of the six schools to rate this low), and (d) lower student achievement scores than at the other two middle schools. Despite these statements of challenge, the principal wrote, Teachers and staff members at [Johnson] are committed to our students and their families. It is typical for teachers to buy books and clothing for students, pay lunch bills, and to be invited to family gatherings. Teachers at [Johnson] work hard at being effective teachers and improving professionally. It is not unusual to have conversations about teaching at 7:30 p.m., and they are often on site on weekends. Teachers care deeply about the academic and social well-being of their students. The motivating factor for making changes in curriculum and instruction is seeing evidence that it makes a difference in the achievement and interest in learning of students. (HP05) 124 In her interview, the Johnson Middle School principal emphasized that looking only at state assessment data was not helpful for her staff, given the high student turnover rate. They had tried extrapolating growth data of specific students and cohorts of students who were at the middle school all 3 years. This had been somewhat helpful but introduced other confounding variables and factors. The principal concluded, The forms or types of data that we are really trying to look at are [centered] around what our kids are actually doing ... really looking at student work. (HP05) Significant leadership differences between levels (elementary, middle, high) were not apparent. There were more positive responses to leadership from the two elementary schools and Roosevelt Middle School (each ranked within the top three) than from the other middle school and the two high schools. Chi-square distributions were computed on each question in the area of Leadership. The results indicated significant differences (p < .05 or p < .10) in positive responses across the six schools to most of the questions. Two questions--data support services from central administration to the school and teacher input into decisions in the school---could not be computed due to expected counts being less than five. Examining the raw data for both ofthese questions revealed that teachers responded with a high number of "agrees" and a very low number of "disagrees." The null hypothesis was rejected for two ofthe questions: (a) data support services from central administration to the teacher and (b) state data support for school-based decisions and improvement. This negative response to data support from central office to the teachers' classroom was 125 consistent with teachers' responses that there was a strong infrastructure within each school and more of a direct impact internally rather than from central administration. The negative response to state support to schools was consistent with teachers' open- ended responses in the survey, as well as principals' and superintendents' interview responses. All participants reported that the state was removed from what occurred in the schools and only served to regulate mandatory requirements. Figure 4 depicts the average positive responses from each school in the area of Leadership. Roosevelt Middle and Adams Elementary Schools had the highest percentages of positive responses. Taft Elementary, Cleveland High and Hoover High clustered together (52%, 45%, 43%, respectively). Johnson Middle School had the lowest percentage of positive responses (36%). Table 18 displays the descriptive statistics of positive responses from each of the six schools. Each school's average positive responses, ranking and standard deviation are reported. The results from Table 17 and the descriptive statistics in Table 18 indicated that Johnson Middle School differed significantly from the other middle school, Roosevelt, and somewhat from each of the other four schools. An analysis of variance (ANOVA) test was conducted to examine mean differences in positive responses among the schools in the area of Leadership. The ANOVA revealed overall significant differences in the mean scores of the schools (F= 8.29,p < 0.001). Ttests were conducted to examine variance among school responses more closely. The overall 126 Leadership Average Positive Responses 0.7- 0.6 0.5- 0.4 0.3 0.2 0.1 o Ii Average PositiveL Responses TES AES RMS JMS HHS CHS FIGURE 4. Leadership percentage positive responses by school. TABLE 18. Leadership Positive Responses Summary School Mean Rank N SD Taft Elementary .52 3 23 .147 Adams Elementary .62 2 22 .153 Roosevelt Middle .69 23 .171 Johnson Middle .35 6 34 .100 Hoover High .43 5 48 .103 Cleveland High .45 4 31 .145 Total scores indicated significant differences between the teachers' responses at each school, except the high schools. The t test between Adams Elementary and Roosevelt Middle Schools indicated a significant difference at the .10 level. Processes None of the teachers or principals in the six schools reported an identifiable "formal" data process within their schools. Nor were there comments by the administrators indicating a need to create formal processes. The two superintendents made the following comments: How are decisions made in the district? That is a mystery, isn't it!? Sometimes I fee11ike it's an emergence. There's just this flow of ideas all the time. And when something gathers enough understanding and energy then it just shoots like wildfire through the district. But it's a messy process. It is much like Margaret Wheatley describes, I think. (HS02) I don't know. When you find out let me know! In this district, when we talk about how our decisions are made, they're made all different ways. Sometimes they are made without a lot of conversation; other times when it impacts a lot of people and things aren't as clear cut, it takes a lot of conversation and collaboration. (HS01) The principals also described a less formal process of making decisions. It included collaboration with teachers and depended on the nature of the decision needing to be made. Decisions are usually a team effort and a lot of times made through the team reps. (HP08) Always collaborative1y. Including more people gives ownership and understanding and responsibility to everyone. (HP02) I've also learned that the staff appreciate it when [the principal] and I make some decisions. They don't want all the decisions to be messy group decisions. So, we try to keep the big ones, the complex ones, to involve as much of the staff as possible. (HP04) 127 128 Teachers make a lot of decisions and teams make a lot of decisions and then we have groups that meet. I really believe in cultivating informal leadership. (HP03) We try to have an open and collaborative process. The group that does the most, or has the most input, is the department chairs. We rotate that position every 2 years. Our philosophy is that anybody should have the skills to be the chair. We try to encourage everyone to participate. (HP11) Rarely do I [and the administrative team] make a decision that's just us. (HP09) There is no one answer ... it depends. There's a widening lens. Some things I will make a decision about. Just about everything involves gathering information from teachers, teams, students, looking at it and then doing something with it. (HP05) The principals described working with teams, departments or committees to make decisions but did not consider this a "formal" process. They described weekly or monthly meeting times by grade level or department representatives or committees. Some schools had separate committees that made decisions about academic affairs and other committees that attended to the culture and climate of the school. Even though the principals emphasized the importance of gathering "voice," input and data, it appeared that the principals still controlled which data were used and who ultimately determined the decision. Table 19 summarizes the survey responses from teachers on Processes for data- based decision-making. Teachers responded to this section in the survey by selecting Agree, Disagree, or Undecided. The results in the table show the number and percentage of positive responses, the overall rank for each question by school and system, and the TABLE 19. Teacher Survey of Processes Summary Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N=22) (N = 23) (N = 34) (N= 48) (N= 31) % Rank Question n % n % n % n % n % n % N= 181 X2 4 School has formal process in 4 .17 7 .32 13 .57 1 .03 21 .44 8 .26 .30 32.913* place for how decisions are made 2 The process for how decisions 3 .13 8 .36 13 .57 6 .18 18 .38 12 .39. .33 19.285* are made includes representation from all of the school's stakeholders 3 The school has a formal 4 .17 10 .45 13 .57 6 .18 15 .31 8 .26 .31 16.109* process in place for collection of data and information The school has a formal 6 .26 10 .45 14 .61 7 .21 25 .52 12 .39 .41 18.809* process in place for communication and disseminating data and information throughout the school -N '-0 TABLE 19. (Continued) Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N= 22) (N= 23) (N= 34) (N=48) (N= 31) % Rank Question n % n % n % n % n % n % N= 181 X2 5 The school has a formal 5 .22 7 .32 10 .43 3 .09 14 .29 6 .19 .25 13.685* process in place for staff members to learn and increase their skills and capacity with data and information 3 The school has a professional 7 .30 7 .32 11 .48 6 .18 16 .33 10 .32 .31 10.345** development process for staff members to increase their proficiency and expertise in using data AVERAGE TOTAL 5 .21 8 .37 12 .54 5 .14 18 .38 9 .29 .30 POSITIVE RESPONSES SCHOOL RANK 5 3 1 6 2 4 *p < .05. **p < .10. ...... w o 131 chi-square distribution. Chi-square distributions were computed on each question in the area of Processes. The critical value for chi-square with 5 degrees of freedom was 11.07,p < .05. The results from this portion ofthe survey were consistent with the principals' descriptions of an informal, collaborative decision-making process in their schools. Most teachers (except at Roosevelt, 57%) did not agree-or were undecided-that a formal process was in place for how decisions were made in their schools. Across the six schools, there are a total number of 54 "agree" responses, 59 "undecided" responses and 68 "disagree" responses. Even though SDPs, principals and teachers could all describe the various committees and teams that helped make decisions in the school, according to the survey, teachers appeared to be equally conflicted about the formal process of how the decisions were ultimately made. Roosevelt Middle School teachers were more inclined to agree that there was a formal process in place. Johnson Middle School, on the other hand, had only 1 teacher "agree" to this survey question from the 34 respondents. Twenty-seven teachers "disagreed" and 6 were "undecided." This difference between the two middle schools reflected the consistently different perspectives between the two school's teachers towards leaders within their schools and the processes for how decisions were made. According to the previous Leadership section survey responses, teachers reported that they had input into the decisions of the school; however, the results to the questions on Processes indicated that teachers did not agree that all decisions included 132 representatives from all of the school's stakeholders. These results mirrored those of the Leadership section survey results that indicated the SDP process did not include all stakeholders. An examination of all six schools' responses to this question showed that there were 60 "agrees," 53 "disagrees" and 68 "undecideds." Once again, there appeared to be a relatively equal number of teachers who were either not sure about the process for making decisions or about whether a decision process included all of the school's stakeholders. Teachers at both Roosevelt Middle School (57%) and Johnson Middle School (18%) responded differently to this survey question, just as they did to the previOUS one. A second data analysis was conducted on the Processes survey responses from teachers due to the high number of "undecided" responses. The "undecided" responses were completely eliminated. The total number of positive responses and total number of negative responses were added together from all of the questions asked of each school. These separate totals were averaged into percentages. In other words, removing the "undecided" responses completely from the totals at each school (N) allowed for a secondary analysis of the raw data. Table 20 reveals the results. This secondary analysis-whereby only questions with definitive answers of "agree" or "disagree" were totaled-revealed that teachers from four of the five schools had a more positive perception of processes in place for data-based decisions. The teachers at Taft Elementary and Johnson Middle Schools continued to have more negative ("disagree") responses to processes for data-based decision-making. This second analysis made it TABLE 20. Process Survey Summary: Agree and Disagree Only Responses Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N= 22) (N= 23) (N= 34) (N= 48) (N= 31) -- Dis- Dis- Dis- Dis- Dis- Dis- Agree agree Agree agree Agree agree Agree agree Agree agree Agree agree Question (%) (%) (%) (%) (%) (%) (%) (%) (%) (%) (%) (%) Total Responses to all six 29 61 49 28 74 35 29 130 109 92 56 57 questions in the Processes (.32) (.68) (.64) (.36) (.68) (.32) (.18) (.82) (.54) (.46) (.50) (.50) portion of the survey POSITIVE RESPONSES 5 2 1 6 3 4 SCHOOL RANK -w w 134 apparent that teachers in these two schools were still conflicted about how to respond to questions regarding the process for data-based decision-making within their schools. In contrast to the teachers' survey responses, principals at each of the schools believed that teachers (as stakeholders) had considerable voice and input in the processes of decision-making. Their interviews provided examples of how teachers' opinions (surveys) and recommendations were a part of small- and large-group decision-making processes. One principal offered the following summary: I think [teachers] have more input than they might say they do. They really do ... voices get to be heard. They have input on what courses they get to teach ... they decide in their departments. We'll tell them about the concerns or ask what they think and they'll come back and tell us with data. It's a team effort. (HPlO) Teachers from five of the six schools (excluding Roosevelt) could not agree that there was a formal process in place for the collection of data or for communicating and disseminating data throughout the school. Conversely, principals and SDPs described an active collection of a variety of data, and earlier Sources survey results indicated that demographic, perception and student learning data were collected school-wide and within classrooms to make decisions. Several principals mentioned times of the year when data were formally collected in order to monitor progress or report progress. There are formal ways ... fall, winter, spring ... that we use as a school and I make sure we use that data that way. (HPOl) We have data at every meeting ... hard or soft. And then they [teachers] also collect and bring back data. The other day, someone wanted to talk about our schedule ... and I said go get the data and talk to each of your departments. (HP09) 135 Formally, we try to do a fall-winter-spring [review]. [The Principal] sends out surveys at the end of each year. (HP02) Depending on what we're looking at. Daily collection or occasional, like surveys. (HP08) Our gut level, our ear to the floor, listening to the comments ofthe people and reacting to those incrementally has made a difference. (HP12) Some of the best information you get from individuals is one-on-one; it's more private and feels sometimes like a more safe venue for discussion. (HP06) A few principals and a superintendent talked about honoring "grass roots movements" from their teachers and having an open process to allow these types of decisions to be made. Both high schools' principals described their curriculum and course selection process as largely driven and initiated by the teachers and department chairs (representing their teachers) rather than by a top-down decision. I love doing that [having teachers decide on adopted curriculum] rather than "This is our program and we're all doing it." Because I think no one can be more passive-aggressive like an educator-teacher or administrator-we know how to do that well. (HP03) I sometimes think that our best decisions are always from our best teachers. You see the first blink of an idea. There's lots of stories around our district where teachers find good things. If we're careful enough in that process oftaking it from that spark to the whole, it's a good thing. (HS02) Even though communicating and disseminating data received only a 41 % favorable overall response from teachers across all six schools, it did rank as the highest positive Processes response. Principals described E-mails, weekly E-news, staff meetings, teams 136 meetings, committee meetings, and conversations as processes (formal and informal) for how information was communicated and disseminated. Both teachers and principals acknowledged that there were no formal processes or staff development opportunities in place for either member (teacher, administrator) to increase their skills or proficiency in handling and using data. The elementary principals mentioned that they met informally with classroom teachers or grade levels to examine and analyze students' content-area assessments (teacher-made and standardized) or critically evaluate students' work to guide instruction. Both middle school principals mentioned recent formal workshops on bullying, which involved teaching teachers how to collect survey data from students and examine the results. Both high school principals mentioned training sessions on technology to help teachers learn to use databases such as grade books, spreadsheets, data software (Excel), and web pages for posting data. Many of the principals expressed enthusiasm with the new electronic web- based survey formats (for example, SurveyMonkey) because of the ease and efficiency in generating, collecting, analyzing and disseminating the data. One of the principals suggested that training sessions on how to gather and use data could be initiated in the district. Other administrators indicated that schools should spend their professional development time having conversations-especially conversations that included examining data-regarding their schools' academic or social progress. I don't think that we've ever focused on professional development for use of data. Well, maybe we have with some curriculum assessments. But that's a focus on how do you help this child. Teacher-made assessments, particularly around reading for quick diagnostic work-that would be the 137 place, the diagnostic component, where we have more staff development. (HSOl) We use data to determine what professional development activities we will have. (HP 11) Chi-square distributions were computed on each question in the area of Processes and are also displayed in Table 19. The results indicated a significant correlation (p < .05 or P < .10) of positive responses across all six schools to all of the questions. Figure 5 demonstrates the average positive responses from each school. Adams Elementary, Hoover High and Cleveland High Schools clustered with similar overall average responses from teachers (37%,38%, and 30%, respectively). Taft Elementary and Johnson Middle Schools had the lowest overall average positive responses (23% and 14%, respectively). There was a difference between the two middle schools. Table 21 displays the descriptive statistics of positive responses from the teachers at each of the six schools. Each school's average positive response, ranking and standard deviation are displayed. The results from Tables 19 and 20, as well as the descriptive statistics in Table 21, indicated that the responses from Johnson Middle School differ from the other middle school, Roosevelt, and somewhat from each of the other four schools. An analysis of variance (ANOVA) was conducted to examine differences in positive responses among schools in the area of Processes for data-based decisions. 138 Processes Average Positive Responses 0.6 0.5 0.4 0.3 0.2 0.1 o TES AES RMS JMS HHS CHS IEl Average Positive Responses FIGURE 5. Processes average positive responses by school. TABLE 21. Processes Positive Responses Summary School Mean Rank N SD Taft Elementary .21 5 23 .064 Adams .37 3 22 .067 Elementary Roosevelt Middle .54 1 23 .065 Johnson Middle .14 6 34 .068 Hoover High .38 2 48 .087 Cleveland High .30 4 31 .078 Total The ANOVA revealed significant differences in the mean scores of the schools (F = 22.38,p < 0.001). Ttests were conducted to examine variance among school responses 139 more closely. The overall scores indicated significant differences between each of the schools' responses, except between the responses from Adams Elementary and Hoover High. Impacts Teachers were more confident and positive in their reports of Impacts from data- based decisions compared to the decision-making Processes responses. The most noticeable impacts that teachers reported seeing in their schools were the informed decisions about student placement, assistance and instruction and the informed decisions about school procedures, policies and programs. Most teachers reported positively about the impacts of data on decisions they made in their classrooms and acknowledged that there was improvement with knowledge and expertise in using data in their schools. Table 22 summarizes the survey responses from teachers on Impacts of data- based decision-making. Teachers responded to the questions in this section in the survey by selecting Agree, Disagree, or Undecided. The results in the table show the number and percent of positive responses, the overall rank for each question by schools, and the chi-square distribution. Chi-square distributions were computed on each question. If the chi-square was not valid for a particular question due to expected counts being less than 5, a hyphen (--) marks that question. The critical value for chi-square with 5 degrees of freedom was 11.07, p < .05. TABLE 22. Impacts of Data Summary Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N=22) (N=23) (N= 34) (N=48) (N= 31) % Rank Question n % n % n % n % n % n % N= 181 X2 5 The active use of various 7 .30 16 .73 15 .65 4 .12 13 .27 10 .32 .36 28.35* forms of data has a strong impact on decision-making in my school 2 Teacher regularly uses data to 16 .70 19 .86 19 .83 17 .50 28 .58 17 .55 .64 12.082* make decisions about teaching and learning 2 Using data has had a positive 17 .74 18 .82 18 .78 15 .44 30 .63 17 .55 .64 impact on teaching performance Knowledge and expertise in 17 .74 19 .86 18 .78 22 .65 31 .65 22 .71 .71 using data has improved as it has been used -~ o TABLE 22. (Continued) Taft Adams Roosevelt Johnson Hoover Cleveland (N= 23) (N=22) (N= 23) (N= 34) (N= 48) (N= 31) % Rank Question n % n % n % n % n % n % N= 181 X2 4 The use of student test data in 8 .35 11 .50 10 .43 2 .06 24 .50 14 .45 .38 the school has produced positive outcomes as evidenced by our state assessment scores 3 The school has been 9 .39 12 .55 14 .61 5 .15 22 .46 11 .35 .40 29.23* strengthened as an organization by its use of data in decision-making AVERAGE TOTAL 12 .54 16 .72 16 .68 11 .32 25 .51 15 .49 .52 POSITIVE RESPONSES SCHOOL RANK 3 1 2 6 4 5 - *p < .05. ......... -+:-. ......... 142 Across all six schools, teachers agreed that their knowledge and expertise in using data had improved. This average response was ranked the highest (71 %) across all ofthe schools. Two questions ranked second highest by teachers. The question regarding regular data use to make decisions about teaching and learning was an average 64% positive response across the schools. This response was consistent with teachers' earlier survey responses to data Sources where they reported gathering and using data within their own classrooms. Teachers at Adams Elementary had the highest response (86%) to this survey question. Also ranking second highest, with an average of 64%, was the question about whether the use of data created a positive impact on teaching performance. Teachers from five of the six schools responded positively to this question. Only teachers from Johnson Middle School (44%) had a less than favorable response. This was consistent with earlier negative comments made by several teachers at Johnson Middle School regarding the reported "stress" from using data within a "chaotic and disorganized" process. The remaining three questions in this portion of the survey received mixed responses from the teachers at each school. Teachers at only two schools responded favorably to the remaining three questions. The teachers at Adams Elementary (55%) and Roosevelt Middle (61 %) agreed that their school had been strengthened as an organization by its use of data in decision-making. 143 This particular question received positive reports from the principals in each of the six schools. Each ofthem were asked in their interview, "Please share the impacts that you see as a result of using data to make decisions." The common two themes that resulted from their responses (a) their school programs, instruction and culture had been strengthened by using data (anecdotal, surveys, assessment); and (b) their leadership credibility, which impacts the school culture, increased when they are able to use data in making decisions. Principals also reported that by using data collected from teachers and a variety of sources, there was an impact in decisions regarding curriculum changes, program initiations, quality of professional conversations and decisions about professional development. They said that all of these decisions had served to strengthen the school as a system, its academic performance and its professional and interpersonal culture. Principals reported that a school culture where everyone was actively inquiring and using data to engage in conversations centered around teaching and learning was the greatest impact from using data. I think as administrators the biggest impact we can have on a school is a culture where everybody is invested in looking at something, gathering info about it, making decisions and moving forward on it. That's exciting and what's working right now in our school for sure. (HP07) I think impacts can be on two different levels: in the classroom with kids ... and in the school. (HPO1) The teachers from Adams Elementary (73%) and Roosevelt Middle (65%) were the only ones who also responded positively on the impact that active use of various 144 forms of data had on decision-making in their schools. Teachers from these two schools commented about data-based impacts they had noticed in their school: Adjustments to our discipline policies, review of homework policies, review of grading policies. (HT301) Everything ... programs, tests, strategies. (HT306) Response to student need; administrator response to staff choices and decision-making. (HT 312) Using many different assessments, including teacher observations and input, have given us a more complete picture of the students so that we can provide interventions or modify instruction to meet [their] needs. (HT210) Faster reaction to needs of the students. (HT203) The teachers from these two schools also reported the highest use of data Sources earlier in the survey. Additionally, they were the only two schools with positive reports from teachers (Adams, 73%, and Roosevelt, 70%) in the Leadership portion of the survey regarding strong principal leadership in using data to improve classroom instruction. None of the teachers from the six schools agreed (above 50%) that the use of student test data in the schools had produced positive outcomes, as evidenced by the state assessment scores. Teachers at Adams Elementary (50%) and Roosevelt Middle (43%) were also equally divided between their "agree" and "disagree" responses to this question. Examining the raw data from each of the six schools indicated that a majority of teachers responded with "undecided" to this question. Sixty-nine teachers "agreed," 37 teachers "disagreed," and 75 teachers were "undecided." The examination of the 145 open-ended question at the end of the survey revealed this same conflict about whether there were any positive impacts from state assessment data. Some teachers responded positively to how the school has used state assessment data to implement interventions, to drive teacher professional goals, and to reveal areas for improvement or special programs needed for students. Others indicated their skepticism about the emphasis of standardized test data and wanted to keep that type of data in perspective with other data collected. Several teachers expressed clear dissatisfaction regarding the impact that state assessments made in terms of determining the success of a school, its programs, or its students' and teachers' overall performance. Looking at students' statewide assessment strands helps me to plan lessons in response to where students struggle. (HT301) Test scores have pointed to weaknesses in specific areas, such as reading. (HT 609) Reinforcement that our school's strengths can't be measured on standardized tests-so trust by colleagues and administration to keep teaching through best practices. (HT101) Tracking standardized test performance of students deemed at-risk. (HT411) Test data are not used in any meaningful way except to document general trends. It is not used to guide or in any way assist instruction. (HT515) Making choices and decisions based solely on state test scores is ludicrous.... On the other hand, it was great when TESA [Technology Enhanced State Assessments] was working well for us. It was great for us as teachers to get the data in order to strengthen our curriculum instruction for the short terms as well as the long term. (HT204) 146 I have seen drastic decisions made based on very specific data sources, but feel they have been overly reactionary and decidedly far-reaching. (HT2l2) Principals reported similar conflicts about the use of state test data and its impact on their schools. All of the principals acknowledged that they presented and reviewed their state test data results with their teachers. Most of them indicated that they used the data to help their teachers or Site Council identify areas where improvements could be made. Most of the principals mentioned that they considered it part of their leadership responsibility to put state assessment data in perspective and in context with other data collected. I think that has been my job to make the impact and to make the connection with "so what" [regarding test data]. (HP09) In response to data ... that's where leadership needs to step in and say, "What should we do with this?" and "How should we look at this?" (HP05) In the interviews, principals were asked about future impacts they would like to see in terms of data use. Their responses indicated that they want to make the following improvements: (a) collecting better and more efficient data in-house that is more closely linked to daily instruction and learning, (b) cultivating a culture of inquiry that uses data, and (c) finding systemic understanding centered around the nature of state test data as one source and one indicator only within a broad variety of indicators of success. Several principals and both superintendents agreed that while state test data were helpful in terms of looking at trends or patterns (with groups or more broadly), they 147 regretted its "reductionistic" tendency when characterizing a single student or a specific school. I can talk about all the different ways we know our students and how we serve them, but I will draw the line at reducing a child's performance to a number and a teacher's performance to a number. So I guess it takes all of us as school leaders having the courage to not perpetuate something that doesn't serve children. (HP04) Whenever it's a reductionistic point of view, reduced from this huge data set with millions or thousands of data points down to a reduced word like "meets," "exceeds," or "does not" or schools are "exceptional, strong, satisfactory" ... that kind of reductionism really results from very poor analysis and in many cases can become a misuse of data. (HS02) To determine how teachers across the six schools reportedly felt about each survey question on Impacts of data-based decision-making, chi-square distributions were computed. The results indicated a significant correlation (p < .05) of positive responses across all six schools to three of the six questions: (a) the active use of data has a strong impact, (b) teachers regularly use data to make decisions, and (c) the school has been strengthened as an organization by its use of data in decision-making. The remaining three questions (using data has a positive impact on teaching performance, knowledge and expertise have improved, and the use of student data has produced positive outcomes evidenced by state scores) could not be computed due to expected counts being less than five. According to the raw data for each of these three questions, teachers responded with a high number of "agrees" and a very low number of "disagrees." Only Johnson Middle School teachers responded to the question about positive impacts being evidenced in state scores with a high number of "disagrees." 148 Figure 6 demonstrates each school's average positive responses regarding impacts of data-based decisions. Adams Elementary and Roosevelt Middle Schools had the highest positive responses (72% and 68%, respectively). Taft Elementary, Hoover High and Cleveland High were clustered closely together (54%, 51 %, and 49%, respectively). Johnson Middle School was the only school with a negative overall average response percentage from teachers (32%). Once again, there is an observable difference in response patterns between the two middle schools. Impacts Average Positive Responses 0.8- 0.7 0.6 0.5- 0.4 0.3 0.2 0.1- o TES AES RMS .IMS HHS CHS WI Average Positive Responses FIGURE 6. Impacts average positive responses by school. Table 23 displays the descriptive statistics of positive responses from each of the six schools. Each school's average positive responses, ranking and standard deviation are shown. 149 TABLE 23. Impacts Positive Responses Summary School Mean Rank N SD Taft Elementary .54 3 23 .209 Adams Elementary .72 22 .161 Roosevelt Middle .68 2 23 .147 Johnson Middle .32 5 34 .242 Hoover High .51 3 48 .139 Cleveland High .49 4 31 .143 Total An analysis of variance (ANOVA) test was conducted to examine differences in positive responses between the six schools regarding impacts of data-based decision- making. The ANOVA revealed overall significant differences in the mean scores of the teachers' responses between the schools (F= 5.55,p < 0.001). Ttests were conducted to examine the variance more closely. The t tests indicated that differences between Adams Elementary and Roosevelt Middle Schools were significant at the .10 level. There were no significant differences between Hoover High and Cleveland High teachers' responses. In the survey, teachers also responded to an open-ended question asking, "What specific impacts of data-based decision-making have you observed in your school?" Table 24 displays the results of this question. Each school's number of responses was counted and calculated into a percentage. The written responses were then categorized into four themes and ranked in order of greatest number of responses to those four 150 TABLE 24. Summary of Teacher Responses to Impacts of Data Use Number of School Summary themes of the responses responses Taft 1. Informed decisions about student placement, assistance and 3 5/23 instruction (22%) 2. Informed decisions about school procedures, policies and 2 programs 3. No effective process or impacts 0 4. Increased student achievement 0 Adams 1. Informed decisions about student placement, assistance and 10 15/22 instruction (68%) 2. Informed decisions about school procedures, policies and programs 3. No effective process or impacts 2 4. Increased student achievement 2 Roosevelt 1. Informed decisions about student placement, assistance and 6 12/23 instruction (52%) 2. Informed decisions about school procedures, policies and 4 programs 3. No effective process or impacts 0 4. Increased student achievement 2 Johnson 1. Informed decisions about student placement, assistance and 7 12/34 instruction (35%) 2. Informed decisions about school procedures, policies and 0 programs 3. No effective process or impacts 5 4. Increased student achievement 0 Hoover 1. Informed decisions about student placement, assistance and 5 15/48 instruction (31%) 2. Informed decisions about school procedures, policies and 8 programs 3. No effective process or impacts 1 4. Increased student achievement 1 Cleveland 1. Informed decisions about student placement, assistance and 7 10/3 instruction (32%) 2. Informed decisions about school procedures, policies and 1 programs 3. No effective process or impacts 4. Increased student achievement 151 themes. The order of the themes was as follows: (a) informed decisions about student placement, assistance and instruction; (b) informed decisions about school procedures, policies and programs; (c) no effective process or impacts; and (d) increased student achievement. Most teachers (38) from across the six schools responded that data use in their schools informed their decisions about student placement, assistance and instruction. This was consistent with principals' responses and the mention of data use within schools' SDPs. In their school plans, principals reported on a variety of ways that data were used to help students with academic and social needs. Responses regarding data usage to make informed decisions about school procedures, policies and programs ranked second most common from teachers. This result was also consistent with principals' responses regarding their use of surveys to better understand the effectiveness of programs and procedures. Many principals mentioned that surveys were used to shape and change policies and practices in their school. These stories were reflected in the narratives and goals of the several schools' SDPs. There were also comments from teachers about lack of an effective process or observable impacts regarding data usage. A significant number ofthese comments (five), as seen in Table 24, came from Johnson Middle School. This was consistent with other negative responses from Johnson teachers in other portions ofthe survey, Processes and Impacts. In both of those sections of the survey, Johnson ranked last for 152 overall positive responses. Johnson Middle School teachers did not comment on school-wide procedures, policies or programs that were created or adjusted based on data. The principals at Johnson indicated that data have impacted staff conversations. Using data had strengthened their own skills in examining data and framing questions. The principal expressed a desire to "broaden the sources of data and look at a lot more of our work as sources of data." She added that recently many of her teachers reported that they felt "ashamed and attacked by what the data say." There appeared to be no significant similarities or differences between the school levels (elementary, middle, high) regarding impacts of data use on decision- making. School administrators and teachers reported noticeable impacts to their own data collection, processes and leadership decisions. Examining the average total positive responses of each school indicated that there was greater variability between the two elementary schools (52%, 73%) and the two middle schools (70%, 32%) than between the two high schools (52%, 48%). The two high schools had also scored more similarly across the two other portions of the survey (Leadership, Processes). In this study, three of the schools selected (one elementary, one middle, one high) were located in Belleview and the other three were located in Oakview. As described earlier, there were geographical and demographical differences between these two cities. Results of the surveys, interviews and SDPs showed only one bit of evidence that location may have impacted these schools in regard to data-based decision-making: the ELL data used by the school personnel located in Belleview. Teachers from the 153 elementary school (Adams) and middle school (Johnson) reported a high usage of ELL data as a source for data-based decisions in their school. The principal at Cleveland included a number of goals within the SDP that focused on improving ELL services at the school. At the time of the study, the city of Belleview had a higher minority population than Oakview. Otherwise, there was no observable pattern of Sources, Leadership, Processes or Impacts from the surveys, interviews or SDPs that suggested geographic location was a significant factor in data-based decision-making among the six schools. In review, the first research question examined the similarities and differences in how the teachers and administrators in the schools used the data they collected. This was done by creating a typology of each school based on responses from the survey data, interviews and on each school's School Development Plan. Results of these similarities and differences were noted, calculated and presented in the various tables and through the descriptive and analytical statistics. Examination of the similarities and differences revealed patterns indicating how data were used to inform decision-making. These patterns are reported in more detail in the following analyses of responses to the second and third research questions. Research Question 2 Research Question 2 asked, "What patterns emerged that implicated how data were used to inform decision-making?" 154 After the examination of each school's use of data-as reported in the teachers' surveys (Sources, Leadership, Processes and Impacts), each principal's interview and each school's SDP-several significant patterns emerged across the six schools. Sources of Data Used In the study at Highland School District, teachers mostly reported using local data (teacher-made assessments, observations, and analysis of student work) to make decisions in their classrooms. In both elementary schools, teachers also used scored student work samples. Principals acknowledged that teachers in their schools preferred to use diagnostic data that gave more in-depth information about how their individual students were performing. These statements were consistent with earlier positive responses by teachers that their schools used multiple sources of data. Teachers noted that they were primarily responsible for collecting their own daily or formative student assessment data to guide their instruction. They commented that they were provided results of state assessment data and met in various forums or meetings for reviewing individual or group performance. However, the teachers reported consistently that they did not consider state assessment data to be helpful or reliable in making decisions about teaching and learning. Larger group data sets (grade level or departmental) in the form of state assessments or vendor-created assessments were still collected primarily by the principals and brought forward to team or department meetings for conversation and inquiry. 155 Across each school, teachers and principals reported that results from state assessment data were not the focus in characterizing their schools or what drove their decisions for daily instruction in the classrooms. They were, however, the only form of assessments that were consistent across the schools and, therefore, ended up being the only data used in comparing the schools. Teachers responded that state assessments (at all levels) were used for making school-wide improvement decisions. Principals acknowledged that they did take the state assessment data seriously because this information was the data used by the state and media to measure and report each school's AYP and overall rating. The principals added that group trends or patterns revealed in the state assessment data were used to identify goals for improvement. According to the surveys and interviews, the school personnel actively used a variety of perception data for school-wide improvement-some more than others. These perception surveys were mostly initiated and generated by the principals or a group that the principal facilitated. The type, timing and distribution of the surveys were driven by the principals' goals and agendas as reported in their interviews or written in their SDPs. Teachers and principals described surveys given to parents, students and staff designed to elicit information about the school culture, the school as a system, and professional development needs. Teachers across all six schools did not report high usage of their own surveys to parents. Only the teachers at Adams Elementary and Roosevelt Middle indicated using student surveys. None of the teachers or principals reported using surveys with community members. 156 Teachers reported using a wider variety of data usage for school improvement than for classroom instruction. More demographic data and perception data were used when school-wide decisions were made. The teachers' responses to the use of student learning data revealed a pattern between the levels. The elementary schools differed from the middle and high schools regarding the use of student learning data. In the elementary schools, the teachers reported that student learning data (state assessment and localized, teacher-administered assessments) were both used for making school- wide decisions and for informing classroom instruction. This was not the case in the middle and high schools. In the middle and high schools, the teachers reported the use of state assessment for school improvement and the use of local (teacher-administered) assessments for classroom instruction. Elementary school teachers from the study cited examples of how both data sets initiated classroom and school-wide practices, such as pull-out intervention programs for struggling students (e.g., reading groups), behavioral support for students struggling socially, and additional curriculum resources. There was a strong message reported by the administrators in the study (superintendents and principals) that there needed to be discretion and discernment regarding the collection and use of data. This message was supported by the superintendent: Data and information always have to be put into context of what you know, what you're trying to accomplish, why you're trying to accomplish it, what value structures you have, and relationships. It's a human endeavor. It's contrary to a mechanistic way ofthinking which takes information into pieces and parts. Humans don't work that way. Things don't work that way. They're systems. They're interconnected. You push 157 on one little piece here and it changes something else. You have to look at data in a holistic way in the context of a lot of other things. (HSO 1) Principals mentioned being (a) cautious about the types of data collected; (b) attentive to how constructive collected data may be to teachers and the school; (c) aware that there needed to be a "balance" of when, how much and why data were presented to staff; and (d) cognizant that data could be misused or misconstrued. Leadership in Data Usage In the Highland School District, leaders played a significant role in how data were used to make decisions. The philosophy of data-based decisions was consistently articulated by the administrative leaders interviewed for this study. There was a consistently reported belief that decisions were made involving data, but that data alone did not drive the decisions. As the superintendent stated, "I don't believe that rote analysis of data gives you answers. You have to fold in experience, intuition, knowledge, and context." Each school administrator acknowledged that he or she detennined the extent to which data would be collected, analyzed and used as part of its school-wide decision-making process. The principals reported using a shared leadership style in the form of a Site Council, academic department or teacher-based committee to help collect and analyze the data. There was also a reported expectation on the part of administrators that teachers should be involved in making decisions about curriculum, programs, student needs, school improvement and school culture. The administrators 158 and teachers acknowledged that multiple sources of data (demographic, perception, student learning) were collected for school improvement at each school. In the district, teachers generally believed that their principals exhibited strong leadership skills in using data to improve classrooms and to solve school-wide problems. Overall, there was a strong belief that teachers had input into decisions that were made in their schools and that staff were encouraged to use data in making decisions. These responses were mirrored in the principals' interviews and the School Development Plans (SDPs). Examples of how teachers provided input were through surveys, committee meetings, Site Councils and informal conversations with their administrators. Although the statistical analysis of the teachers' surveys indicated a positive correlation with strong principal leadership across the district, there was notable variance among the schools. One school in particular, Johnson Middle School, demonstrated the lowest percentage of positive responses to almost every question regarding Leadership when compared to the other six schools. It was also observable that the two middle schools selected in the study had the highest (Roosevelt, 69%) and the lowest (Johnson, 36%) percentages of average positive responses to data-based decision-making leadership in their schools. 159 Processes The superintendents' expectations of how data were to be used in the schools was evident in the School Development Plan guidelines that were given to each school principal (see Appendix A). The guidelines stated an expectation that principals craft SDPs with collective responsibility for the vision and mission of the district. The plans were to be "infused with the notions of Performance Character and Moral Character as a means to improving student learning" (see Appendix A). Within the content of the plans, there was a written expectation for analysis of data, which was to include student performance data, surveys, and student work samples. District initiatives that were woven through each schools' plans were (a) Research and Inquiry, (b) Teaming, (c) Interdependence, (d) Wellness, and (e) Literacy. All of these initiatives were expected to be evident in an overarching challenge to "Excellence and Ethics." The guidelines included the following statement: "We know that student achievement and high quality learning will flourish in a healthy school and classroom culture, [that is] Performance and Moral Character" (see Appendix A). These district guidelines and statements were echoed in many of the principals' interviews regarding processes for data-based decision-making. The principals mentioned using "teams" and "research and inquiry" as forums to integrate data at both the teacher and student level. Teachers and principals at each school identified a site-based infrastructure whereby teachers and teacher representatives had input into decisions that were made. 160 Principals talked about an informal process of decision-making that included teachers' voices in various forums (committees, teams, grade levels, departments, Site Councils, one-on-one conversations). Identifying aformal decision-making process in each school was difficult for teachers. There were a higher number of "undecided" responses in this portion of the survey than in any other. Early in the survey, three teachers mentioned receiving some training in collecting or analyzing data. For one teacher, it occurred in another district. For two teachers, the training was to analyze data collected in the classroom with new curriculum. Forty-eight teachers indicated that they had experience or coursework in statistics or data analysis during their undergraduate or graduate coursework. The other 57 teachers reported no formal training in collecting or analyzing data. The high school principals mentioned that their teachers were learning to use technology (Excel, spreadsheets) that could help with data analysis. Two vice-principals mentioned that upgraded and user-friendly technology tools had made collecting data and disaggregating data much easier than in the past. Each superintendent and almost all of the principals talked about using survey data to determine next steps in professional development for teachers and their own data skills. However, the responses from the superintendents, principals and teachers did not indicate a formal plan for staff to learn and increase their data-analysis skills. 161 Impacts of Data Use The teachers and principals reported consistently that data had impacted student learning and overall student success in school. Teachers reported that there had been a positive impact in their teaching performance and their own knowledge and expertise in using data. Generally they acknowledged that the use of data had positively impacted their school and strengthened the organization as a whole. There was some variance between the individual schools as to perceptions of how data use was strengthening their school, but overall there was a positive statistical agreement that it had done so. Both groups of teachers and principals reported that the impacts of data-based decisions could be largely seen in the decisions made to support students. These impacts included (a) student placement in various programs, courses and classes; (b) special or general assistance they may need academically or socially; and (c) types of instruction (specially designed instruction, small groups, ELL, modifications and accommodations, etc.). There was also agreement by both groups that data-based decisions impacted school procedures, policies and programs. The principals and superintendents noted that their challenge as leaders was to discern which data to collect, efficient ways to collect and analyze this information, and how to "nudge" their teachers to collect their own data more consistently. Several principals criticized the limitations and media emphasis regarding the state assessment data, and hoped that there would be an exploration of alternative, more "authentic" standardized assessments. 162 Across five of the six schools, there was an overall positive response by teachers and principals regarding the impact of data on teaching and learning. As mentioned earlier, there was a recurring pattern of variance between one particular school and the five others. Johnson Middle School had an overall average positive response of 32% in the Impacts portion of the survey compared to Roosevelt Middle School (68%). As observed in each of the other portions of the survey, the teachers from Johnson Middle School consistently indicated a less than favorable perception of data-based decision- making within their school as compared to the other five schools. Additional Patterns Observed There were no significant patterns of data-based decisions (Sources, Leadership, Process, Impacts) between each of the levels (elementary, middle, high). There was some variability between each of the elementary schools, middle schools and high schools. These differences included data sources specific to levels and leadership or departmental team compositions specific to levels. There was an isolated difference between the two elementary schools and the other schools regarding types of student learning data and whether they were used for school improvement or classroom instruction. Otherwise, no other patterns of similarities or differences between school levels were detected in the data collected. The geographical location of the schools in Highland School District did not appear to be a significant factor in data-based decision-making at each school, other 163 than the ELL data collection and goals described at the Belleview schools. There were reports of negativity from the teachers regarding data usage at Johnson Middle School during this study. However, there were no observable patterns of negativity reported by the teachers at Adams Elementary or Cleveland High Schools. All three of these schools were located in Belleview and shared the same demographic and geographic population of students. Summary Results from Questions 1 and 2 indicated an overall positive correlation between data-based decision-making and leadership behaviors (principal and teacher), formal or informal processes, and impacts on teaching and learning across five of the six schools. Reports from principals' interviews and archival documents revealed data-based actions regarding student performance, professional inquiry, and school systems (programmatically, systemically and culturally). Although there were clear mandates (federal and state) for data usage to meet external accountability, there was also a consistent message from the administrators that data were not to be used in isolation, as a sole measure of student success, or without the use of appropriate discretion with individual students. The results from the first two research questions led to the following chapter, Data Analysis, and the results of the third and final research question. 164 CHAPTER V DATA ANALYSIS This chapter includes an overview of the similarities and differences in the way participant schools collected and used data (Research Question 1, reported in Chapter IV); the patterns and trends which were observed (Research Question 2, reported in Chapter IV); and, therefore, what data-based leadership model was evident at each school, between levels (elementary, middle, high) and across the district (Research Question 3, reported in this chapter). The results of each school's survey, interviews and archival documents were compiled into a typology and analyzed using Baker and Richards' (2004) continuum of data-based school leadership behavior (compliance, performance, ecological). The focus, analysis and evaluation of each data-based component-Sources, Leadership, Processes and Impacts-for each school were analyzed according to their progression along the continuum. Table 25 outlines Baker and Richards' (2004) theoretical framework, which was used for this analysis. Sources Baker and Richards (2004) and Bernhardt (1998,2004) remind us that a variety of data must be collected for schools to make significant and targeted school 165 TABLE 25. Baker & Richards' (2004) Model of Data-Based Decision-Making Goal Focus Analysis Evaluation Compliance Regulations and standards Compare data to the standards Data are collected for accountability measures Performance Input from stakeholders Data are collected from multiple resources, and performance of the organization is important Efficient and increased performance Ecological Ongoing and continuous growth and transformation Data expose relationships and differences, and constitute the basis for action Organizational learning occurs through the experiences and introspection of the process improvement and for data to be the basis for action. These types of data include demographics, perceptions, and student learning. The results of the study revealed that personnel in each school collected a variety of data across all three types. However, there was a difference between the variety of data the principals collected and the variety of data the teachers collected. At each school, principals reported collecting and analyzing demographic, student learning (state assessment or local assessment) and perception data in order to evaluate programs, policies and areas of performance for improvement. When the topic of state assessment data came up in each administrator's interviews, there was a consistent response that it was collected, analyzed and considered important in understanding large-group achievement or trends in the school. Moreover, each principal added that state assessment data were not the sole form of data used in the 166 school, nor should they be used as the sole form of data to evaluate a school's successes and areas for improvement. Teachers from all six schools reported the use of a variety of demographic and student learning data to improve their school and their classroom instruction. Unlike the principals, the teachers at each school reported the use of local assessment data-not state assessment data-to inform their instruction. They reported the use of state assessment data for school improvement goals. State assessment data evidently comprised one data source that both administrators and teachers reportedly found themselves struggling to keep in the context of other assessments administered at a local level. In schools where the teachers reported greater negativity or conflict about the state assessments, there were parallel responses towards leadership decisions and school processes involving data use. It can be implied that these responses and reactions are linked. According to Baker and Richards (2004), in order for a school to move along the continuum past compliance and performance dimensions towards an ecological dimension, its leaders must use data for ongoing and continuous growth and transformation. The analysis of data sources should expose "relationships and differences" (Baker & Richards, 2004, p. 20). For a system to do this, perception data are integral. Perception data are important in gleaning what other people believe, perceive or think about different topics or what is happening in the school community. It informs the organization about how the stakeholders think about the organization. 167 This type of data informs an organization of its current status and how it can transform itself (Bernhardt, 1998). All of the principals reported the use of perception data of some sort and of some stakeholder groups (surveys) as a data resource for gathering teacher, parent and student perceptions as part of ongoing school improvement efforts. However, only teachers in two schools reported using surveys for classroom instruction. Neither principals nor teachers reported using community surveys; however, in the interview and district archival documents the superintendent reported the use of community surveys on a periodic basis to elicit stakeholder opinion of district policies and long- range plans. These results demonstrate that the administrators at all six schools understood the importance of all three types of data and made attempts to gather multiple sources of data, not just for compliance or performance, but for ongoing usage within their schools. However, not all of the teachers at each school used a variety of data for ongoing growth within their classrooms. Cultural, political, and technical barriers to data usage at the six schools mirrored those that were found in earlier studies (Ingram et aI., 2004; Lachat & Smith, 2005). The questionable "value" of state assessment data to inform classroom instruction was an example of a cultural barrier that arose out of this study as well. An example of a political barrier was the conflict reported by the teachers at Johnson Middle School about data usage to compare and judge their school's performance scores 168 with other schools. Feelings of self-efficacy linked to data use were found in Bettesworth's (2006) study and reported by teachers and principals in this study. Principals and teachers both mentioned a need for technological support when dealing with data. In summary, data sources were collected and utilized to increase performance in target areas and in some cases, in some schools, to inform continuous growth and ongoing transformation. Leadership The literature acknowledges the important role of leadership for effective data usage in schools (Armstrong & Anthes, 2001; Feldman & Tung, 2001; Lachat & Smith, 2005). Rather than relying on professional judgment alone, principals are expected to examine data when making decisions for school improvement (NAESP, 2002). Three levels of leadership were examined in this study: leadership at the state level, leadership at the central district administrative level (superintendents), and leadership at the site level (principals). In response to questions about leadership support from the state, both groups of administrators and teachers did not respond favorably. Superintendents and principals characterized the state as an agent ofmandates and an organization that regulated compliance with those mandates. Several principals added that they consider the state a "bureaucratic organization" and that when they need support, they look to the central 169 administrators for information and resources. The teachers' survey responses revealed similar opinions about support from the state. The chi-square distribution to this question rejected the null hypothesis. This negative response from teachers may be due to several factors: the conflict teachers reportedly felt over mandated state assessments in general or a lack of knowledge about what types of support the state provides to school districts. In response to leadership support from central administration, principals reported being supported by the superintendents in terms of trust extended to them in their site-based management of their schools. Two principals mentioned receiving support services from the district-level technology director when databases, software and advice were needed. There were mixed responses from teachers to this question. The chi-square distribution to this question rejected the null hypothesis, meaning that across all six schools a positive response to this question automatically lacked reliability. However, the raw data revealed that teachers from two schools had responded favorably. It appeared that, depending on how data support services were accessed, teachers had different perspectives. The different responses could also have been due to the recent posting of building-based information technologists at each school who were available to help teachers deal with data-based questions or tools. Years ago these positions were located at the central office; at the time of the study, they were located within each school. 170 Questions about strong site-based leadership (principals) received favorable response from teachers in five of the six schools. The teachers in the five schools agreed that there was strong leadership in using data to solve problems school-wide and that their principals modeled good uses of data. One school stood out as having an internal conflict about its principal's leadership in terms of using data to make decisions. It may be that despite the principal's best leadership efforts, the external pressures of a high- performing school district and the results of less favorable state assessment results had caused the teachers' reported feelings of frustration. Or there is internal dissatisfaction with the principal's leadership efforts in general, and the pattern of negative survey responses are linked to these teachers' attitudes and beliefs about her. This area of contention did impact the analysis and the resulting position of this school on the continuum. All of the principals concurred that it was one of their primary roles to seek out the data, analyze it, and bring it forward to the staff. Together, they would use it to make informed decisions within their schools. This expectation was similar to what the principals in Hill's (2004) study reported. All of the principals reported efforts to encourage shared decision-making among their staff and parent communities. Each principal described teams and committees of stakeholders involved in using data to make decisions. The teachers' surveys mirrored the principals' claims that they had input into decisions made at the school. However, the teachers at only one ofthe schools reported being proficient at 171 accessing and using data for their classroom instruction. This finding, a lack of data proficiency, is similar to findings in other studies (Hill, 2004; Lachat & Smith, 2005; Ingram et ai., 2004). According to the principals' responses to the question about professional development opportunities, there appeared to be no formal plans for training teachers how to gather and analyze data. The principals and superintendents reported that teachers needed to gather, analyze and reflect on data and how it could inform their instruction, but lacked examples of specific training or workshops designed to assist them in knowing how to do that. Principals described teaming times, staff development days and committee meetings set aside to discuss data use or data results. The teachers' survey responses appeared to indicate that they themselves did not feel proficient (perhaps regardless of whether time was given) in accessing and using data. For data to be an ongoing and important part of an ecological system, everyone must be involved in the collection of, analysis of, and reflection upon the data (Baker & Richards, 2004). According to one of the superintendents, this should also include the students: Who should collect the data and information? Ultimately, the child. The child should be presenting themselves, scaffolding as they grow older. When you think of the senior in high school, she or he ought to be fully capable of presenting themselves as a learner, their accomplishments, their challenges, their next learning, their goals, their understanding of how they learn best, what conditions help them learn best, knowing their passions and how to follow them, being an inquirer and researcher. ... So, ultimately, the child should be able to do that. If you back that up throughout the system, then we should be gradually releasing that or engaging with that with the children. (HS02) 172 There was stronger evidence in two of the schools, Roosevelt and Hoover, that the inclusion ofthe student's "voiceH (through forums) and involvement in presenting data about themselves (through surveys) was an ongoing part of the principals' and teachers' expectations. In summary, the leaders in each of the six schools collected data not just because of compliance towards mandates in school development plans, but to increase performance in specific areas or in the entire school. Each of the school leaders also demonstrated efforts to collect data on the external and internal environment impacting the school. In some cases, the leaders made intentional efforts to include the teachers' and students' opinions about their well-being on an ongoing and continuous basis. Processes According to Baker and Richards (2004), as a school moves along the continuum, processes for collection, dissemination, collaboration, and communication of data should be part of a school's culture rather than for specific standards (compliance) or to increase performance of specific areas (performance). According to the principals and superintendents in the present study, there was no "formal" process in the district for gathering or using data. Each administrator did identify a process whereby structured teams, committees or groups worked with the principals to make data-based decisions. It can be implied, therefore, that the understanding of a process in this district was more of an "informal" one organized and 173 facilitated by each principal's discretion and discernment. Most principals talked about having informal conversations with their teachers, having an "intuition," or hearing about a concern which then resulted in collecting data to gather more information. Another possible interpretation of this finding is that, in spite of the appearance of many people being involved in decision processes, the decisions are still being made by the principal or by someone the principal designates to make the decisions. The superintendents' responses to these questions suggested that the organizational structure did not emphasize formal processes or policies: I don't know every decision [principals] make. I want to know that you're O.K. and accessible and things are working for you, that you can consult with me if you have a conflict. But on a day-to-day basis you're making decisions or you're working with people to make decisions. And there are probably decisions being made in your building that you don't even know about either. You're negotiating every day. You have to have some faith that decisions are made based on the values that we hold in this organization by whoever is making them. That the decisions are bound by some common sense of value and purpose in the mission that we hold. Not policy. But that we hold to what we truly value ... who we are. So that we say, this is the level of what we're dealing with and what is consistent with the values we hold. (RPOl) This set of beliefs, held by the administrators, of a more "informal" process reflects an organizational structure held together by shared values rather than set policies. According to the surveys, only the teachers at one school (Roosevelt) answered favorably to the survey questions regarding processes in their schools for decision- making. As mentioned in Chapter IV, the raw data revealed a larger number of "undecided" responses to these questions than to questions in other portions of the survey. Earlier survey responses indicated that teachers felt they had input into 174 decisions; however, according to these responses, they were in conflict over the way their input played out in the processes (formal or otherwise) that were in place. It is possible to interpret this finding as teachers being unused to identifying formal decision processes, or that the informal decision processes are successfully masking the formal process-that is, the principal makes final decisions. Both the administrators and the teachers identified "formal" processes (committees, teams, Site Councils), yet they didn't name them as such. This could also be due to the nature of shared and unshared power within systems, as mentioned earlier. It also reveals the nature of implicit (and explicit) systems. In other words, there may be implicit understandings of how decisions are made (processes) in each school or the district, which may come with their own set of issues in the decision-making process. Hill's (2004) study found a similar response to the questions about processes for data-based decisions. Teachers from four of the five schools in the study could not clearly define a formal process for their school improvement goals. A key finding revealed a lack of common language across the schools regarding data use and processes in dealing with data (Hill, 2004). In summary, there appears to be an ecological perspective reported by the principals and superintendents for how data-based decision processes are occurring in the district and in the schools. In other words, the administrators described the fluid and ongoing "bidirectionality" of influences between the nested "individual," "microsystems" and "macrosystems." However, there is some confusion about what is 175 happening at the level of the teachers and the classrooms. It appears that at the teacher level there is only clarity about data processes in place to meet standards or to increase performance within the classroom or in specific areas as written into school development plans. Impacts According to the literature, the positive impacts of data-based decision-making are evident in school improvement and a school culture of inquiry (Brunner et aI., 2005; Chen et aI., 2005; Chrispeels et aI., 2000; Feldman & Tung, 2001). The 18 Challenge schools in the Almenberg study (Annenberg Institute for School Reform, 1998) reportedly moved to a place of "internalizing" the notion of being accountable to data findings while also implementing a cycle of reflective inquiry that integrates data into the process of inquiry. Senge (1990) described the impacts of inquiry-minded schools as those where knowledge is constructed from individual and social experiences, beliefs, values, emotions, and will. Baker and Richards (2004) summarized the impacts of data on ecological schools as an evolution where all four domains of intelligence (physical, emotional, mental and spiritual) improve. In this study, data from each of the schools were analyzed according to impacts perceived by teachers, reported by principals and written in the SDPs. Teachers across all six schools responded positively to using data in their classrooms and reported that their knowledge and expertise in using data improved as they used it. They did not, 176 however, respond favorably towards the question whether state assessment scores were an indicator of positive outcomes from data usage. The teachers' conflict with this question could have been in response to many variables--e.g., political controversy over high-stakes standardized testing, the argument that only one set of data does not create change, or a varying reaction to the ways in which test data are presented to the staff by each school's administrator. One would assume that in a high-performing school district like Highland, where 80% to 90% of the students consistently "met" or "exceeded" grade-level benchmarks, teachers may have been apt to point to high state assessment scores as an indicator of the quality work they do with students. However, the responses to this survey question indicated more unease or conflict about the use of state assessment scores to reflect or characterize their work with students. It can be implied that the majority ofteachers in this study use data, but do not believe that positive outcomes in their school are reflected in specific data, namely state assessment scores. It can also be implied that teachers may need more training in the development and uses of individual student-level data to be completely comfortable with reporting on their data uses in the classroom. When asked in an open-ended question about the impacts of data use that teachers noticed in their schools, teachers responded primarily with examples that had to do with student instruction, placement, and assistance. Teachers in five of the six schools also made comments about impacts that informed decisions about school procedures, policies and programs. However, the teachers in one particular school, 177 Johnson, did not respond favorably to the majority of questions in this portion of the survey. Open-ended responses from these teachers about impacts of data-based decisions in their school were only categorized as "informing student placement, assistance and instruction" or "no effective process or impacts." The respondents from this same school were also consistently negative towards the Leadership and Processes in place at the time. There were no open-ended responses from teachers indicating data-based decision-making had created a more "inquiry-minded" school culture. Open-ended responses about the impacts were more student-focused. A few responses inferred a collective responsibility in examining data by using pronouns or nouns such as "we" or "our team" and "our school." Principals, however, reported that one of the impacts they noticed from using data was a school culture where everyone was actively inquiring and using data to engage in conversations about teaching and learning. This discrepancy in perceptions between principals and teachers may be a cultural barrier or a leadership factor. Principals also reported that using data impacted their decisions by giving them objectivity, details, intention, an ability to evaluate programs or systems, and credibility. In summary, teachers' perceptions about the impacts of data-based decisions reflected a belief more focused on student performance outcomes. There was an indication in some schools that data also impacted the school as an organization. 178 Principals demonstrated a more ecological set of beliefs about the impacts of data usage as a conduit for a culture of professional inquiry. The analysis of the data from each school across the four components of data- based decision-making (Sources, Leadership, Processes and Impacts) leads to the third research question. Research Question 3 Research Question 3 asked, "What data-based leadership model (compliance, performance, ecological) predominated at each school, at each school level (primary, middle, high), and within the district (K-12)?" The survey results, interview responses, and archival documents were compiled, analyzed and sorted into six typologies, one for each school. Through a pattern-matching process, these data were used to determine where each school placed along the leadership continuum of compliance-performance-ecological. Figure 3 shows the conceptual model (Hill, 2004) used to make the placement determination. The data analysis revealed that the Highland schools fell somewhere between the performance and ecological models of the continuum. There was evidence that the staff from each school collected a variety of data sources (including perception) on a continual basis to inform school effectiveness. There was evidence of strong leadership aimed at performance increases in specific areas or throughout the entire school-and in some schools, on a broader level and an ongoing and continual basis. Whether formal or 179 informal, processes were in place to increase performance within targeted areas or were in place as part of the school culture. Teachers across the six schools reported impacts that aligned closer to the performance model of the continuum with a focus on performance results. Principals, however, indicated that staff members were more engaged in professional inquiry due to data usage in their schools. Differences between the six schools did impact where they were placed between the performance and ecological models on the continuum. Roosevelt Middle School and Hoover High School were placed closer to the ecological end of the continuum. Both of these schools reported a data focus on transformation of practices at the school site. They communicated in their interviews, SDPs and surveys that they collected a wide variety of data across a broad range of inquiry. Both schools cited surveys of students to inform physical and emotional well- being. At Roosevelt, students gave "voice," through surveys, about bullying experiences, homework and course workload. At Hoover, students were given "voice" via regular student panels, or forums, and surveys on topics like drugs, alcohol, grading practices, and safety. In both schools, formal evaluation of the leadership and the school system was a routine practice conducted by staff, parents and students. Teachers had "voice" through many surveys as well as through their representative committees. Teachers were expected to gather both academic and character-based data on a regular basis and in response to inquiry. Teachers at both schools reported on data impacting 180 decisions about student instruction and assistance as well as decisions about the overall school system, procedures, policies and programs. Taft Elementary and Adams Elementary schools were placed midway between the performance and ecological models on the continuum. While the leaders at both schools operated with an ecological set of beliefs in their perspectives and approaches to data, teachers tended to see data as largely a performance indicator or a targeted aspect for improvement. There were annual systems in place for collecting a variety of data- e.g., student exit surveys and teacher surveys, as well as periodic data collection for targeted inquiry. Data collection still appeared to be largely initiated or required by the principals, and the processes for collecting and reflecting on data seemed unclear to the teachers. In the open-ended question about impacts of data use, teachers at Taft gave the fewest responses of all six schools. Teachers at Adams gave the most responses of the six schools, with a majority of responses directed at student-focused outcomes. There was little recognition from either of the two groups of teachers that data are an integral part of an improving school culture. Cleveland High School and Johnson Middle School were both closer to the performance model of the continuum but appeared to me to have potential to move towards an ecological leadership model. The principals at both schools reported beliefs and actions that aligned with an ecological model, but their teachers reported beliefs and actions within the performance model of the continuum. When asked about data Sources, Processes or Impacts, the teachers at these two schools identified mostly state 181 assessment data as the type of data collected and used to determine decisions about student placement, assistance and instruction and how the school was characterized overall. The principals at both schools reported using survey data to gather information about the school, but teachers did not mention or address these surveys in their comments. This suggests that either these surveys were primarily conducted by the principals or teachers assumed that the use of the surveys had little impact on the organization. There is potential for Cleveland and Johnson to move towards an ecological leadership model due to the district's value of systemic interdependence, teaming and collaboration. Johnson Middle School had the greatest variance from the other five schools in terms of Leadership, Processes, and Impacts. Based on the descriptive and statistical analyses of the survey responses from these teachers, there was a significant negative perception of the school's leaders during the time of this study. There is still potential for the leaders and teachers at Johnson to move towards an ecological model; however, to do so will involve dealing with the issues (apart from data-based decision-making) that appear to exist between them. There was no clear preference for leadership models between levels (elementary, middle, high). Both elementary schools were placed between the performance and ecological models. The two high schools were similar in some ways, but the leaders and teachers at Hoover High School reported more variety in the data they collected. The staff at Hoover appeared to have moved to a transformation of practices at their school 182 site. The two middle schools clearly had the greatest variance of all the schools. These results demonstrate that data-based decision-making in this district was not significantly influenced by size or level of school; rather, the predominant influence seems to be the leaders within the schools themselves. Overall, I would place the district between the performance and ecological leadership models along the continuum. Despite the external mandates for compliance and the community's pride in its high performance, the district superintendents and principals have chosen to establish a vision and make decisions that are not driven by federal or state mandates or by performance expectations. Central administrators expect school development plans to include time and resources devoted to all actions that address all four domains of intelligence: "research and inquiry," wellness, moral and performance character, and "teaming." The superintendents and principals in the district had an aligned set of beliefs about systems thinking and the potential roles that different kinds of data can play within a complex and adaptive organization. Each principal gave examples of data collection from parents, students and staff to inform its teaching and learning as well as the overall health and well-being of the children and the system. The teachers, however, appeared to be less clear about the district's beliefs about data-based decisions and less clear about how they themselves might be making use of different kinds of data within their classroom practices and connected to school and district-level practices. Many of them still defined data as state or local academic assessments used to compare school and district-level performance, judge performance, and meet specific 183 mandated goals. There appeared to be greater receptivity toward and trust in data that were initiated and collected internally at each school site and within each classroom. Teachers did not talk directly about how classroom-level data might be summarized in alternative ways to give different pictures of the school as a whole, of a particular content area, a particular teacher, or of the district as a whole. Summary While this study showed that the individual schools and the district as a whole fell between the performance and ecological leadership model, four of the six schools went beyond these expectations and established processes for collecting a variety of data on an ongoing and continuous basis to inform a broader understanding of how the school was functioning and meeting the needs of students (physically, emotionally, mentally, spiritually). These four schools were closer to the ecological model. The remaining two schools were closer to the performance model, but demonstrated indications and potential of moving along the continuum toward the more sophisticated model. One of the schools, Johnson Middle School, was clearly constrained in moving beyond a performance model by the teachers' perceptions of the leaders at the time of the study. Across the district, the superintendents and principals appeared to have an ecological set of beliefs regarding how and why data should be collected. Many of them cited Wheatley and Senge in their responses to how school teams should think and work 184 together. Many of them talked about professional literature they were reading and discussing with their staff that indicated an ecological understanding and determination for their school staffs to be ecologically-minded. What appears to me to be keeping each of the schools from moving along the continuum is either a predominant set of beliefs still held by most of the teachers that performance data have greater value than other forms of data; or lack of formal data training that could potentially strengthen teachers' skills in using data and their current set of beliefs; or a hesitancy by teachers to replace their experience-based knowledge with "data knowledge" that someone may rank higher in value. In the final chapter, I discuss the conclusions of the study, limitations ofthe study, recommendations for further research, and recommendations for how this district, or other districts, can evolve along the continuum towards an ecological leadership model for data-based decision-making. 185 CHAPTER VI CONCLUSIONS In education, the growing trend in the United States towards increased accountability has meant, among other things, that school personnel are expected to use data to make decisions and report results. This trend toward increased accountability has come with federal and state mandates regarding standardized reporting on academic success, school safety and teacher quality. Most recently, the No Child Left Behind legislation added sanctions for schools that do not meet Adequately Yearly Progress in these same three criteria. These external mandates have had great influence on the work and decisions that school leaders and teachers must make in classrooms and in school improvement plans. It is increasingly important that school personnel use many forms of data effectively to meet school improvement goals. How data are used is determined by the beliefs, attitudes and actions of school leaders and leadership teams. Purpose Statement The purpose of this study was to explore how six schools in a suburb of Portland, Oregon, used the data they collected to make decisions. In this study, I used a model of ecological leadership and data-based decisions developed by Baker and Richards (2004). There is very little research that examines an ecological leadership 186 model in a school system. The first case study that did so was conducted by Hill (2004). Hill examined five elementary schools in North Carolina and their data-based decision- making practices, using Baker and Richards' model. Hill's study was designed to provide schools with information on a data-based leadership model that would "assist them in determining their current status and increasing capacity over time" (Hill, 2004, p. 160). In order to do so, Hill developed a typology to identify Sources, Leadership, Processes and Impacts of data-based decisions in each school. Using the typology allowed this study to measure questions about data use, leadership, processes and impacts along a continuum of compliance-performance-ecological models of data-based leadership. This study replicated Hill's research design and questions. Research Questions The study's research questions replicated those used in Hill's (2004) original study and expanded them to include middle and high schools. Regarding data collection and its use within decision-making, the same sets of beliefs and actions within each school were explored: 1. What were the similarities and differences in how the schools used the data they collect? 2. What patterns emerged that indicated how data are used to inform decision- making? 187 3. What data-based leadership model (compliance, performance, ecological) predominated at each school, at each school level (primary, middle, high), and within the district (K-12)? Summary of the Findings The school district and its six schools in the study were found to be predominantly between the performance and the ecological models along the leadership continuum. Two of the schools were closer to ecological; two were in between, and the last two were closer to performance. Of the two schools closer to the performance model, one school, in particular, demonstrated a pattern of conflicted responses towards the school leaders and how data were used. There did not appear to be any significant impact by school level (elementary, middle, high) on data use. The school and district administrators were pivotal in determining the implementation and evolution of data-based decision-making at their schools and in the district. The principals consistently demonstrated ecological beliefs and behaviors towards data and a perspective that called for "balancing" external mandates with internal needs to make professional, informed, school- and classroom-based decisions. Teachers, on the other hand, were closer to the performance beliefs and behaviors regarding data usage. Teachers reported examples and opinions about data more focused on academic scores and accountability expectations. It was also apparent that teachers looked to their principals, and not the district or the state, for collection, analysis and 188 direction regarding group data that impacted school improvement scores and goals. The teachers identified themselves as the key people for using data for instructional and classroom improvement. A wide variety of data sources was consistently collected at each school. Perception data (surveys) and student-involved data (forums) were examples of data that included the four intelligences-physical, emotional, mental, and spiritual-as defined within the ecological leadership model. School personnel at each school indicated that these data were collected on an ongoing basis for the purposes of school and instructional transformation and not simply for compliance or performance scores. Personnel in the schools did not always have a common language or understanding of formal processes for data-based decisions. It was unclear whether this was due to more informal processes being in place or a lack of formal processes. It may also have been an example of a district that operated with a flexible "system-thinking" behavior pattern that acknowledged and valued "grass-roots" and bidirectional decision- making. This latter opinion was suggested by both school superintendents and referred to by several principals. Teachers mostly seemed puzzled by the question and were typically unable to name the formal decision process in place for the school or the district, even though they deferred to the principal on site, and clearly understood the source of district, state, and federal mandates. Five of the six schools identified positive impacts within their own classrooms and their schools due to data-based decisions. Again, the role of school-based leaders 189 was influential in this determination. The one school where teachers reported a negative impact also consistently reported conflict about how data were used to make decisions in the school or to judge the performance of the school. As Hill (2004) asserted, "If schools are truly 'accountable,' they must consider data other than learning outcomes when making decisions for overall school improvement" (p. 168). Baker and Richards' (2004) ecological leadership model helps school personnel to recognize that focusing only on academic learning outcomes holds schools within the compliance domain. Creating school improvement goals targeting performance moves schools along the continuum toward a greater understanding of how data can be used to make intentional decisions. However, a shared understanding that continuous data collection informs growth creates a professional school culture that is "inquiry minded" and, therefore, ecological. Generalizability of the ecological leadership model to other schools, other school districts, and even other organizations that use data to make decisions, is possible and desirable in the sense of increasing the overall effectiveness of the organization by increasing effective and frequent use of many sources of data in making decisions. Limitations of This Study Internal Validity The strength of this research design was in the use of pattern-matching for the analysis and in the replication and extension of a previous research study. This is 190 considered one of the most desirable techniques for case study analysis (Yin, 2003). Multiple sources of data (archival documents, surveys, interviews) were collected and categorized within a typology developed by by Hill (2004) in a previous study. The typology of indicators of data-based decisions was matched to a conceptual model of data-based schools developed by Hill (2004) based on Baker and Richards' (2004) ecological leadership theory. Limitations to the internal validity included possible researcher bias. The researcher in this study was an administrator in the school district where the study was conducted. Although the researcher's school was not included in the study, the researcher was well-known in the district. This could have impacted the interviews and the teachers' surveys. As a control for the possibility that teachers' knowledge of the researcher might impact their survey responses, the surveys were created using a web-based program and sent electronically to each school's teaching staff. Teachers were assured that their participation was voluntary and their responses were anonymous. No one's name was indicated on returned surveys; only site locations were noted. Participants were assured that no responses would be identifiable to them. The school district and each school were given pseudonyms. Limitations of the survey also included the inability to further explore conflicts that appeared in a school site or differences between participants' beliefs and behaviors because most responses were on a Likert scale instead of open-ended questions or interviews. 191 It was more likely that the administrator interviews could have been affected by researcher bias, since they occurred one on one. As a control for researcher error or bias, each interview was taped and transcribed verbatim. A "negative-case sampling" method was employed to reduce researcher bias. The only other case study ofthis nature was Hill's (2004), and her findings indicated that the schools were between compliance and performance on the continuum. Analysis and findings of the data from that study were compared to the results in the current study. By doing so, the researcher carefully checked that the results in the current study were different than those in Hill's, yet similar in decisions made about particular behaviors and where they were included within the continuum. It was also likely that interviewees could have responded according to what they believed the researcher wished to hear. This threat was addressed by assuring interviewees that their participation was voluntary and that their schools were given pseudonyms. In addition, separately interviewing both the principal and the vice- principal/instructional coordinator at each school reduced the possibility of researcher influence. Neither school administrator knew the questions in advance or knew how the other would answer the questions. Finally, collecting multiple sources of data allowed interview responses to be checked for accuracy against school development plan documentation and teacher survey responses. Recommendations for future controls to internal validity may include using a peer reviewer as an impartial observer of the study. The peer reviewer would be another 192 researcher not involved in the study. The role would involve periodically discussing the methods, results and analysis with the researcher and acting as a "devil's advocate" to challenge the researcher to prove the study's design and findings without bias. External audits could be another way to use outside experts to assess the quality of the study. Multiple data coders could also be used to achieve interrater reliability on coding of interview responses. Finally, the percentage of teachers' responses to the surveys is a limitation to be noted. Each school did have a response rate of higher than 50% with a range of 55% to 71 % across the schools. Having greater response percentages from each school would strengthen the internal validity and reliability of the study. External Validity The primary limitation of qualitative studies is typically noted to be in the limits of generalizability. However, case studies such as this one dealt with analytical generalization; that is, the researcher was "striving to generalize a particular set of results to some broader theory" (Yin, 2003, p. 37). Replications of case studies strengthen analytical generalizations even more. Since this was a replicated study based on Hill's (2004) work and since it included a multiple-embedded case study design (six schools), the results of each school were generalized to the ecological leadership model and thus can be considered more robust than external validity found in most single-case designs. The research study also employed both quantitative and qualitative research 193 methodology, which strengthened the design by collecting diverse types of data that provide a broader understanding of the research problem. Generalizability should be considered across people, settings, and times. A limitation of this study was the nonrandom selection of the district, schools, and participants. The researcher selected the six schools because of their location (three in Oakview, three in Belleview), similar size and level (elementary, middle, high). Teachers were given the option of participating in the study; consequently, between 55% and 71 % of each school's teaching staff participated in the study. Employing a random sampling of schools, increasing the number of schools in the study, and increasing the sample size of the participants from each school would give more external reliability to the study. One ofthe criticisms of replicating Hill's (2004) methodology was the design limitation in selection of participants. Only school administrators and school teachers were selected in the surveys and interviews. Student, parent, other staff, and community perceptions were not included in the study. One of the defining characteristics of an ecological school is evidence that the school gives particular consideration to students' physical, emotional, mental and spiritual well-being. It would seem important for students to be surveyed about the data Sources, Leadership, Processes and Impacts that they perceive are being used in their own schools. These surveys could be appropriately designed for students at middle and high schools. 194 In this case study, and in Hill's (2004) case study, the settings were both suburban school districts. Since all schools are subject to state and federal accountability measures, generalizing this research and using the ecological leadership model for analysis in an urban or rural setting should not be problematic. Understanding how data are used may be impacted by the passage of time. A decade ago, this study may not have seemed as imperative. A decade from now, this study may still be considered helpful to school leaders and educational policymakers, or it may no longer be seen as useful as mandates and practices change. Recommendations for Further Study The strength of a case study with a theoretical model is in its replication. Further studies using Baker and Richards' (2004) ecological leadership model and Hill's (2004) Conceptual Model of Data-Based Schools would strengthen the validity of both models. Conducting the study in more diverse locations, such as urban and rural areas, is recommended. This was the first case study that examined ecological leadership and data-based decision-making in middle and high schools. Further studies with K-12 student populations are also recommended with a possible exploration of formative and summative data sources. Formative data sources (those that occur throughout the year to inform instruction on a more regular basis) tend to be less politically and publicly sensitive than summative data sources. 195 As mentioned earlier, students' perceptions (from the middle and high schools) of data-based decisions may be included in further research. The definition of an ecological school is one that nurtures its students' well-being and integrates data collection into its school culture. Ultimately, the students should be collecting data about their own learning as well as data that pertain to the effectiveness of their school. Future studies should include students' perceptions as well as parents' and school board members' perceptions. These groups are integral stakeholders and contributors to decisions that are made in schools and districts. Since this was a replicated study, Hill's (2004) Conceptual Model of Data- Driven Schools was used with its original integrity and design. At times in the study, this proved to be confusing as to whether schools were between models or whether they were actually between "stages." A recommendation to reconceptualize Hill's model and use "stages" might allow the school or future researcher to point directly to what the school can do next to move completely to the next stage. Thus, the model becomes stages within a hierarchy of sorts, where if an earlier stage is not completely present or goes away at some point in the future, the school reverts back to the thinking or actions of the previous stage and must contend with its practices in order to move forward agam. Lastly, the findings of this study point to expanding the current conceptualization of an ecological decision process to include the levels of organization, structure, and personnel, as seen in the extended Bronfenbrenner (1979) model. For 196 example, using the different units of analysis implicit in the levels helped this research point to where a particular site employed the ecological model at one level of the organization, but not at another level. This interpretive tool could assist researchers and practitioners in directly pinpointing where change needs to occur. The ecological theory appears to be a robust theoretical framework for analysis of school practices at the micro- to the macrolevels. Recommendations for Practice Four of the six principals interviewed asked what would happen to the results of the study and whether their school would have the opportunity to examine the group data in order to learn from the findings. This question suggested that school leaders would find the ecological leadership model helpful in examining current practices. They might share the findings with their staff in order to stimulate discussion of how to improve data practices. Many of the teachers and some of the administrators professed hesitancy in using some types of data appropriately. Professional development that strives to explain how data can be collected, analyzed and used to inform practice and improve schools could be planned in school districts with authentic hands-on data and multiple sessions to allow participants to practice, share, coach one another and engage in professional inquiry. 197 One of the key findings was that teachers found their own data most meaningful. This is supported by prior research and adds to the recommendations that more training is a good idea. This also points to finding a way to include individual assessment practices in overall shared school thinking and brainstorming using data. Hill's (2004) Conceptual Model of Data-Based Schools can be introduced to school leaders to assist in developing a common understanding of how data-based decisions can be made within an ecological leadership model. In the schools where teachers felt judged or conflicted about how performance data were used, this model could be the catalyst for reflective and candid conversations about how to move the beliefs and attitudes along the continuum. A common language about data-based decisions and ecological schools might develop and teachers might become more alert to opportunities for collecting data themselves and collaborating with school leaders in making data-based decisions. As school personnel increase their own knowledge and capacity for using data to inform decisions, school improvement occurs (Bernhardt, 1998; Bettesworth, 2006). Additionally, as school personnel increase their understanding of how to use data in an ecological leadership capacity, data-based decisions could be made not only for compliance and performance reasons, but also for the ongoing transformation of the school organization. 198 Summary Educational policy advisor Michael Barber (2002) described the 2000s as the era of "informed professional judgment" (p. 11). He proposed that control of education be returned to educators, but with requirements that they be informed professionals who use evidence and research, along with expertise, to justify and support decisions. The ecological leadership model for data-based decision-making provides a theoretical framework and recommendations for practices that support professionals' use of evidence and research to justify and support decisions. When asked about the future use of data that an administrator might like to see, one principal replied, There's so much data out there that quoting data doesn't do anything productive. On a national level ... they just throw data back and forth. They don't engage in dialogue. What's the next step? How are we evolving so that we make important decisions? I guess it goes back to dialogue. Maybe that's the evolution of data use. (HP04) We are in an era of "informed judgment" that also demands strong external accountability. The research questions in this study can be used to engage educators and policymakers in a dialogue that helps these schools-and all schools---evolve in their use of data for ongoing and continuous school improvement. APPENDIX A HIGHLAND SCHOOL DISTRICT SDP GUIDELINES 199 200 Highland School District September 19, 2006 To: Principals and Directors From: (Superintendents) Re: The School Development Plan (content, form, format, timeline) The School and Department Development Plans We will be writing School Development Plans and Department Development Plans again this year. We want them to be drafted in October and completed by the end of November. The school board study session on November 13,2006, will focus on School and Department Development Plans. Leadership team should plan to be in attendance. We want to use this planning process to add coherence and strength to the district initiatives by coordinating with each school and department. ",. J. .~ r'" '1' • ' f/~ ~mh.8 ,. 201 Research and Inquiry: Thoughtful inquiry and skillful study. Gathering and using information to pose powerful questions, analyze, synthesize and present new knowledge. Creative and critical thinking and the development of performance character. Teaming: Collective responsibility for the success of each team and each member. A positive peer culture leading to elevated thinking and learning. Development of individual and group performance character. Interdependence: Positive and meaningful connections between ideas, skills and disciplines. Development of moral character through consideration of complexity. Examination of the impacts and implications of action and interactions. Wellness: Positive and meaningfulleaming opportunities that lead each student toward becoming a self-disciplined person who pursues a healthy lifestyle. Literacy: Significant and meaningful connections between ideas and information. Developing the skills, knowledge, and disposition of a thinker, a reader and a writer. Becoming a discerning user of ideas and information. School Development Plans The school development planning process will help build collective responsibility for the mission of the School and District. The Plans will be infused with the notions of Performance Character and Moral Character as a means to improving student leaming. The plans will demonstrate the school's contributions to action on District initiatives. Department Development Plans This year again, each of the support services departments will also prepare Department Development Plans. These plans follow the basic framework for School Development Plans and then adjust as needed for the unique needs of the departments. We want to infuse each Department Plan with the notions of Performance Character and Moral Character as a touchstone for producing our highest quality work. Content of Development Plans We use these plans as part of our update to the School Board. We use them with the State of Oregon in our Consolidated District Improvement Plan. Since they are public documents, it will be necessary that each School Development Plan has similar core elements. The School Development Plans are also required attachments when we apply for grants. Each plan should clearly include these sections: 1. Introduction: narrative description of the school, its culture and its context. 2. Report on the progress the school made on the goals from the past year. 202 3. Analysis of data (student performance, survey, student work samples, etc.) that leads you to the new and continuing goals for this year. 4. New or continuing school goals linked to board/district goals and performance indicators. How will you know you are making progress? 5. Action Plan: a list of strategies and actions planned to approach the goals. 6. Support and Staff Development: identification of resources you will use to support the development work. 7. Student Demographics: a description and demographic data about the students, including student achievement data over time. 8. Staff Demographics: a description of staff demographics, education, experience, highly qualified teacher status, and past participation in professional development. Optional attachments: Any other school documents (data, survey results, themes, mission statements, philosophy statements, essential questions) to help paint a picture of the thinking and discussion of school development at your school. Narrative Introduction to the Plan Each principal will want to personalize the introduction to the School Development Plan for her/his own school, using a written style that best communicates the energy, commitment, and uniqueness of that school. At the same time, we will all include common elements so the plans represent our common attention to things that are important. The common process will recognize, too, the place of each school and department within the larger school community. Accountability for Student Achievement Development Plans describe the state of the school and the accomplishments to date. Most significantly the plans express a consolidated commitment of the people at the school to be accountable for high levels of student achievement. The Development Plans are intentional expressions of our goals and action plans. Building Upon the Past-Year Plan Each school principal and leadership team has been preparing an SIP-SDP for each of the past several years. Those documents provide the baseline for this plan. The basic School Development Plan framework can be used year to year with appropriate updating. Goals of any significance are usually multiyear goals. It is reasonable to continue a goal from the past year, noting continued progress and new emphases. District Systems Link The School Development Plan is an essential element of the district as a system, the district as a complex, dynamic whole. The School and Department Development Plans will be shared with the school board in written form and in a discussion forum. The 203 systems question to think through is ... How is our school using our learning from our past performance, from questions and experiences, andfrom the data to inform the improvement agenda? Building Upon the District Consolidated Improvement Plan Each school's School Development Plan is a companion to the Consolidated District Improvement Plan. The two together are required legal documents that must be on file at each school, at the district office, and available to the public. Our latest updated plan was registered with the Oregon Department of Education in October 2005. The Continuing School Improvement Discussion Each school leadership team will continue to meet periodically with (Superintendents) in School Development Meetings. We want to keep our connection strong. The next meetings will be in OctoberlNovember 2006. These meetings will provide a time to look at the School Development Plan together. Please choose a time to meet and note it on the sign-up sheet at Leadership Forum, September 28, 2006. Energizing the Effective Effort of our Collaborative Learning Team School development and continuous learning is at the heart of our leadership work. One purpose for the School Development Plans and School Development meetings is to provide sacred time to think together, to understand each school, and to give each school leader a time to enroll the district team in the service of your school agenda. The Challenge to Excellence and Ethics Fragmented attention to multiple agendas is confusing and counterproductive. We are looking for more clarity and coherence in purpose as we pursue school development. We understand more about the link between excellence and ethics after our work this past year. We know that student achievement and high quality learning will flourish in a healthy school and classroom culture. We are exploring the notions of Performance Character and Moral Character with the Eight Strengths of Character framework proposed by Tom and Matt in Good and Smart Schools. With this link we are asking how we will lead our teachers to examine the language, systems, and guidelines in our schools. How will we share power with students? How will we develop respectful schools and classrooms? How will we harness the power of collective responsibility and positive peer culture to increase the quality of learning and achieve success for all learners? 204 Highland School District September 19,2006 To: Site Councils, Teachers and Principals Re: Developing Excellence and Ethics Linking School Development Plans With Professional Growth Plans Each school prepares an annual update of the School Development Plan, linking the goals of the school to the goals of the school district. The School Development Plan is designed with the collaborative thinking of the site council and the staff as a whole under the guidance of the principal. As always, the principal holds the primary responsibility for the school and for the school improvement agenda. The principal engages the staff and the site council in collaborative thinking about the school, celebrating the past accomplishments, highlighting present conditions, examining data, and gaining consensus about the future aspirations for student learning. The theme of Excellence and Ethics compels schools to examine the link between academic excellence and character education, planning actions to strengthen the school culture for learning. The people at each school are guided by the district vision and specific agendas encouraged by the district. This year each staff, site council and principal are encouraged to define school goals that contribute to the three common district directions outlined on the following page. Additional goals may be added where needed. Teachers develop individual Professional Growth Plans with goals to support the school and district directions. Guidelines for teachers Professional Growth Plans are outlined in the Educators' Handbook for Professional Growth. These plans define professional development activities selected by the teacher and designed to increase the knowledge, skill, and effectiveness of the teacher. The School Development Plans and each teacher's Professional Growth Plan are significant elements contributing to an aggressive, sharply focused, and coherent school improvement effort. All these documents record the systems and processes for improvement that comprise the District Systems Model. 205 Highland School District Common Directions for School Development Goals September 2006 Personalized Education Educating Individual Children in a Community ofExcellence and Ethics 1. Personal and Academic Excellence: Performance Character and Student Performance • All children and adults engaged in significant and meaningful learning. • All children showing competence in reading, writing, science, and mathematics growing toward, meeting and exceeding agreed-upon standards (ClM and district benchmark standards in reading, writing, speaking, science and mathematics). • All children growing in strength as learners through the development of performance character. The School Development Plan should have one goal directed toward improving student achievement in reading, writing, science and mathematics. 2. Educating the Whole Child: Ethics and Excellence • Working toward a higher level of moral character; respect, responsibility, honesty, integrity, kindness, compassion and courage for all staff and students. • Creating a community for learning that demonstrates collective responsibility for the success of each member of the community. • Creating the conditions for understanding and developing performance character in each member of the community. The School Development Plan should have one goal directed toward developing excellence and ethics within each classroom, the school, and surrounding the school. 3. Learning through Research and Inquiry, Teaming, and Interdependence • Designing studies for inquiry around significant conceptual and essential questions. ----------- ---------- ---- 206 • Helping students gain knowledge, create knowledge, and apply knowledge in an academic context. • Using technologies to create the conditions for students to construct meaning and meet and exceed performance expectations. The School Development Plan should have one goal that recognizes the challenge to implement a K-12 research and inquiry curriculum. APPENDIXB ADMINISTRATOR UNSTRUCTURED INTERVIEW PROTOCOL 207 208 ADMINISTRATOR UNSTRUCTURED INTERVIEW PROTOCOL Superintendent, Asst. Superintendent, Principals, Asst. Principals, Instr. Coordinators Mental Model 1. What do you see as the overriding goal or purpose for collecting data and information? 2. Who collects the data and information? 3. What factors determine how the data and information are analyzed? 4. Who is responsible for analyzing the information collected? 5. Once data are analyzed, how is this information used? 6. What outcomes occur as the result of collecting and analyzing data and information? Sources 1. What role do you (the administrator) assume in utilizing data? 2. How do you (the administrator) facilitate the staff in using data and information? 3. What level or levels of support come from the District and the State? 4. What various forms and types of data are used in the school? 5. Are there forms or types of data that you have not used in the past that you would like to access and utilize in the future? Why? How would you go about it? Processes 1. How are decisions made within the school? 2. How often is data collected and used? 3. How do staff members collaborate and input on decisions? 4. How is information shared and communicated among the staff? 5. What opportunities does the staff have for dialogue and personal reflection about information that is collected? 6. What types of professional development opportunities has the staff participated in on data use? Impacts on Decision-Making 1. Please share the impacts that you (the administrator) see as the result of using data to make decisions. 2. What future impacts would you (the administrator) like to see in terms of data use? 209 APPENDIXC CODED TEACHER SURVEY INSTRUMENT 210 211 CODED TEACHER SURVEY INSTRUMENT Teacher Information on Sources, Leadership, Processes and Impacts of Data Use in Schools 1. DEMOGRAPHICS Directions: Please complete the following: 1 What grade level or area do you teach? _ 2 How many years have you taught at this current level (elementary, middle, high)? _ 3. How many years have you taught in this district? 4. Age: __ 21-3o __ 31-40 __ 41-50 __ 51-60 __ 61+ 5. Since your initial teacher certificate was granted, have you completed a graduate level course, workshop or seminar in statistics or data use and interpretation? Yes No 6. Ifyour answer is "yes," please list or describe the course or courses, who offered the course, and the approximate date(s) of the course or courses. Title of Course Offered by Date 212 II. LEADERSHIP Directions: Read each of the following and respond with A = Agree, D = Disagree, or U = Undecided 7. My principal provides strong leadership and support in using data and information to influence and/or improve my classroom decisions. A D U 8. My principal provides strong leadership and support in using data and information to solve problems school-wide. A D U 9. My principal models the use of data and encourages the staffto use data to make decisions that improve our school. A D U 10. Teachers on my staff are proficient at accessing and using data and information to solve problems in their classrooms. A D U 11. Our school district does not provide data and other information and data support services to my school. A D u 12. Our school district provides data and other information and data support services that assist me in meeting the needs of my students. A D u 13. The State of Oregon does not provide data and information that help me and my school make decisions for school improvement. A D u III. VARIABLES Directions: Please circle all of the answers that apply. 14. My school regularly uses the following data and infonnation to make decisions about school improvement. 213 Enrollment Attendance Ethnicity Gender Free and Reduced % Gifted and Talented Lunch students % ofIEP Ethnic breakdown Discipline students of IEP students Referrals/Action Graduation Drop out rate Advanced courses Rate offered ELL Students Mobility Rate Parent Surveys (# students moving in and out of school) Student Surveys Staff Surveys Community Surveys State Assessment CIM/CAM data Analysis of student Scores work Student Work Teacher-Made Authentic Samples Assessments Assessments Teacher Summer School DIBELS Observations Enrollment scores 214 15. I regularly use the following types of data and information to make decisions about my students and my classroom. Enrollment Attendance Ethnicity Gender Free and Reduced % Gifted and Talented Lunch students % ofIEP Ethnic breakdown Discipline students of IEP students Referrals/Action Graduation Drop out rate Advanced courses Rate offered ELL Students Mobility Rate Parent Surveys (# students moving in and out of school) Student Surveys Staff Surveys Community Surveys State Assessment elM/CAM data Analysis of student Scores work Student Work Teacher-Made Authentic Samples Assessments Assessments Teacher Summer School DIBELS Observations Enrollment scores Directions: Please check the most appropriate answer. 16. The number of staff members regularly involved in decision-making in the school is 1-25% 26 - 50% 51 -75% 76 -100% 215 Directions: Answer the following with A = Agree, D = Disagree, or U = Undecided 17. I have input into the decisions in the school. A D U 18. Decisions are not based upon multiple sources of data. A D U 19. Our "School Development Plan" process did not involve representatives from all stakeholder groups in our school. IV. PROCESSES A D U 20. Our school has a formal process in place for how decisions are made. A D U 21. Our process for how decisions are made includes representation from all of our school's stakeholders. A D U 22. Our school does not have a formal process in place for collecting information from many sources of data. A D U 23. Our school has a formal process in place for frequent collection of data and information. A D U 216 24. Our school has a formal process in place for communicating and disseminating data and information throughout the school. A D U 25. Our school has a formal process in place for staff members to learn and increase their skills and capacity to deal with data and information. A D U 26. Our school does not have a professional development process for staff members to increase their proficiency and expertise in using data. A D U V. IMPACTS ON DECISION-MAKING 27. The active interaction of various forms of data processes for using data has a strong impact on decision-making in my school. A D U 28. I do not regularly use data to make decisions about teaching and learning. A D U 29. Using data has had a positive impact on my teaching performance. A D U 30. As I have used data, my knowledge and expertise in using data have improved. A D U 31. The use of student test data in my school has produced positive outcomes as evidenced by our state assessment scores. A D U 217 32. My school has not been strengthened as an organization by its use of data in decision-making. A D u 33. What specific impacts of data-based decision making have you observed in your school? APPENDIXD TYPOLOGY OF THE COMPONENTS OF DATA-BASED SCHOOLS, HILL (2004), REVISED 218 219 Typology of the Components of Data-Driven Schools, Hill (2004), Revised Enrolhnent School Development Plan (SDP) SchoolMaster (school information system) Attendance Average Daily Attendance 0-177 days (%) School State Report Card Discipline Referrals Number and % None - % of SchoolMaster and Types Suspensions and Suspensions and School State Report Expulsions Expulsions Card Ethnicity Number and % Students in White Asian, Hispanic, SchoolMaster Ethnic Categories Black, Other SDP Exceptional Children Number and % Students SchoolMaster identified as Gifted and SDP Talented; Ethnic Breakdown and % of students identified Gender Number and % Students Male or Female SchoolMaster Male and Female SDP Socioeconomic Number and % Students on No Assistance to SchoolMaster Status Free and Reduced Lunch Partial Assistance to SDP Free Language Number and % of ELL No English to Some SchoolMaster Proficiency Students English to Proficient SDP (Woodcock Munoz Levels I - V) Mobility Rate Number and % of Students oMoves to # Moves SchoolMaster Moving in or out of SDP Schools o- certain number per School Site surveys; year; Research Study Disagree Strongly to Survey; SDP Strongly Agree Parents Parent surveys o- certain number per School Site surveys; year; District surveys; Likert Scale or Open- SDP Ended questions Students Student surveys o- certain number per School Site surveys; year; SDP Likert Scale or Open- Ended questions 220 Community Number of Volunteers, o- certain number per District surveys; Business Partners, year; SDP Community Agencies; Support for the Involvement of Schools to No Support PTSAIPTSOs for the Schools Components How measured Range Sources ........... ,./'....//, , y ....•. .. "."y ..,'/. / ... .~ , .. Shared and Number and % of No Involvement to Research Study Collaborative Stakeholders Involved in Very Involved Survey; Decision-Making Decision-Making Research Study Interviews; SDP Multiple Sources of Number of Resources Only a Few Sources to Research Study Data Accessible and Utilized Multiple Sources of Survey; Research Both Quantitative and Study Interviews; Qualitative Data SDP; SchoolMaster School Development Plan is Implemented and Required Research Study Plan Involving All the Implementation is Survey; Research Stakeholders Documented by the District Study Interviews; and School Board SDP Stiide]j:tLeam'iIlg ••'./...... '/"'..••.•,... .•,•.••,.•..... < .•,.••.'/'. U. ...".,.' .··,· •.·,·· ••·/·'·y··'·'·m••• Standardized State RlT score translated into Does not meet, Meets, Oregon State test scores in category of (DNM, M, E); Exceeds; 1 - 99th Assessment Data Reading, Math, and Percentile score percentile Science (Grades 3 - 10) DIBELS (Dynamic Oral Reading Fluency o- personal best DIBELS assessment Indicators of Basic rate/accuracy data Early Literacy Skills) assessment scores State Work Samples Work Samples scored by o- 6 point rubric Work samples collected and teachers using state- measure produced in class and assessed by Teachers designed point-based collected by teacher (Grades 3 - 10) rubric State Writing Writing sample scored by o- 6 point rubric Writing scores Proficiency panel of teachers using 6 measure returned to schools Assessment Traits point-based rubric from the State (Grade 4) Science Proficiency Science sample scored by Science scores Assessment in panel of teachers using a returned to schools Middle School point-based rubric from the State CIM/CAM Student achievement of Does Not Meet, Meets Scores from the State achievement passing Grade 10 standards or Exceeds in all assessed content areas Graduation rate Completed requirements Drop out - Does not SchoolMaster; SDP for graduation as per state graduate - Graduates requirements 221 Teacher Teacher report of observed Observable Progress to Research Study Observations and and Documented Results Progress Not Observed Survey Analysis of Student Work LEADERSHIP Components How measured Range Sources Mental Model ... ..iii iii I! Compliance Focus, Analysis, Research Study Evaluation (Baker and Survey; Research Richard's Model) Study Interviews; Examination of Processes Performance Focus, Analysis, Research Study Evaluation (Baker and Survey; Research Richard's Model) Study Interviews; Examination of Processes Ecological Focus, Analysis, Research Study Evaluation (Baker and Survey; Research Richard's Model) Study Interviews; Examination of Processes Components How measured Range Sources .. r· ............ .. ' .............,. ' ....,... ··,·i .. ,...,.... ,.... i, ....'...... i Compliance Data Use Complies with Research Study Standards Survey; Research Study Interviews; Examination of Processes and SDPs Performance Data Use Increased Performance Research Study Survey; Research Study Interviews; Examination of Processes and SDPs Ecological Data Use Improvement through Research Study Organizational Survey; Research Learning Study Interviews; Examination of Processes, and SDPs hiT;;;'';'A, .. i .. i".·i..i.:' ....... i' i.I. ··i ....·, ..,'..i,iiiHi'Ii"'''''''' Accountability Reported by multiple Not observed to Research Study Standards sources observed as reported Survey; Research Study Interviews; SDPs Communication Reported by multiple Not observed to Research Study sources observed as reported Survey; Research Study Interviews; SDPs 222 Dissemination of Reported by multiple Not observed to Research Study Data sources observed as reported Survey; Research Study Interviews; SDPs Shared Decision- Reported by multiple Not observed to Research Study Making sources observed as reported Survey; Research Study Interviews; SDPs Local, District, and Reported by multiple Not observed to Research Study State Support sources observed as reported Survey; Research Study Interviews; SDPs Principal Support Reported by multiple Not observed to Research Study and Advocacy sources observed as reported Survey; Research Study Interviews; SDPs Aligned Curriculum Reported by multiple Not observed to Research Study and Assessment sources observed as reported Survey; Research Study Interviews; SDPs PROCESSES Components How measured Range Sources Process for Frequent Frequency of Data Infrequent Data from Research Study Data Collection from Collection; Number and Few Sources to Survey; Research Many Sources Types of Sources Frequent Collection Study Interviews; from Multiple Sources SDPs Process for Formalized No Process to a Well- Research Study Communication and Communication and Established Process for Survey; Research Dissemination of Dissemination Channels Communication and Study Interviews; Information Dissemination SDPs Process for Formalized Framework for No Process for Research Study Collaborative and Collaboration and Input; Collaboration and Survey; Research Shared Input Frequency of the Input and Input to a Well- Study Interviews; Collaboration; and Amount established and SDPs of the Input and functional Process for Collaboration Input Process for Building Formalized Process in No Process to a Well- Research Study and Increasing place, and Evidence of established process to Survey; Research Organizational Increased capacity Building and Study Interviews; Capacity documented over time- Increasing capacity SDPs both individuals and organization Professional Offering and Frequency of No Professional Research Study Development Professional Development Development to Much Survey; Research Process in Data Use and Number of Individuals Professional Study Interviews; Participating Development SDPs 223 IMPACTS ON DECISION-MAKING Components How measured Range Sources Active Interaction of Data Sources and Process No Interaction to High Research Study Sources and Are in Place and Interaction Survey; Research Processes Continuously Interact Study Interviews; SDPs Individual and Individual and No Change and Research Study Organizational Organizational Qualitative Improvement to Survey; Research Learning Occur and Quantitative Data Significant Change and Study Interviews; Documenting Planned Growth SDPs Change and Growth Over Time Individuals, Performance No noted growth of Research Study Subgroups, and Documentation; Individuals, Survey; Research Organization are Observation of the Subgroups, or the Study Interviews; Strengthened Organization; Individual Organization to Much SDPs through Participation and Collective Growth and Growth in the Data-Driven Development of Expertise Process 224 REFERENCES Annenberg Institute for School Reform. (1998, May). Using datafor school improvement (Report on the Second Practitioners' Conference for Annenberg Challenge Sites). Providence, RI: Author. Armstrong, J., & Anthes, K. (2001). How data can help: Putting information to work to raise student achievement. American School Board Journal, 188(11), 38-41. Baker, B., & Richards, C. (2004). The ecology ofeducational systems. Upper Saddle River, NJ: Pearson Education. Barber, M. (2002, April). From good to great: Large-scale reform in England. Paper presented at Futures of Education Conference, Zurich, Switzerland. Bernhardt, V. L. (1998). Data analysisfor comprehensive schoolwide improvement. Larchmont, NY: Eye on Education. Bernhardt, V. L. (2004). Using data to improve student learning in middle schools. New York: Eye on Education. Bettesworth, L. (2006). Administrators' use ofdata to guide decision-making. Paper presented at the annual meeting of the American Educational Research Association, San Francisco. Blink, R. (2007). Data-driven instructional leadership. New York: Eye on Education. Bowers, C. A. (1995). Educatingfor an ecologically sustainable culture. Albany, NY: State University of New York Press. Bronfenbrenner, U. (1979). The ecology ofhuman development: Experiments by nature and design. Cambridge: Harvard University Press. Bronfenbrenner, u., & Morris, P. A. (1998). The ecology of developmental processes. In W. Damon & R. M. Lerner (Eds.), Handbook ofchildpsychology: Vol. 1. Theory (5th ed., pp. 993-1028). New York: Wiley. 225 Brunner, C., Fasca, C., Heinze, 1., Honey, M., Light, D., Mandinach, E., & Wexler, D. (2005). Linking data and learning: The grow network study. Journal of Educationfor Students Placed at Risk, 10(3),241-267. Chen, E., Heritage, M., & Lee, J. (2005). Identifying and monitoring students' learning needs with technology. Journal ofEducation for Students Placed at Risk, 10(3), 309-332. Chrispeels, 1. H. (1992). Purposeful restructuring: Creating a culture for learning and achievement in elementary schools. Washington, DC: Falmer Press. Chrispeels, J. H., Brown, J. H., & Castillo, S. (2000). School leadership teams: Factors that influence their development and effectiveness. Advances in Research and Theories o.fSchool Management and Educational Policy, 4,39-73. Creswell, 1. W. (2003). Research design: Qualitative, quantitative and mixed methods approaches (2nd ed.). Thousand Oaks, CA: Sage. Dunlap, D., Garrison, A., Hernandez., P., & Clott, A. (2008). A theoreticalframework for education based on Bronfenbrenner's model. A paper presented at the annual meeting of the American Education Research Association, New York. Earl, L. (2005). From accounting to accountability: Harnessing data for school improvement. Retrieved December 18,2006, from http://www.acer.edu.auJ workshops/documentslEarl.pdf Earl, L., & Fullan, M. (2003). Using data in leadership for learning. Cambridge Journal ofEducation, 33,383-394. Earl, L., & Katz, S. (2006). Leading schools in a data-rich world: Harnessing data for school improvement. Thousand Oaks, CA: Corwin Press. Education Commission of the States. (2000). Informing practices and improving results with data-driven decisions (ECS Issue Paper). Retrieved October 30, 2006, from www.ecs.org Elementary and Secondary Education Act. (1994). Retrieved March 9,2008, from http://www.ed.gov/legislation/ESEA/toc.html Feldman, J., & Tung, R. (2001, Summer). Using data-based inquiry and decision making to improve instruction. ERS Spectrum, 10-19. 226 Goodlad, J. 1. (2001). An ecological version of accountability. Theory Into practice, 17(5),308-315. Hill, M. G. (2004). A case study in data-driven school improvement. Unpublished doctoral dissertation, Teachers College, New York. Holcomb, E. L. (1999). Getting excited about data: How to combine people, passion, and proof Thousand Oaks, CA: Corwin Press. Huffman, D., & Kalnin, 1. (2003). Collaborative inquiry to make data-based decisions in schools. Teaching and Teacher Education, 19,569-580. Ingram, D., Louis, K., & Schroeder, R. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258-1287. Irvin, L., Homer, R., Ingram, K., Todd, A, Sugai, G., Sampson, N., & Boland, J. (2006). Using office discipline referral data for decision making about student behavior in elementary and middle schools: An empirical evaluation of validity. Journal ofPositive Behavior Interventions, 8(1), 10-23. Katz, S., Sutherland, S., & Earl, L. (2005). Toward an evaluation habit of mind: Mapping the journey. Teachers College Records, 107(10),2326-2350. Kazdin, A E. (Ed). (2000). Encyclopedia ofpsychology (Vol. 3). New York: American Psychological Association and Oxford University Press. Kelly, J. G., Ryan, A M., Altman, B. E., & Stelzner, S. P. (2000). Understanding and changing social systems: An ecological view. In 1. Rappaport & E. Seidman (Eds.), Handbook ofcommunity psychology (pp. 133-159). New York: Plenum. Lachat, M., & Smith, S. (2005). Practices that support data use in urban high schools. Journal ofEducation for Students Placed at Risk, 10(3),333-349. Lambert, L. (2003). Leadership capacity for lasting school improvement. Alexandria, VA: Association for Supervision and Curriculum Development. McComb, J. (2002). Oregon education reform issue brief, March 2002. Retrieved December 15, 2006, from www.leg.state.oLus Miles, M., & Huberman, A M. (1994). An expanded sourcebook: Qualitative data analysis (2nd ed.). Thousand Oaks: Sage. 227 Murnane, R., Sharkey, N., & Boudett, K. (2005). Using student-assessment results to improve instruction: Lessons from a workshop. Journal ofEducation for Students Placed at Risk, 10(3),269-280. National Association of Elementary School Principals. (2002). Leading learning communities: Standards for what principals should know and be able to do. Alexandria, VA: Author. National Commission on Excellence in Education. (1983). A nation at risk. Washington, DC: Author. National Educational Goals Panel. (1996). The national educational goals report (Executive Summary). Washington, DC: Author. Nichols, B., & Singer, K. (2000). Developing data mentors. Educational Leadership, 57(5),34-37. No Child Left Behind Act, 20 U.S.C. (2001). Retrieved July 30, 2007, from http://www.ed.gov/nclb/accountability/ayp/testing.html Noyce, P., Perda, D., & Traver, R. (2000). Creating data-driven schools. Educational Leadership, 57(3), 52-58. Olson, L. (1999). Shining a spotlight on results [Electronic version]. Quality Counts '99, 18(17). Retrieved December 20, 2006, from www.edweek.org Oregon Education Act for the 21st Century. (1995). Retrieved March 9,2008, from http://www.leg.state.or.us/ors/329.html Oregon Quality Education Commission. (2004). The Quality Education Model Commission report. Retrieved December 18, 2006, from http://www.ode.state.or.us/initiatives/qualityed/qecreport04.pdf Petrides, L., & Guiney, S. Z. (2002). Knowledge management for school leaders: An ecological framework for thinking schools. Teachers College Record, 104(8), 1702-1717. Protheroe, N. (2001). Improving teaching and learning with data-based decisions: Asking the right questions and acting on the answers [Electronic version]. Educational Research Service Spectrum, 19(3). Retrieved July 19,2006, from http://www.ers.org/spectrumlsumOla.htm 228 Rallis, S., & MacMullen, M. (2000). Inquiry-minded schools: Opening doors for accountability. Phi Delta Kappan, 6, 766-773. Reeves, D. (2006). The learning leader: How to focus school improvement for better results. Alexandria, VA: Association for Supervision and Curriculum Development. Salpeter, J. (2004). Data: Mining with a mission. Technology and Learning, 24(8), 30- 37. Schmoker, M. (1996). Results: The key to continuous school improvement. Alexandria, VA: Association for Supervision and Curriculum Development. Senge, P., (1990). The fifth discipline: The art andpractice ofthe learning organization. New York: Doubleday. Streifer, P. (2002). Using data to make better educational decisions. Lanham, MD: Scarecrow Press. Streifer, P., & Schumann, 1. (2005). Using data mining to identify actionable information: Breaking new ground in data-driven decision making. Journal of Educationfor Students Placed at Risk, 10(3),281-293. Tyack, D., & Cuban, L. (1995). Tinkering toward utopia. Cambridge, MA: Harvard University Press. U. S. Department of Education. (2008). Policy statement on AYP. Retrieved January 1, 2008, from http://www.ed.gov/policy/elsec/guid/states/index.htm Walpole, S., Justice, L., & Invemizzi, M. (2004). Closing the gap between research and practice: Case study of school-wide literacy reform. Reading and Writing Quarterly, 20,261-283. Wayman, J. (2005). Involving teachers in data-driven decision making: Using computer data systems to support teacher inquiry and reflection. Journal ofEducation for Students Placed at Risk, 10(3), 295-308. Wheatley, M., & Kellner-Rogers, M. (2003). A simpler way. San Francisco: Berrett- Koehler. Wielkiewicz, R. M., & Stelzner, S. P. (2005). An ecological perspective on leadership theory, research, and practice. Review ofGeneral Psychology, 9(4),326-341. Yin, R. (2003). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA: Sage. Zemelman, S., Daniels, H., & Hyde, A. (2005). Best practice: Today's standards for teaching and learning in America's schools (3rd ed.). Portsmouth, NH: Heinemann. 229