For over 20 years, the University of Chicago Consortium on School Research (CCSR) has administered surveys to teachers students and principals in Chicago Public Schools. Because so many teachers and principals have made it a priority to fill out past surveys, we have learned a tremendous amount about the factors that lead to better outcomes for Chicago students.
The survey can play an important role in an individual school’s improvement efforts. Using results from past surveys, CCSR has demonstrated that there are five essential supports of successful schools. Schools that have strong performance in the majority of these elements are 10 times more likely to improve student learning and much less likely to stagnate than are those who are weak.
This survey was conducted by CCSR in collaboration with CPS. As is the case with all surveys, this survey used at least three stages to gather information and report results:
- Question development
- Scoring survey results
Since the early 1990s, CCSR has developed specific processes to ensure accurate and reliable information is available on schools in Chicago. This year, these processes resulted in responses from over 150,000 students and over 19,000 teachers with data reported back to over 640 schools. The sections below detail each of the stages in the survey.
The primary set of questions on the survey is derived from the research on the 5Essentials. All survey questions have been pretested. There are roughly 80 questions across the student and teacher survey that focus on the 5Essentials. The remaining questions are related to on-going research at CCSR and at CPS.
Students and teachers. All students in grades 6-12 and all teachers were asked to participate. In February, we estimated that there were 191,128 students who could participate.
We estimated that 23,526 teachers were eligible to participate. We received a list from CPS of all teachers in schools that are not charters. We attempted to contact all 101 charter schools to request a list of teachers. Of those 101 schools, 78 supplied this information.
After receiving some additional rosters from charter schools and including new teachers, we estimate that there were 23,880 teachers during administration. As with the students, response rates are based on the updated information.
Timing. The student and teacher surveys were administered over the web from March 17 to April 25 within each school during school hours. The survey took, on average, about 30 minutes to complete for each student in the district. The teacher survey took, on average, about 30 minutes to complete for all schools in the district.
Efforts to increase response rates before administration. Principals were asked to nominate a staff member to coordinate survey administration in that school. Materials to help coordinators manage the process were emailed to each of them.
Teachers were sent an email several days before administration informing them about the survey. On the day that the survey opened, teachers were sent invitations via email.
Efforts to increase response rates during administration. For both the student and teacher surveys, CCSR emailed principals and school coordinators weekly to encourage participation of students and teachers. In addition, network chiefs and their staff members were sent weekly response rate updates.
Scoring Survey Results
We use advanced techniques to score the survey similar to those used by national standardized test makers. Using Rasch analysis, we combine data from a set of questions conceptually related to each other This technique provides us with one score for the concept while taking into account that some questions are more difficult to agree with than others. For example, teachers were asked to rate the extent to which certain behaviors are a problem at their school in a teacher measure called Disorder and Crime. “Threats of violence toward teachers” is more difficult to endorse than “disorder in hallways” because it likely happens far less often.
Our scoring technique also allows us to account for the difference within questions. For example, it may be easier to go from strongly disagree to disagree than it is to go from agree to strongly agree. Rasch takes those differences into account for every question.
For more technical details about Rasch analysis, click here.
For more information on how measure, essential, and 5Essential scores are calculated, see How Scores are Calculated.