Experimental Design
Since the enactment of the National Policy on Education in 1986, non-formal education centers (NFEs) have played an important role in India’s drive toward universal primary education. They have been the main instrument for expanding school access to children in remote and rural areas. They have also been used to transition children who may otherwise not attend school into a government school. Several million children are enrolled in NFEs across India. Children of all ages may attend the NFE, though, in our sample, most are between seven and ten years of age. Nearly all of the children are illiterate when they enroll. In the setting of our study, the NFEs are open 6 hours a day and have about 20 students each. All students are taught in one classroom by one teacher, who is recruited from the local community and has, on average, a tenth-grade education. Instruction focuses on basic Hindi and math skills. The schools only have one teacher; thus, when the teacher is absent, the school is closed.
Seva Mandir runs about 150 NFEs in the tribal villages of Udaipur, Rajasthan.
Udaipur is a sparsely populated, hard-to-access region. Thus, it is difficult to regularly monitor the NFEs, and absenteeism is high. Before 2003, Seva Mandir relied on occasional visits to the schools, as well as reports by the local village workers, to monitor teacher attendance. They then use bimonthly teacher meetings to talk to delinquent teachers. Given the high absence rate, they were aware that the level of supervision was insufficient. Therefore, starting in September 2003, Seva Mandir implemented an external monitoring and incentive program on an experimental basis. They chose 120 schools to participate, with 60 randomly selected schools serving as the treatment group and the remaining 60 as the comparison group. In the treatment schools, Seva Mandir gave each teacher a camera, along with instructions for one of the students to take a photograph of the teacher and the other students at the start and end of each school day. The cameras had a tamper-proof date and time function that made it possible to precisely track each school’s openings and closings. Rolls were collected every two months at regularly scheduled teacher meetings, and payments were distributed every two months. If a camera malfunctioned, teachers were instructed to call them program hotline within 48 hours. Someone was then dispatched to replace the camera, and teachers were credited for the missing day.
At the start of the program, Seva Mandir’s monthly base salary for teachers was Rs. 1,000 ($23 at the real exchange rate, or about $160 at purchasing power parity) for at least 20 days of work per month. In the treatment schools, teachers received a Rs. 50 bonus ($1.15) for each additional day they attended in excess of the 20 days (where holidays and training days, or about 3 days per month on average, are automatically credited as working days), and they received a Rs. 50 fine for each day of the 20 days they skipped work. Seva Mandir defined a “valid” day as one in which the opening and closing photographs were separated by at least five hours and at least eight children were present in both photos. Due to ethical and political concerns, Seva Mandir capped the fine at Rs. 500. Thus, salaries ranged from Rs. 500 to Rs. 1,300 (or $11.50 to $29.50). In the 56 comparison schools, teachers were paid the flat rate of Rs. 1,000, and were reminded that regular attendance was required and that they could, in principle, be dismissed for poor attendance. No teacher was fired during the span of the evaluation, however.
Vidhya Bhawan (a consortium of schools and teacher training institutes) and the Abdul Latif Jameel Poverty Action Lab (J-PAL) collected the data. We have two sources of attendance data. First, we collected data on teacher attendance through one random unannounced visit per month in all schools. By comparing the absence rates obtained from the random checks across the two types of schools, we can determine the program’s effect on absenteeism. Second, Seva Mandir provided us with access to the camera and payment data for the treatment schools. We collected data on teacher and student activity during the random check. For schools that were open during the visit, the enumerator noted the school activities: how many children were sitting in the classroom, whether anything was written on the blackboard, and whether the teacher was talking to the children. While these are crude measures of teacher performance, they were chosen because each could be easily observed before the teachers could adjust their behavior. In addition, the enumerator also conducted a roll call and noted whether any of the absent children had left school or had enrolled in a government school, and then updated the evaluation roster to include new children. To determine whether child learning increased as a result of the program, the evaluation team, in collaboration with Seva Mandir, administered three basic competency exams to all children enrolled in the NFEs in August 2003: a pretest in August 2003, a mid-test in April 2004, and a post-test in September 2004. The pretest followed Seva Mandir’s usual testing protocol. Children were given either a written exam (for those who could write) or an oral exam (for those who could not). For the mid-test and post-test, all children were given both the oral exam and the written exam; those unable to write, of course, earned a zero on the written section. The oral exam tested simple math skills (counting, one-digit addition, simple division) and basic Hindi vocabulary skills, while the written exam tested for these competencies plus more complex math skills, the ability to construct sentences, and reading comprehension. Thus, the written exam tested both a child’s ability to write and his ability to handle material requiring higher levels of competency relative to the oral exam.