Experimental Design
We set up the experiment in five (of around 180) employment agencies. The participating agencies are geographically dispersed; some are in East and others in West Germany, and we have agencies in rural and urban areas. Moreover, there were no other major ongoing trials and re-organizations in these agencies during our trial period. Another criterium for the selection of agencies was to ensure that we end up with a sufficiently large inflow into unemployment.
80% of the individuals who registered as unemployed between July 2012 and January 2013 in these five employment agencies were randomly selected to participate in our experiment. The participants in the experiment were randomly assigned to one out of four treatment groups during their first meeting with the caseworker.
Our experimental protocol addresses two aspects: the timing of the integration agreement (IA), and whether individuals were informed about a future IA. In treatment group A individuals were supposed to sign their IA during the first month of unemployment. In treatment groups B and C individuals had to sign their IA in the third month of unemployment (if they are still unemployed). Individuals of group B received a written announcement during their first meeting with the caseworker about having to sign an IA in the third month of unemployment. Next to the timing of the IA, this announcement contains a detailed description about the typical content of an IA. It is stated that the IA will specify the type of support the unemployed gets from the public employment service, details about future participation in ALMP programs, and job search requirements. In addition to that, it is stated that non-compliance with the content of an IA might lead to a sanction in form of benefit cuts. In treatment group D, individuals sign a first IA in their sixth month of unemployment (conditionally on still being unemployed).
We designed instruction material and conducted instruction lessons with team leaders of caseworker teams in participating agencies before the project started. The team leaders, in turn, instructed the caseworkers using the material. The instruction material consisted of a presentation, a frequently-asked-questions list and a two-sided plastic slide summarizing the experimental design which was meant to be placed on each caseworker’s desk throughout the experiment. The presentation highlighted the importance of the research question and why it could only be answered by means of a randomized controlled trial. It defined the target group of the experiment, which were new entries into unemployment that were not registered as unemployed during the last quarter prior to their unemployment entry. Only individuals who were eligible for unemployment benefits were supposed to participate in the experiment, excluding individuals younger than 25 and disabled individuals. Furthermore, the material included verbal and graphical descriptions of the different treatment groups. An important part of instructions was that – apart from the different timing of IAs – other elements of the placement process were not supposed to differ across treatment groups, and that in particular all groups should have the same access to instruments of active labor market policy.
Random assignment itself was based on a computer program that was developed by the PES for evaluation purposes. Before or during the meeting with a newly unemployed person, caseworkers had to open the program and to enter the identification number, the name and the date of birth of the unemployed person. After pressing a randomization button, the program immediately showed the corresponding result, which caseworkers then had to document in the usual placement software program. In our analysis, we use the information stored by the randomization program. Beside the time and the result of the randomization, the program also stored the anonymized identifiers of the unemployed and the caseworkers. The caseworkers were not able to manipulate the randomization by for example re-running the randomization tool.
The analysis will be mainly based on administrative data containing information about UI benefit receipt, participation in active labor market policy, meetings with the caseworker, employment spells and wages. In addition to that, we make use of a survey of participants in our experiment one month after entry into UI and of a survey of caseworkers working in the agencies that participate in the RCT, one month before the RCT began. The caseworkers have been interviewed a second time around 5 months after the start of the RCT. We have additionally conducted a second survey of participants around 7 months after entry into UI. It turned out that the response rate for this second wave is rather low. In our analysis, we will mainly focus on men and not women because female labor market histories are more affected by parental leave spells that are not recorded in the administrative data.