NEW UPDATE: Completed trials may now upload and register supplementary documents (e.g. null results reports, populated pre-analysis plans, or post-trial results reports) in the Post Trial section under Reports, Papers, & Other Materials.
A field experiment on online job search assistance
Initial registration date
November 21, 2019
November 22, 2019 11:03 AM EST
Other Primary Investigator(s)
University of Adelaide
Additional Trial Information
Finding a job is a full-time job, and it requires basic skills to engage effectively with employers with vacancies. We design and evaluate experimentally the impact of a website we purposely created with useful job search resources: an annotated resume template, a cover letter template, and tips on how to look and apply for jobs. We informed about 2,000 job seekers in Australia about the website via low-cost communication channels and tracked their outcomes for almost two years. We find the website is effective at increasing job-finding rates, particularly among a most receptive group of job seekers aged 35-50 and women within this age group for whom job placements were respectively 6% and 8% higher compared to the control group after three months. We also find a marginal improvement in the quality of job matches. We discuss policy implications for improving targeting of employment programs and the potential for online government services.
Briscese, Guglielmo, Veronica Quinn and Giulio Zanella. 2019. "A field experiment on online job search assistance." AEA RCT Registry. November 22.
We test a low-cost and easily scalable online intervention to supplement job search skills among income support recipients. The intervention consists in user-friendly online resources to help unemployed job seekers find a job. Specifically, we developed a website with three webpages, each basic resources needed when looking for work: (i) a resume template, (ii) a cover letter template), and (iii) tips on how to look and apply for jobs. The first two webpages contain a downloadable template, a brief explanation on how to customize it, and a short video from an employment advisor that explains how to use the resources effectively. We then notified around 2,000 job seekers about the existence of the website via offline and online channels. We left the website online for four months, and tracked recipients' employment outcomes for a period of nearly two years.
Intervention Start Date
Intervention End Date
Primary Outcomes (end points)
The primary outcomes of interest in our experiment are:
1) Found employment
2) Found own employment (i.e. the vacancy was not secured by an employment advisor)
3) Job retention rates over a period of 12 months before the trial and 22 months after the trial
Primary Outcomes (explanation)
Secondary Outcomes (end points)
The secondary outcomes of interest in our experiment are:
1) Gender difference across all primary outcomes
2) Age differences across all primary outcomes
3) Number of days needed to find a job
4) Job type, for the subset of the sample who found one (e.g. full-time, part-time)
Secondary Outcomes (explanation)
We partnered with an employment agency to select 22 of their job centres in the area of metropolitan Sydney, Australia. In order to minimize spillovers onto job seekers in the control group and because crucial aspects of the trial such as the presence of posters and the role of job advisors cannot be varied within job centers, the randomization was conducted at the level of job centers.
Job seekers who, at the beginning of the trial, were registered at job centers assigned to treatment or who registered at these centers during the trial constitute the treatment group, which was a total of 1,983 persons. The control group is given by 1,764 job seekers who registered at job centers not assigned to treatment.
Experimental Design Details
Job seekers were randomised into treatment or control at the site level. Randomisation was achieved using a balance check and re-randomisation procedure . This procedure involved randomly allocating sites to control and treatment 1000 times, and then calculating the relative imbalance on each of the above variables. Relative imbalance was calculated by taking the absolute value of the t-statistics on a simple regression with treatment allocation as the outcome variable and the balance variable as the sole predictor. We then took the largest t-statistic of the balance checks for each allocation, and chose the allocation that had the smallest maximum imbalance. We balanced on the following variables which may influence treatment effect:
• Part-time or full-time status of the site (binary)
• East/ West status of the site (binary) • Caseload of wage subsidy eligible job seekers • Caseload of job seekers at the site
• Single or composite site (i.e. where sites might share an employment advisor; binary)
• Average monthly job placements at the site
The unit of randomisation was the job centre.
Was the treatment clustered?
Sample size: planned number of clusters
The trial was run on 22 clusters (job centres) randomly allocated between control and treatment
Sample size: planned number of observations
At the end of the trial, the total sample size was 3,746 unemployed job seekers.
Sample size (or number of clusters) by treatment arms
11 job centres in control and 11 job centres in treatment
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
The number of sites that were to be included in the trial and their size had already been determine before the power calculation was conducted. This was based on a joint decision with the senior management team of the job service provider to ensure that we had access to sites that did not undergo any other management change or intervention throughout the duration of the trial. Thus, we looked at the duration of the trial to reach a sufficiently large sample. We took the sites and randomisation as given and calculated the power to detect a range of effect sizes.
We calculated power by taking the number of observations in the control and treatment, and deflating them by the design effect for a cluster randomised trial with unequal sizes of clusters. We first estimated the ICC and design effect given our baseline data, and then used these parameters to calculate the power to detect for any given effect size.
For total placements in one month, we used as our outcome variable the likelihood of being placed into employment, across all job seekers. Approximately 8% of the total caseload at each site was placed into employment each month based on historical data. We estimated that over one month the trial was powered at conventional levels to detect a 50% (or larger) increase in the total number of individuals placed into employment, which translates to a 4 percentage point increase in placements. Based on historical data, we estimated approximately 400 new job seekers to be enrolled into all sites each month, leading to a required trial duration of 4 months. We then calculated the minimum detectable effect sizes (given a power of 80%) for each possible duration of the trial, from one to twelve months in duration. This was achieved by adding 200 individuals to both control and treatment sites for each additional month.
For job placements, in three months we were powered to find a 40% relative impact, an absolute increase (or decrease) in the rate of placements by approximately 3.5%. Three months allowed a more reasonable MDES, and was within the duration of the agreement we held with our partners. This effect size is in line with available similar studies (Altmann et al., 2018. "Learning about job search: A field experiment with job seekers in Germany"; Belot et al, 2015. "Providing Advice to Job Seekers at Low Cost: An Experimental Study on On-Line Advice") and concluded that we were sufficiently powered in this study. A clear trial end date also allowed us to have a more controlled environment, by limiting the risks that the partner provider needed to implement any change to some site in our sample.
INSTITUTIONAL REVIEW BOARDS (IRBs)
University of Adelaide Human Research Ethics Committee
IRB Approval Date
IRB Approval Number
Post Trial Information
Is the intervention completed?
Intervention Completion Date
August 17, 2017, 12:00 AM +00:00
Is data collection complete?
Data Collection Completion Date
March 01, 2019, 12:00 AM +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
22 clusters, of which 11 in treatment
Was attrition correlated with treatment status?
Final Sample Size: Total Number of Observations
3,746 job seekers
Final Sample Size (or Number of Clusters) by Treatment Arms
1,764 job seekers in control, and 1,983 in treatment
Is public data available?
This section is unavailable to the public. Use the button below
to request access to this information.