Measuring the Unmeasured: Combining Technology and Survey Design to Filter Noise in Self-Reported Business Outcomes

Last registered on February 12, 2014

Pre-Trial

Trial Information

General Information

Title
Measuring the Unmeasured: Combining Technology and Survey Design to Filter Noise in Self-Reported Business Outcomes
RCT ID
AEARCTR-0000251
First published
February 12, 2014, 12:47 PM EST

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Locations

Region

Primary Investigator

Affiliation
London Business School

Other Primary Investigator(s)

PI Affiliation
World Bank

Additional Trial Information

Status
On going
Start date
2013-10-07
End date
2014-12-01
Secondary IDs
Abstract
Reducing noise and improving the measurement of business outcomes is critical to advancing research on private enterprise in low-income countries, particularly given the lack of record keeping (by entrepreneurs) and administrative data (by governments). To address many of the outstanding measurement challenges we have constructed an electronic survey tool that brings together the benefits of the latest surveying technologies with improvements in survey design to increase the precision of profit and sales estimates obtained through research on micro enterprises in low-income countries. We have identified four sources of noise, both in the literature and through our own research, which we will address through our new electronic surveying approach (e.g. respondent, business, surveyor and data management factors). By focusing on survey design and technology, our novel approach represents a first attempt at reconciling the recommendations of de Mel et al (2009) for increased attention to survey design, with those of Fafchamps et al (2012) for identifying ways to leverage an electronic survey tool beyond basic consistency checks.

Specifically, our new electronic survey tool is designed to improve the collection of data on three sections of a firm’s financials: (1) Money In; (2) Money Out; and (3) Money Left-Over. Through an iterative process, our electronic survey tool narrows in on a more precise estimate for firm sales, firm costs, and firm profits. Moreover, by default the electronic tool offers certain advantages (over a similar paper survey tool).

To test whether our new approach does a better job of filtering out noise and increasing measurement precision, we will conduct a randomized-controlled trial (RCT) to evaluate the effectiveness of a traditional paper survey tool compared to our new electronic survey tool. Our proposed study will include an independent sample of micro entrepreneurs from Ghana. The RCT will be carried out by the field research staff of Innovations for Poverty Action (IPA) in 2014. In Part 1, we will identify a sample of 600 micro entrepreneurs (50% female) operating their business in greater Accra. In Part 2, we will randomly assign half of the entrepreneurs into a treatment group (administered our new electronic survey tool) and half into a control group (administered a comparable paper survey tool). This field experiment will result in the construction of a novel dataset.
External Link(s)

Registration Citation

Citation
Anderson-Macdonald, Stephen and Bilal Zia. 2014. "Measuring the Unmeasured: Combining Technology and Survey Design to Filter Noise in Self-Reported Business Outcomes." AEA RCT Registry. February 12. https://doi.org/10.1257/rct.251-1.0
Former Citation
Anderson-Macdonald, Stephen and Bilal Zia. 2014. "Measuring the Unmeasured: Combining Technology and Survey Design to Filter Noise in Self-Reported Business Outcomes." AEA RCT Registry. February 12. https://www.socialscienceregistry.org/trials/251/history/1067
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Intervention Start Date
2013-10-07
Intervention End Date
2013-12-20

Primary Outcomes

Primary Outcomes (end points)
Our analysis will evaluate whether the new electronic surveying approach works better and, if so, then how it actually helps to improve the precision of estimates. Outcomes to be examined include the coefficient of variation (CV) for each estimate, as well as the adjustment factor (AF) within each estimate (i.e. the extent to which an estimate changes from its pre-adjusted initial value to its post-adjusted final value). For example, the difference between a pre-adjusted estimate for monthly profits (straight, unaided recall) and the post-adjusted estimate for monthly profits (obtained through the steps in the electronic survey tool).

Importantly, we will be able to compare each post-adjusted estimate obtained using our electronic tool to multiple benchmarks:
(1) Within tool: CV of post-adjusted estimates (electronic tool) versus CV of pre-adjusted estimates (electronic tool).
(2) Between tools: CV of post-adjusted estimates (electronic tool) versus CV of post-adjusted estimates (paper tool).
(3) Between tools: AF of estimates (electronic tool) versus AF of estimates (paper tool).

In sum, we aim to examine whether sources of noise can be reduced and precision increased for profit and sales estimates of micro enterprises by using an electronic surveying approach that focuses on new triangulation, aggregation and adjustment processes in survey design.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
Our proposed study will include an independent sample of micro entrepreneurs from Ghana (n=600). The RCT will be carried out by the field research staff of Innovations for Poverty Action (IPA) in 2014. In Part 1, we will identify a sample of 600 micro entrepreneurs (50% female) operating their business in greater Accra. In Part 2, we will randomly assign half of the entrepreneurs into a treatment group (administered our new electronic survey tool) and half into a control group (administered a comparable paper survey tool).

Importantly, all members of the survey team will be trained for 3 weeks by IPA research managers. Each surveyor will implement the electronic tool (treatment group) or the paper tool (control group) to a participant on a randomized basis (supervised by IPA staff). We will control for individual surveyor fixed effects in our analysis. Thus, any differences between the two survey approaches (electronic versus paper) will be attributable to the survey design - and not to potential surveyor biases or training effects.
Experimental Design Details
Randomization Method
Randomization done in office via a computer (Stata DO file).
Randomization Unit
Individual entrepreneur (firm owner)
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
n/a
Sample size: planned number of observations
600 entrepreneurs (firm owners)
Sample size (or number of clusters) by treatment arms
300 entrepreneurs to be randomly assigned into treatment group (exposed to the new electronic surveying approach).
300 entrepreneurs to be randomly assigned into control group (exposed to the traditional paper surveying approach).
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Target Effect Size: 15% improvement in Coefficient of Variation or Adjustment Factors. Required Sample Size: 291 entrepreneurs per group (T, C). Assumptions for Power Calculations: 80% power α = .05 ρ = 0.50 (auto-correlation in firm outcomes) 1 pre-treatment survey round and 3 post-treatment survey rounds. Coefficient of Variation (CV) of 1.0 Equal Group Sizes, e.g. Treatment (n=291) versus Control (n=291). [Used "sampsi" command in Stata for calculations.]
IRB

Institutional Review Boards (IRBs)

IRB Name
London Business School Research Ethics Committee
IRB Approval Date
2013-07-01
IRB Approval Number
REC 76(a)

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials