Measuring Domestic Violence: A New Approach

Last registered on January 22, 2018

Pre-Trial

Trial Information

General Information

Title
Measuring Domestic Violence: A New Approach
RCT ID
AEARCTR-0000517
Initial registration date
September 29, 2014

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
September 29, 2014, 8:41 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
January 22, 2018, 6:07 PM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation
University of Connecticut

Other Primary Investigator(s)

PI Affiliation
Inter-American Development Bank

Additional Trial Information

Status
Completed
Start date
2015-07-01
End date
2015-08-25
Secondary IDs
Abstract
The project designed and implemented list experiments to measure the incidence of intimate partner violence among women in Peru.
External Link(s)

Registration Citation

Citation
Aguero, Jorge and Veronica Frisancho. 2018. "Measuring Domestic Violence: A New Approach." AEA RCT Registry. January 22. https://doi.org/10.1257/rct.517-2.0
Former Citation
Aguero, Jorge and Veronica Frisancho. 2018. "Measuring Domestic Violence: A New Approach." AEA RCT Registry. January 22. https://www.socialscienceregistry.org/trials/517/history/25195
Sponsors & Partners

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information
Experimental Details

Interventions

Intervention(s)
Domestic violence is a sensitive topic and as such we will find reticent respondents, i.e., those who give false answers with a nonzero probability when an honest answer could reveal that the person has committed or been exposed to a sensitive act. We compare the protocols used by the Peruvian Demographic and Health Survey (DHS) against a new design. The DHS is a nationally representative survey that includes a module to measure domestic violence and has been used in Peru for the past 14 years. Thus, by relying on the DHS methods and questions we will be able to frame our findings relative to well-established measures of violence nationwide. Our alternative method tries to capture violence against women in an innovative way, called list experiments, that is sometimes used in political science (Blair and Imai 2012). In list experiments subjects are provided with a list of statements and asked to answer how many of them are true. Subjects are not asked about which ones are true and therefore, list experiments could provide a more private environment to answer questions on sensitive topics more honesty. In this setting, the control group receives a list of innocuos statements (e.g., "I had breakfast this morning"). The treatment group is read the same list plus one additional sensitive question. In our case, this additional question is about domestic violence (e.g.,"My partner pushed me in the past 12 months"). The mean difference in the response between treatment and control is an estimate of the incidence of the sensitive question.
Intervention Start Date
2015-07-01
Intervention End Date
2015-08-25

Primary Outcomes

Primary Outcomes (end points)
We will compare the incidence of domestic violence as measured by the DHS-style questions against the ones obtained from the list experiments. We will also explore whether the gradients of violence by socioeconomic status are similar under both types of measurement.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
We randomly assigned bank clients to two arms. In the first arm, the control group, women were asked about four sets of statements without the sensitive questions. The sets contained 5 questions each. In the second arm, the treatment group, the sets had one extra question. These additional (and sensitive) questions were about sexual and physical violence. As described above, the participants answered how many statements were true and not which were true. To measure the incidence of violence in the control group, we use a DHS-style set of questions.
Experimental Design Details
Randomization Method
Two balls (one white and one blue) were included in a black bag. The enumerator asked the bank client to extract (without looking) one ball. If blue, women were assigned to the treatment group, otherwise they were in the control.
Randomization Unit
As described above the randomization was done at the individual level.
Was the treatment clustered?
No

Experiment Characteristics

Sample size: planned number of clusters
There are 112 banks. However, as described elsewhere, randomization was conducted at the individual level.
Sample size: planned number of observations
We expect to interview around 1800 women.
Sample size (or number of clusters) by treatment arms
Around 500 women will be in the treatment group and the rest in the control.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
Because the randomization takes places a the individual level, we will apply the list experiment with the sensitive question to 500 women and the rest will answer the shorter list and the DHS-style questions about violence. This guarantees a MDE of 25% of a standard deviation with a power of 80%.
IRB

Institutional Review Boards (IRBs)

IRB Name
University of Connecticut
IRB Approval Date
2015-06-19
IRB Approval Number
#H15-164: ``Measuring Violence Against Women with Experimental Methods.''
IRB Name
Institutional Review Board Services
IRB Approval Date
2014-07-28
IRB Approval Number
FINCA Peru: Women and Microfinance in Rural Peru
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
Yes
Intervention Completion Date
August 25, 2015, 12:00 +00:00
Data Collection Complete
Yes
Data Collection Completion Date
August 25, 2015, 12:00 +00:00
Final Sample Size: Number of Clusters (Unit of Randomization)
Final sample: 1078 women (randomization was done at the individual level)
Was attrition correlated with treatment status?
No
Final Sample Size: Total Number of Observations
From the total pool of 1873 clients in 112 village banks in ADRA's microcredit program in Lima, we first drop all clients under age 18 as well as all women above 65. This leaves us with a universe of 1776 clients. We draw 6 banks at random and exclude them from the study to be able to pilot the instruments with their members. The remaining universe is comprised by 1690 clients in 106 banks. Finally, we work with all banks with monthly meetings scheduled during July 2015, which restricts the population of interest to 1562 women in 98 village banks. We targeted this restricted universe and were able to interview 1223 women between July 1st and August 25th, 2015. Randomization of the treatment was done at the individual level and was conducted by the surveyor. The questionnaire was implemented via tablets. Due to some initial complications with the software, we drop a few surveys which were incorrectly assigned to answer the list experiment questions from both treatment arms and are left with a sample of 1078 valid surveys.
Final Sample Size (or Number of Clusters) by Treatment Arms
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Abstract
A growing literature seeks to identify policies that could reduce intimate partner violence. However, in the absence of reliable administrative records, this violence is often measured using self-reported data from health surveys. In this paper, an experiment is conducted comparing data from such surveys against a methodology that provides greater privacy to the respondent. Non-classicalmeasurement error in health surveys is identified as college-educated women, but not the less educated, underreport physical and sexual violence. The paper provides a low-cost solution to correct the bias in the estimation of causal effects under non-classical measurement error in the dependent variable.
Citation
Aguero, J. and V. Frisancho (2017) Misreporting in Sensitive Health Behaviors and Its Impact on Treatment Effects: An Application to Intimate Partner Violence, IDB-WP-853

Reports & Other Materials