The Impact of Personalization and SMS Reminders on the Uptake of Customer Satisfaction Surveys in Government Services Centers

Last registered on November 14, 2024

Pre-Trial

Trial Information

General Information

Title
The Impact of Personalization and SMS Reminders on the Uptake of Customer Satisfaction Surveys in Government Services Centers
RCT ID
AEARCTR-0014592
Initial registration date
October 21, 2024

Initial registration date is when the trial was registered.

It corresponds to when the registration was submitted to the Registry to be reviewed for publication.

First published
October 28, 2024, 12:57 PM EDT

First published corresponds to when the trial was first made public on the Registry after being reviewed.

Last updated
November 14, 2024, 8:17 AM EST

Last updated is the most recent time when changes to the trial's registration were published.

Locations

Region

Primary Investigator

Affiliation

Other Primary Investigator(s)

PI Affiliation
NL Partners
PI Affiliation
Boston Consulting Group

Additional Trial Information

Status
On going
Start date
2024-10-06
End date
2024-11-30
Secondary IDs
Prior work
This trial does not extend or rely on any prior RCTs.
Abstract
Customer satisfaction surveys are essential tools for government agencies, providing a cost-effective and efficient method for gathering large-scale feedback on service quality. These surveys help identify customer pain points and enhance the overall user experience. However, low response rates present a significant challenge, impacting both the quality of the data collected and the depth of actionable insights that can be derived. An analysis of over 400,000 web-based surveys conducted across seven Government Service Centers (GSCs) from July 2023 to July 2024 illustrated this issue, revealing a low completion rate of approximately 9%. Over 80% of these responses were received within the first hour of sending the SMS invite.

To address this challenge, a randomized controlled trial (RCT) was implemented to test the effectiveness of behavioral insights in increasing survey participation rates. The intervention employed personalized SMS invitations and timely reminders to encourage survey completion following visits to the GSCs. The study consisted of four treatment arms: a control group receiving the original generic SMS, a group receiving personalized SMS invites, another group receiving the original SMS with timely reminders, and a final group that combined both personalized invites and reminders.

The intervention will run for two weeks (10 working days) across four Government Service Centers (GSCs) serving approximately 1,500 customers daily (60% of whom are first-time visitors). It aims to test and develop a scalable cost-effective solution for enhancing customer feedback mechanisms in public service environments.
External Link(s)

Registration Citation

Citation
Makki, Fadi, Ali Osseiran and Nabil Saleh. 2024. "The Impact of Personalization and SMS Reminders on the Uptake of Customer Satisfaction Surveys in Government Services Centers." AEA RCT Registry. November 14. https://doi.org/10.1257/rct.14592-2.0
Experimental Details

Interventions

Intervention(s)
The intervention involved sending personalized SMS invitations to customers shortly after they received a service at one of four participating GSCs, inviting them to complete a web-based survey rating their experience at the center. The messages were customized to make them more relevant and engaging, and were sent in both Arabic and English to accommodate language preferences.

The intervention also included sending reminders the day after the initial SMS invite, scheduled for 7 PM. These reminders were sent to customers who had not yet completed the survey, prompting them again to participate. Like the original invitation, the reminders were personalized and delivered in both Arabic and English.

Treatment Arms

The trial utilized a randomized controlled design comprising four treatment arms:
- Control Group: Received the original generic SMS invitation.
- Personalized SMS Group: Received personalized SMS invitations tailored to their specific interactions with the government service.
- Original SMS + Reminder Group: Received the original SMS invitation along with a follow-up reminder sent the following day at 7 PM.
- Personalized SMS + Reminder Group: Received both the personalized SMS invitations and a follow-up reminder.
Intervention (Hidden)
Original SMS Invite:
“Dear, Please take some time to rate your experience at our Government Service Centre on this link: (link)”

Personalized SMS Invite:
“Greetings, You have received a service from (Government Entity) at (Government Services Centre). Kindly rate your experience via: (Link). It will not take more than 30 seconds. Thank you, Governmental Services Centers.”

SMS Reminder:
“Greetings, We still need your feedback to improve the quality of our services. Please fill out the survey sent to you when you requested a service from (Government Entity) at (Government Service Center). Thank you, Governmental Services Centers.”

The SMS reminders were sent the day after the initial invitation at 7 PM, and each customer received only one reminder. The survey link provided in the SMS invitations expired 10 days from the date the invite was sent.
Intervention Start Date
2024-10-06
Intervention End Date
2024-10-27

Primary Outcomes

Primary Outcomes (end points)
Response Rate: The proportion of customers who complete the customer satisfaction survey within the 10-day period following the SMS invitation.
Primary Outcomes (explanation)

Secondary Outcomes

Secondary Outcomes (end points)
Response time: The average time taken by customers to respond to the survey after receiving the SMS invites.
Engagement rate: The proportion of customers who clicked on the customer satisfaction survey link.
Secondary Outcomes (explanation)

Experimental Design

Experimental Design
The study employed a randomized controlled trial (RCT) with a factorial design to evaluate the effectiveness of the personalized SMS invite and follow-up reminder in increasing participation rates in the customer satisfaction surveys. This design allowed for assessing the effect of the generic and personalized invites separately as well as in combination with the follow-up reminders.

Key Components of the Experimental Design

Treatment Arms:
The trial comprised four treatment arms:
- Control Group: Received the original generic SMS invitation.
- Personalized SMS Group: Received personalized SMS invitations tailored to their specific interactions with the government service.
- Original SMS + Reminder Group: Received the original SMS invitation along with a follow-up reminder sent the following day at 7 PM.
- Personalized SMS + Reminder Group: Received both the personalized SMS invitations and a follow-up reminder.

Randomization:
Customers visiting one of the four participating GSCs were randomly assigned to one of four intervention arms. This randomization was achieved through a simple randomization method, whereby each customer was assigned a random number between 1 and 4 upon checking into a GSC to request a government service. Each number corresponded to a specific treatment arm.

To ensure consistency in assignment, customers were tracked using the local mobile numbers provided during check-in. This tracking mechanism allowed for repeat visitors to remain in the same intervention group throughout the trial period.

Sample Size:
The study aimed to include approximately 10,000 customers, with each treatment arm consisting of about 2,500 customers. This sample size was deemed sufficient to detect meaningful differences of 2 percentage points in response rates between the groups.

The trial anticipated collecting around 15,000 survey observations, which included responses from first-time visitors as well as repeat customers.

Duration of the Trial:
The intervention will be implemented over a two-week period, with data collection concluding 10 days afterward, providing sufficient data to achieve the desired statistical power.

Data Collection:
Surveys were administered through web-based links sent via SMS. Each customer received a unique link to participate in the survey, which expired 10 days after the invitation was sent.

Outcome Measures:
The primary outcome measure was the completion rate of the customer satisfaction surveys.
Experimental Design Details
Randomization Method
A simple randomization method was employed to assign customers to one of the four intervention groups. Upon checking into a participating government services center, each customer was assigned a random number between 1 and 4, with each number representing a specific intervention group. Customers were tracked using the local mobile numbers they provided during check-in, ensuring that the same customer (mobile number) remained assigned to the same group if they visited multiple times during the intervention period.
Randomization Unit
Randomization Unit: Individual customer, with some customers visiting the centers more than once during the intervention period.
Was the treatment clustered?
Yes

Experiment Characteristics

Sample size: planned number of clusters
10,000 customers
Sample size: planned number of observations
15,000 surveys
Sample size (or number of clusters) by treatment arms
2,500 customers per group.
Minimum detectable effect size for main outcomes (accounting for sample design and clustering)
The minimum detectable effect size (MDE) is around 2.5 percentage points (27.7% relative increase), assuming a baseline response rate of 9% and an ICC of 1.0 for robustness.
IRB

Institutional Review Boards (IRBs)

IRB Name
Salus IRB
IRB Approval Date
2024-11-09
IRB Approval Number
24647
Analysis Plan

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Post-Trial

Post Trial Information

Study Withdrawal

There is information in this trial unavailable to the public. Use the button below to request access.

Request Information

Intervention

Is the intervention completed?
No
Data Collection Complete
Data Publication

Data Publication

Is public data available?
No

Program Files

Program Files
Reports, Papers & Other Materials

Relevant Paper(s)

Reports & Other Materials