Co-operation or freeloading: What is the effect of conditional versus unconditional incentives in an SMS survey?

Written by: Alexandra Cronberg

Introduction

Gifts can be a tricky business. While they may stem from pure generosity and care, they often come with sticky strings. Just ask all the companies that tightly regulate the receipt of gifts from, say, potential clients or partners. Such are human relationships that obligation and reciprocity often govern behaviour and interactions, for better or worse.

In survey research we may draw on the same deep-seated human traits of obligation and reciprocity to get respondents to complete our questionnaires. We can do this by giving an unconditional gift, i.e. incentive, in advance of asking for participation. Indeed, several studies[1] on postal surveys have shown that unconditional incentives do lead to higher response rates compared to giving a gift conditional upon completing the survey, which arguably treats the questionnaire more like a transactional exchange.

The use and administration of incentives is a particularly relevant issue for surveys making use of self-completion questionnaires, such as postal and SMS surveys: These data collection modes do not have the benefit of an interviewer who can coax respondents to take part and therefore need to rely on incentives to a greater extent.

Now, the same studies showing that unconditional incentives in postal surveys lead to higher response have also shown that unconditional incentives are actually not cost efficient. This can be due to undelivered letters or the absence of eligible respondents. Some respondents will also take the incentives, e.g. a voucher attached to an advance letter, without completing the questionnaire. Consequently, few postal surveys in practice administer incentives unconditionally.

With the increasing popularity of SMS surveys, it is pertinent to ask whether unconditional incentives have the same effect on SMS as on postal surveys, and whether it is cost efficient or not. In particular, SMS has the advantage over postal surveys that respondents can easily opt in, meaning cost efficiency may well be improved.

In order to seek the answer to these questions, Kantar Public carried out a small experimental study together with the British Council. Read on to find out the results.

This study

The study involved an SMS survey with an experimental design to test the effect of administering conditional versus unconditional incentives. The study also sought to test the feasibility more broadly of using SMS as data collection mode to gather feedback and progress updates from British Council course participants, but that question is the topic for another blog post.

The survey was carried out among course participants in a British Council teacher training course in Ethiopia and the questionnaire comprised 16 questions. The sample consisted of 434 respondents with valid telephone numbers. Respondents were randomly allocated into one of two groups, Group A and Group B. The initial message was successfully delivered to 390 respondents (Group A: 199 resp. and Group B: 191 resp.). Each group was administered the survey as shown in the diagram below.

Group A & B

At the beginning of fieldwork, respondents were sent a message alerting them to the survey. A day later they were then sent another message asking them participate. In order to participate, respondents were instructed to first opt in by responding to the message. For Group A, the questions were then sent out followed by the incentive, provided the respondent completed all 16 questions. For Group B, the incentive was sent immediately after the respondent opted in, which was followed by the questions. The incentive consisted of airtime worth 15 Ethiopian Birr, equivalent to 0.55 US dollars.

Findings

The findings from the study suggest that offering the incentive in advance yields a slightly higher response rate compared to an incentive conditional on the respondent completing all the questions. As shown in the table below, among Group B, 25% completed all the questions whereas in Group A the equivalent figure was 21%.

These figures are broadly in line with surveys of this nature. That said, it is clear that response is still fairly low even among Group B.

How does this impact on cost efficiency? As mentioned above, one advantage of SMS surveys over postal ones is that respondents can easily opt in before any other message or incentive is sent to them. This means that unconditional incentives are only sent to respondents who have a valid telephone number and who are eligible, thus minimising loss. There is, however, still the potential issue of respondents taking the incentive without completing the questionnaire. This problem turned out to be quite a notable one in our SMS survey. Among respondents who opted in, nearly half of Group B (48%) did not complete the questionnaire. That means a large share of respondents took the incentive but ditched the questionnaire. The equivalent proportion who opted in but failed to answer all questions was somewhat higher for Group A (56%). Yet the resulting cost for the airtime incentives overall (and per completed interview) was lower for Group A since we did not allow for any freeloaders.

Putting monetary values to the incentives given to Group A and B, we can see that the total cost for Group B was ETB 15*93=ETB 1,395 (USD 59.30), equivalent to an average of ETB 29 per completed interview. This compares to a total cost of ETB 15 per completed interview among Group A, resulting in a total cost of ETB 15*42=ETB 630 (USD 26.77). Consequently, we might draw the conclusion that cost efficiency is a major concern also for SMS surveys when administering unconditional incentives.

Table

Conclusion

Based on the results from this experimental SMS study among teachers in Ethiopia, we can see that unconditional incentives yielded slightly higher response compared to administering incentives conditional upon completion of the questionnaire. This finding is line with other studies, and re-affirms the view that drawing on respondents’ sense of obligation and reciprocity is more productive than treating survey participation as something of a transactional exchange.

That said, it is clear that a large share of respondents are not that bothered about reciprocity in the face of a free gift, even when first asked for their active participation. In this light, administering unconditional incentives in an SMS survey is arguably not cost efficient, with the average cost of unconditional incentives per completed interview nearly double that of the conditional alternative.

Hence, the sense of obligation and reciprocity may well be part of deep-seated human traits and behaviour, but it seems that in a context of technology and faceless interactions, many respondents will turn into freeloaders. Unfortunately for us social researchers, free airtime does not seem to come with sticky strings.

 

[1] See for example Simmons, E. and Wilmot, A. ‘Incentive payments on social surveys: a literature review’, published by the Office for National Statistics in the UK, 2004. See also Abdulaziz K, Brehaut J, Taljaard M, et al. ‘National survey of physicians to determine the effect of unconditional incentives on response rates of physician postal surveys’. BMJ Open 2015;5: 007166.doi:10.1136/bmjopen-2014-007166

2 thoughts on “Co-operation or freeloading: What is the effect of conditional versus unconditional incentives in an SMS survey?

  1. Thank you for sharing an informative article. I am surprised with the findings as I thought the conditional incentive would have been better. But I think the caveat is the amount of the incentive provided, may be it was not “too attractive”​

    One more thing no statistical test was performed to compare the two groups. A statistical test would help to solidify the findings.

    Like

    • Joel, thank you for your comments. Yes, it might be surprising that unconditional incentives are more effective for increasing response but very much in line with what other studies have found too.

      We conducted another study on SMS surveys a little while ago, which, among other things, tested what difference it makes on response rates if the (conditional) incentive is increased ($0.5 vs $1.25) – it didn’t actually make much of a difference. That said, we can’t be sure that we’d observe the same effects for unconditional incentives, or for notably higher amounts.

      Regarding statistical testing, I didn’t include the results because I wanted to keep the blog post non-technical. The difference in response rate is not statistically significant at the 5% level (Fisher’s exact test yields a p-value of 0.2), but it should also be noted that the sample size is rather small.
      Since the results are in line with other studies, I think we can still be confident that our findings are valid. That said, as mentioned in the blog post, the difference in response rates is quite slight. If we have an opportunity it would be great to conduct the same experiment on a bigger sample – and also on a sample that is representative of the general population – in order to verify the results.

      Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s