Incentive Effects on Nonresponse and Data Quality
Abstract – A considerable body of research shows that prepaid incentives increase response rates in mail surveys (e.g., Church 1993, Singer &Ye 2013). This increase can sometimes be due to an increase in a certain type of respondent that may have participated at lower levels without the incentive (Petrolia and Bhattacharjee 2009, Stinger and Kulka 2002), thus impacting nonresponse bias in survey estimates (Groves 2006). However, most incentive studies focus on response rates; only a small handful have assessed the impact of incentives on nonresponse bias (Groves and Peytcheva 2008).Even fewer studies have looked at the effects of incentives on data quality. These studies have looked at item nonresponse between $2 and $5 incentives (Shaw et al., 2001), the number of words, comments, and short answers provided by the respondents. (James & Bolstein, 1990), and completeness (Hansen, 1980).This presentation will report the results of a prepaid cash incentive experiment ($0 vs. $1) conducted in the 2014 Nebraska Annual Social Indicators Survey (NASIS)(n=1,018). The NASIS is an annual mail, omnibus survey of Nebraska adults with sample drawn from the USPS Delivery Sequence File.First, response rates by experimental treatment will be reported, but analyses will also compare the demographic make-up of the completed samples for the incentivized versus non-incentivized treatments and compare both treatments to American Community Survey population estimates in order to understand if the incentive impacted the makeup of the completed sample in ways that would be expected to impact nonresponse bias. Primary demographics of interest include age, gender, race, education level, income, marital status, and presence of children in the home. Respondents and nonrespondents from each group will also be compared by region as a way of measuring nonresponse bias. Responses to the 2014 NASIS will also be compared by experimental group in order to understand if the incentivized group responded differently than the non-incentivized group controlling for demographics. The 2014 NASIS asked questions on a range of items including natural resources, underage drinking, safety, vaccinations, the Affordable Care Act, and plant management.Next, survey responses will be analyzed in order to understand the incentive effects, if any, on data quality. The two experimental groups will be compared on item nonresponse, the number of words provided to an open-ended question, the use of nonsubstantive answer options, primacy, and straight-lining.
Keywords – Incentive, nonresponse, data quality
Focus Statement – Incentive effects are frequently studied in terms of response rates. However, this presentation will look at nonresponse bias and data quality as well.