64th ISI World Statistics Congress - Ottawa, Canada

64th ISI World Statistics Congress - Ottawa, Canada

Smart household budget surveys: How to involve respondents in smart data collection?

Author

Jd
Jelmer C. de Groot

Co-author

  • B
    Barry Schouten

Conference

64th ISI World Statistics Congress - Ottawa, Canada

Format: IPS Abstract

Keywords: hbs, householdconsumption, ocr, smart

Session: IPS 190 - Household Expenditure Programs in Official Statistics

Thursday 20 July 2 p.m. - 3:40 p.m. (Canada/Eastern)

Abstract

Household budget/expenditure surveys contain many elements that make them very well fit for the introduction of smart survey features. These surveys tend to be burdensome, both in time and in cognitive effort, and the information requested may not always be readily available for respondents. Smart surveys introduce features of smart devices such as internal storage and computing, internal sensors, linkage to external sensor systems, access to public online data and various forms of data donation. Some of these features, such as receipt scanning or uploading, advanced product search algorithms and data donation, are very promising. They are also challenging in terms of user interface design and processing through text/image extraction methods. Besides these technical aspects, methodological uncertainties are faced; what approach strategy fits best when implementing smart features into a survey?
In ESSnet Smart Surveys a large-scale field test has been conducted testing various recruitment and motivation strategies. Randomized conditions such as mode of invitation, in-app feedback on OCR results and personalized insights were added. These conditions allow for an analysis of data quality trade-offs. Since the aim is to reduce perceived respondent burden, it is crucial to know what the boundaries are in respondent involvement and motivation, and how these depend on design features.
In the paper, we will discuss respondents’ motivation and involvement for a smart household budget survey as a function of approach strategy design choices. The main question for this pilot was: how can we involve respondents into the fieldwork and how can we keep these respondents motivated throughout their writing period?
In the field test, interviewer-assistance was randomized; one sample had assistance and the other not. The non-interviewer sample thus is a control group. To quantify differences, two main questions were asked:
1) what is the impact of interviewer-assistance on respondents’ participation, motivation and involvement?
2) what is the recommended role of interviewers?

Another way to keep respondents motivated is by giving them insights in their own expenses.
This was varied by giving half of the sample insights into their expenses directly, whereas the other half had access to their insights page after the writing period. The main question here was:
1) What is the impact on direct versus delayed insights on registration and completion rates?
Another smart feature that was implemented on the back-end side was the logging of paradata. By this paradata, in-app behaviour of the respondents was analysed to further quantify the above mentioned.