Written by Dr. John Ward and Dr. Alex Smajgl, project leaders of the MK28 project
There are a number of common pains and ailments experienced by practitioners conducting livelihood surveys, particularly in remote communities. Unplanned delays in questionnaire design and piloting, variations in interviewing techniques and enumerator behaviours, respondent fatigue and unreliable data entry are a few.
Our recent experience in Lao PDR’s Nam Xong river basin provided some palliative, symptomatic relief. Here we tested out digital tablets as an alternative to paper based surveys.
Generally, we have only one opportunity to conduct field surveys in the Mekong; they are expensive and resource intensive, so it’s far better to identify any problems, make all the mistakes and correct them during planning and piloting. Questionnaire design, enumerator training, piloting, revision, data entry and validation have previously taken 4-6 months. The same process took only 4 weeks with the digital tablets. And we saved about 20,000 pages of printing.
Where do surveys fit into our work as researchers?
Most of us working for the CGIAR Research program on Water, Land, and Ecosystems in the Mekong are working in complex and contested decision-scapes. We are exposed to the diverse and often competing objectives and values of regulatory agencies, academic disciplines, industries and factions of civil society. In addition, we are motivated by the impetus to balance development with ecosystem functions and livelihood needs.
Our focus on integrated research reflects the view that effective integration improves policy performance, agency accountability and reduces the level of institutional fragmentation, duplication and bureaucratic complexity. Community participation and an understanding of community motivations, entitlements, endowments, value orientations and capabilities are essential features of successful integration. Successful participation, in turn, is measured by how receptive and responsive decision makers are to the future needs and aspirations of those affected by natural resource decisions.
Baseline surveys of households are one important technique in the suite of approaches to inform natural resource planners of community capabilities, needs and aspirations. Our experience in the Nam Xong suggests that digital tablets and web based data software can make a valuable contribution to the ensemble of available research tools to elicit the needs, aspirations and capacities of communities.
Many livelihood questionnaires conducted in the Mekong region have been too fragmented for effective policy analysis at a level important to central agencies. Non-randomised survey results and small sample sizes have restricted analysis to individual provinces, districts or villages, creating a mosaic of disconnected information, often limiting interpretation to a coarse typology of households characterised as low, median and high livelihood status.
Data collated from these fragmented approaches are a necessary initiative for baseline livelihood assessments of communities, but they limit statistical inference to design effective policy interventions to improve livelihoods, community resilience and poverty reduction of non-sampled households.
Livelihood surveys in the Mekong are expensive and typified by limited survey budgets. Any initiatives to reduce costs generally translate as 1) more coherent and reliable questionnaire design and piloting 2) larger sample size and 3) improved data entry and reliability. These are the 3 areas where the use of digital tablets in the Nam Xong had substantial advantages.
Our experience with tablet interviews in the Nam Xong
With our national partners, we co-developed a complex and extensive Nam Xong questionnaire to meet the project objectives. After a two-day training session with enumerators, we conducted 60 tablet based pilot interviews in two villages close to Vientiane in a single day. Sim cards were installed in each tablet so the results of completed interviews were automatically uploaded, along with enumerator name, time of interview, and spatial coordinates.
We were able to revise the questionnaire online during lunch and download these revisions to the tablets. Enumerators were able to complete pilot interviews with the revised questionnaire in the afternoon. Real time data access meant we could check data integrity, consult with individual enumerators and respondents if they were having difficulty understanding the questions, and check the time taken for individual interviews. Effective enumerator training, completed reliability checks, and reduced interview times (compared to the paper based interviews) equates to reduced costs which equates to an increased sample size in the field. That means reduced sampling error and an improved degree of external validity. Reduced piloting costs meant more of the budget could be allocated to field interviews: we estimate an increased field sample size of about 20% by using the digital tablets.
In the past, a considerable amount of time (up to 3 months), effort and pain has been spent validating and correcting paper-based data after the field survey has been completed. The typical delay of several weeks between field interviews and completed data entry introduces substantial difficulties in checking responses with specific enumerators. And there is a high probability of incorrectly entered data derived from complex questionnaires.
Tablet based data are not infallible, especially with complicated questionnaires such as the Nam Xong example. However, real time access meant we could constantly monitor results and immediately contact supervisors to correct evident problems. This was crucial for 4 of the enumerators, who initially struggled with the subjective well-being section of the questionnaire.
We flag a health warning: surveys using digital tablets are neither a general remedy nor panacea for all survey ailments. Digital tablet questionnaire development still demands extensive and rigorous consultation, comprehensive field-testing and elaborated enumerator training. There is a substantial purchase outlay, and time is needed to design and code the questionnaire into the online software. The 20% increase in sample size saved around $6,000, compared to the $5,400 to purchase the tablets and battery packs. Unlike paper, one has to worry about running out of battery power in the field so additional battery packs are a prudent investment!