The Journal of Extension -

August 2013 // Volume 51 // Number 4 // Tools of the Trade // 4TOT1

Using iPads as a Data Collection Tool in Extension Programming Evaluation

Program evaluation is an important part of Extension, especially with the increased emphasis on metrics and accountability. Agents are often the point persons for evaluation data collection, and Web-based surveys are a commonly used tool. The iPad tablet with Internet access has the potential to be an effective survey tool. iPads were field tested at a grazing conference to assess their efficacy as a data collection tool. Participants with prior iPad experience found them easy to use, while some novice users had difficulty completing the survey. The high mobility of the iPads was a distinct advantage over laptops.

J. E. Rowntree
Assistant Professor
East Lansing, Michigan

R. R. Wittman
Graduate Student
East Lansing, Michigan

G. L. Lindquist
Extension Educator
Reed City, Michigan

M. R. Raven
East Lansing, Michigan

Michigan State University


Evaluating Extension programming is an increasingly vital component of reporting. Philosophically, program assessment is a critical part of the outreach process; however, increased emphasis from government and other agencies on program performance and accountability is an important driver (Radhakrishna & Martin, 1999). Concurrently, shrinking budgets and staff reductions coupled with growing diversity of programs have increased the workload of personnel. Agents conducting the actual programming are also usually the point person for collecting data used for program evaluation. Given the myriad of duties that agents are already responsible for, it is critical that innovative ways of collecting evaluation data be investigated and tested.

Surveys are especially useful for collecting information about clients' knowledge, attitudes, beliefs, and self-reported behaviors. Surveys can be distributed as handouts, mailed, telephone, Web-based, or face-to-face. Each form has its advantages and disadvantages, with the context of the situation being an important factor in choosing the appropriate one. (Taylor-Powell & Hermann, 2000). Dillman, Smyth, and Christian (2009) listed low cost, ease of data organization, and implementation time as advantages to Web-based surveys. A noted advantage is the capacity for complex surveys that use initial questions to predicate follow-up questions (e.g., a vegan wouldn't see a question about his or her favorite meat). Advantages such as this coupled with the increased availability of networked computers have resulted in a growing trend of using Web-based surveys. One limitation was the cost of computers and the need for a network. With the advent of the iPad both the cost and the network requirement can be diminished.

The first iPad went on sale April 3, 2010. The devices are highly portable and offer a multitude of uses, including Internet access. Within 2 years of the initial release, over 67 million iPads have been sold (Costello, 2012). Given this large number and the publicity surrounding Apple and the iPad, it is highly likely that many Extension clients are at least aware of the iPad and that some number have used or own one.

iPad as a Survey Data Collection Tool

Surveys developed using Web-based technologies allow for sophisticated designs and features. For example, many Web-based surveys allow for a client to use a slider button in response to a prompt. This type of feature elicits a more precise response than a standard Likert scale and is ideal for semantic differential scales. Also, data collected via Web-based surveys removes the step of inputting data from paper-based surveys, resulting in fewer mistakes, cleaner data, easier concatenation of datasets, and a savings in time.

Prior to the iPad, if agents wanted to use a Web-based survey for collecting evaluation data onsite during or at the conclusion of a program, they would have to provide for access to a computer and network access. Even the size of a normal laptop makes it problematic to haul some minimum number needed to efficiently collect data. A typical thin and light laptop weighs 4 to 7 pounds (Kyrnin, 2012), which equates to roughly 4 iPads. Furthermore, iPads are extremely portable. A client responding to a Web-based survey with an iPad would be able to do it standing in a field as long as there was a cellular connection.

Field Test Case Study

In 2012 a grazing conference was selected to field test iPads as a data collection tool for a Web-based survey being conducted to collect data from Midwestern livestock producers. The grazing conference was held at a university with adequate network capabilities so laptops could be used to back up the iPads. Both the laptops and the iPads (1st generation) were connected to the Internet via the WiFi available in the meeting area. Prior to the grazing conference, the authors pilot tested the iPads with the Web-base survey created with Qualtrics. The iPad's built-in Web browser Safari was used, and no problems were identified with either survey response or data collection with Qualtrics.

Demographics (Table 1) indicate the grazing conference attendees were diverse in age and education level. Respondents' age ranged from 22 to 77, with a mean of 53. The highest degree obtained by respondents also varied from less than a high school diploma to a doctorate. Nearly two-thirds of the respondents (63%) had at least a bachelor's degree.

Table 1.
Demographics of Grazing Conference Participants Who Participated in the Web-Based Survey

  Frequency Percent Valid Percent
Highest Degree Completed
< High school 1 2.6 2.6
High school 8 21.1 21.1
Some college 5 13.2 13.2
Bachelors 13 34.2 34.2
Masters 9 23.7 23.7
Doctorate 2 5.3 5.3
Total 38 100.0 100.0
Male 31 83.8 83.8
Female 6 16.2 100.0
Missing 1 2.6  
Total 38 100  
  Mean SD Range
Age 52.9 13.5 22 - 77

Respondents when using the iPad either had no issue with it or struggled navigating the survey even when shown how to use it. Those who did not have issues typically had prior experience using an iPad. Respondents who struggled universally had never used one before. Several were frustrated and switched to a laptop to finish. It was observed that some respondents with heavy calluses on their hands had difficulty with the iPad recognizing their hand movements.

The increased mobility of the iPad was an advantage over the laptops. The researchers were able to easily take the iPad to potential respondents and request them to complete the survey versus having to either wait for them to come by a stationary laptop. The iPads were especially convenient to deploy during meals. The researchers were able to take the iPad to potential respondents at their tables as they were finishing their meal. The longer battery life of the iPads also contributed to the mobility of the device compared to the laptops with their short battery life.

Conclusion and Recommendations

While iPads are prevalent, many participants had never used one. Despite its quick learning curve, several initially struggled. This must be considered when using an iPad to administer a Web-based survey. Even a momentary technical struggle when responding to a survey can cause respondent frustration and jeopardize completion. Further research should be conducted to determine if respondent demographics influences iPad survey efficacy. Age, education, and calloused-fingers could influence tablet use.

Despite the mentioned concerns, the iPad proved to be effective in collecting Web-based survey data. Portability was a definite asset. Clients had flexibility in where they took the survey, and those with prior use found it easy to navigate the survey. As tablet devices use grows, the documented issues should diminish.


Costello, S. (2012, April 25). What are iPad sales all time [Web log message]? Retrieved from:

Dillman, D. A., Smyth, J. D., & Christian, L. M. (2009). Internet, mail and mixed-mode surveys: The Tailored Design Method. Hoboken, NJ: John Wiley & Sons, Inc.

Kyrnin, M. (2012). Guide to laptop size and weight [Web log message]. Retrieved from:

Radhakrishna, R., & Martin, M. (1999). Program evaluation and accountability training needs of Extension agents. Journal of Extension [On-line], 37(1) Article 3RIB1. Available at

Taylor-Powell, E., & Hermann, C. (2000). Collecting evaluation data: Surveys (University of Wisconsin Publication no. G3658-10). Retrieved from: