December 2018 // Volume 56 // Number 7 // Tools of the Trade // 7TOT5
Plickers for Success: A Technological Tool for Advancement in Data Collection
Extension faculty and staff are constantly planning, implementing, and evaluating programming efforts. Data from participants are needed for assessing changes in knowledge, attitude, and behavior through summative and formative evaluation. However, collecting these data can be time-consuming and hard to achieve. A technological tool called Plickers makes data collection quicker and easier. The electronic application can be implemented in Extension settings through the use of a smart device to set up classes, add participants, build surveys, and export data.
Collection of data from program participants is an essential aspect of Extension work. Such data are needed to assess changes in knowledge, attitude, and behavior and are obtained through summative and formative evaluation. Common measures support evaluation endeavors in state and national programs in many areas of Extension (Diem, 2016; Payne & McDonald, 2012). Evaluations derived from common measures typically involve some form of a web-based or paper-based survey instrument. With funding partners desiring measured impact, Extension clientele are subject to survey fatigue, an "overexposure to the survey process" (Steeh, 1981, p. 53). For example, Porter, Whitcomb, and Weitzer (2004) found that administering multiple surveys in a year led to decreased response rates for the later surveys. Electronic tools may be an antidote to survey fatigue as they can be used for collecting data instantaneously while engaging participants in a psychomotor activity.
If used efficiently, technology can allow educators to engage learners in critical thinking, quickly check learners' understanding, and collect data as learners advance through their programs (Kudryavtsev, Krasny, Ferenz, & Babcock, 2007). One tool Extension educators can use to increase learners' active engagement with program content and gather formative assessment data in real time is Plickers, a free online audience response system used for collecting data from groups.
In implementing Project GROWL (Growing Real Opportunities for Work and Life in Agriculture)—a program focused on empowering teens to become influencers in their communities—we have used Plickers to collect baseline data from participants. Others working in Extension—educators, program evaluators, or researchers—could benefit from applying Plickers for various purposes as it is simple to use and requires minimal supplies.
Setting Up Plickers for Data Collection
Once a user has obtained a Plickers account, he or she uses features of the application to prepare for collecting and organizing data.
- Set up classes. This feature allows the user to create different groups of participants so that data may be organized by group.
- Assign participants. This feature allows the user to create a unique identification code for each participant. This code becomes associated with that person to make further data collection easier. These codes can be as simple as a participant's name or as complex as a code randomly generated to provide anonymity.
- Build surveys. This feature allows the user to create new surveys or import existing surveys.
- Export data. This feature allows the user to move data collected into a document for further analysis.
Using Plickers for Data Collection
With the Plickers application, the user develops an assessment instrument that has multiple-choice questions and prints a set of response cards for distribution to participants. The cards resemble QR codes. Each response card is unique—no two cards are the same. The cards are turned four ways to correspond with four answer choices. The user can print response cards from the Plickers website for no additional cost.
The user presents the instrument to participants via the Plickers application, displaying one question at a time with respective answer choices. As each participant determines an answer, he or she holds up the response card, turned in the direction that matches the desired answer choice. The user then captures and stores the responses by using a smart device camera to video scan the room. The device alerts the user if a participant is missing data or if the scan was not complete.
The response cards are numbered, but because the Plickers system does not require a user to set the unique identifier, data collection can be either anonymous or confidential. All data collected are stored in "classes," components of an organizational structure on the Plickers website. After data collection, the user can directly print the results or download the data into a Microsoft Excel file. The user can open, edit, close, delete, and analyze data at any time.
Benefits of Using Plickers
This electronic data collection tool is unique in several ways.
- By involving cards printed with codes, Plickers removes the need to supply all participants with technological devices for accessing an evaluation. The use of these codes translates to the need for fewer resources (batteries, Wi-Fi, charging stations, etc.) as compared to requirements for other technological data collection systems (Parmer, Parmer, & Struempler, 2012).
- The surveys can be projected to be viewed by participant pools of all sizes, and the technology ensures that participants are sharing their opinions. If responses are required, the user can verify that all participants' responses are collected. This scenario represents an improvement over paper and pencil administrations, where someone may accidently skip a response.
- There is no need to create individual accounts for multiple participants. Once a user has created an account, he or she can create multiple classes, or data collection groups, and assign participants to these groups. This functionality allows participants to respond without having to create their own accounts as is the case with some electronic audience response tools.
Plickers in Extension
In using Plickers for administering questionnaires, our Project GROWL team aimed to keep youths engaged in their nonformal education setting while collecting baseline data that would help inform the direction of the program. Our team found it effective to print the response cards, which print in a rectangular format, on cardstock and then cut them into square shapes to increase the confidentiality of responses. If a participant were to hold up an uncut rectangular card, other participants would have a better chance of determining the participant's response on the basis of the orientation of the card. Once we have cut the cards into squares, we laminate them to help keep the response codes from getting dirty, damaged, or destroyed.
We also add a small label with a participant number in the top left corner on the back of each square. We place the labels in this position so that the participant will know what side of the square is the top. The labels are small enough that others are not able to determine how a participant has responded. These participant numbers are chosen as unique identifiers for the participants enrolled in the program. Each participant is assigned one number, which can be transferred to different survey administrations throughout the year. Response cards are taken up after data collection for future use, and these labels help our team members redistribute the response cards to the correct participants and keep track of missing or damaged response cards.
As noted, we have used Plickers to collect baseline data from program participants, but the application can be used for various purposes. Therefore, Extension professionals will find Plickers to be a practical and flexible tool.
Diem, K. G. (2016). Program standards and expectations: Providing clarity, consistency, and focus. Journal of Extension, 54(4), Article 4TOT1. Available at: https://www.joe.org/joe/2016august/tt1.php
Kudryavtsev, A., Krasny, M., Ferenz, G., & Babcock, L. (2007). Use of computer technologies by educators in urban community science education programs. Journal of Extension, 45(5), Article 5FEA2. Available at: https://www.joe.org/joe/2007october/a2.php
Parmer, S. M., Parmer, G., & Struempler, B. (2012). Testing a new generation: Implementing clickers as an Extension data collection tool. Journal of Extension, 50(5), Article 5TOT5. Available at: https://www.joe.org/joe/2012october/tt5.php
Payne, P. B., & McDonald, D. A. (2012). Using common evaluation instruments across multi-state community programs: A pilot study. Journal of Extension, 50(4), Article 4RIB2. Available at: https://joe.org/joe/2012august/rb2.php
Porter, S. R., Whitcomb, M. E., & Weitzer, W. H. (2004). Multiple surveys of students and survey fatigue. New Directions for Institutional Research, 2004(121), 63–73.
Steeh, C. G. (1981). Trends in nonresponse rates, 1952–1979. Public Opinion Quarterly, 59, 66–77.