November 1984 // Volume 22 // Number 6 // Feature Articles // 6FEA1

Previous Article Issue Contents Previous Article

Another Kind of Evaluation

Not all evaluation needs to involve numbers. A focused and systematic approach to qualitative evaluation is presented together with a discussion of its possible application by Extension staff.

M. F. Smith
Associate Professor and Program Evaluation Specialist
Cooperative Extension Service, IFAS
University of Florida-Gainesville

Yvonna S. Lincoln
Associate Professor of Higher Education
University of Kansas-Lawrence

The press for accountability, threatened reductions in government funding for all programs, and the increased desire of professionals to demonstrate the usefulness of services they perform have prompted a great interest in Extension for evaluation methods to assess needs, demonstrate program impact on targeted audiences, and document needed improvements. One approach of great interest is the collection and use of qualitative data. This article compares qualitative and quantitative data, identifies common qualitative methods, describes a step-by-step process for gathering and analyzing qualitative data, and discusses two applications of the process to Extension.

... This article compares qualitative and quantitative data, identifies common qualitative methods, describes a step-by-step process for gathering and analyzing qualitative data, and discusses two applications of the process to Extension.

Qualitative vs. Quantitative Data

Qualitative data are different from quantitative in the depth, richness, and detail of information that's recorded.1 Qualitative data are comprised of detailed descriptions of people and events in natural settings; depth and understanding emerge from recording what people say in their own words and capturing their modifiers or qualifiers with carefully worded probe questions.

Quantitative data may also describe people and events i natural settings, but the difference is that activities, events, and people's experiences are classified into predetermined, standardized categories to which numerical values are then attached. The views of the world as seen by individual respondents may be lost.

The same question may sometimes be asked to elicit either qualitative or quantitative data. For example, a question about program impact could be asked this way for qualitative data:

"How has participating in this learn-by-mail course affected your grocery buying practices; that is, what do you now do that you did not do before you completed the course?"

For quantitative data, the question could be worded:

"Indicate which of the following steps you now practice as a result of having participated in this learn by-mail course on grocery buying practices. (Circle numbers of all steps the course encouraged you to take.)"

  1. Plan meals for a period of days before shopping for food.
  2. Check the date on perishable items before purchasing.
  3. Compare the costs of name brands, store brands, and generic or no-brand products.
  4. Write out a shopping list before going to the grocery store.
  5. Check grocery store advertisements before shopping.
  6. Other:_____________________
  7. None of the above.

Qualitative Methods

Qualitative methods generally are those that have been variously termed anthropological, ethnographic, or field work-oriented. The most common techniques are interviewing, participant and nonparticipant observation, and content analysis of reports, records, and other pertinent documents. The strength of these methods is that they provide natural, in-context, "slice-of-life" data, usually in the ordinary and everyday language of those who are respondents. Since qualitative methods rarely assign numerical identifiers, responses tend to be reported as descriptions and verbal reports. Similarly, technical and evaluation reports constructed from primarily qualitative material are done in the form of case studies, rather than tabulated reports. The purpose of these case studies, like the purpose of qualitative methods in general, is to enhance understanding, to give the reader a vicarious experience of the entity being evaluated or reported on, and to add richness, dimension, and unique perspective or "groundedness" to the context.

In general, the use of qualitative methods has proceeded from a model of research and evaluation that stands in opposition to the model of research typically used by the hard or biological sciences (and later adopted by the social sciences). That model of research and evaluation usually called naturalistic and/or responsive2-depends on assumptions that aren't permitted by the older, classical model. For instance, it relies on the belief that causality in human organizations is difficult, if not impossible, to determine; and that what human beings experience is largely a result of the values, beliefs, attitudes, and frameworks they impose on situations to make sense of them.

However, it's not necessary to understand emergent models of research or evaluation to use qualitative methods. All that is necessary is a commitment to acquiring the skills-and there are many good texts on qualitative methods available3- and a commitment to doing evaluation that can be more descriptive and more natural than that typically resulting from quantitative analysis.

Steps for Gathering and Analyzing Data

Extension faculty have been using qualitative methods in their work for many years. In discussions with farmers, ranchers, homemakers, volunteers, subject-matter specialists, and others, faculty have relied on questioning and observation to help them deliver better programs. To make these data-gathering techniques better serve the ends of decision making requires moving to a systematic and purposeful approach to data gathering. Here are the steps in the process:

  1. Decide if the audience to receive the data will be more interested in unique and personal accounts of client reactions and perceptions of program accomplishments or in succinct, parsimonious, statistically conclusive accounts. If it's the former, qualitative data and methods are appropriate. If it's the latter, qualitative data may still be useful, but will serve only as support for the primarily quantitative evidence.

  2. Focus the evaluation by generating preliminary areas of concern. Make sure the information needs of all who will receive the report are considered. In some cases, specific open-ended questions may be written for mail questionnaires; in other cases, a list of topics to cover in an interview may be sufficient. (The key aspect of naturalistic inquiry is that as new data are acquired, new questions and topics may evolve. For that reason, interviewing is probably the most often used qualitative approach.)

  3. Decide what's the best method to use, that is, how can the data best be acquired to answer the questions of the study. In some cases, content analysis of records , proposals, legislative re ports, or other written documents may be adequate. In other cases, personal observation of events or program activities may suffice. But, more often than not, interviews with program participants and other key people in local communities will provide the convincing evidence of program impact or whatever the questions may involve.

  4. Plan a systematic procedure for implementing the method. For example, if personal interviews will be conducted, care should be exercised in selecting who's interviewed. Appropriate sampling techniques should be used rather than talking with whomever happens to be handy.

  5. Implement the method. Great care should be taken to record the content of exactly what people say and/or exactly what's printed in documents.

  6. Analyze and interpret data. One of the positive characteristics of qualitative data is also one of its negative characteristics responses aren't standardized and are thus hard to summarize. Patton notes that analyzing qualitative data is a creative process and may be approached in different ways by different people.4 Here's a procedure very similar to what Patton uses that has worked well in analyzing interview data f rom Extension county program reviews:
    1. Make copies of all notes and save originals. You may want to refer to them later.
    2. Read through the notes several times until some sense emerges of what's there; that is, search for patterns or themes.
    3. In margins, write one- or two-word descriptors of what's included, with separations showing where topics start and stop.
    4. Make a list of topics-pre and emerging questions-and arrange in some order.
    5. Cut up the pages, making separate piles of paragraphs and pages for each question. Arrange in some logical order. (This step can be speeded up by using a word processor instead of scissors and tape!)
    6. Determine if additional data are needed for any questions. Collect what's needed and repeat Step "c" for each question. Assign subtopics, if necessary, and repeat Step "e".
    7. Write a summary for each question.
    8. Select examples from notes to serve as representations of typical respondent experiences.

  7. Write the report. List the original question of t study and new ones that emerged as the evaluation progressed. Describe the findings as succinctly as possible without destroying the naturalness of the data. Include as many typical program experiences as necessary to "paint" a total picture of the evaluated program.

Applications to Extension

Needs Assessment

Two particularly appropriate situations in which to use qualitative methods are for needs assessments and for identifying unique impacts of programs. In needs assessment, qualitative and quantitative methods may be used together to produce a better study than either might have produced alone. For example, an agent might start by interviewing a representative list of key informants in a community by asking open-ended questions about the needs/problems of a particular client group. Data from these responses could then be used to develop a questionnaire with standardized categories of possible responses worded in the users own terms and stated in the context of that community's types of experiences.

The standardized questionnaire (yielding quantitative data) would allow a large number of responses to be quickly and easily summarized (and would verify the accuracy of the original qualitative data). "Grounding" the data collection instrument in users'words may increase the number of responses to a needs assessment and can result in programs tailored more closely to community identified preferences.

Identifying Unique Program Impacts

Qualitative approaches are almost essential for identifying unique impacts of a program. We can ask participants to check which practices they implemented (on a quantitative data question) and draw conclusions about economic (and sometimes social) consequences. But the personal effect of implementing recommended changes can best be conveyed in the participants' own natural, spontaneous, and thoughtful words. Asking a sample of individuals to tell or write what the results of an Extension program were may uncover unintended and/or unanticipated benefits or harms. It can even reveal that the original goals and objectives were off target.

When carrying out a quantitative impact study, having personal accounts at the extremes of whatever is being measured can make the quantitative data more meaningful and provide a reader of the evaluation report with improved understanding of how the program worked and/or what the real effects were, by providing the full range of possible responses.


Qualitative methods are gaining advocates in evaluation circles. We believe they have utility in Extension evaluations. However, because they're very time-consuming, and present difficulties in analysis, they're not as appropriate for studies where a large number of individuals and/or sites must be assessed. Their most relevant contribution may be in combination with quantitative methods-the qualitative data augmenting and providing personalized references to broadly based, quantitatively gathered estimates of program impact.

Qualitative methods have been used by faculty in counties since Extension began. What's needed is a more focused and systematic approach and a careful matching of method with evaluation purpose and users.


  1. M. Q. Patton, Qualitative Evaluation (Beverly HiIIs, Calif ornia: Sage Publications, Inc., 1980).
  2. E. G. Guba and Y. S. Lincoln, Effective Evaluation(San Francisco: Jossey-Bass, Inc., 1981).
  3. Note, for example, two books by M. Q. Patton, Qualitative Evaluation and Practical Evaluation (Beverly Hills, California: Sage Publications, Inc., 1983); Guba and Lincoln, Effective Evaluation; and David Hamilton and others, eds., Beyond the Numbers Game (Berkeley, California: McCutchan, 1977).
  4. Patton, Qualitative Evaluation.