Look this up
Nora Cate Schaeffer1 and Stanley Presser2
1
Sociology Department, University of Wisconsin, Madison, Wisconsin 53706;
email: Schaeffe@ssc.wisc.edu
2
Sociology Department, University of Maryland, College Park, Maryland 20742;
email: Spresser@socy.umd.edu
Key Words questionnaires, survey research, measurement, methodology,
interviewing
Cite:
THE SCIENCE OF ASKING QUESTIONS
Annual Review of Sociology
Vol. 29: 65-88 (Volume publication date August 2003)
DOI: 10.1146/annurev.soc.29.110702.110112
Abstract:
Survey methodologists have drawn on and contributed to research by cognitive psychologists, conversation analysts, and others to lay a foundation for the science of asking questions. Our discussion of this work is structured around the decisions that must be made for two common types of inquiries: questions about events or behaviors and questions that ask for evaluations or attitudes. The issues we review for behaviors include definitions, reference periods, response dimensions, and response categories. The issues we review for attitudes include bipolar versus unipolar scales, number of categories, category labels, don't know filters, and acquiescence. We also review procedures for question testing and evaluation.
Look this up
Look up
"Closed-ended questions dominate most interview schedules. Yet the almost exclusive use of this form did not arise because open-ended questions, its major competitor, proved to be weak indicators of public opinion. Instead, responses from open-ended questions proved more difficult and expensive to code and analyze than those from closed-ended questions. Although such practical concerns are important, the real task of survey researchers is to measure public opinion accurately. Using an experimental design, this article tests whether open-ended questions measure the important concerns of respondents—one of the long-claimed advantages of this format. The results, on balance, show that open-ended comments reflect such concerns, suggesting that pollsters may want to include more of these questions in their surveys of public opinion."
"This article presents concept mapping as an alternative method to existing code-based and word-based text analysis techniques for one type of qualitative text data—open-ended survey questions. It is argued that the concept mapping method offers a unique blending of the strengths of these approaches while minimizing some of their weaknesses. This method appears to be especially well suited for the type of text generated by open-ended questions as well for organizational research questions that are exploratory in nature, aimed at scale or interview question development, and/or developing conceptual coding schemes. A detailed example of concept mapping on open-ended survey data is presented. Reliability and validity issues associated with concept mapping are also discussed."
"Open-ended questions are frequently used by survey researchers to measure public opinion. Some scholars, however, have doubts about how accurately these kinds of questions measure the views of the public. A chief concern is that the questions tap, in part, people's ability to articulate a response, not their underlying attitudes. This paper tests whether this concern is warranted. Using open-ended questions from the Center for Political Studies, I show that almost all people respond to open-ended questions. The few individuals who do not respond appear uninterested in the specific question posed, not unable to answer such questions in general. These findings should increase our confidence in work of scholars who have relied on open-ended questions."
"The results indicate that increasing the size of the answer box has little effect on early responders to the survey but substantially improved response quality among late responders. Including any sort of explanation or introduction that made response quality and length salient also improved response quality for both early and late responders. In addition to discussing these techniques, we also address the potential of the web survey mode to revitalize the use of open-ended questions in self-administered surveys."
Urša Reja, Katja Lozar Manfreda, Valentina Hlebec, and Vasja Vehovar
Abstract
Two quite different reasons for using open-ended as opposed to close- ended questions can be distinguished. One is to discover the responses that individuals give spontaneously; the other is to avoid the bias that may result from suggesting responses to individuals. However, open-ended questions also have disadvantages in comparison to close-ended, such as the need for extensive coding and larger item non-response. While this issue has already been well researched for traditional survey questionnaires, not much research has been devoted to it in recently used Web questionnaires. We therefore examine the differences between the open-ended and the close- ended question form in Web questionnaires by means of experiments within the large-scale RIS 2001 Web survey.
The question “What is the most important, critical problem the Internet is facing today?” was asked in an open-ended and two close-ended question forms in a split-ballot experiment. The results show that there were differences between question forms in univariate distributions, though no significant differences were found in the ranking of values. Close-ended questions in general yield higher percentages than open-ended question for answers that are identical in both question forms. It seems that respondents restricted themselves with apparent ease to the alternatives offered on the close-ended forms, whereas on the open-ended question they produced a much more diverse set of answers. In addition, our results suggest that open-ended questions produce more missing data than close-ended. Moreover, there were more inadequate answers for open-ended question. This suggests that open-ended questions should be more explicit in their wording (at least for Web surveys, as a self administered mode of data collection) than close-ended questions, which are more specified with given response alternatives.
Open and closed question types and when to use them.
A combination of open and closed question is often used to identify and compare what respondents will state spontaneously and what they will choose when given categories of responses. For instance, the open question:
"What do you think are the major issues facing your organization?"
__________________________________
could be followed up with a checklist question:
"Open-ended questions should begin with words such as "why" and "how" or phrases such as "What do you think about . . ." Open-ended questions should lead students to think analytically and critically. Ultimately, a good open-ended question should stir discussion and debate in the classroom sparking enthusiasm and energy in your students.
"
"One of the basic measurement problems in survey research is the reliable coding of open-ended questions. A posteriori methods for improving coding reliability are distinguished from a priori methods. An a posteriori method has been shown to be of limited value for improving reliability for certain coding tasks. This article proposes and illustrates a multi-step, a priori procedure for generating coding categories for open-ended items. Preliminary evidence is presented indicating that this method may yield reliability levels far superior to those typically obtained in coding open-ended questions"
"Factual; Convergent; Divergent; Evaluative; and Combination"
"Inductive Thinking: identifying patterns within small details to form big ideas."
20 items | 5 visits
Resources related to asking great questions.
Updated on May 11, 14
Created on Jan 20, 14
Category: Schools & Education
URL: