Lecturer: Michael Gibbert.
Week 2 (23 - 27 August 2021).
Workshop objectives and content
The aim of this course is to help participants who intend to use, or are already using, qualitative methods for research purposes, either as a stand‐alone design or as a complement to quantitative methods. Students will learn about the values and limitations of qualitative approaches, gain an overview of the variety of traditions and methods and learn hands on how to conduct a qualitative study from its design to data collection and analysis all the way to publication. We will start the course by clarifying the ontological and epistemological foundations of qualitative research (vs. quantitative research) and outlining two of its key traditions (positivist vs. interpretivist stance) and their debates (e.g. validity/reliability vs. reflexivity/trustworthiness). We then zoom in on the interactive phases of a research project, including design (in particular designing case study research), data collection (employing multiple sources, in particular interview and observation based research), and data analysis (drawing on analytical methods widely used in social sciences such as grounded-theoretical coding, and process vs. cross-sectional analysis).
The fundamental difference between qualitative and other (quantitative) methods is essentially one of communication: in qualitative work there tend to be more variables of interest than data points, or, indeed, more data points than originally-imagined variables of interest. This conundrum makes qualitative studies very "rich", but has also led to a good deal of criticism about their rigor, particularly in terms of validity, reliability, credibility and trustworthiness. Perhaps most problematically, qualitative research strategy is not blessed with the same reporting conventions as her brothers and sisters in quantitative methods. Both field studies (surveys without experimental controls), as well as lab studies (which do include experimental controls, by definition), enjoy well-established quality checks, which can be outsourced to computer software and reported neatly in the final report.
Given these issues, we take a problem-oriented approach that looks across research stages into the question of how to craft methodologically sound qualitative studies so that they have a good chance of publication in reputed journals. Specifically, we look into each stage of the research project, including design (research question, case selection, etc.), data collection and analysis (employing multiple sources of data including surveys, interviews, archival data, and participant-observation, pattern matching, etc.), as well as reporting and write-up (issues of rhetoric and style). At every stage of the process, the emphasis is on how to resolve rigor issues, and the different strategies to enhance the quality of qualitative methods. In the process, we pay close attention to idiographic (commonly also called interpretivist, or even constructivist perspectives) on top of nomothetic (aka deductive or positivist) approaches. The objective is to look across ideological bases for doing qualitative research to appreciate their commonalities, which are often hidden behind apparent differences. Again, the ultimate objective is to get one’s research accepted by the editorial board of a journal or the thesis commission (or both) and doing qualitative work more rigorously (whether in an interpretivist or positivist sense) helps.
Students will be required to do the ‘theory’ readings (see the day-by-day schedule below) in preparation of each class and will practically work in small groups with specific methodological approaches (the ‘application readings’) during class, always with an eye on their own (PhD- or other research-) projects. The emphasis in the group work is learning from the best (and sometimes, apparently ‘worst’) practices published in reputable journals. USI Ph.D. students are asked to prepare (within five working days after completion of the course) a post-course assignment of 2500-3000 words:
Conduct a critical appraisal o the methodology of an empirical article using a qualitative approach: what research design is it deploying, what data collection methods are used, how is data analysis conducted? What was done well, what could have been done better?
Are you convinced by the methodological choices? How could the methodology have been further improved? What different methodological choices could have been made?
What elements of the qualitative approach and the qualitative methods used can be of guidance for your own PhD project? How would you use them?
Eisenhardt, K.M. (1989). Building theories from case study research. Academy of Management Review, 14 (4), 532-550.
Geddes, B. (1990). How the Cases You Choose Affect the Answers You Get: Selection Bias in Comparative Politics. Political Analysis, 2, 131-150
Gerring, J. (2004). What is a Case Study and What Is It Good for? American Political Science Review, 98(2), 341-354.
Gerring, J. (2007). Techniques for choosing cases. In J. Gerring Case Study Research (pp. 97-108 i.e. from extreme case until and including deviant case).
Gibbert, M., & Ruigrok, W. (2010). The What and How of case study rigor: Three strategies based on published work. Organizational Research Methods,13 (4), 710-737.
Gibbert, M., Nair, Lakshmi Balachandran, Weiss, M., Hoegl, M. (2020). Using outliers for theory building. Organizational Research Methods, 1-10
Gibbert, M., Ruigrok, W., & Wicki, B. (2008). What passes as a rigorous case study? Strategic Management Journal, 29, 1465-1474.
Gioia, D. A., Corley, K. G., & Hamilton, A. L. (2013). Seeking qualitative rigor in inductive research notes on the Gioia methodology. Organizational Research Methods, 16(1), 15‐31.
Gioia, D. A., Price, K. N., Hamilton, A. L., & Thomas, J. B. (2010). Forging an identity: An insideroutsider study of processes involved in the formation of organizational identity. Administrative Science Quarterly, 55(1), 1‐46.
Hamel, J. (1993). Case Study Methods. London, UK: Sage. Chapter 1-2
Hoorani, B. H., Nair, L. B., & Gibbert, M. 2019. Designing for impact: The effect of rigor and case study design on citations of qualitative case studies in management. Scientometrics, 121(1): 285-306.
Ketokivi, M., & Mantere, S. (2010). Two strategies for inductive reasoning in organizational research. Academy of Management Review, 35(2), 315‐333.
Langley, A. (1999). Strategies for theorizing from process data. Academy of Management Review, 24(4), 691-710.
Pratt, M. (2008). Fitting oval pegs into round holes, Organizational Research Methods, 11 (3) 481-509.
Sex, writhes, and videotape. The Economist, May 3rd, 2014.
Slater, D., & Ziblatt, D. (2013). The Enduring Indispensability of the Controlled Comparison. Comparative Political Studies. Advance online publication. doi: 10.1177/0010414012472469.
Sullivan, C. J. (2011). The utility of the deviant case in the development of criminological theory. Criminology, 49(3), 905-920.
The birdmuda triangle. The Economist, February 2nd, 2013.
Welch, C. & Piekkari, R. 2017. How should we (not) judge the ‘quality’of qualitative research? A re-assessment of current evaluative criteria in International Business. Journal of World Business, 52(5): 714-725.