• Workshops
  • Rethinking Analysis
  • Community
  • Blog
  • About
Susanne Friese, December 11 2023

Embracing Digital Transformation: The Evolution of Qualitative Research from Tape Recorders to AI


The landscape of qualitative research has witnessed profound transformations, evolving from the days of notetaking to leveraging the sophisticated capabilities of generative AI. This journey reflects not just technological advancements but also multiple paradigm shifts in how researchers approach, analyze, and interpret qualitative data.

The Early Days

Before the portable tape recorder became available, qualitative research primarily relied on analog methods without the aid of digital tools. Researchers heavily depended on memory and manual notetaking. It was a common practice for them to temporarily withdraw to jot down observations or conversations in the restroom before details faded from their minds. The absence of portable recording devices forced researchers to recall conversations from memory, a method fraught with challenges related to accuracy and completeness.

Researcher taking notes after a day of field work (image by DALL.E)

Tape Recorders: A Technological Leap

The advent of tape recorders in the late 1960s marked a significant milestone. For the first time, researchers could capture real-time conversations, leading to more accurate and comprehensive data collection.

Phillips cassette recorder, 1968

However, this innovation was not without its detractors. Critics like Jack Douglas feared that research might become an artifact of technology, with researchers potentially defining their settings and interests based on the capabilities of their recording devices.

“Recording devices are the technological invention with probably the greatest potential use in field research. Because of this potential, they pose a strong temptation to the beginning researcher, one that can easily lead them astray. Some researchers have been almost transfixed by recording devices so that they come to define their research settings and their theoretical interests in society as a whole in terms of these devices.” (Jack Douglas, 1976:36)

There surely were advantages. The tape recorder was a very efficient device for data gathering, it was easy to use, the cost was low, and instead of being busy taking notes one could concentrate on conducting the interview. However, this also introduced new challenges such as the need for transcription.

The Transcript – A Necessary Evolution

The cassette recorder, while a significant step forward, came with its own set of issues. Fragile tapes and the tedious process of searching through recordings made transcripts a necessity. Preparing transcriptions was a time-consuming task. Errors crept in due to fatigue or carelessness.

You could outsource it to a third party, but an outsider was not familiar with the subject of the research, and you could never be sure how they interpreted the data while transcribing. The use of punctuation can change how one reads and interprets the data.

Transcription also leads to loss of information such as intonation, laughter, and silence. Various transcription styles like the Jeffersonian transcript or a score transcript emerged to mitigate these challenges.

Different forms of transcripts were developed

Transcripts offered undeniable benefits, providing a more accurate representation of events and conversations, and capturing hesitations, restarts, and interruptions that were otherwise lost in note-based summaries.

The necessity of preparing transcripts brought about another change – now the accustomed ways of analyzing data were not sufficient to deal with all the details:

Instead of briefly summarizing fieldnotes and remarks gathered by interviewing, or even simply referring to these data, many… accounts cited them word for word, which forced researchers to construct finer categories of analysis and to explain their interpretation of remarks and behavior in more detail.“ (Chapoulie, 1987:270)

This coincided with another technological advancement:

Qualitative Data Analysis Software - A New Era of Data Analysis

By the mid-1980s, the introduction of Computer-Assisted Qualitative Data AnalysiS (QDA software), like The ETHNOGRAPH and Nud*ist, later MAXQDA and ATLAS.ti, and many others, revolutionized data analysis. These tools streamlined data organization, coding, and analysis. Initial concerns regarding these tools included trading resolution for scope, potential distancing of researchers from the data, access, and cost issues, learning curves, and the risk of prioritizing form over content.

Contrary to the fear that researchers would amass an excessive volume of data, studies revealed that the average number of interviews per study remained relatively modest. If you consult the literature, you will find that the consensus among most authors is that with approximately 12 interviews, you typically achieve saturation, especially when dealing with a homogeneous sample (Guess, 2006).

Concerns about software distancing researchers from their data persist to this day. Look at the paper by Jackson et al (2018): The Walking Dead Genealogy: Unsubstantiated Criticisms of Qualitative Data Analysis Software (QDAS) and the Failure to Put Them to Rest – to ascertain the origins of this fear and determine its substantiation.

Richards and Richards (1991) in contrast recognized early on that the event of QDA software brought about a methodological revolution. Consider all the following requirements, emerging realities, and available options that researchers needed and had the privilege to adapt to – and this was as early as the 1990s:

This wasn't done all automatically by the software, but the software greatly facilitatied data analysis. It had a profound impact on the research process, with a newfound emphasis on coding and category system development. This shift fundamentally altered how researchers engaged with their data, allowing them to seamlessly transition between textual material and coded data.

 Generative AI - The New Frontier

The qualitative research landscape is on the cusp of another transformation with the rise of generative AI tools like ChatGPT and others. Once again, we see promises of more efficiency, speed, the ability to analyze vast amounts of data, higher accuracy, and reliability.

I have already addressed and refuted certain promises, particularly concerning the integration of AI into QDA software tools. While some features have garnered user favor, allowing them to utilize AI as an assistant for idea generation or as a quality control mechanism, it is worth noting that, instead of expediting the analysis, this integration often introduces additional time. Although the capability to generate summaries may appear as a potential time-saving benefit, it frequently falls short of delivering accuracy and reliability.

The hope that AI can take over the labor-intensive process of coding has not been fulfilled, at least not in its current implementations. This does not mean that the integration of AI tools in existing CAQDAS tools is useless. The issue is more with the promises that are attached to it. It can be very useful to call upon the AI assistant if one feels stuck, wants to verify an idea, or check whether one has overlooked something.

With the advent of new technologies, there are invariably accompanying concerns. Users have raised valid issues regarding ethics, inherent biases, data security, accuracy, and the overall reliability of these innovations. It's widely recognized that Language Learning Models (LLMs) possess certain biases due to their training methodologies. AI-assisted data analysis typically necessitates transmission through an API to an external server, which raises anxieties about potential privacy violations, particularly with sensitive data. Many Europeans harbor reservations about data storage on U.S. servers, fueled by apprehensions about non-compliance with GDPR standards. While some of these fears are well-founded, others stem from misinformation and a lack of comprehensive understanding.

Addressing the issue of model bias, it is beneficial that there is a discussion about bias, enabling researchers to account for it when interpreting results derived from AI tools. This awareness prompts a reminder to all qualitative researchers about inherent biases in their work. Why expect AI to deliver unbiased outcomes when our analyses, influenced by our gender, subjective backgrounds, attitudes, and experiences, are themselves biased? We, as researchers, bring these personal elements into our analysis and have learned the importance of reflecting on them. When employing AI as a tool, a similar level of reflection is imperative.

We need to move beyond seeking 'a faster horse,' metaphorically speaking. This means reevaluating traditional methods of data analysis that have prevailed for the past four decades and being open to innovative, perhaps radically different, approaches in our research methodology.

The advent of QDA software has markedly transformed analysis methodologies. Maybe it is time to reevaluate why we analyze qualitative data the way we do.

Re-thinking Analysis

As we look to the future, it's crucial to think outside the box. The question isn't just whether AI can make analysis faster or more efficient, but whether we can reimagine the process of qualitative analysis itself.

In German, we say: "to look beyond the edge of the plate" (image by DALL.E)

Concerns often arise about the 'black box' nature of AI – the opacity surrounding how analysis is generated. These concerns are particularly pronounced when there's an expectation that AI will conduct the analysis independently, delivering instantaneous results. Instead, it's more constructive to view AI as a collaborative tool, an assistant with which one can interact.

Initial experiments with Language Learning Models (LLMs) such as ChatGPT, along with platforms like AI-LYZE, Coloop, and QInsights by Qeludra, indicate promising new avenues for engaging with qualitative data. By conceptualizing the AI assistant as an entity, you can "converse" with, its mysterious 'black box' nature diminishes. This interaction allows you to probe deeper into the results presented, introduce fresh perspectives, and even engage in discussions with the AI. A significant advantage of this approach is the AI's capability to retain all your data, providing an efficient way to locate the specific information you need.

The introduction of the tape recorder revolutionized data capture by eliminating the need for researchers to rely solely on memory for documentation. However, this innovation brought a new challenge: managing the voluminous data produced through recordings, which required transcription. Researchers found themselves inundated with hundreds of pages of material, far beyond what could be effectively remembered or manually handled. This is where QDA software emerged as a pivotal solution, offering an efficient means to organize, analyze, and manage this extensive data.

Now, AI serves as a powerful agent, capable of remembering extensive data on our behalf. Our role transforms into one of inquiry and interaction: posing questions, raising issues, and presenting ideas to the AI. We then engage with the AI, seeking feedback and discussing these concepts, effectively utilizing the AI as a dynamic tool for exploration and analysis. If you are interested, take a look at this post, where I delve deeper into this topic: Ethical and Responsible Use of AI for Qualitative Analysis

For now, I encourage you to secure a spot on our early access list. Space for the beta test group is limited, but if you act swiftly, you'll be among the first we invite to use the app when it launches.

To register for early access, click here.

 You can watch the full presentation below or here:

 

Literature

Agar, M. (1991). The right brain strikes back. In: G. Fielding & R.M. Lee (Eds.). Using Computers in Qualitative Research, pp. pp. 181-194. London: SAGE

Chapoulie, J.M. (1987). Everett C. Huges and the development of fieldwork in sociology. Urban Life, 15: 259-98.

Davidson, C. (2009). Transcription: Imperatives for qualitative research. International Journal of Qualitative Methods, 8 (2), 36-52.

Douglas, J. (1976). Investigative Social Research. London: SAGE.

Evers, J. C. (2010). From the Past into the Future. How Technological Developments Change Our Ways of Data Collection, Transcription and Analysis. Forum Qualitative Sozialforschung Forum: Qualitative Social Research, 12(1). https://doi.org/10.17169/fqs-12.1.1636

Fasick, F. A. (1977). Some uses of untranscribed tape recordings in survey research. The Public Opinion Quarterly, 41 (4), 549-552.

Fielding, G. and Lee, R. M. (1991). Using Computers in Qualitative Research. London: SAGE.

Fielding, G. and Lee, R. M. (1998). Computer Analysis and Qualitative Research: London SAGE.

Gao, J., Tsu, K. Choo, W., Cao, J. Lee, R.K.W. and Perrault, S (2023). CoAIcoder: Examining the Effectiveness of AI-assisted Human-to-Human Collaboration in Qualitative Analysis. https://doi.org/10.48550/arXiv.2304.05560 arXiv:2304.05560 [cs.HC]

Guest, G., Bunce, A., & Johnson, L. (2006). How Many Interviews Are Enough?: An Experiment with Data Saturation and Variability. Field Methods, 18(1), 59-82. https://doi.org/10.1177/1525822X05279903

Jackson, K., Paulus, T., & Woolf, N. H. (2018). The Walking Dead Genealogy: Unsubstantiated Criticisms of Qualitative Data Analysis Software (QDAS) and the Failure to Put Them to Rest. The Qualitative Report23(13), 74-91. https://doi.org/10.46743/2160-3715/2018.3096

Lapadat, J.C. and Lindsay, A.C. (1999). Transcription in research and practice: From standardization of techniques to interpretive positionings. Qualitative Inquiry, 5 (1), 64-86. doi:10.1177/1049732303259804

Lee (Eds.). Using Computers in Qualitative Research, pp. 107-116. London: SAGE

Mondata, L. (2007). Commentary: Transcript variations and the indexicality of transcribing practices. Discourse Studies, 9 (6), 809-821.

Ochs, E. (1979). Transcription as theory. In E. Ochs & B.B. Schiefflin (Eds.). Developmental pragmatics, pp. 43-72. New York, NY: Academic Press.

Richards, L. and Richards, T. (1991). The transformation of qualitative method: Computational paradigms and research processes. In: G. Fielding & R.M. Lee (Eds.). Using Computers in Qualitative Research, pp. 38-53. London: SAGE

Seidel, J. (1991). Method and madness in the application of computer technology to qualitative data analysis. In: G. Fielding & R.M.

Tessier, S. (2012). From field notes, to transcripts, to tape recordings: Evolution or Combination? International Journal of Qualitative Methods, 11 (4), 446- 460.

Zhang, H., Chuhao, W., Jingyi, X. Yao L., Jie, C. and Carroll, J. M. (2023) Redefining Qualitative Analysis in the AI Era: Utilizing ChatGPT for Efficient Thematic Analysis. College of Information Sciences and Technology, Penn State University, USA. https://doi.org/10.48550/arXiv.2309.10771 or arXiv:2309.10771v1 [cs.HC]

Written by

Susanne Friese

Tags

Older Beyond the Comfort Zone: The Dynamics of Technological Acceptance
Newer Looking into the Mirror: Reflection on AI and Human Bias in Research