header
vol. 18 no. 1, March, 2013


Qualitative interpretative categorisation for efficient data analysis in a mixed methods information behaviour study


Peter Stokes
Anglia Ruskin University, Peterborough, PE2 9PW, UK
Christine Urquhart
Aberystwyth University, Department of Information Studies, Aberystwyth, SY23 3AS, UK


Abstract
Introduction. This paper discusses the development of qualitative interpretative categorisation, a method of data analysis suitable for mixed methods information behaviour research.
Method. The data analysis approach draws mainly on the work of Burnard, Sandelowski, and Miles and Huberman. The qualitative data was gathered from interviews (n=11) with nursing students at one site in the UK.
Analysis. A start list of categories (from the research questions, and the quantitative findings) was modified through data reduction and clustering, with dendrograms used for data display and further interrogation of the data.
Results. Of the seven initial categories, one was removed and another added. The title of one category was changed to reflect its new meaning. All categories were fully redefined.
Conclusions. The method proposed offers a systematic approach to integrating qualitative data into a predominately quantitative mixed methods study.


Introduction

Mixed methods research has its advocates (Creswell and Plano-Clark 2011; Johnson et al. 2007; Morgan 2007) and detractors (Giddings 2006; Symonds and Gorard 2008) but the research method has been used widely (Bryman 2008; Lipscombe 2008). Morgan (2007) refers to mixed-methods as a 'pragmatic' approach as during the design of research, data collection, and data analysis 'it is impossible to operate in either an exclusively theory or data-driven fashion' (Morgan 2007: 71). Teddlie and Johnson (2009) concur that pragmatism is the philosophical partner for mixed-methods that embraces and synthesises ideas from both sides (quantitative and qualitative). Patton (2002) agrees that pragmatism, that is, being adaptable and creative, is a valid approach as gathering the most relevant information outweighs methodological purity. The pragmatic approach relies on abductive reasoning that moves back and forth between deductive (quantitative) and inductive (qualitative) reasoning. Pragmatism is pluralist as it accepts the variety of competing interests and forms of knowledge, allowing knowledge to be evaluated according to whether it works in relation to a particular goal (Cornish and Gillespie 2009); focusing on what practical difference can be made, and whether any theory or idea is successful in accomplishing a desired effect (Baert 2005). By using both quantitative and qualitative methods in a pragmatic approach, a deeper, richer understanding of the information seeking process could be attained. This is in line with the definition of mixed methods as offering 'a powerful third paradigm choice that often will provide the most informative, complete, balanced, and useful research results' (Johnson et al. 2007: 129). This paper discusses the development of an efficient approach to qualitative analysis that fits within a mixed methods study of information behaviour. The findings illustrate how qualitative interpretative categorisation offers an efficient framework for analysing the qualitative component of a mixed methods study that is primarily quantitative in approach.

In many information behaviour studies the mixed methods approach may simply take the form of a questionnaire survey (mainly quantitative) with some open ended questions included for qualitative analysis – the aim being to provide confirmation and explanation, for example, to clarify some of the reasons the searchers had for the strategies and changes made as their searches progressed on various types of task (Kim 2009). Mason and Robinson (2011) used some open-ended questions in an online survey of the information behaviour of artists and designers. The quantitative and qualitative elements operated simultaneously in this case. The Mason and Robinson (2011) approach was theoretically based (on previous research on artists). Other examples of a largely quantitative approach, with qualitative elements providing explanation, include the research by Xie and Joo (2010) that used qualitative diary data to illuminate reasons for using, or not using various Web search tactics, in a primarily quantitative study of the frequency of use of various search tactics. Kwon (2008) used validated instruments for critical thinking and library anxiety for the quantitative component of a study examining the relationships between critical thinking and library anxiety. Student essays were analysed qualitatively to explain how these relationships developed, and how emotions (and critical thinking disposition) affected how students coped with library anxiety. The qualitative and quantitative approaches are integrated closely in such research as the rationale for a longitudinal study of student information behaviour explains (Rowley and Urquhart 2007). The model that was produced in the latter project required findings from both the quantitative and qualitative components of the research; the qualitative data were certainly necessary in providing the justification and explanation (Urquhart and Rowley 2007). The quantitative element also provides the background context for other mixed methods studies in information behaviour that may be primarily interpretive/constructivist (e.g. Williamson 2008).

More unusually, quantitative analysis may be possible on a large qualitative data set. This happened with Urquhart and Rowley (2007), with the complementary quantitative data collection used to refine the information seeking model. Lu (2010) lacked an available strong theoretical framework for children's coping and information seeking in daily life, and therefore used qualitative methods to collect and analyse the data. The size of the data set (641 interviewees) allowed some quantitative analysis to help understand the effect, for example, of gender and grade level, as well as some of the emergent factors.

When reviewing the previous decade of information behaviour research for Information Seeking in Context conferences, Vakkari (2008) noted a trend towards varied, but generally descriptive qualitative research. If (more unusually) qualitative and quantitative research were combined, this usually took the form of qualitative research being used to derive the factors and categories, which were then subjected to some quantitative analysis. Pragmatism, as a general philosophy, might be predominant in the type of questions asked, and the approach used. Theory and model testing are more difficult if there is no suitable theory to use. Theory may, of course, also be generated through qualitative research, through grounded theory, but proper application of grounded theory seems very mixed in studies of information behaviour (Gonzalez-Teruel and Abad Garcia 2012). A study of reviews of information behaviour research (Urquhart 2011) testifies to the wide range of inquiry paradigms used, and the difficulty, sometimes, of identifying which are used, and why. Despite the increasing representation of qualitative research for the ISIC conference, Urquhart noted that in the general literature on information behaviour, cognitive (and quantitative) approaches are still comparatively common.

Deciding on the most appropriate mixed methods approach for an information behaviour research study may depend on the research question (Teddlie and Tashakkori 2009). It seems sensible, given the occasional lack of transparency about operating paradigms in information behaviour research, to at least state which paradigm is the main operating paradigm for the mixed methods study (Crotty 1998). Next, decisions should be made about the data collection methods and their timing, particularly the relationship between qualitative and quantitative data collection methods. The third step involves decisions about the synthesis and integration of the data collected. This mixed methods study aimed to produce a profile of information seeking behaviour in nursing students. A concurrent embedded mixed methods design was used, a design in which one data set provides a complementary role in a study based primarily on the other data set.

In this study, the main operating paradigm was post-positivist. The qualitative data collection and analysis served to check and critique both the truth of the findings of the quantitative research, and the assumptions made in the quantitative research. The literature suggested that several types of factors were important, but the study aimed to go beyond identification of one factor as more important than another. The aim was to provide a profile, where the qualitative research could bring meaning, a human face to the profile, through integration of the quantitative and qualitative data.

During the process of collecting and analysing the data, a particular technique, termed as qualitative interpretative categorisation was developed for the mixed-methods research project (see outline in Stokes 2008). This paper provides a brief synopsis of the research project, before discussing the methodology used for the qualitative data analysis.

Background

Phase one, the quantitative component of the study, used a questionnaire containing validated research tools, as well as a set of questions based on Foster's non-linear model of information seeking (Foster 2005), which predicts that individuals search using a range of different methods in a non-linear process. The aim of the questionnaire was to determine whether either personality, self-efficacy, or learning style impacts on the information seeking behaviour of nursing students (n=194). Phase two was to gather qualitative interview data examining the information needs and seeking processes of a smaller sample (n=11) of nursing students.

Ideally, the interview schedule would have been designed in detail once findings from the quantitative component had been analysed, but in practice this was impossible, as the ethical review for the study dictated that an interview schedule be developed prior to quantitative data collection. There were also sampling constraints as time pressures dictated that all participants for the study were approached at the same time for both parts of the study.

Initial findings from part one have already been reported (Stokes and Urquhart 2011) and addressed the following two research questions:

The aim of the qualitative part of the study was to investigate the search processes of nursing students, to answer the following questions:

It should be emphasised that the word search here refers more to the strategy taken, not to the mechanics of information system use. The interviews were not concerned with which particular resources were used and in what order. The emphasis was on why students might feel more comfortable with certain sources, what their information seeking routines might be, and how they felt about these routines throughout the process of doing a search or set of searches for a particular project assignment. The types of information behaviour that were studied were initially assumed to be Kuhlthau's stages of initiation, selection, exploration, formulation, collection and presentation (Kuhlthau, 1991) but we would later relabel the final stages as 'collection and evaluation' and 'presentation and ending' to describe the emphasis of the questions better. In this context seeking is focused on the identification of sources.

The final phase of the study (not presented here) links the two previous phases and addresses the following research question:

A concurrent, embedded study allows either the comparison of the two sets of data or for them to reside side by side as 'two different pictures that provide an overall composite assessment of the problem' (Creswell 2009: 214). This latter approach is taken in this study, as Creswell suggests that this type of study can address different research questions.

Development of methodology

Introduction

The qualitative analysis pragmatically synthesises elements of the work of Burnard (1991), Miles and Huberman (1994) and Sandelowski (2000, 2010), with the critical incident technique (Flanagan 1954) to provide an overall technical framework of analysis termed qualitative interpretative categorisation. The framework is adaptable to different situations and research requirements and can be neatly applied when time pressures are a factor. All four of these research approaches are described and then summarised in Table 1 which shows the contribution of each approach. Figure 1 shows the iteration involved in the stages of data analysis.

Critical incident technique

The critical incident technique is a systematic, inductive method that involves collecting descriptions of events and behaviours. Once collected they are grouped and analysed using contextual, content or thematic analysis (Aveyard and Neale 2009). The critical incident technique was originally developed by Flanagan (1954) as part of the US Army Air Forces Psychology Program to describe successful and unsuccessful bombing missions. An incident is 'any observable human activity that is sufficiently complete in itself to permit inferences and predictions to be made about the person performing the act' (Flanagan 1954: 327). To be critical

'an incident must occur in a situation where the purpose or intent of the act seems fairly clear to the observer and where its consequences are sufficiently definite to leave little doubt concerning its effects' (Flanagan 1954: 327).

Critical incidents can be recorded by a variety of methods, but 'the face-to-face interview format is the most satisfactory data collection method for insuring that all the necessary details are supplied' (Kemppainen 2000: 1265). In the late 1990's the JISC usage surveys trends in electronic information services projects used this method to examine the uptake of electronic information sources (Urquhart et al. 2004; Urquhart et al. 2003); and it has been used in medical research (Bradley 1992). The critical incident technique is suited to nursing research as it relies on reflection and interviewing is aligned to the oral culture of nursing practice (Schluter et al. 2008). It is therefore not surprising that it has been well used in this field. The establishing the value of information to nursing continuing education project (Urquhart and Davies 1997) used the critical incident technique to examine the patterns of information need amongst hospital and community nurses; and it has been used to capture the experiences of nurses from differing areas (Keatinge, 2002; Perry, 1997). Elsewhere the critical incident technique has been used to determine the quality of nursing care from both the nurse and patient perspective (Norman et al. 1992; Redfern and Norman 1999) and to explore the spiritual needs of patients (Narayanasamy and Owens 2001). Within nursing education the critical incident technique has been used recently to explore student nurses' perceptions of language awareness (Irvine et al. 2008), and the meaning of empowerment for nursing students within their clinical areas (Bradbury-Jones et al. 2007).

According to Flanagan (1954) there are five steps involved in conducting a critical incident technique investigation:

Stage 1: Give a clear statement of what is being investigated.
Stage 2: Specify inclusion criteria.
Stage 3: Collect data.
Stage 4: Analyse the data.
Stage 5: Interpret the data.

In this study a clear statement is given at the start of the interview outlining the aspects under exploration. Supplementary probes attempt (if necessary) to elicit additional information if the participant appears hesitant or the response is lacking in detail. Because the incident is chosen by the interviewee, and it is based on real events, this allows areas not considered by the interviewer to be explored. In addition Chell (2004) notes that as incidents are critical this aids recall, and the critical incident technique provides a focus for the researcher to probe and which the interviewee can concentrate upon (Chell 2004). This research uses semi-structured interviews rather than structured interviews as this allows questions not listed in the guide to be asked (Bryman 2008).

One nurse who had recently graduated agreed to take part in a pilot study to test face validity of the interview schedule. No changes to the schedule were deemed necessary, but it did enable the researcher to mould supplementary questions and probes for the interviews.

The interviews were tape recorded and transcribed in full into NVIVO 8 software for analysis. Each interview took place at a single site in a single location at different times, with the duration ranging from just over 15 minutes to almost 25 minutes. The interviews were taped, subsequently listened to a single time, and transcribed in full.

Qualitative interpretative categorisation – the main steps

Qualitative interpretative categorisation data analysis steps were then followed to identify underlying themes in relation to the objectives identified at the outset and the qualitative research questions. The justification for the stages involved is discussed below.

Burnard's (1991) stage by stage method of data analysis for semi-structured interviews was used as a base. His method assumes that semi-structured interviews are recorded in full and the whole recording is transcribed. Although many commentators advocate the use of reliability checks by peers as a method of enhancing rigour (Cohen and Crabtree 2008; Tobin and Begley 2004) in coding, it could be construed that as the analysis is necessarily interpretive then different individuals will likely interpret that dataset differently. Other researchers suggest that the provision of sufficient detail in the theoretical and analytical decision making process and representation of as much of the data as possible is sufficient to provide reader verification (Chenail 1995; Constas 1992; Horsburgh 2003; Koch 2006). As Piper (2004) states, "verification hinges on the reader being able to see how the text was constructed and not on shared interpretation" (Piper, 2004: 156). For a team project, two people could share the coding, but if only one person is responsible for coding, then the use of data display methods would help to explain the decisions made on description and interpretation.

Miles and Huberman's (1994) qualitative analysis approach provided a framework for the processes of coding and data display. This approach focuses on three components that take place concurrently throughout the analysis:

As there are specific research questions to address, a-priori categories were developed to encourage the researcher to look out for particular aspects within the data, providing initial focus. Miles and Huberman advocate a provisional start list of categories generated from the research questions (Miles and Huberman 1994), a strategy often used in the health sciences (Creswell 2009), and supported elsewhere (Constas 1992; Dey 1993).

The start list is precisely that: it is not considered (until the research project is completed) to be a final list, acting rather to allow groupings to develop or become evident throughout the coding process. Data that do not fit are left temporarily free. Further coding may then identify a category for this free chunk of data thus modifying the start list table. This type of data analysis is similar in intent to template analysis (King 2004) which uses (as the name suggests) a template of hierarchical categories that are amended, refined, deleted as the analysis takes place. Template analysis however differs from qualitative interpretative categorisation in that it is restricted to the data analysis stage and the template contains more than just the highest order categories. In addition (as will be described later) it does not make use of any data display techniques other than the template itself.

The start list of a-priori categories (Table 1) was generated both from the two research questions and the interview schedule which itself was derived from the research questions.
Research question 1 (RQ1): Why do users search the way they do?
Research question 2 (RQ2): What are the preferred methods of information seeking?
The interview questions are listed below. For IQ6 the additional probes are given as they are the only reference to critiquing.
Interview question 1 (IQ1): CRITICAL INCIDENT TECHNIQUE
Please tell me about one entire project from a title or area through to completion: Please tell me about the activities and places that you look as you progress through a literature search. By all means take a moment to think back to where you were and who you consulted about this.
Interview question 2 (IQ2): Do you feel that you changed the way you search from the beginning of your search and as you move through? How did your priorities change?
Interview question 3 (IQ3): How does this search compare with other searches you have done before or after this time? If you searched differently in the past, why do you think you changed? If you now search differently – why have you subsequently changed?
Interview question 4 (IQ4): Where would you look for information? Who would you ask – and why?
Interview question 5 (IQ5): How do you identify new or useful information sources?
Interview question 6 (IQ6): When are you satisfied that you have enough information and can therefore move on to a new question, activity or different way of searching? How do you judge when enough is enough? (Probes: quantity, match with perceived needs for assignment, expectations of number of references, try to tease out how any interpolation is done, any critique of the information in the items retrieved, putting it all together.)


Table 1: Development of category start list
Name of category (Start list) Rationale for category Relates to?
Amount of information What is the amount of information a student needs before moving on to something else, or starting the assignment? RQ1, IQ1, IQ6
Confidence Are students more or less confident now than before? RQ1, IQ2, IQ3
Critiquing Do students critique articles or take them at face value? (Academic tutors within health faculties emphasise critical appraisal of literature, and selection of quality evidence.) RQ1, RQ2, IQ6
Relevancy How do students determine what is or isn't relevant? RQ1, RQ2, IQ5
Satisfaction with searching Why are students either satisfied or dissatisfied with their search? RQ1, RQ2, IQ6
Searching techniques What techniques do students employ in their searches? RQ1, RQ2, IQ1, IQ2, IQ4,
Sources used What sources are used to find information? RQ1, RQ2, IQ1, IQ4, IQ5

The qualitative analysis would ideally be conducted after the quantitative analysis to allow time for the researcher to mentally detach from the first set of findings and avoid the first set of results intruding on the subsequent analysis. In team projects those responsible for the qualitative component would not be involved in the analysis of the quantitative survey.

Classifying data into discrete groups using hierarchical cluster analytic techniques is suited to research where the number of groups is not certain. Its exploratory nature allows relationships and principles between the groups to manifest through the research (Beckstead 2002). Using a-priori categories also lends itself to the development of taxonomic structures (Bradley et al. 2007) to classify multifaceted phenomena according to a set of conceptual domains. Dendrograms for clustering of concepts from specific to general can be used, a process that not only highlights the relationship between concepts but also how they have been grouped. Whilst usually computer generated from content analysis and depicting quantitative results (Beckstead 2002), a dendrogram can be used as a means of purely depicting a hierarchical display. This clustering can form part of the data display component of the analysis. The data reduction component in Miles and Huberman's (1994) strategy is similar in technique to Burnard's (1991) stages 3-5 in which categories are developed and streamlined. There are other methods of data display, of course, and post-it notes or labelled cards might be used if software is not easily available.

The data analysis emphasised qualitative description (also termed 'thematic surveys' (Sandelowski 2010; Sandelowski and Barroso 2003)). This type of study uses a method of analysis that entails the presentation of the facts in everyday language, whilst allowing a level of interpretation of the data; and is amenable to obtaining straight and largely unadorned answers to questions of relevance to practitioners (Sandelowski 2000). In this method Sandelowski suggests that data collection (usually via semi-structured interviews or focus groups) is directed toward discovering the 'who, what, and where, of events and experiences' (Sandelowski, 2000: 338), with content analysis the preferred technique. It is also suggested that a 'targeted event' be employed as a focus; and that pre-existing codes (or a framework for analysing the data (Sandelowski, 2010) can be used as long as they are modified during the course of the analysis. In this case the critical incident technique provided the 'targeted event'. Again the development and modification of codes reflects both Burnard's (1991), and Miles and Huberman's (1994) techniques of data reduction and clustering. The outcome of the study should then be presented in descriptive summaries.

Table 2 illustrates the fit between the three main contributors to the data collection and analysis methodology for the qualitative interpretative categorisation framework. The later stages (6-14) of Burnard (1991) are included in the table (in italics), but the main contribution of Burnard was for the earlier stages of data collection and analysis (up to stage five). The later stages seem more appropriate for research projects that are purely qualitative.


Table 2: Qualitative interpretative categorisation framework in relation to Burnard (1991), Miles and Huberman (1994), and Sandelowski (2000, 2010).
Qualitative interpretative categorisation Burnard Miles and Huberman Sandelowski
Initial categorisation from research questions for start list   Pre-existing codes may be used as a start list Pre-existing codes may be used as a start list
Note taking and close reading Close reading and note taking (stage one)
Open coding (stage two)
  Modification of codes during analysis – reading and reflection
Reflection on reading, organising data into themes according to start list Filling out categories with themes, reorganisation Immersion in the data (stage three) Developing broader categories (stages four and five) words and phrases grouped together (reduced) Data reduction Identification of patterns
Checking coherence of themes within revised categories Clustering Developing broader categories (stages four and five) Data reduction and display Checking description is coherent and comprehensive
Development of dendrograms via data clustering Guarding against bias (stage six)
Establishing categories cover all aspects (stage seven)

Conclusion (tentative) Minimal interpretation, recourse to theory not essential
Assembly of final set of categories and dendrograms Recoding as necessary (stages eight and nine), individual analysis (stage ten), validity check (stage eleven), organisation and writing up (stages twelve to fourteen) Conclusion and verification Checking that summary meets the needs of the audience

The complete framework can be shown schematically (Figure 1) using Creswell's (2009) data analysis process overview (seven boxes on the left hand side of the diagram) as a base. Figure 1 emphasises the reflexive and interactive nature of the coding, data reduction, data display processes.

Figure 1: showing qualitative interpretive categorisation in relation to Creswell's (2009: 185) seven step data analysis overview.

Creswell also suggests that his general framework should be blended with the specific research strategy steps. The blending of Creswell's general data analysis framework with the qualitative interpretative categorisation process provides a complete technical framework for the qualitative research analysis undertaken in this study. The starting point for data collection is the use of the critical incident technique.

Category development - discussion

The first steps of the coding involved emphasis on close reading and faithful description of the events in the transcripts. This was in line with Burnard's (1991) open coding principles (Table 1). The first steps also tried to map some of the new codes assigned to the initial start list of categories, but this was a reflexive, and interactive process (as indicated by Sandelowski (2000) - Table 1). It became clear early in the analysis that a separate category for 'searching differently' was required and as such a new category 'revision of searching' was created. This category pulled some data from the 'search techniques' category which was becoming overloaded with separate search strategies. Changes in the search process as the search progresses remained within the 'search techniques' category, but the category was renamed to the broader 'search strategy' to better indicate that the data within included both the initial search and the follow up. In addition the 'critiquing' category was excluded as this generally received a 'yes/no' response and did not yield any significant further information. These changes are outlined in table 3. The interview transcripts were checked to ensure that the interviewees were not using different words to describe the phenomenon.


Table 3: Category development
Initial start list Final categories
Amount of information Amount of information
Confidence Confidence
Critiquing  
Relevancy Pertinence
  Revision of searching
Satisfaction with searching Satisfaction with searching
Searching techniques Search strategy
Sources used Sources used

Within the search strategy category an initial sub-category of problems with searching appeared to be mainly keyword searching and could be included within satisfaction with searching as this had a sub-category of keywords. This then became the sub-category keyword selection within the dissatisfaction element of the whole category. Excerpts from the transcripts include: 'I think a problem I did encounter was the keyword focuses' and 'Sometimes if you put in keywords and it doesn't… it comes up with something totally different you think "Oh that's not for me" so you close it down and click on the next one'.

In addition relevancy became a sub-category within the category pertinence to better indicate the precise bearing of the retrieved document in relationship to the information need. Relevancy might seem the obvious term to an information professional, but close reading of the transcripts indicated that the students perceived information seeking priorities differently.

Creation and utilisation of nodes took place throughout, but it was the use of the dendrograms that enabled these to be grouped as each transcript was analysed. Tentative groups could be confirmed with additional analysis of transcripts. Dendrograms can be generated within NVIVO, and are often used as a cluster analysis technique to explore patterns within the data. Sources or nodes in the cluster analysis diagram that appear close together are more similar than those that are far apart. the example in Figure 2, the dendrogram was organised chronologically, as this suited the emerging pattern within the data.

The sets of results were then laid out in separate categories each with a dendrogram indicating the nodal structure within the category. The initial start list of categories showed minimal amendments with only three significant alterations (Table 3). The themes (nodes) within the category provide a much fuller description of the content of the category.

Example dendrogram: amount of information

The amount of information category contains nodes pertaining to the student's perception of the quantity of information needed for an assignment with nodes grouped into sub-categories of: at the start, the initial stages, and at the end (Figure 2). This category differs from revision of searching which covers searching for different information or topics or a different method of searching.

Figure 2: Dendrogram showing the nodal structure within the category for 'amount of information'. Codes closer in proximity are more similar.

What became clear from the interviews was a difference between the amount of information needed at the start of the assignment compared to at the end. There was an emphasis on getting only a few quality references to get going, but then a desire to have plenty of references at the end – more quantity. There was also a general feeling of needing one reference per 100 words.

Current analysis of the seven categories has enabled the development of an information seeking process model based on Bystrom and Jarvelin's (1995) task-based model. This model, on reflection, was a better fit for the qualitative data, than the Kuhlthau (1991) model. It is envisaged that the information seeking profiles that are being re-evaluated as part of the mixing stage of the study will feed into this model to create an information seeking framework.

Discussion

So why bother blending techniques? What is the rationale? Why not use a pre-existing research framework of one of the traditional types of qualitative research for example: grounded theory or phenomenology?

Simply put, these methods do not fit. In fact, no single type of qualitative method does fit this research. There is little point trying to 'shoehorn' a method into line to fit the research. Both grounded theory and phenomenology contain aspects that are either not possible to do or irrelevant. For example, to do grounded theory properly requires theoretical sampling which was not possible because of the demands of ethical review. Equally, as phenomenology is concerned with the 'lived experience' it would focus on how students 'felt' about information seeking rather than what they actually did. The research here blends from a pragmatic methodological perspective various well known approaches to data collection and analysis, to generate a new framework that can be applied in specific situations where the researcher is constrained by time pressures, or simply wants a pragmatic, rather than a highly interpretive approach. This research is also pragmatic as the evidence should help the design of information literacy programmes for nursing students.

Our research could be considered to fall within the compass of generic qualitative research defined as exhibiting characteristics of qualitative endeavour (Caelli et al. 2003). However rather than focusing on a known methodology, generic qualitative research either combines several methodologies or approaches, or claims no methodological viewpoint at all (Caelli et al. 2003). Our research also could be described as basic interpretive in which data are 'inductively analysed to identify recurring patterns or common themes that cut across the data' (Merriam, 2002: 7). Having no defined boundary for the type of research method used in the research reflects Sandelowski's (2010) view that efforts to define and generalise do not capture the variations in the actual practice of methods; and that there is no perfect execution of any method as methods are always accommodated to the real world of research practice and are therefore reinvented. As Patton states 'because each qualitative study is unique, the analytical approach will be unique' (Patton 2002: 433). Having no clear canonical path also allows flexibility in the application of methods to appropriately answer research questions (Chamberlain 2000) preventing methodolatry, the overemphasis on selecting and describing methods that overtakes the story being told (Janesick 2000). Avis (2003) goes further to suggest that methodological theory can be overemphasised to the detriment of the research process and that method should not be used to justify production of evidence that 'closes off critical scrutiny of the evidence by locating it as internal to a particular methodological theory' (Avis 2003: 1004). This is in line with Miles and Huberman's pragmatic statement that any method that produces 'clear, verifiable, credible meanings from a set of qualitative data – is grist to our mill' (Miles and Huberman 1994: 3).

For many, the world of qualitative research may seem very complex. One prominent qualitative researcher has described the field as 'unhelpfully fragmented and incoherent' (Atkinson 2005). Atkinson stresses the values of formal analysis, 'preserving and respecting the different layers of action and representation through which cultures are enacted and social action is performed' (Atkinson 2005: para 20). In this research, the intention was to reach a fuller understanding of the information seeking of nursing students. That means trying to see the situation from the perspective of the student, as well as appreciating where the assumptions of those designing an information literacy programme come from. For example, we had assumed that students might talk about evaluation of the information obtained, and the development of their critical thinking, as many of the information literacy models, and their academic tutors assume. However, interviews revealed that students were often more concerned with the weight of evidence in a different way, finding enough justification for their arguments. There are layers of action (students doing assignments, academics designing programmes of study, librarians designing and conducting information literacy programmes), and many activities, and professional (or other) assumptions that are associated with these sets of actions. It is important to stress that though the names for the initial category and the final category have not changed; the final category is much more closely defined and redefined than the initial category, which was, simply, a rather flat label. To cite Dey (1993: 102):

The meaning of a category is therefore bound up on the one hand with the bits of data to which it is assigned, and on the other hand with the ideas it expresses. These ideas may be rather vague at the outset of the analysis. The meaning of a category is something that evolves during the analysis, as we make more and more decisions about which bits of data can or cannot be assigned to the category.

It is possible that dendrograms might be more difficult to manage manually with a larger number of interviewee transcripts to handle, but there are software programs that can help in the process of creating a workable dendrogram with larger datasets. For information behaviour research, and when working with a strong structure such as the narrative structure underpinning the critical incident technique, there seems less risk of additional complexity.

Conclusions

This paper has set out the rationale and justification for the development of the qualitative interpretative categorisation method that can be pragmatically used in a range of situations. It is particularly well suited to concurrent mixed-method studies where the researcher is constrained by time pressures, or the research is the responsibility of a team. The method therefore offers a systematic way of dealing with interview data obtained alongside survey data in research done by practitioners. Using this approach encourages proper interrogation and questioning of the data, avoiding the risk of fitting the data to preconceived assumptions.

Some researchers may prefer the integrity of an approach developed by one researcher, and might criticise the qualitative interpretative categorisation as a 'pick and mix' approach. In some ways that is what it is, but the intention was to develop a systematic approach to data analysis that was informed by previous critiques of qualitative research. The approach emphasises the data display and data reduction possibilities of dendrograms. Such visual methods make the generation of categories a more transparent process, and this is helpful in many situations where the information behaviour research findings need to be explained to practitioners.

Acknowledgements

The overall research project received funding from the Arts and Humanities Research Council.

About the authors

Peter Stokes is a Subject Librarian at Anglia Ruskin University, Peterborough, UK. He received his Masters in Health Information Management from Aberystwyth University and is currently undertaking a PhD at the same university. He can be contacted at: peter.stokes@anglia.ac.uk
Christine Urquhart has directed several studies of information seeking and use in the health sector and also co-directed a longitudinal study of the impact of electronic information services on the information behaviour of students and staff in UK higher and further education. She also prepares systematic reviews for the Cochrane Collaboration, principally the Effective Practice and Organization of Care group and is a co-author of reviews on nursing record systems and telemedicine. She was Director of Research in the Department at Aberystwyth for several years and established the training programme for doctoral students. She can be contacted at cju@aber.ac.uk

References
  • Atkinson, P. (2005). Qualitative research – unity and diversity. Forum: Qualitative Social Research (FQS), 6(3) Art. 26, Retrieved 8 September 2012 from http://www.qualitative-research.net/index.php/fqs/article/view/4/9 (Archived by WebCite® at http://www.webcitation.org/6DpmFLuvl)
  • Aveyard, H. & Neale, J. (2009). Critical incident technique. In J. Neale (Ed.), Research methods for health and social care. Basingstoke: Palgrave Macmillan.
  • Avis, M. (2003). Do we need methodological theory to do qualitative research? Qualitative Health Research, 13(7), 998-1004.
  • Baert, P. (2005). Philosophy of the social sciences. Cambridge: Polity.
  • Beckstead, J. W. (2002). Using hierarchical cluster analysis in nursing research. Western Journal of Nursing Research, 24(3), 307-319.
  • Bradley, C. P. (1992). Uncomfortable prescribing decisions: a critical incident study. British Medical Journal, 304(6822), 294-296.
  • Bradley, E. H., Curry, L.A. & Devers, K. J. (2007). Qualitative data analysis for health services research: developing taxonomy, themes, and theory. Health Services Research, 42(4), 1758-1772.
  • Bryman, A. (2008). Social research methods (3rd ed.). Oxford: Oxford University Press.
  • Burnard, P. (1991). A method of analysing interview transcripts in qualitative research. Nurse Education Today, 11(6), 461-466.
  • Bystrom, K. & Jarvelin, K. (1995). Task complexity affects information seeking and use. Information Processing & Management, 31(2), 191-213.
  • Caelli, K., Ray, L. & Mill, J. (2003). 'Clear as mud': toward greater clarity in generic qualitative research. International Journal of Qualitative Methods, 2(2), Article 1. Retrieved 7 April 2012 from http://www.ualberta.ca/~iiqm/backissues/2_2/pdf/caellietal.pdf (Archived by WebCite® at http://www.webcitation.org/6DpmuduPg)
  • Chamberlain, K. (2000). Methodolatry and qualitative health research. Journal of Health Psychology, 5(3), 285-296.
  • Chell, E. (2004). Critical incident technique. In C. Cassell & G. Symon (Eds.), Essential guide to qualitative methods in organizational research (pp. 45-60). London: Sage Publications.
  • Chenail, R. J. (1995). Presenting qualitative data. The Qualitative Report, 2(3). Retrieved 22 April 2012 from http://www.nova.edu/ssss/QR/QR2-3/presenting.html (Archived by WebCite® at http://www.webcitation.org/6DppIUJif)
  • Cohen, D. & Crabtree, B. (2008). Evaluative criteria for qualitative research in health care: controversies and recommendations. The Annals of Family Medicine, 6(4), 331.
  • Constas, M. A. (1992). Qualitative analysis as a public event: the documentation of category development procedures. American Educational Research Journal, 29(2), 253-266.
  • Cornish, F. & Gillespie, A. (2009). A pragmatist approach to the problem of knowledge in health psychology. Journal of Health Psychology, 14(6), 800.
  • Creswell, J. W. (2009). Research design: qualitative, quantitative, and mixed methods approaches (3rd ed.). Los Angeles: Sage Publications.
  • Creswell, J. W. & Plano-Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks: Sage Publications.
  • Crotty, M. (1998). The foundations of social research: meaning and perspective in the research process. London: Sage Publications.
  • Dey, I. (1993). Qualitative data analysis: a user friendly guide for social scientists. London: Routledge.
  • Flanagan, J. C. (1954). The critical incident technique. Psychological Bulletin, 51(4), 327-358.
  • Foster, A. (2005). A non-linear model of information seeking behaviour. Information Research, 10(2), paper 222. Retrieved 14 April 2012 from http://informationr.net/ir/10-2/paper222.html (Archived by WebCite® at http://www.webcitation.org/5cm6ah7Ps)
  • Giddings, L. S. (2006). Mixed-methods research: positivism dressed in drag? Journal of Research in Nursing, 11(3), 195-203.
  • González-Teruel. A. & Abad-García, M. F. (2012). Grounded theory for generating theory in the study of behavior. Library and Information Science Research, 34(1), 31-36.
  • Horsburgh, D. (2003). Evaluation of qualitative research. Journal of Clinical Nursing, 12(2), 307-312.
  • Irvine, F. E., Roberts, G. W., Tranter, S., Williams, L. & Jones, P. (2008). Using critical incident technique to explore student nurses' perceptions of language awareness. Nurse Education Today, 28(1), 39-47.
  • Janesick, V. J. (2000). The choreography of qualitative research design: minutes, improvisations, and crystallization. In M. K. Denzin & Y. S. Lincoln (Eds.), Handbook of qualitative research. (pp. 379-399). Thousand Oaks: Sage Publications.
  • Johnson, R. B., Onwuegbuzie, A. J. & Turner, L. A. (2007). Towards a definition of mixed methods research. Journal of Mixed Methods Research, 1(2), 112-133.
  • Keatinge, D. (2002). Versatility and flexibility: attributes of the critical incident technique in nursing research. Nursing & Health Sciences, 4(1-2), 33-39.
  • King, N. (2004). Using templates in the thematic analysis of text. In C. Cassell & G. Symon (Eds.), Essential guide to qualitative methods in organizational research (pp. 256-270). London: Sage Publications.
  • Kemppainen, J. K. (2000). The critical incident technique and nursing care quality research. Journal of Advanced Nursing, 32(5), 1264-1271.
  • Kim, J. (2009). Describing and predicting information-seeking behavior on the Web. Journal of the American Society for Information Science and Technology, 60(4), 679-693.
  • Koch, T. (2006). Establishing rigour in qualitative research: the decision trail. Journal of Advanced Nursing, 53(1), 91-100.
  • Kuhlthau, C. (1991). Inside the search process: information seeking from the user's perspective. Journal of the American Society for Information Science, 42(5), 361-371.
  • Kwon, N. (2008). A mixed-methods investigation of the relationship between critical thinking and library anxiety among undergraduate students in their information search process. College & Research Libraries, 69(2), 117-131.
  • Lipscombe, M. (2008). Mixed method nursing studies: a critical realist critique. Nursing Philosophy, 9(1), 32-45.
  • Lu, Y. L. (2010). Children's information seeking in coping with daily-life problems: An investigation of fifth- and sixth-grade students. Library & Information Science Research, 32(1), 77-88.
  • Mason, H. & Robinson, L. (2011). The information-related behaviour of emerging artists and designers: inspiration and guidance for new practitioners. Journal of Documentation, 67(1), 159-180.
  • Merriam, S. B. (2002). Introduction to qualitative research. In S. B. Merriam (Ed.), Qualitative research in practice: examples for discussion and analysis (pp. 3-17). San Francisco: Jossey-Bass.
  • Miles, M. B. & Huberman, A. M. (1994). Qualitative data analysis: an expanded sourcebook (2nd ed.). Thousand Oaks: Sage Publications.
  • Morgan, D. L. (2007). Paradigms lost and pragmatism regained: methodological implications of combining qualitative and quantitative methods. Journal of Mixed Methods Research, 1(1), 48-76.
  • Narayanasamy, A. & Owens, J. (2001). A critical incident study of nurses' responses to the spiritual needs of their patients. Journal of Advanced Nursing, 33(4), 446-455.
  • Norman, I., Redfern, S., Tomalin, D. & Oliver, S. (1992). Developing Flanagan's critical incident technique to elicit indicators of high and low quality nursing care from patients and their nurses. Journal of Advanced Nursing, 17(5), 590-600.
  • Patton, M. Q. (2002). Qualitative research & evaluation methods (3rd ed.). Thousand Oaks: Sage Publications.
  • Perry, L. (1997). Critical incidents, crucial issues: insights into the working lives of registered nurses. Journal of Clinical Nursing, 6(2), 131-137.
  • Piper, S. (2004). An interpretive inquiry testing the relationship between health promotion theory and nursing practice. Anglia Ruskin University, Cambridge.
  • Redfern, S. & Norman, I. (1999). Quality of nursing care perceived by patients and their nurses: an application of the critical incident technique. Part 2. Journal of Clinical Nursing, 8(4), 414-421.
  • Rowley, J. & Urquhart, C. (2007). Understanding student information behavior in relation to electronic information services: lessons from longitudinal monitoring and evaluation, Part I. Journal of the American Society for Information Science and Technology, 58(8), 1162-1174.
  • Sandelowski, M. (2000). Whatever happened to qualitative description? Research in Nursing & Health, 23(4), 334-340.
  • Sandelowski, M. (2010). What's in a name? Qualitative description revisited. Research in Nursing & Health, 33(1), 77-84.
  • Sandelowski, M. & Barroso, J. (2003). Classifying the findings in qualitative studies. Qualitative Health Research, 13(7), 905-923.
  • Schluter, J., Seaton, P. & Chaboyer, W. (2008). Critical incident technique: a user's guide for nurse researchers. Journal of Advanced Nursing, 61(1), 107-114.
  • Stokes, P. (2008). Developing an information seeking behaviour profile for nursing and midwifery students. Information Research, 13(4), paper wks13. Retrieved 12 May 2012 from http://InformationR.net/ir/13-4/wks13.html (Archived by WebCite® at http://www.webcitation.org/6Dk4SMmFI)
  • Stokes, P. & Urquhart, C. (2011). Profiling information behaviour of nursing students: part 1: quantitative findings. Journal of Documentation, 67(6), 908-932.
  • Symonds, J. E. & Gorard, S. (2008). The death of mixed methods: research labels and their casualties. Paper presented at the The British Educational Research Association Annual Conference, Heriot Watt University, Edinburgh, September 3-6, 2008. Retrieved 13 February, 2013 from http://www.leeds.ac.uk/educol/documents/174130.pdf (Archived by WebCite® at http://www.webcitation.org/6EPOnBm6t)
  • Teddlie, C. & Johnson, R. B. (2009). Methodological thought since the 20th century. In C. Teddlie & A. Tashakkori (Eds.), Foundations of mixed methods research: integrating quantitative and qualitative approaches in the social and behavioral sciences. Los Angeles: Sage Publications.
  • Teddlie, C. & Tashakkori, A. (Eds). (2009). Foundations of mixed methods research: integrating quantitative and qualitative approaches in the social and behavioral sciences. Los Angeles: Sage Publications.
  • Tobin, G. & Begley, C. (2004). Methodological rigour within a qualitative framework. Journal of Advanced Nursing, 48(4), 388-396.
  • Urquhart, C. (2011). Meta-synthesis of research on information seeking behaviour. Information Research, 16(1). Retrieved 24 September 2012 from http://informationr.net/ir/16-1/paper455.html (Archived by WebCite® at http://www.webcitation.org/6DppUk7Wr)
  • Urquhart, C. & Davies, R. (1997). EVINCE: the value of information in developing nursing knowledge and competence. Health Libraries Review, 14(2), 61-72.
  • Urquhart, C. & Rowley, J. (2007). Understanding student information behavior in relation to electronic information services: lessons from longitudinal monitoring and evaluation, Part 2. Journal of the American Society for Information Science and Technology, 58(8), 1188-1197.
  • Urquhart, C., Armstrong, C., Spink, S., Thomas, R., Yeoman, A., Fenton, R., et al. (2004). JISC user behaviour monitoring and evaluation framework. JUSTEIS: JISC usage surveys - trends in electronic information services. Final report - 2003/2004 cycle five report. Retrieved 10th May 2010, from http://cadair.aber.ac.uk/dspace/bitstream/2160/239/1/JUSTEIS%20cycle%205%20report.pdf (Archived by WebCite® at http://www.webcitation.org/6DppgNp5F)
  • Urquhart, C., Light, A., Thomas, R., Barker, A., Yeoman, A., Cooper, J., et al. (2003). Critical incident technique and explicitation interviewing in studies of information behavior. Library & Information Science Research, 25(1), 63-88.
  • Vakkari, P. (2008). Trends and approaches in information behaviour research. Information research, 13(4), 13-14. Retrieved 13 May 2012 from http://informationr.net/ir/13-4/paper361.html (Archived by WebCite® at http://www.webcitation.org/5vAdbysDG)
  • Williamson, K. (2008). Where information is paramount: a mixed methods, multi-disciplinary investigation of Australian online investors. Information Research, 13(4). Retrieved 14 May 2012 from http://informationr.net/ir/13-4/paper365.html (Archived by WebCite® at http://www.webcitation.org/6Dpps8l7O)
  • Xie, I. & Joo, S. (2010). Transitions in search tactics during the web-based search process. Journal of the American Society for Information Science and Technology, 61(11), 2188-2206.
How to cite this paper

Stokes, P. & Urquhart, C. (2013). "Qualitative interpretative categorisation for efficient data analysis in a mixed methods information behaviour study" Information Research, 18(1) paper 555. [Available at http://InformationR.net/ir/18-1/paper555.html]

Check for citations, using Google Scholar