ISPUB.com / IJMI/6/2/14033
  • Author/Editor Login
  • Registration
  • Facebook
  • Google Plus

ISPUB.com

Internet
Scientific
Publications

  • Home
  • Journals
  • Latest Articles
  • Disclaimers
  • Article Submissions
  • Contact
  • Help
  • The Internet Journal of Medical Informatics
  • Volume 6
  • Number 2

Original Article

“Clinical Doctors as Qualitative Researchers”: QDAS Factors Informing Hospital Research Policy

P James

Citation

P James. “Clinical Doctors as Qualitative Researchers”: QDAS Factors Informing Hospital Research Policy. The Internet Journal of Medical Informatics. 2012 Volume 6 Number 2.

Abstract


Background This paper explores the everyday experiences and research rationale of clinical doctors within a purposefully selected single institution, a Thai group of private hospitals. An article search for research conducted by clinical doctors revealed that little is written about their experiences or their engagement in research practices using qualitative software that justifiably lead to pragmatic research solutions related to clinically identified problems or issues underpinned by hospital research policy. Methods The paper utilises a qualitative approach, targeting a Thai private hospital general medical department, using a small semi-structured questionnaire creating a focused element of context and flexibility. 12 doctors were randomly chosen from 22 that were available and willing to share their needs and experiences and a focus group was used as the data-collecting vehicle. This paper addresses issues raised when doctors use qualitative software to assist in their research process while conducting research at a private hospital in Bangkok. ResultsThe 3 developed research questions were mapped to 9 major themes, and underpinned by 27 sub-themes. The presented qualitative data outcomes highlight the various experiences, perspectives, opportunities and challenges that Thai clinical doctors perceive they face. Together, these illustrate the diversity of research practice opinion of a small group of Thai clinical doctors, when engaging more effectively in qualitative related evidence-based research practices using QDAS software. Conclusions The impact of this research suggests that the perceived complexities of qualitative software can be mitigated with appropriate training and specialist guidance and an appropriate hospital management engagement in building equitable research capacity. This will also give greater support to clinical doctors engaging in qualitative evidence-based research leading to more effective primary care solutions.

 

Introduction

The role of Qualitative Data Analysis Software (QDAS) in qualitative research has become an imperative, as the application and use of qualitative research methods gain greater popularity (Fielding and Lee, 2002), whilst the availability and complexity (Ulin, Robinson and Tolley, 2005) of such software packages have increased. The process of qualitative data analysis is a difficult skill (Broom, 2005) for qualitative researchers to develop whilst attempting to engage with and enhance methodological rigour through the application and use of technological tools (Meyrick, 2006). However, QDAS packages are still not accepted as a mainstream qualitative research tool, as a literary debate persists between the usefulness of utilising software and the philosophical stance of the analyst (Catterall and Maclaran, 1998), as well as age, computer literacy, and experience of the qualitative researcher (Mangabeira, Lee and Fielding, 2004). Research activities utilising computer applications to handle unstructured, qualitative data has produced a huge variety of QDAS packages (around 50 different packages at present) such as Atlas-Ti (v6) and more recently NVivo (v9). In making full use of the QDAS system appropriate research questions would have needed to be developed that lead to a clearly defined methodology and/or data analytic strategy before data analysis begins, using one of many QDAS packages. These packages were developed to utilise functions such as word searching, data storage and retrieval, data coding, memoing, graphic’s mapping, hierarchical theme/tree building, concept building, and reflexive report writing (based on Peace, 2000; and QSR International, 2010).

In many ways, QDAS packages are considered relatively fast in the processing ability of large amounts of data (media such as documents, video, photographs and audio) - Morse and Richards (2002). These can reflect the versatility in research approaches to working with data more interactively (Kearns, 2000) for deeper, contextualised investigation (Bassett, 2004), compared with the traditional qualitative analysis using manual card and paper techniques whilst leaving a visible and recoverable audit trail (St. John and Johnson, 2000) to support methodological rigour (Dey, 1993). As more and more researchers report using QDAS packages, their usage appears to have revolutionized the way research methodology and analytical work is carried out in qualitative research. However, the decision whether or not to use a QDAS package is based on the individual researcher’s requirements, as well as the researcher’s skills and experience with software and technology (Webb, 1999). Nevertheless, using such packages do not automatically create the methodological notion of qualitative analysis using the generated data, nor do they, by default, increase the robustness or rigour of the qualitative research method utilised. They can be considered a tool for the researcher to use. This raises the first research question, in what ways are QDAS packages used to help “doctors as qualitative researchers” develop answers to their research questions?

What is Qualitative Data Analysis (QDA)?

Data analysis is a systematic search for meaning (Hatch, 2002) articulated through the visible use of robust qualitative research methods and tools. How the resultant analysis becomes defined as meticulously engaged depends on the methodological treatment utilised and made visible by the qualitative researcher thus contributing towards methodological validity and reliability. Consequently, the qualitative researcher faces the task of somehow reducing the often huge amount of collected qualitative data (Lee and Esterhuizen, 2000) into a form in which it can be examined for overt and less explicit patterns and hidden relationships. This task can seem erroneous especially when the amount of data collected is perceived as somewhat daunting. Anything over 10,000 words is sufficient today for the qualitative researcher to consider the use of QDAS packages. QDA is the central function of the use of QDAS packages where its general principles and application are concerned with utilising a qualitative approach when considering qualitatively derived data. The qualitative researcher must then make some decisions about the sort of coding scheme – which are a set of markers, tags or labels representing conceptually engaged categories into which to sort the available data and relate these to the literature for future examination, analysis and comment. In this way, an approach will ensure that all data collected has been captured through this process and provides appropriate means to show characterised methodological validity.

Using the Constant Comparison Method of Analysis in QDAS Packages

The constant comparison analysis (Glaser and Strauss, 1967) is likely to be the most commonly contemplated type of methodological employment utilised in the analysis of qualitative data. This raises the term “coding” when referring to this type of analysis (Miles and Huberman, 1994) resulting from thick, rich descriptions that are seemingly situationalised and contextual in nature (Onwuegbuzie and Leech, 2004). This is often preceded by a 3-stage process of open, axial and selective coding that seeks to underpin the constant comparison method (Patton, 2002) by moving from an open-coding process to the constant comparison method which draws upon finely tuned theoretical and cyclic analytic researcher skill building and development (Ryan and Bernard, 2003). Utilising the constant comparison method is often undertaken deductively (codes are identified prior to any analysis (Miles and Huberman, 1994) and then looked for in the available data – an apriori process), but it can also be conducted through an inductive process (codes emerge out from the data), or abductively (codes emerge from the application of a continuing cyclic and iterative process) which adds another demonstration of validity for qualitative research outcomes. Using QDAS packages, this allows the researcher to code data directly and in real-time, splitting the data into more manageable and visible components and identifying or naming these segments by assigning themes and located sub-themes. The text data can also be coded and recoded dynamically (Strauss and Corbin, 1998) and easily into any new emergent concepts, categories, or identified themes and in some cases help develop appropriate models as the analysis progresses; the new coding categories can be compared (cross-referenced) with respect to other coded responses or tabled questions (Bazeley, 2006). The use of QDAS packages appears to speed up this research process and becomes more flexible as a result, as the software helps the researcher ask questions to hear the data (Rubin and Rubin, 2005) but the research process, the analysis and the thinking related to the implications for the analysis in terms of the literature is driven by the researcher and not the software. The software package thus becomes the research artist’s canvas. What is often overlooked is the usefulness of QDAS packages such as NVivo in helping develop a more constructive and robust engagement in the literature review. Consequently, it seems a logical and more dynamic possibility to release the power of QDAS packages when researching the context of research projects whilst also using the QDAS package within its methodological operations. In this way, researchers will be able to test and retest the assumptions associated with the contextual literature as well as the contextually generated data. It is predisposed therefore that when connecting both context and generated data with the QDAS technology that this cements a more rigorous outcome for the research project. For example, NVivo (v9) package gives the researcher the ability to engage with bibliography and EndNote in ways that make it very easy to sustain a logical sharpness in the literature development within each on-going research project. Using this type of software package therefore appears to link the literature to the methodology and onto the outcomes that can be retested against the literature in building a more expressive theoretical development. This raises the second research question, when QDAS packages are used, how effective are they in helping the “doctors as qualitative researchers” realise the “truth” and/or the full meaning behind the data?

When choosing a QDAS package the researcher’s style of working with the available data is paramount and the package needs to be flexible enough to allow the researcher to interrogate the generated data and develop the required analysis in a natural way. It is the type of qualitative analysis associated with the requirements of the research questions that dictates which package is more suitable to use (Williams, Mason and Renold, 2004). Consequently, the choice of package can dictate the type of analysis to be performed and care needs to taken in the final choice of software package. In some circumstances more than one may be used (Mangabeira, Lee and Fielding, 2004), as the data analysis and subsequent literature engagement may force different approaches that lead to different software package treatments. In essence, for most researchers one package could be all that is needed. In other research circumstances, multi-package engagement would need to be utilised, as the specific and on-going research orientation demands different data treatments that can only be done through multi-package use. One very important and evident problem in using QDAS packages is how to display and model the patterns and relationships found in the data. Unfortunately, many researchers claim they have used these types of packages for the data analysis but fail to show specifically how their structural/theoretical propositions have been arrived at as a direct result of an engagement in the software and its corresponding analysis. This is mirrored to some extent by Glaser (2004) for example, who states that …Currently it appears to be very popular in QDA research for substantive and methodological papers to label QDA as GT for the rhetorical legitimating. Consequently, some qualitative researchers appear to use the wrong methodology and possibly the wrong type of software to conduct their data analysis. This obvious issue does little to substantiate methodological rigour and deflects from the engaged and proper use of qualitative software packages. This raises the third research question, in what ways do the use of QDAS packages help “doctors as qualitative researchers” substantiate the methodological rigour required?

Methodology

To consider the application of QDAS issues involved in the support of research methodology closely, this empirical paper employed an interpretive approach. This used a semi-structured questionnaire utilising a focus group, as is now common practice for such enquiries (Krueger, 1994). This provides an appropriate element of context and flexibility (Cassell and Symon, 2004). Given the lack of purposeful research in the area of software use and qualitative methods in health in Thailand, this methodology is seen as appropriate for generating contextual data supporting the purpose of underpinning enriched theory development (Cayla and Eckhardt, 2007), and informing professional practice (Brown, Reynolds and Brenman, 1994).

The population for this study were all doctors who have conducted qualitative research as part of their professional development at a medical facility in a Thai private hospital in Bangkok, Thailand (derived from Carman, 1990; and Glaser, 2004) and the resultant sample frame was based on convenience sampling (after Harrel and Fors, 1995). The criteria of theoretical purpose and relevance (Glaser and Strauss, 1967) were applied to the identified population.

Consequently, all doctors were included in the population frame as individuals (material objects) that form the focus of the investigation (Bryman and Burgess, 1994) - resulting in 22 available research informants, who could speak English (second degree and trained overseas) and could account for their views in terms of the research orientation (Morse, 1994). The departments involved were – Clinical Pharmacology; Diabetes and Endocrinology; Anaesthesia; Cardiovascular Medicine; Immunology; and Infectious Diseases (Not all departments in the hospital were conducting research projects – those that did are included here). Each doctor was given a number and using a random number approach, a group of twelve (12) were chosen (Onwuegbuzie and Leech, 2005a; Carrese, Mullaney and Faden, 2002). Twelve (12) were chosen, as this was the number considered optimum in a focus group (Vaughn, Schumm, and Sinagub, 1996; Johnson and Christensen, 2004) in order to provide expected levels of interaction. This was considered an appropriate sample size for this qualitative research orientation, as it is driven by the need to uncover all the main variants on a research conception (Kember and Kwan, 2000). Levine and Zimmerman (1996) suggest that a further important consideration of using the focus group was that this method innately acknowledges participants as experts and is designed to obtain perceptions for a defined research purpose in a positive, interactive environment (Krueger, 1994), as well as its ability to link with other qualitative methods effectively (Vaughn, Schumm, and Sinagub, 1996). This supports the notion that all included doctors were sufficiently experienced to provide appropriate opinion and context for the research orientation. The focus group was conducted in English and audio recorded for future analysis. The focus group interview took approximately one and a half hours and was later transcribed verbatim. The conduct of the interviews follows a similar process as used by Gray and Wilcox (1995), where the group was asked a small set of prepared questions modified through ancillary questioning (probes and follow-ups) in the same way as Balshem (1991).

The focus group outcome was manually coded initially using Acrobat according to themes and sub-themes that 'surfaced' from the interview dialogue using a form of wide open-coding, which is derived from Glaser (1992a), and Straus and Corbin (1990) constructing first-stage analysis. This treatment was also reinforced through deep and surface approaches (Gerbic and Stacey, 2005) and extended through the use of thematic analysis conducted using the NVivo qualitative software package (Walsh, White, and Young, 2008) while using a mind-mapping sequencer to test connections and residual legacies. 392 specific codes were developed which in turn fitted into 9 main themes and 27 sub-themes. In this way, no portion of the focus group dialogue was left uncoded and the outcome represented the shared respondents views and perspectives through an evolving coding sequence (Buston, 1999). Various themes were detected and acknowledged whilst using the NVivo qualitative software package, as well as from the application of manual coding and utilising the constant comparative approach where new instances were constantly compared to the theme with those already encountered until this saturated the derived category. This triple form of interrogation was an attempt to increase the validity of the choice of both key themes and sub-themes through a triangulation process. NVivo was further used to explore these sub-themes by helping to pull together each of these sub-themes from all the interviews (Harwood & Garry, 2003). It was thus possible to capture the respondent's comments on each supported sub-theme and place them together for further consideration and analysis.

Presentation of Framework Outcomes

The research questions were mapped to the generated 9 major themes, and is supported by 27 sub-themes, as indicated in Table 1 - Major Themes, Research Informing Notions, and Informing Hospital Research Policy. The major themes are further discussed below. The outcomes of this research inquiry in terms of the 9 major themes and the total number of references for these themes are further indicated in Table 2. Figure 1 presents a model of the discussed main outcomes. These are further reduced to connect the major theme to a significant sub-theme where additional analysis was conducted, which revealed 4 key areas of outcomes (Figure 2).

Figure 1
Table 1 - Summary of Research Questions and Major Themes that Inform Hospital Research Policy

Discussion

The style adopted for reporting and illustrating the data is partially influenced by Gonzalez, (2008); Carpenter, 2008; and Daniels et al. (2007) and is formulated below, focusing on the raised main themes. The subsequent sub-themes are presented in Table 2. The discussion format used in this paper reflects the respondents' voice in a streamlined and articulated approach using Table 2, below, also shows the breadth of coded respondent illustrations/extractions as used in the reporting of this research.

Figure 2
Table 2 - Main-theme observations (other sub-themes are considered of less persuasive significance)

Building Research Designs

An approach by doctors was to use the software to build a research design as they went along. For example, one respondent (D5) suggested that ...the software has become powerful enough to start building as soon as the data becomes available. Another respondent (D11) further indicated that ...I see patterns almost as the data gets transcribed. This firmly suggests that the software isn't just a tool that is used in a passive way, but a living tool that transforms ideas and issues raised throughout the research engagement. In many ways, this has the effect of supporting the development of a dynamic research design and continues throughout the research loop. Another respondent (D9) stated that ...the software was an enormous help. I couldn't think of using qualitative methodology without it. In many respects, the building aspect of the research design appears to be one of the more difficult tasks to perform. For example, one respondent (D8) indicated that ...starting is possibly the biggest problem, but this software gives me some confidence. This followed by another respondent (D6) who implied that ...the research design starts with the research questions and guides me through to this point. It is here where data meaning and discovery start to fashion structure that helps to create the basis for research outcomes.

Test and Retest Question Issues

Since the questions have been underpinned from the literature using the research loop, there is still the need to test the derived data from respondents. In this respect, one respondent (D3) suggested that …it is just as important to check the questions and recheck these, as it is to build description thickness. Another respondent (D6) indicated that …the question shouldn’t be seen as finalized (at least in meaning) and therefore the testing of questions through the categories-themes is part of the language of rigour. However, another respondent (D12) highlighted that …there is a loose tension between the asked project questions and the resulting data. Not all data captured can be truly directly as a result of a specific question (for example, prompts)…

Visualise Research Needs

Visualising the research needs appeared to be a surprise factor, as the connection between research needs and research outcomes are often blurred. However, one respondent (D1) suggested …of course it has now become a monetary issue to ensure that research output relates to research needs. Another respondent (D8) indicated that …often today, we are expected to match government requirements or our funding is cut. This is reinforced by another respondent (D10) who stated that …there is now the need to be specific about what research we do, so that departmental requirements are focused on. This suggests that doctors are now moderated on the type of research they carry out – and this appears to be what they think supports government policy.

Training Requirements

Most doctors appeared to believe that they were not experienced in qualitative research methods and needed some form of training. This aspect is supported by one respondent (D3) who stated emphatically …we aren’t really trained in qualitative research methods. We pick it up as we go along. Another respondent indicated that …oh, yes indeed. We need the views of our patients, but we don’t expect to get close to them. No… In overall support of training, another respondent (D11) suggested that …if only we did get better training or that government shouldn’t always take notice of research that is only about numbers. In many respects, doctors appeared to agree that more training was necessary but didn’t know what sort of training or methodologies could help them. Further government policy appeared to be a barrier to the development of qualitative research in health services because support for this type of methodology was not expected by the doctors. For example, one respondent (D6) indicated support of this as …our research is validated by a panel who always asks for proven hypothesis.

Coding Data

Doctors who had used software for qualitative research tended to support the notion that software made research easier to analyse. As one respondent (D2) indicated …coding is difficult. I tend to miss a lot of the crucial meaning. But I try. Another respondent (D7) suggested that …I like working with others, as it is overwhelming dealing with all that data on your own. Coding data appears to be the one step that is seen as central to the qualitative researcher’s task. As one respondent (D11) indicated …I know I need to code, but when I look at ways of doing this I get a little anxious. You know what I mean? It would appear that Code-based Theory Builders (those using NVivo v7-v9) have become the main QDAS treatment available which allow in one form or another thematic coding, language assessment - the terminology used in the data, the comparison and occurrence of words and phrases, quantitative content information from the data, and mapping tools to see relationships between developed themes and sub-themes, tags and categories.

Modeling Data

Modeling data is an art that is difficult for many researchers. This refers to the data contextualized through code and cemented together under a category or theme. As one respondent (D9) highlighted …sometimes its easy to develop a huge number of themes that don’t link and keep them like that. Other times, they are saying all the same things. It would seem that some respondents appear very experienced with modeling data. For example, one respondent (D6) suggested that …it isn’t always clear to me how to use any kind of modeling. I can see the benefits, but the data – well, there’s just too much isn’t there? From the responses, it would appear that no respondent has attempted modeling data to any extent and this may reflect the inexperience of the respondents when carrying out qualitative research or that insufficient training has been provided to them. Within the context of this paper, it would seem that this unresolved issue consequently leaves the whole research process somewhat weakened.

Software Features

It would appear that doctors find that the qualitative software takes too long to learn and does not really benefit the patient in terms of medical outcomes. As one respondent (D4) stated …it really isn’t a method to support or deny policy, but to understand individual or small groups of patients needs better. Another respondent (D12) quickly supported this as …we need the research, but nobody takes any notice. So we tend to focus on more quantitative ways to address policy and management directions. Another respondent (D1) indicated that …the software is all so different. How am I expected to know which one to use, never-mind how to use it properly?

Transparency of audit trail

It was recognised by many respondents that the audit trail was an important aspect of qualitative rigour. As one respondent (D7) indicated …quality is important to us. We have a quality system in place. So, it should also be the same for research. This was supported by another respondent (D5) who stated …oh yes. We need to show where we came from and what the research outcomes are based on. Yes, it’s very important to show this… Another respondent (D3) suggested …it’s difficult to achieve so that managers accept it. It is also more difficult for the researcher because I have to do more work when using qualitative research than if I was to use figures. It was clear from respondents that linkages through the research process were difficult to show, but they still had to try to show them. As one respondent (D9) stated …each research decision has to be shown. Each output has to be shown. Sometimes this is too difficult even though you may have the right idea.

Ethical Standards

Overall, ethical standards appeared to over-ride all research decisions. As one respondent (D8) stated …we have to be so secure and methodical. It is much more difficult to get a qualitative research project approved than a quantitative one. I think it’s easier to use Math than words, but that’s me… Another respondent (D12) further indicated …whatever research we do we have abide by ethical rules. Qualitative research just makes it harder to comply. But it’s worth it in the end. This is aspect of knowing and following hospital ethical regulations introduces and element of rigour outside of each research project. When asked about the rules that were followed to govern research, the typical answers, as depicted by one respondent (D10) was …we follow or we get into trouble – it’s that simple.

The three research questions (above) therefore surround the application and implications of using software in qualitative research. This is modelled in Figure 1, below.

Figure 3
Figure 1 – Research Outcomes – Main Themes

As can be seen from Figure 1, the doctor’s research views and research questions are mapped together to illustrate how mind-mapping software can show the linkages and respondent extractions. It is also noticeable from Figure 1 that the left side relates to the questions and software, and to the right it relates to the Doctors views about managing data and its interpretation whilst reflecting present hospital policies. Out of 392 coding extractions, Figure 1, underpins the major outcomes as - Q1 – 14; Q2 – 8; and Q3 – 7 which are considered appropriate and context sensitive.

What are implications for Hospital Research Policy?

Table 3 below, shows a summary of main, sub-themes and more useful coded extractions totaling 197. This does not mean that more extractions are more important, but in the context of a focus group, more time and greater consideration was given to these elements and therefore these are deemed to be of greater significance in terms of the outcomes for the respondents.

Figure 4
Table 3 - Summary of Main, Sub-themes and More Important Coded Extractions

For developing hospital research policy therefore, this paper will now address each main theme in terms of the 4 elements (personal, administrative, research tools and research process) as contained in the following model:

Figure 5
Figure 2 – Hospital Research Policy Considerations

Policy Implications for each highlighted area from Figure 2 are discussed below.

Administrative - When developing research policy, the hospital appears to focus on quasi-revenue-generating schemes that seek to better not only the lives of its patients but other patients elsewhere and reduce health risks (Mitchell et al., 1999). Naturally, much of the funding-financial association is through external means (Medina, Lavado and Cabrera, 2005) - (mostly personal contacts through the medical supply system) and puts immense pressures on hospital staff to follow not only ethical standards but also the pre-determined process requirements (Nooteboom, 1999) - (publication; copyright; ownership outcomes; research and development - good clinical practice - clinical trials agreements etc.) affecting their ability to develop, construct and publicise their research outcomes through their seemingly rigorous empirical and ethical design (Denzin and Lincoln, 2005b). Consequently, the research orientation (divided into three major areas – Personal or group research; Student research practices; and Clinical trials of medicinal products or commercialisation research) has a dramatic effect on the efficacy of the arrangements to produce trustworthy outcomes that can help underpin hospital research policy (Malhotra, Gosain and El Sawy, 2005) and have been primarily associated with the functionalist paradigm (Stowell and Mingers, 1997) or production approach (Averch, 1989) dealing with organisational innovation and competitiveness (Porter, 1990). Clinical trials research poses threats as contracts are considered expensive and impossible to manage effectively as they are complex and variable (Nooteboom, 1999); although the need for transparency gives hospital researchers help in following research protocols. As such, this provides hospital research policy an easier and perhaps more effective way for research to be initiated in the hospital as well as provide much needed extra income/revenue and organisational competitiveness (McAdam and Keogh, 2004) through research. This stresses the importance of a research interdependence process (Coughlan, Stern and El-Ansary, 2001) and research innovation (Roehrich, 2004) between formal clinical trial research and personal research practices and activities involved in multi-disciplinary hospital-wide research programmes.

For example, when utilising a qualitative method to help visualise research needs, the process and outcomes may not necessarily be as expected in the quantitative orientation. Training, specifically more training in qualitative methods, may be an added burden for hospital research engagement, without the ability to offset the costs associated with this form of methodology. However, because of the bureaucratic culture of the hospital in general, and the need to follow quality management protocols, then the audit trail should be very transparent and this should help engineer greater acceptance of qualitative outcomes in a financially oriented culture whilst building-in research oriented best practices (Cooper, 1999).

Research training isn’t taken as seriously as it is expected, and those doctors engaging in research already have the tools and experience to conduct their research, as most research is non-mission critical (Kostoff, 1993c). However, for mission-critical clinical trials experiences and research methodology knowledge appears to provide confidence in managing these large capacity processes. However, it is probably the lack of appropriate doctor’s research experiences that reduces the hospitals capability to conduct more mainstream research –through treatment, supportive care or prevention trials (Shweta, Vidhi, and Satyaendra, 2007).

Research Tools – It would appear that software features are an issue. For example, it is often the case that more software features would mean an increase in price of the software and licencing cost. Further the greater the features the more likely there is an additional cost associated with training and time for learning. However, better tools (Loewe and Dominiquini, 2006) facilitating research (Kandampully, 2002) may also mean more effective engagement in the research process and therefore better quality data and analysis (Pissarra and Jesuino, 2005) resulting in a more effective research policy engagement for the hospital. Thus translating research into practice through performance examination, emergency preparedness, improving organisation effectiveness, and enhancing care practices.

Research Process – Difficulties with building appropriate research designs and controlling multi-research oriented processes often result in a narrow range of research methodologies used along with other methods of mechanistic data collection (Pope and Mays, 2006). This however, maybe directly related to the research culture (Ahmed, 1998) and how and the way in which this is utilised (Read, 2000). Consequently, “doctors as researchers” have become accustomed to these narrow orientations and steer away from perceived riskier (but more effective) research methodology uptakes. For example, if the hospital policy requires specified financial requirements to be met, then using a methodology that is perceived as not providing the substantive evidence for these requirements will not be used – irrespective of the methodology’s usefulness in getting at the “truth”. However, the learning organisation seeks out new knowledge and innovations (Pavitt, 2002) through appropriate research practices (Subramanian and Youndt, 2005) and the hospital research policy is therefore an important and strategic mechanism to operationalise (Brennan and Dooley, 2005). Therefore testing and retesting research questions while visibly implementing research policy appears to be a difficult task, but a necessary one to utilise in order to build research capacity (Campbell, et al., 1999). Using and applying qualitative software in research often requires the need to code and model data. It was stated that QDAS packages can cause the loss of the richness of qualitative data (Gilbert, 2002), has a significant learning curve (Blank, 2004) and is a technological barrier, as the software may reduce control of the data analysis (Catterall and Maclaran, 1998). Suggestions were also raised in that QDAS packages encourage the application of a mechanistic approach and create directional bias (Welsh, 2002) to the analysis and presentation of their data (David and Sutton, 2004). Therefore the role of the QDAS package is seen by some doctors as to impose a “framework of operation” which for some qualitative researchers may be perceived as a mechanical influence in the way they conduct the qualitative research analysis and apply qualitative techniques (Richards, 2002) and the resulting different ways of presenting the relationships among the developed codes and between codes and linked text through modelling data. For example, some QDAS packages only allow the representation of non-hierarchical relationships; and others emphasise the relationship between adopted codes and cases, rather than adopted codes and portions of targeted text (Rubin and Rubin, 2005). This suggests that “doctors as researchers” need to be aware of the package coding influence and how the package appears to create boundaries of operation of the researcher’s conduct of their analysis. This could lead to some researchers visually limiting their models, as well as reducing the conceptually engaged categories as they become lost in the package’s technical constraints. This is where the power of mind-mapping tools offers doctors an easier path to understanding the relationships between developed themes. Therefore, no QDAS package should be considered apolitical and each has their own demonstrated bias that the researchers need to be aware of.

Personal - Ethical standards appears to very high on the list of hospital researchers behavior. In essence, this means that there would be greater intolerance to processes that move out of line, and therefore the research process is likely to be considered more trustworthy. Further, this would mean that in a qualitative setting, that delivering outcomes may help with hospital research policy enrichment, by creating diversified research processes. It would appear that respondents share their research (Kazanjian, Drazin and Glynn, 2000) in order to disseminate outcomes in line with Kok, Jongedijk and Troost (2003), but may not do so as widely as expected in other industries (Gibbert, Leibold and Probst, 2002). However, only selective research information needs to be released to staff (von Krogh, Nonaka, and Aben, 2001) and always according to the research rationale, protocols and ethical standpoints. Of further importance is the motivation of staff (Rivas and Gobeli, 2005) to engage in research and this can be tested by the management style (Hyland and Beckett, 2005) and their adoption and support of appropriate research-oriented policies as part of the whole corporate strategy (Balogun and Hailey, 2004).

Possible Benefits of the Application of QDAS to Doctors Research

When mapping out the benefits of one QDAS package against the researchers data needs, the data always took precedence – corresponding to the methodological orientation. In this way, the QDAS package can be seen as an instrument that helps “doctors as qualitative researchers” analyse data streams more effectively (Welsh, 2002) and reduce data into more manageable segments through the processes of categorization and hierarchal relationship building (Atherton and Elsmore, 2007). Notably, it was understood from the data that the original format of the data available for analysis will often determine which QDAS package is more useful. Inevitably, compromises are often made and the data transformed and revised to adapt to each particular QDAS operating parameters – as each software package will ultimately provide a different generalised framework that “doctors as qualitative researchers” will need to learn and accommodate methodologically. The choice of QDAS package also depends on how the researcher expects to undertake the data analysis (Weitzman, 2000) as the qualitative researcher is the main tool for analysis, regardless of whether QDAS is employed in the data analysis (Denzin and Lincoln, 2005b) or elsewhere in the project. Consequently, a doctor’s approach is literally bound to treat the software package as a tool.

One of the first benefits is that using software speeds up analysis tasks considerably. This was recognised by many doctors; because utilising QDAS packages to search for important phrases, create text segments and code them, cross-reference memos to text and codes, create cross-references between parts of the text, can easily be carried out very efficiently and swiftly. This did not seem to compromise the feelings of keeping close to the data (contrary to Barry, 1998) as many qualitative researchers require - as the QDAS package does not carry out any kind of independent qualitative analysis. Over time and with appropriate experience, reliance on hand-built and manually adopted schemes of qualitative research analysis will retreat and more functional and flexible QDAS packages will appear to take their place.

Other major benefits of using QDAS are interactivity and collaboration (Asensio, 2000), which enhances avenues for flexibility throughout the research process (Lewins and Silver, 2007). This allows the researcher to focus on the meaning of the data streams through digital convergence (Brown, 2002) whilst providing adequate transparency through an audit trail (St. John and Johnson, 2000).

Possible Limitations Associated with the Use of QDAS

Consequently, there is nothing more unequal about the equal treatment of QDAS package applications to qualitative approaches. However, this appears to be more of an issue with novice or first-time qualitative researchers as they often lack a critical perspective (Mangabeira, Lee and Fielding, 2004) of the application of QDAS packages on qualitative processes and methodologies.

Further, it would appear too many doctors that some preliminary structuring of the research arena – based on the outcomes of a literature review - are often tenuous and signal inadequate research design as well as inadequate attention to the literature which leads to poorly structured analytic outcomes that are unsupported either by the literature or by the adopted analysis techniques. In this way “doctors as qualitative researchers” are drawn into thinking that their methodological process is robust and secure. To mitigate this, coding should be continuously reviewed and rechecked (Loxley, 2001) and planned and supported using mind-mapping techniques. QDAS applications appear to doctors to separate out the real data from the reporting where quotations from interviews are often used as a substitute for analysis (Gibbs, 2002) and used to impart a ‘gloss of rigour’ to reporting qualitative research outcomes. With some QDAS packages, the standing/existing research model is often nullified by losing the details of its past form from which a new, revised model has been created from it – through the notion of inspired progress. The use of mind-mapping techniques and packages such as NVivo reduces such issues by providing connection and visual tracking through each new model development as the research process progresses.

There was some disagreement on a theoretical level as to whether the use of QDAS packages may make qualitative data more quantitative in nature. However, Hwang (2008) indicated that this may be just attempting to make the research process more effective through transparency and enhancing reliability of the data analysis processes using visual and deductive elements of various QDAS packages. This outcome supports the notion of applying ethical standards and in utilising the transparency of the audit trail.

Other QDAS dilemmas that were raised by doctors include the pacing of data collection, the volume of data, the procedure and rigor of data analysis, the framing of the ensuing analysis and the software product (Glaser, 2001). However, the qualitative researcher should always remain in control of the data analysed and some QDAS packages provide more effective ways for this to happen. Fear, lack of knowledge, when to end, when to use other methods, ease of use of software, Cross-case thematic analysis, complexity of software, software availability and price, features and flexibility of software, triangulation, speed, presentation of result outcomes are depicted in different ways. Close attention to the effect of each QDAS package on similar data sets may help determine the efficacy of the rigourous of adopted qualitative research method.

Conclusion

The use of QDAS packages in qualitative research is gaining ground and for some “doctors as qualitative researchers” they offer speed and flexibility in assessing and analysing large volumes of generated data. QDAS packages are utilised by small numbers of qualitative researchers that attempt to use technology to explore and make sense of qualitative related data. However, this is about to change as QDAS packages like NVivo have now been developed to a point where their usefulness and inner technical abilities can enhance qualitative researchers primary investigation, methodological and data analysis response. The application of QDAS packages for “doctors as qualitative researchers” and their influences on hospital research policy have gone beyond just data analysis as long as doctors follow ethical guidelines in conducting research. This will also give greater support to clinical doctors engaging in qualitative evidence-based research leading to more effective primary care solutions.

References

r-0. Ahmed PK: Culture and climate for innovation. European Journal of Innovation Management 1998, 1:30-43.
r-1. Asensio M: Choosing NVivo to support phenomenographic research in networked learning. Proceedings of a symposium conducted at the meeting of the Second International on Networked Learning, 2000 April, Lancaster, England.
r-2. Atherton A, Elsmore P: Structuring qualitative enquiry in management and organization research: A dialogue on the merits of using software for qualitative data analysis. Qualitative Research in Organizations and Management 2007, 2(1):62-77.
r-3. Averch H: Exploring the cost-efficiency of basic research funding in chemistry. Research Policy 1989, 18:165-172.
r-4. Balshem M: Cancer, control and causality: Talking about cancer in a working class community. American Ethnologist 1991, 18:152-172.
r-5. Balogun J, Hailey VH: Exploring Strategic Change (2nd Edn). Essex: FT, Prentice Hall, UK; 2004.
r-6. Barry CA: Choosing qualitative data analysis software: Atlas.ti and Nudist compared. Sociological Research Online 1998, 3.
r-7. [http://www.socresonline.org.uk/socresonline/3/3/4.html]
r-8. Bassett R: Qualitative data analysis software: Addressing the debates. Journal of Management Systems 2004, 15:33-39.
r-9. Bazeley P: The contribution of computer software to integrating qualitative and quantitative data and analyses. Research in the Schools 2006, 13(1):64-74.
r-10. Blank G: Teaching qualitative data analysis to graduate students. Social Science Computer Review 2004, 22(2):187-196.
r-11. Brennan A, Dooley L: Networked creativity: a structured management framework for stimulating innovation. Technovation 2005, 25(12):1388-1399.
r-12. Broom A: Using Qualitative Interviews in CAM Research: A Guide to Study design, Data Collection, and Data Analysis. Complementary Therapies in Medicine 2005, 13(1):65-73.
r-13. Brown D: Going digital and staying qualitative: Some alternative strategies for digitizing the qualitative research process. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research 2002, 3(2).
r-14. [http://www.qualitativeresearch.net/fqs-texte/2-02/2-02brown-e.htm]
r-15. Brown LK, Reynolds LA, Brenman AJ: Out of focus: Children’s conceptions of AIDs. Journal of Health Education 1994, 25(4):204–208.
r-16. Bryman A, Burgess RG: Analyzing Qualitative Data, London: Routledge, UK; 1994.
r-17. Buston, K. (1999). NUD*IST in action: its use and its usefulness in a study of chronic illness
r-18. in young people. In: Bryman A, Burgess RG, editors. Qualitative Research. Vol. 3, Analysis
r-19. and Interpretation of Qualitative Data. London: Sage. 183-202.
r-20. Campbell SM, Roland M, Bentley E, Dowell J, Hassall K, Pooley J, Price, H: Research capacity in UK primary care. British Journal of General Practice 1999, 49(449):967-970.
r-21. Carman JM: Consumer Perceptions of Service Quality: An Assessment of the SERVQUAL Dimensions. Journal of Retailing 1990, 66(1):33-55.
r-22. Carpenter J: Metaphors in Qualitative Research: Shedding Light or Casting Shadows? Research in Nursing & Health 2008, 31(3):274–282.
r-23. Carrese JA, Mullaney JL, Faden RR: Planning for death but not serious future illness: Qualitative study of household elderly patients. British Medical Journal 2002, 325(7356):125-130.
r-24. Cassell C, Symon G: Essential Guide to Qualitative Methods in Organizational Research, London: Sage, UK; 2004.
r-25. Catterall M, Maclaran P: Using computer software for the analysis of qualitative market research data. Journal of the Market Research Society 1998, 40(3):207-222.
r-26. Cayla J, Eckhardt GM: Asian brands without borders: regional opportunities and challenges. International Marketing Review 2007, 24(4):444-456.
r-27. Cooper RG: From experience – the invisible success factors in product innovation. Journal of Product Innovation Management 1999, 16(2):115-133.
r-28. Coughlan AT, Stern LW, El-Ansary ID: Marketing Channels, Prentice-Hall, Englewood Cliffs, NJ, US; 2001.
r-29. Daniels et al: The Successful Resolution of Armed Hostage/Barricade Events in Schools: A Qualitative Analysis. Psychology in the Schools 2007, 44(6):601-613.
r-30. David M, Sutton CD: Social Research: the basics, London: Sage, UK; 2004.
r-31. Denzin NK, Lincoln YS: Introduction: The discipline and practice of qualitative research. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage Handbook of Qualitative Research. (3rd ed.). Thousand Oaks, CA: Sage, US; 2005b.
r-32. Dey I: Qualitative Data Analysis: A user-friendly guide for social scientists. London: Routledge, UK; 1993.
r-33. Fielding NG, Lee, RM: New patterns in the adoption and use of qualitative software. Field Methods 2002, 14(2):197-216.
r-34. Gerbic P, Stacey E: A purposive approach to content analysis: Designing analytical frameworks. The Internet and Higher Education 2005, 8(1):45-59.
r-35. Gibbert M, Leibold M, Probst G: Five styles of customer knowledge management, and how smart companies use them to create value. European Management Journal 2002, 20(5):459–469.
r-36. Gibbs GR: Qualitative data analysis: Explorations with NVivo. Buckingham, England: Open University Press, UK; 2002.
r-37. Gilbert LS: Going the distance: ‘closeness’ in qualitative data analysis software, International Journal of Social Research Methodology 2002, 5(3):215-228.
r-38. Glaser BG: Basics of grounded theory analysis: Emergence vs. forcing. Mill Valley, CA: Sociology Press, US; 1992a.
r-39. Glaser BG: The Grounded Theory Perspective: conceptualization contrasted with description. Mill Valley, California: The Sociology Press, US; 2001.
r-40. Glaser BG: Remodeling Grounded Theory. The Grounded Theory Review: An International Journal 2004, 4(1):1-24.
r-41. Glaser BG, Strauss AL: The Discovery of Grounded Theory: Strategies for qualitative research. Chicago: Aldine, US; 1967.
r-42. Gonzalez C: Conceptions of, and approaches to, teaching online: a study of lecturers teaching postgraduate distance courses. Higher Education 2008, 57(3):299-314.
r-43. Gray J, Wilcox B: Good Schools, Bad Schools, Open School Press, UK; 1995.
r-44. Harrel GD, Fors MF: Marketing services to satisfy internal customers. Logistics Information Management 1995, 8(4):22-27.
r-45. Harwood TG, Garry T: An overview of content analysis. The Marketing Review 2003, 3(4): 479-498.
r-46. Hatch JA: Doing qualitative research in education settings. Albany: SUNY Press, US; 2002.
r-47. Hwang S: Utilizing qualitative data analysis software: A review of Atlas.ti. Social Science Computer Review 2008, 26(4):519-527.
r-48. Hyland P, Beckett R: Engendering an innovative culture and maintaining operational balance. Journal of Small Business and Enterprise Development 2005, 12(3):336-352.
r-49. Johnson RB, Christensen LB: Educational research: Quantitative, qualitative, and mixed approaches. Boston, MA: Allyn and Bacon, US; 2004.
r-50. Kandampully J: (2002). Innovation as the core competency of a service organisation: the role of technology, knowledge and networks. European Journal of Innovation Management, 5(1), 18-26.
r-51. Kazanjian R.K, Drazin R, Glynn MA: Creativity and technological learning: the roles of organization architecture and crisis in large-scale projects. Journal of Engineering and Technology Management 2000, 17(3–4):273–298.
r-52. Kearns RA: Being There: Research Through Observing and Participating. In Qualitative Research Methods in Human Geography. Edited by Hay I. Oxford: Oxford University Press, UK; 2000:103-121.
r-53. Kember D, Kwan K: Lecturers’ approaches to teaching and their relationship to conceptions of good teaching. Instructional Science 2000, 28(5):469–490.
r-54. Kok G, Jongedijk S, Troost J: Insights from KPMG’s European Knowledge Management Survey 2002/2003. Amsterdam: KPMG Knowledge Advisory Services, The Netherlands; 2003.
r-55. Kostoff RN: Semi-quantitative methods for research impact assessment. Technological Forecasting and Social Science 1993c, 44(3):231-244.
r-56. von Krogh G, Nonaka I, Aben M: Making the most of your company’s knowledge: a strategic framework. Long Range Planning 2001, 34(4):421-439.
r-57. Krueger R: Focus Groups: A Practical Guide for Applied Research. Thousand Oaks: Sage, CA, US; 1994.
r-58. Lee RM, Esterhuizen L: Computer software and qualitative analysis: Trends, issues, and responses. International Journal of Social Research Methodology 2000, 3(3):231-243.
r-59. Levine IS, Zimmerman JD: Using qualitative data to inform public policy: Evaluating Choose to De-Fuse. American Journal of Orthopsychiatry 1996, 66(3):363-377.
r-60. Lewins A, Silver C: Using Software in Qualitative Research: A step-by-step guide. London: Sage Publications, UK; 2007.
r-61. Loewe P, Dominiquini J: Overcoming the barriers to effective innovation. Strategy and Leadership 2006, 34(1):24-31.
r-62. Loxley W: (2001). Drowning In Words? Using NUDIST to Assist In the Analysis of Long Interview Transcripts from Young Injecting Drug Users. Addiction Research and Theory, 9(6):557-573.
r-63. Malhotra A, Gosain S, El Sawy OA: Absorptive capacity configurations in supply chains: gearing for partner-enabled market knowledge creation. Mis Quarterly 2005, 29(1):145-187.
r-64. Mangabeira WC, Lee RM, Fielding NG: Computers and qualitative research: Adoption, use, and representation. Social Science Computer Review 2004, 22(2):167-178.
r-65. McAdam R, Keogh W: Transitioning towards creativity and innovation measurement in SMEs. Creativity and Innovation Management 2004, 13(2):126-139.
r-66. Medina CC, Lavado CA, Cabrera RV: Characteristics of innovative companies: a case study of companies in different sectors. Creativity and Innovation Management 2005, 14(3):272-287.
r-67. Meyrick J: What is Good Qualitative Research? A First Step towards a Comprehensive Approach to Judging Rigour/Quality. Journal of Psychology 2006, 11(5):799-808.
r-68. Miles MB, Huberman AM: Qualitative Data Analysis: An expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage, US; 1994.
r-69. Mitchell V, Davies F, Moutinho L, Vassos V: Using neural networks to understand service risk in the holiday product. Journal of Business Research 1999, 46(2):167-180.
r-70. Morse JM: Designing funded qualitative research. In N. K. Denzin and Y. S. Lincoln (Eds.), Handbook of qualitative research. Thousand Oaks, CA: Sage, US; 1994:220-235.
r-71. Morse JM, Richards L: Read me first for a user’s guide to qualitative methods. Thousand Oaks, CA: Sage, US; 2002.
r-72. Nooteboom B: Inter-Firm Alliances: Analysis and Design, Routledge, London, UK; 1999.
r-73. Onwuegbuzie AJ, Leech NL: Enhancing the interpretation of “significant” findings: The role of mixed methods research. The Qualitative Report 2004, 9:770-792. [http://www.nova.edu/ssss/QR/QR9–4/ onwuegbuzie.pdf]
r-74. Onwuegbuzie AJ, Leech NL: Taking the “Q” out of research: Teaching research methodology courses without the divide between quantitative and qualitative paradigms. Quality & Quantity: International Journal of Methodology 2005a, 39(3):267-296.
r-75. Patton MQ: Qualitative Research and Evaluation Methods. (3rd ed.). Thousand Oaks, CA: Sage Publications, US; 2002.
r-76. Peace R: Computers, Qualitative Data and Geographic Research. In Qualitative Research Methods in Human Geography. Edited by Hay I. Oxford: Oxford University Press, UK; 2000:144-160.
r-77. Pavitt K: Innovating routines in the business firm: what corporate tasks should they be. Industrial and Corporate Change 2002, 11(1):117-133.
r-78. Pissarra J, Jesuino JC: Idea generation through computer-mediated communication: the effects of anonymity. Journal of Managerial Psychology 2005, 20(3-4):275-291.
r-79. Pope C, Mays N: Qualiative methods in health research. In: Pope C, Mays N (editors). Qualitative Research in Health Care (3rd edition). Malden (MA): Blackwell Publications/BMJ Books, US; 2006.
r-80. Porter ME: The Competitive Advantages of Nations. London, MacMillan Press, UK; 1990.
r-81. QSR International. Introducing NVivo 9, 2010.
r-82. [http://www.qsrinternational.com/products_nvivo.aspx]
r-83. Read A: Determinants of successful organisational innovation: a review of current research. Journal of Management Practice 2000, 3(1):95-119.
r-84. Richards T: An intellectual history of NUD*IST and NVivo. International Journal of Social Research Methodology 2002, 5(3):199-214.
r-85. Rivas R, Gobeli DH: Accelerating innovation at Hewlett-Packard. Research Technology Management 2005, 48(1):32-39.
r-86. Roehrich G: Consumer innovativeness: concepts and measurements. Journal of Business Research 2004, 57(6):671-677.
r-87. Rubin HJ, Rubin IS: Qualitative interviewing: The art of hearing data. (2nd ed.). Thousand Oaks, CA: Sage, US; 2005.
r-88. Ryan GW, Bernard HR: Data management and analysis methods. In N. K. Denzin & Y. S. Lincoln (Eds.) Collecting and interpreting qualitative materials. Thousand Oaks, CA: Sage Publications, US; 2003.
r-89. Shweta K, Vidhi D, Satyaendra S: An overview of challenges and dire need of clinical trials. Calicut Medical Journal 2007, 5(3), e2.
r-90. Stowell F, Mingers J: Information systems: An emerging discipline? - Introduction. In J. Mingers & F. Stowell (Eds.), Information systems: An emerging discipline? London: McGraw-Hill, UK; 1997:1-15.
r-91. St. John W, Johnson P: The pros and cons of data analysis software for qualitative research. Journal of Nursing Scholarship 2000, 32(4):393-397.
r-92. Straus A, Corbin J: Basics of qualitative research. Newbury Park, CA: Sage, US; 1990.
r-93. Strauss A, Corbin J: Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. (2nd ed.). Thousand Oaks, CA: Sage, US; 1998.
r-94. Subramaniam M, Youndt MA: The influence of intellectual capital on the types of innovative capabilities. Academy of Management Journal 2005, 43(3):450-463.
r-95. Ulin PR, Robinson ET, Tolley EE: Qualitative methods in public: A field guide for applied research. San Francisco: Jossey-Bass, US; 2005.
r-96. Vaughn S, Schumm JS, Sinagub J: Focus group interviews in education and psychology. Sage: London, UK; 1996.
r-97. Walsh SP, White KM, Young RM: Over-connected? A qualitative exploration of the relationship between Australian youth and their mobile phones. Journal of Adolescence 2008, 31:77-92.
r-98. Webb C: Analysing Qualitative Data: Computerized and Other Approaches. Journal of Advanced Nursing 1999, 29(2):323-330.
r-99. Weitzman EA: Software and Qualitative Research. In N. K. Denzin & Y.S. Lincoln (Eds.), Handbook of Qualitative Research (803-820). Thousand Oaks, CA: Sage, US; 2000.
r-100. Welsh E: Dealing with data: Using NVivo in the qualitative data analysis process. Forum Qualitative Sozialforschung / Forum: Qualitative Social Research 2002, 3(2), May. [http://www.qualitative-research.net/fqs-texte/2-02/2-02welsh-e.htm]
r-101. Williams M, Mason B, Renold E: Using computers in qualitative research: A review of software packages. Building Research Capacity 2004, February, 7:4-7.

Author Information

Paul TJ James
Graduate School, Qualitative Studies - Health Care, Bangkok University

Download PDF

Your free access to ISPUB is funded by the following advertisements:

 

BACK TO TOP
  • Facebook
  • Google Plus

© 2013 Internet Scientific Publications, LLC. All rights reserved.    UBM Medica Network Privacy Policy