Laman

Jumat, 13 Januari 2012

international journal


International Journal of Education
ISSN  1948-5476
2010, Vol.  2, No. 2: E13



Describing and Illustrating Data Analysis in Mixed Research



Julie P. Combs (Associate Professor) & Anthony J. Onwuegbuzie (Professor) Department of Educational Leadership and Counseling
Sam Houston State University, USA E-mail: tonyonwuegbuzie@aol.com

Abstract

In  this  methodological  paper,  the  authors  propose  a  tool  that  brings  together  various quantitative  and   qualitative  data  analysis  (i.e.,  mixed  analysis)  techniques  into  one meta-framework to assist mixed researchers (who use qualitative and quantitative approaches within  the  same  study)  in  the  data  analysis   phase  of  mixed  research  studies.  A meta-framework for mixed analysis techniques is described, which  incorporates 13 criteria that methodologists have used to create their mixed analysis typologies. In  particular, a heuristic example is used with the aid of screenshots to illustrate how one can utilize several of these data analysis techniques to conduct mixed analyses.

Keywords:  Mixed  research,  Mixed  methods  research,  Quantitative  research,  Qualitative research, Mixed analysis, Analysis screenshots


1 www.macrothink.org/ije

1. Mixed Research

Mixed research, the third methodological paradigm—alongside qualitative and quantitative research—involves “mix[ing] or combin[ing] quantitative and qualitative research techniques, methods, approaches, concepts or language into a single study” (Johnson & Onwuegbuzie,
2004, p. 17). Because of its complexity relative to qualitative and quantitative research, one of the more challenging steps in the mixed research process is that of analyzing data. Mixed researchers  have  to  be  competent  in  utilizing  quantitative  and  qualitative  data  analysis techniques or employ team members (i.e., co-researchers) who can conduct several types of analyses.  To  assist  mixed  researchers,  Onwuegbuzie  and  Combs  (2010)  developed  an inclusive framework for mixed analyses. In the first section of this article, we describe their inclusive framework. In the second part, we provide a heuristic example to illustrate, using screenshots, how one can utilize this framework to conduct mixed analyses.
2. Meta-Framework for Mixed Analysis Techniques
Since Greene, Caracelli, and Graham’s (1989) seminal article a little more than 20 years ago, several mixed  analysis techniques have emerged. In particular, there have been numerous articles (e.g., Bazeley, 1999, 2003, 2006, Caracelli & Greene, 1993; Chi, 1997; Datta, 2001; Greene, 2008; Happ, DeVito Dabbs, Tate, Hricik, & Erlen, 2006; Jang, McDougall, Pollon,
& Russell, 2008; Lee & Greene, 2007; Li, Marquart, & Zercher, 2000; Onwuegbuzie, 2003; Onwuegbuzie & Collins, 2009; Onwuegbuzie & Combs, 2009a; Onwuegbuzie & Dickinson,
2008; Onwuegbuzie & Leech, 2004, 2006; Onwuegbuzie, Slate, Leech, & Collins, 2007,
2009; Onwuegbuzie & Teddlie, 2003; Sandelowski, 2000, 2001; Teddlie, Tashakkori, & Johnson, 2008; West & Tulloch, 2001) and chapters in seminal mixed research books (e.g., Bazeley, 2009, Creswell & Plano Clark, 2007, 2010; Greene, 2007; Johnson & Christensen,
2008; Rao & Wolcock, 2003; Tashakkori & Teddlie, 1998; Teddlie & Tashakkori, 2009; Todd, Nerlich,  McKeown, & Clarke, 2004). These articles and book chapters have been instrumental in providing mixed analysis strategies for mixed researchers. However, these strategies typically have been presented in an isolated manner as standalone techniques with little or no interaction with other mixed analysis techniques. Indeed, as surmised by Greene (2008), to date, despite the extensiveness of the field of mixed analysis, “this work has not yet cohered into a widely accepted framework or set of ideas” (p. 14). As such, it is clear that an integrated, interactive framework is needed that provides mixed researchers with a map of the mixed- analytical landscape.
In developing their inclusive and interactive framework, Onwuegbuzie and Combs (2010) used classical content analysis (Berelson, 1952) to review mixed research articles in which authors developed typologies for mixed analysis strategies (e.g., Bazeley, 1999, 2003, 2006,
2009; Caracelli & Greene, 1993; Chi, 1997; Creswell & Plano Clark, 2007, 2010; Datta, 2001; Greene, 2007, 2008; Greene et al., 1989; Happ et al., 2006; Li et al., 2000; Onwuegbuzie,
2003;  Onwuegbuzie,  Collins,  &  Leech,  in  press;  Onwuegbuzie  &  Dickinson,  2008; Onwuegbuzie &  Leech, 2004, Onwuegbuzie et al., 2007, 2009; Onwuegbuzie & Teddlie,
2003; Sandelowski, 2000, 2001; Tashakkori & Teddlie, 1998; Teddlie & Tashakkori, 2009; Teddlie et al., 2008; West & Tulloch, 2001). Their analysis revealed the following 13 criteria
that the aforementioned authors have used to create their mixed analysis typologies:


2 www.macrothink.org/ije


1.  rationale/purpose for conducting the mixed analysis
2.  philosophy underpinning the mixed analysis
3.  number of data types that will be analyzed
4.  number of data analysis types that will be used
5.  time sequence of the mixed analysis
6.  level of interaction between quantitative and qualitative analyses
7.  priority of analytical components
8.  number of analytical phases
9.  link to other design components
10. phase of the research process when all analysis decisions are made
11. type of generalization
12. analysis orientation
13. cross-over nature of analysis
2.1 Criterion 1: Rationale/Purpose for Conducting the Mixed Analysis

Greene et al. (1989) conceptualized a typology for mixed methods purposes/designs that involves the following five purposes: triangulation, complementarity, development, initiation, and  expansion.  Applying  these  to  mixed  analysis  decisions,  when  triangulation  is  the rationale for conducting the mixed analysis, the researcher would compare findings from the qualitative data with the quantitative results. If complementarity is noted as the purpose for the mixed analysis, then the researcher would seek elaboration, illustration, enhancement, and clarification of the findings from one analytical strand (e.g., qualitative) with results from the other analytical strand (e.g., quantitative). When development is identified as the purpose, then the researcher would use the results from one analytical strand to help inform the other analytical  strand.  With  initiation  as  a  rationale  for  performing  a  mixed  analysis,  the researcher would look for paradoxes and contradictions that emerge when findings from the two  analytical  strands  are  compared.  Such  contradictions  might  lead  to  new  research questions. Finally, with expansion as a purpose, the researcher would attempt to expand the breadth and range of a study by using multiple analytical strands for different study phases.

2.2 Criterion 2: Philosophy Underpinning the Mixed Analysis

In mixed research, researchers from all paradigmatic traditions potentially can utilize both quantitative and qualitative analyses (Bazeley, 2009), depending on their research questions. As such, philosophical  assumptions  and stances can play a role in the analytical decisions made. Onwuegbuzie et al. (in press) identified the following 12 philosophical belief systems that   characterize   mixed   research:   pragmatism-of-the-middle   philosophy   (Johnson   & Onwuegbuzie,  2004),  pragmatism-of-the-right  philosophy  (Rescher,  2000),  pragmatism- of-the-left  philosophy  (Maxcy,  2003),  the  anti-conflationist  philosophy  (Roberts,  2002), critical realist orientation (McEvoy & Richards, 2006), the dialectical stance (Greene, 2008; Greene & Caracelli, 1997), complementary strengths stance (Morse 2003), transformative- emancipatory  stance  (Mertens,  2003),  a-paradigmatic  stance  (Reichardt  &  Cook  1979), substantive theory stance (Chen 2006), communities of practice stance (Denscombe, 2008), and,  most  recently,  dialectal  pragmatism  (Johnson,  2009).  Philosophical  belief  systems influence  the  mixed  analysis  strategies  used.  (For  additional  information  about  mixed


3 www.macrothink.org/ije

methods paradigms/worldviews, see Onwuegbuzie et al., in press; Onwuegbuzie, Johnson, & Collins, 2009.)

2.3 Criterion 3: Number of Data Types That Will Be Analyzed

Mixed data analysis can involve both qualitative and quantitative data (Creswell & Plano Clark,  2007,  2010). Conversely,  mixed  analysis  can  occur  with  just  one  data  type (Onwuegbuzie et al., 2007). For example, according to Onwuegbuzie et al., if the data type is qualitative then the first phase of the mixed analysis would be qualitative and in the second phase, data would be converted into a quantitative form or quantitized (i.e., transformed into numerical codes that can be analyzed statistically; Miles & Huberman, 1994; Tashakkori & Teddlie, 1998). Conversely, quantitative data, after being subjected to a quantitative analysis, can then be qualitized (i.e., transformed into narrative data that can be analyzed qualitatively; Tashakkori & Teddlie, 1998).

2.4 Criterion 4: Number of Data Analysis Types That will be Analyzed

When conducting  a  mixed  analysis,  at  least  one  qualitative  analysis  and  at  least  one quantitative analysis are needed to conduct a mixed analysis (Creswell & Tashakkori, 2007). Therefore, an additional  question for mixed methods researchers to consider would be the number of qualitative analyses and quantitative analyses needed in the study.

2.5 Criterion 5: Time Sequence of the Mixed Analysis

The qualitative  and  quantitative  analyses  can  be  conducted  in  chronological  order,  or sequentially (i.e.,  sequential mixed analysis) or they can be conducted in no chronological order, or concurrently (i.e., concurrent mixed analysis). When concurrent mixed analyses are used, the analytical strands do not occur in any chronological order (Tashakkori & Teddlie,
1998).  Rather, either analytical type can occur first because the two sets of analyses are functionally  independent.  Several  options  are  presented  for  sequential  mixed  analyses (Teddlie & Tashakkori, 2009). The qualitative analysis phase can be conducted first and then used to inform the subsequent quantitative analysis phase (i.e., sequential qualitative-quantitative analysis) or the quantitative analysis phase is conducted first, which then informs the subsequent qualitative analysis phase (i.e., sequential quantitative-qualitative analysis). In addition, the qualitative and quantitative analyses can occur sequentially in more than two phases (i.e., iterative sequential mixed analysis, Teddlie & Tashakkori, 2009).

2.6 Criterion 6: Level of Interaction between Quantitative and Qualitative Analyses

Another component in mixed analyses decisions involves the point at which the various analysis strands interact. Parallel mixed analysis is likely the most common mixed analysis technique (Teddlie & Tashakkori, 2009), which involves two separate processes, for example, a quantitative analysis of quantitative data and a  qualitative analysis of qualitative data. According  to  Teddlie  and  Tashakkori  (2009),  “Although  the  two  sets  of  analyses  are independent, each provides an understanding of the phenomenon under investigation. These understandings are linked, combined, or integrated into meta-inferences” (p. 266).


4 www.macrothink.org/ije

2.7 Criterion 7: Priority of Analytical Components

Another dimension to consider when conducting a mixed analysis is the priority or emphasis given to the analytical strands. Specifically, the qualitative and quantitative strands can have equal priority (i.e., equal status) with respect to addressing the research question(s), or one analytical strand can have a higher priority than the other strand (i.e., dominant status) (cf. Morse, 2003).

2.8 Criterion 8: Number of Analytical Phases

Mixed analyses can be phase-based in nature. For example, Greene (2007, p. 155) identified the  following  four  phases  of  analysis:  (a)  data  transformation,  (b)  data  correlation  and comparison, (c) analysis for  inquiry conclusions and inferences, and (d) utilization of one methodological  tradition  within  the  analysis   of  data  from  another  tradition.  Another phase-based typology presented by Onwuegbuzie and Teddlie (2003) is a seven-step process for mixed data analysis: (a) data reduction, (b) data display, (c) data transformation, (d) data correlation,  (e)  data  consolidation  (f)  data  comparison,  and  (g)  data  integration.  Thus, whether or not to use a phase-based analytical approach is another consideration for mixed researchers.

2.9 Criterion 9: Link to Other Design Components

Mixed analyses can be design-based, wherein the analyses are linked directly to the mixed research  designs for the study. Teddlie and Tashakkori (2009) developed a typology that contains the following six  techniques: (a) parallel mixed data analysis that are linked to parallel mixed designs, (b) conversion mixed  data analysis that are linked to conversion mixed designs, (c) sequential mixed analysis that are linked to sequential mixed designs, (d) multilevel mixed data analysis, (e) fully integrated mixed data analysis that are linked to fully integrated designs, and (f) application of analytical techniques of one tradition to the other. According  to  Creswell  and  Plano  Clark  (2007),  “The  type  of  data  analysis  will  vary depending on the  type of mixed design used” (p. 135). These authors link four analysis techniques to their four major mixed methods designs (for more information, see Creswell & Plano Clark, 2010).

2.10 Criterion 10: Phase of the Research Process When All Analysis Decisions are Made

Decisions about the mixed analysis of a study can be made a priori, a posteriori, or iteratively. A priori decisions are more likely to occur in quantitative-dominant mixed analyses; whereas, a posteriori are more  likely  to occur in qualitative-dominant mixed analyses (cf. Johnson, Onwuegbuzie,  &  Turner,  2007).  Decisions  regarding  the  mixed  analyses  that  are  made iteratively means that some analytic decisions are  made a priori, whereas the remaining analytic  decisions  are  emergent.  Iterative-analytic  decisions  represent  the  most  common decisions in mixed research.

2.11 Criterion 11: Type of Generalization

The type of generalizations pertinent to the study can inform the mixed analysis design. Onwuegbuzie,  Slate, et al. (2009) have identified five major types of generalizations that


5 www.macrothink.org/ije

researchers can  make,  as  follows:  (a)  external  (statistical)  generalizations  (i.e.,  making generalizations,  inferences, or predictions on data obtained from a representative statistical (i.e., optimally random) sample to  the population from which the sample was drawn), (b) internal (statistical) generalizations (i.e., making generalizations, inferences, or predictions on data obtained from one or more representative or elite  participants [e.g., key informants, politically important cases, sub-sample members]), (c) analytic generalizations (i.e., “applied to wider theory on the basis of how selected cases ‘fit’ with general  constructs”; Curtis, Gesler,  Smith,  &  Washburn,  2000,  p.  1002),  (d)  case-to-case  transfer  (i.e.,   making generalizations  or  inferences  from  one  case  to  another  (similar)  case  (Firestone,  1993; Kennedy,  1979;  Miles  &  Huberman,  1994),  and  (e)  naturalistic  generalization  (i.e.,  the readers of the article make generalizations entirely, or at least in part, from their personal or vicarious experiences [Stake, 2005], such that meanings arise from personal experience, and are adapted and reified by repeated encounter [Stake, 1980; Stake & Trumbull, 1982]). These researchers assert that mixed analysis involves data analysis that yields one or more of these five types of generalizations, and have named this as the fundamental principle of data analysis.

2.12 Criterion 12: Analysis Orientation

Analysis orientation, conceptualized by Onwuegbuzie, Slate, et al. (2009) and extending the work  of  Ragin  (1989),  is  a  typology  for  classifying  mixed  analysis  techniques.  The qualitative and quantitative analyses can be any combination of the following: case-oriented, variable-oriented analyses, and process/experience-oriented analyses. Case-oriented analyses focus  on  the  selected  case(s)  to  analyze  and   to  interpret  the  meanings,  experiences, perceptions, or beliefs of one or more individuals. Because  case-oriented analyses aid in understanding phenomena pertaining to one or relatively few cases, they are more often used in qualitative analyses; however, case-oriented analyses can be used for any number of cases in  quantitative  research  with  techniques  such  as  single-subject  analyses  and  descriptive analyses. Variable-oriented analyses are used to identify relationships among constructs (i.e., variables) and tend to yield external generalizations. Thus, variable-oriented analyses tend to be applied to quantitative analyses—although  small samples also can be used to explore relationships among variables via qualitative analyses. Finally,  process/experience-oriented analyses are used to evaluate processes or experiences relating to one or more cases over time, with  processes  tending  to  be  associated  with  variables  and  experiences  tending  to  be associated with cases.

2.13 Criterion 13: Cross-Over Nature of Analysis

Another criterion to consider when making decisions about mixed analyses is the degree to which a cross-over analysis will be used. Cross-over mixed analysis (Onwuegbuzie & Combs,
2010) is an extension of Greene’s (2007) “broad analytic concept” (p. 153) of “using aspects of the analytic  framework of one methodological tradition in the analysis of data from another tradition” (p. 155). Cross-over mixed analysis involves using one or more analysis types  associated  with  one  tradition  to  analyze  data  associated  with  a  different  tradition
(Onwuegbuzie & Combs, 2010). For example, using visual displays to analyze qualitative


6 www.macrothink.org/ije

data (Greene,  2007)  and  using  effect  sizes  in  qualitative  analyses  (cf.  Onwuegbuzie  & Teddlie, 2003) are both types of cross-over mixed analyses.

3. Heuristic Example: A Step-by-Step Guide to the Mixed Analysis Process

The following mixed research study (Onwuegbuzie & Combs, 2009b) provides an example of how one can utilize the Onwuegbuzie and Combs’ (2010) 13-criteria meta-framework for mixed analysis techniques to guide the mixed analysis process.

3.1 Research Questions and Context of the Study

The study was conducted to examine the role that coping strategies play in the context of graduate students’ learning of statistics. Specifically, the following research questions were addressed: (a) What is the  relationship between statistics anxiety and coping strategies? (Quantitative  Research  Question)  and  (b)  To  what  extent  does  the  relationship  between statistics  anxiety  and  coping  strategies  manifest  itself  in   statistics  classrooms  (Mixed Research Question)? Two phases of the study involved a quantitative phase  (i.e., Phase 1) and an embedded qualitative phase (i.e., sequential mixing of qualitative and quantitative data, Phase 2). Because the participants in the quantitative and qualitative phases represented master’s and doctoral students from two different institutions, and the quantitative Phase 1 informed the qualitative Phase 2, the mixed research sampling design used was a Sequential Design using Parallel Samples (Onwuegbuzie & Collins, 2007).

In the initial quantitative phase of the study (Phase 1: Survey Sample), 115 graduate students enrolled  in  an   introductory-level,  quantitative-based  educational  research  course  were administered the Statistics Anxiety Rating Scale (STARS; Cruise & Wilkins, 1980) and the Coping Strategies Inventory for Statistics (CSIS;  Jarrell & Burry, 1989). In the embedded qualitative phase (Phase 2: Focus Group Sample), 17 doctoral students were interviewed and asked about the role that coping strategies played in both the formation and  alleviation of statistics anxiety. In addition, these students during Phase 2 completed the STARS and CSIS. In  Phase  1  of  the  study,  the  major  analytical  procedure  involved  canonical  correlation analysis, which is a multivariate analysis technique used to examine the relationship between two sets of measures when each set contains two or more variables or subscales. As such, the canonical  correlation  analysis  was  utilized  to  identify  a  combination  of  coping  strategy dimensions that might predict a combination of statistics anxiety dimensions.

In Phase 2, quantitative analyses were used to compare participants’ scores from each of the two phases of  the  study. In addition, focus group interviews were conducted to explore students’ experiences with the  statistics course. The qualitative data were used to identify themes pertaining to anxiety and coping strategies, and then were compared to the STARS and CSIS using both a cross-case and within-case analysis.

3.2 Mixed Analysis Design

A two-level embedded mixed research design was utilized in the current study, which was designed to examine the role that statistics anxiety and coping strategies play in the context of graduate  students’  learning  of  statistics.  The  study  represented  a  fully  mixed  sequential


7 www.macrothink.org/ije

design. This design, which incorporated dialectical pragmatist assumptions and stances (i.e., Criterion  2,   philosophical  underpinning),  involved  mixing  qualitative  and  quantitative approaches at several stages  including the data analysis stages. Both phases were given approximately equal weight (i.e., Criterion 7, priority of analytical components). Phase 1 generated quantitative data and Phase 2 generated both quantitative and qualitative data (i.e., Criterion 3, number of data types), and the analysis of data at Phase 1 informed the analysis of  data  at  Phase  2  (i.e.,  Criterion  6,  level  of  interaction).  In  addition,  within  Phase  2 (embedded qualitative phase), the analysis of the quantitative data (i.e., STARS, CSIS) had lower priority than the analysis of qualitative data (i.e., Criterion 7, priority  of analytical components) and informed the analysis of qualitative data (i.e., interviews, Criterion 6, level of interaction). Phase 2 of the study was embedded because it contained the collection and analysis of both qualitative and quantitative data (i.e., Criterion 4, number of data analysis types). The analyses in Phases 1 and 2 were conducted sequentially (i.e., Criterion 5, time sequence of mixed analysis). Phase 2 utilized cross-over mixed analysis techniques in which quantitative data (i.e., STARS, CSIS) were qualitized (i.e., narrative profile formation) and qualitative data were quantitized (e.g., effect sizes), and the quantitative and qualitative data were  correlated  (i.e.,  Criterion  13,  cross-over  nature  of  analysis).  The  mixed  analysis framework was neither design-based nor phase-based (i.e., Criteria 8, number of analytical phases; Criterion 9, link to other design components). The rationale for conducting the mixed analysis based on Greene et al.’s (1989) framework was that of complementarity, initiation, triangulation,  development,  and  expansion  (i.e.,  Criterion  1,  purpose  for  conducting  the mixed analysis). Mixed analysis decisions occurred iteratively (i.e., Criterion 10, phases of research process where analysis decisions are made).

Because Phase 1 involved investigation of the relationship between statistics anxiety and coping strategies using a large sample, it yielded a variable-oriented analysis (i.e., Criterion
12, analysis orientation) that led to external statistical generalizations (i.e., Criterion 11, type of  generalization).  In contrast, Phase 2 yielded both a variable- and case-oriented analysis because Phase 2  involved the assessment of the relationship between statistics anxiety and coping strategies using a relatively small sample (i.e., Criterion 12, analysis orientation) that led to analytic generalizations (i.e., Criterion 11, type of generalizations).

3.3 Mixed Analysis: Step-by-Step

In Onwuegbuzie and Combs’ (2009b) study, several levels of mixed analysis can be found. Phase 1 involved a quantitative analysis of quantitative data (i.e., descriptive and inferential statistics) and Phase 2 involved a qualitative analysis of qualitative data (constant comparison analysis of focus group interview data). The  researchers could have conducted the mixed analysis with these two steps: (a) quantitative analysis of quantitative data (Phase 1), and (b) qualitative analysis of qualitative data (Phase 2). However, they conducted additional mixed analyses  in  Phase  2  that  yielded  the  embedded  qualitative  phase.  Within  this  phase,  in addition to a quantitative analysis of quantitative data (i.e., STARS, CSIS), and a qualitative analysis of  qualitative data (i.e., focus group interview data), they conducted a qualitative analysis of quantitative data (i.e., used STARS and CSIS to compare with interview data) and a quantitative analysis of qualitative data  (e.g., within each focus group they conducted a


8 www.macrothink.org/ije

micro-interlocutor analysis in which they documented the number of times each person spoke, who talked first,  frequency counts for themes; Onwuegbuzie, Dickinson, Leech, & Zoran,
2009). Thus, this study demonstrates various combinations of quantitative and qualitative analysis. In the following sections, each step of the mixed analyses will be explained.

3.4 Study Phase 1: Survey Sample

3.4.1 Step 1: Quantitative analysis of quantitative data, descriptive statistics

Students’ scores (n = 115) on the STARS and CSIS were entered into SPSS. Descriptive statistics (i.e., mean, standard deviation) were computed for the six subscales of the STARS and the two subscales of the  CSIS. In addition, median percentile rank equivalent scores (MPRES) were calculated, as developed by Onwuegbuzie (2004), by comparing the median anxiety scores obtained in the study to the percentile rank norms reported by the developers of the STARS (i.e., Cruise, Cash, & Bolton, 1985). Thus, a MPRES of 81 for the subscale, worth of statistics, indicates that at least 50% of the present sample scored higher than did
81% of the norm group on this dimension (Onwuegbuzie, 2004). The finding that the MPRES ranged from 62 to 81 indicated that the participants in the quantitative phase represented a moderate-to-high statistics-anxious group.

3.4.2 Step 2: Quantitative analysis of quantitative data, inferential statistics

A canonical analysis was conducted to determine the relationships between the six STARS subscales and the two CSIS subscales. Onwuegbuzie and Combs (2009b) determined that the first canonical function was both statistically significant and practically significant, with the first canonical correlation (Rc1  = .60) contributing 35.9% (i.e., Rc12) to the shared variance. The standardized canonical function coefficients were examined and conclusions were drawn about the contributions of the statistics anxiety variable cluster and the coping  strategies cluster. Thus, there was a multivariate relationship between statistics anxiety and coping strategies, wherein examination-taking coping strategies represented a much more important predictor of  statistics anxiety than did study coping strategies, and interpretation anxiety made the most substantial contribution to the multivariate relationship among the six anxiety dimensions.

3.5 Study Phase 2: Focus Group Sample

3.5.1 Step 3: Quantitative analysis of quantitative data, descriptive statistics

Data from the STARS and CSIS were entered into SPSS. In this step, descriptive statistics (i.e.,  mean,  standard  deviation,  MPRES)  were  used  to  analyze  the  scores  of  the  17 participants on the STARS and CSIS.

3.5.2 Step 4: Quantitative analysis of quantitative data: inferential statistics.

Additional quantitative analyses were conducted to compare the students in Phase 1 of the study to those in Phase 2 of the study. For example, t tests were conducted to compare the levels of statistics anxiety and  coping strategies of participants in Phase 1 (i.e., Survey Sample) and Phase 2 (i.e., Focus Group Sample).


9 www.macrothink.org/ije

3.5.3 Step 5: Qualitative analysis of qualitative data, method of constant comparison and quantitative analysis of qualitative data.


Figure 1. Excel spreadsheet of focus group interviews. The number in the order column represents the order of the comments during the interview (e.g., the moderator spoke first). The participant column was  used to identify the focus group participant. The comments column represents the words that were spoken, audible sounds (e.g., laughter), silence, and nonverbal behaviors (nodding).

The focus group interviews were transcribed and typed into a Word document. Then, the transcript was  imported into an Excel spreadsheet. As shown in Figure 1, each row of the spreadsheet contained the order  of  the comments in the overall interview (e.g., who spoke first,  second),  the  participants’  identification  number,  and  the  participants’  comments. Nonverbal behaviors, which were noted by the interview moderator and assistant moderator, were described in parenthesis following the comments.

Next, statements were unitized and each unit represented a significant statement (Glaser & Strauss, 1967),  with each statement providing evidence of anxiety related to the statistics course or evidence of coping  strategies used in the statistics course. When participants’ comments contained multiple units, new rows were added in the spreadsheet and the divided comments were indicated by use of ellipses, as shown in  Figure 2. Using the method of constant comparison (Glaser & Strauss, 1967), Onwuegbuzie and Combs (2009b) compared statements to each other and labeled similar clusters with the same code. Initially, statements


10 www.macrothink.org/ije

were coded as either “anxiety” or “coping”, using a “1” to indicate the presence of the code and a “0” to indicate that the comment was not related to the code (i.e., quantitized). Next, using the sort function in Excel, statements related to the code of anxiety were sorted and grouped together, as shown in Figure 3. Then, anxiety statements were read and codes were developed based on similar comments. Using the method of constant comparison, codes were collapsed and refined. The same process  was  used with the statements related to coping strategies. More specifically, each significant statement was linked to a formulated meaning and to a theme. The six resulting themes related to the students’ anxieties in  the statistics course  were  lack  of  understanding,  class  anxiety,  multiple  responsibilities,  performance expectations, prior experiences, and writing anxiety, as shown in Table 1.

Table 1. Description of Emerging Themes for Anxiety from Statistics Course
Theme Description Significant Statement Examples

Lack of understanding Anxious from a lack of understanding about statistics
Class anxiety Anxious while participating in the
statistics class

“I found myself using words that I did not know what they really meant.”

“Just felt like from the minute we walked into the room we had to be ready and listening because it goes so fast.”

Multiple responsibilities

Performance expectations

Anxious from balancing multiple responsibilities in and out of the class

Anxious about performance, assessment, and expectations from
self or others

“What got in my way is that I had a whole other life and you needed to just have a life for statistics; I had a whole other job, it was too much.”
“I must be an idiot since I don’t know how to do this, so trying to balance what we should be as a graduate students and
maybe what is asking too much of us.”

Prior experiences Anxious due to prior experiences or lack of experiences with statistics

“I’d never taken a statistics class before and I knew it was one of my weak areas.”

Writing Anxiety Anxious about writing “For me, it was the writing. SPSS was not hard. Writing was hard.”

11 www.macrothink.org/ije

Table 2. Description of Emerging Themes for Coping Strategies Used in Statistics Course
Theme Description Significant Statement Examples


Peer support Asks for and receives
help from other peers and collaborates with others

Professor support Asks for and receives help from the professor

Personal management Manages self with organizational tools, routines, and self-care

Class structure Utilizes the resources provided in the course

Study skills Applies skills such as listening, correcting errors, and seeking additional resources

“For me, one of the biggest advantages I saw right then was being in the cohort because you really utilized that cohort, I could call Aretha, and another student, [we] emailed all the time.”
“He was very accessible I thought outside of class which was helpful because as those questions come up, you’d shoot him an email and within hours or a day you’d have a response.” “Taking notes, that was very stressful. I was so worried that I wasn’t going to get everything and when I got the digital recorder, I didn’t panic if I missed something.”
“The way the course was presented is we had an example paper, we had a step by step routine in how to do it, and um an assignment page.”
“I would try to go back and see the errors I had made on the papers, what were those words that weren’t supposed to be
used.”

12 www.macrothink.org/ije

Figure 2. Excel spreadsheet of focus group interview, showing how comments were unitized into significant statements (Glaser & Strauss, 1967), with each statement providing evidence of anxiety related to the statistic course or evidence of coping strategies used in the statistics course. When participants’ comments contained multiple units, new rows were added in the spreadsheet and the divided comments were indicated by use of ellipses. For example, note the sixth comment, made by Participant 5. This comment was divided into two units. A row was added and the order was renumbered to indicate this addition, which was necessary to return to the original order after subsequent sorting processes.

13 www.macrothink.org/ije

Figure 3. Excel spreadsheet of focus group interview, showing how comments were coded as anxiety or coping. The moderator comments were not coded

3.6 Within-Case Analysis

3.6.1 Step 6: Qualitative analysis of quantitative data

To conduct the within-case analysis, Phase 2 participants’ scores on the STARS and CSIS were ranked  from  highest to lowest. Two students were selected who had high statistics anxiety scores and low coping strategy scores. In addition, the students were selected based on the number of significant statements shared during the interview. In addition, two students were selected who displayed lower levels of statistic anxiety  and higher levels of coping strategies.  After  these  four  key  informants  were  identified,  their  scores  and  resulting percentiles  on  the  STARS  and  CSIS  were  subjected  to  a  narrative  profile  analysis. Specifically, the STARS and CSIS scores were qualitized by comparing them to normative data (i.e.,  normative profiles; Tashakkori & Teddlie, 1998) and these normative profiles provided more richness to the qualitative data (i.e., complementarity, development, expansion; Greene et al., 1989).

3.6.2 Step 7. Quantitative analysis of quantitative data, descriptive statistics

STARS and CSIS scores of the four key informants were compared to those participants representing Phase 1 using MPRES.


14 www.macrothink.org/ije

3.6.3 Step 8. Comparing qualitative analysis of qualitative data with quantitative analysis of quantitative data

These four participants’ comments were sorted in Excel by participant, as shown in Figure 4. The participant  comments were compared (i.e., data comparison) to the STARS and CSIS scores to see if their comments supported (i.e., triangulation; Greene et al., 1989) or refuted (i.e., initiation; Greene et al., 1989) their measures on the STARS and CSIS.


Figure 4. Excel spreadsheet of focus group interview, showing how comments were sorted by participant for use with the within case analysis

3.7 Cross-Case Analysis

3.7.1 Step 9: Combining qualitative and quantitative data

To conduct the cross-case analysis, several of Miles and Huberman’s (1994) visual displays were utilized.  For  example, a case-ordered descriptive meta-matrix was used in which the participants were ordered by both the STARS and the CSIS and this ordering was compared to their qualitative statements stemming from the  focus group interviews. Each of the six anxiety themes were quantitized; that is, for each focus group participant, a theme was coded as a “1” to indicate if a statement made by the participant was classified as representing the


15 www.macrothink.org/ije

International Journal of Education
ISSN  1948-5476
2010, Vol.  2, No. 2: E13

theme, and was coded as a “0” otherwise. Each person’s ranking of the STARS and the profile of “1”s and “0”s pertaining to the anxiety themes were compared to his/her statements pertaining to anxiety and patterns  were noted. This procedure was repeated for the coping themes.

In addition, an antecedents matrix was used in which the outcome variables (i.e., STARS, all anxiety statements and themes) were displayed alongside the potential antecedents (i.e., CSIS and coping statements  and  themes) to determine the role that coping strategies played in moderating  levels  of  statistics  anxiety  across  all  the  participants  and  as  a  function  of demographic variables. This display revealed several links among the sets of variables.

4. Interpretation of Findings

Based on the nine steps outlining the mixed analysis for the study, Onwuegbuzie and Combs (2009b)  surmised  that  the  six  statistics  anxiety  themes  and  five  coping  strategy  themes support   the   contention   that   both   statistics   anxiety   and   coping   strategies   represent multidimensional  constructs.  Further,  according  to  these  authors,  the  findings  from  the within-case analyses and cross-case analyses provided  support for the quantitative results, helping to confirm a multivariate relationship between levels of statistics anxiety and coping strategies  (triangulation),  as  well  as  providing  information  about  the  nature  of   this relationship (i.e., complementarity, development, and expansion) and about participants for whom  the   multivariate  relationship  was  weak  or  unclear  (i.e.,  initiation).  Also,  the within-case analyses and cross-case analyses helped the researchers identify specific coping strategies  that  reduced  statistics  anxiety  levels.  Thus,  the  authors  concluded  that  taken together,  the  quantitative  and  qualitative  findings  suggest   that  interventions  aimed  at increasing coping strategies might help to reduce levels of statistics anxiety.

5. Summary and Conclusions

In this article, we presented an inclusive, interactive framework for mixed analyses using the
13 criteria that were identified by Onwuegbuzie and Combs (2010) after they reviewed the extant literature of mixed analysis strategies. A heuristic example was used to highlight the various decisions made by Onwuegbuzie and Combs (2009b) in their mixed research study concerning statistics anxiety and coping strategies of graduate students enrolled in a statistics course.  This  heuristic  example  showed  the  utility  of  Onwuegbuzie  and  Combs’  (2010)
13-Criteria Meta-Framework for Mixed Analysis Techniques, which is summarized in Table
3. By using this framework, Onwuegbuzie and Combs (2009b) were able to design and undertake a more  comprehensive, coherent, and interactive analysis than otherwise would have  been  the  case,  thereby  yielding  a  more  rigorous  mixed  research  study.  As  such, Onwuegbuzie and Combs’ (2009b) study adds  incremental validity to the mixed analysis meta-framework, consistent with the call of Greene (2008) for “a widely accepted framework or set of ideas” (p. 14). We hope that other mixed researchers will use this meta-framework to design their mixed analyses, and assess the utility and limitations of the meta-framework for themselves. By documenting their use of this meta-framework, as we have accomplished in the present  article, a body of evidence can be built that either provides support for the meta-framework or offers direction for improvement.


16 www.macrothink.org/ije

Table 3. Summary of Onwuegbuzie and Combs’ (2010) 13-Criteria Meta-Framework for
Mixed Analysis Techniques Used by Onwuegbuzie and Combs (2009b)

How Criteria were Manifested in Onwuegbuzie and

Criteria
Rationale/purpose for conducting the mixed analysis


Philosophy underpinning the mixed analysis
Number of data types that will be analyzed
Number of data analysis types that will be used

Combs’ (2009b) Study
Involved complementarity, initiation, triangulation, development, and expansion (Greene, Caracelli, & Graham, 1989)
Involved dialectical pragmatist assumptions and stances (Johnson, 2009)
Collected both quantitative and qualitative data
(Creswell & Plano Clark, 2007, 2010)
Utilized both qualitative analysis and quantitative analysis (Creswell & Tashakkori, 2007; Onwuegbuzie, Slate, Leech, & Collins, 2007, 2009; Onwuegbuzie & Teddlie, 2003)

Time sequence of the mixed analysis Involved sequential analysis (Tashakkori & Teddlie,
1998; Teddlie & Tashakkori, 2009)

Level of interaction between quantitative and qualitative analyses

Analyzed data at Phase 1 that informed the analysis of data at Phase 2 (Teddlie & Tashakkori, 2009)

Priority of analytical components Conducted qualitative and quantitative analyses at approximately equal weight (Johnson, Onwuegbuzie, Turner, 2007; Morse, 2003)
Number of analytical phases Not linked directly to any phases of the mixed analysis
(Greene, 2007; Onwuegbuzie & Teddlie, 2003) Link to other design components Not linked directly to any mixed research designs
(Creswell & Plano Clark, 2010; Teddlie & Tashakkori,
2009)

Phase of the research process when all analysis decisions are made

Made mixed analysis decisions iteratively (Johnson, Onwuegbuzie, & Turner, 2007)

Type of generalization Made external statistical generalizations based on Phase 1 analyses and analytic generalizations based on Phase 2 analyses (Onwuegbuzie, Slate, Leech, & Collins, 2009)
Analysis orientation Involved variable-oriented analysis at Phase 1 and a variable- and case-oriented analysis at Phase 2 (Onwuegbuzie, Slate, Leech, & Collins, 2009)
Cross-over nature of analysis Qualitized (i.e., narrative profile formation; Tashakkori & Teddlie, 1998) quantitative data (i.e., STARS, CSIS) and quantitized qualitative data (e.g., effect sizes; Onwuegbuzie, 2003; Onwuegbuzie & Teddlie, 2003); and correlated the quantitative and
qualitative data (Onwuegbuzie & Combs, 2010)



17 www.macrothink.org/ije


References

Bazeley,  P.  (1999).  The  bricoleur  with  a  computer:  Piecing  together  qualitative  and quantitative data. Qualitative Health Research, 9, 279-287. doi:10.1177/104973299129121749.

Bazeley,  P.  (2003).  Computerized  data  analysis  for  mixed  methods  research.  In  A. Tashakkori  &  C.  Teddlie  (Eds.),  Handbook  of  mixed  methods  in  social  and  behavioral research (pp. 385-422). Thousand Oaks, CA: Sage.

Bazeley, P. (2006). The contribution of computer software to integrating qualitative and quantitative data and analyses. Research in the Schools, 13(1), 64-74.

Bazeley, P. (2009). Mixed methods data analysis. In S. Andrew & E. J. Halcomb (Eds.), Mixed methods research for nursing and the health sciences (pp. 84-118). Chichester, UK: Wiley-Blackwell.

Berelson, B. (1952). Content analysis in communicative research. New York, NY: Free
Press.

Caracelli,  V.  W.,  &  Greene,  J.  C.  (1993).  Data  analysis  strategies  for  mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 15, 195-207. doi:10.2307/1164421.

Chen, H. T. (2006). A theory-driven evaluation perspective on mixed methods research.
Research in the Schools, 13(1), 75-83.

Chi, M. T. H. (1997). Quantifying qualitative analyses of verbal data: A practical guide. The
Journal of the Learning Sciences, 6, 271-315. doi:10.1207/s15327809jls0603_1

Creswell, J. W., & Plano Clark V. L. (2007). Designing and conducting mixed methods research.

Creswell, J. W., & Plano Clark V. L. (2010). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks, CA: Sage.

Creswell,  J.  W.,  &  Tashakkori,  A.  (2007).  Developing  publishable  mixed  methods manuscripts. Journal of Mixed Methods Research, 1, 107-111. doi:10.1177/1558689806298644.

Cruise, R. J., & Wilkins, E. M. (1980). STARS: Statistical Anxiety Rating Scale. Unpublished manuscript, Andrews University, Berrien Springs, MI.

Cruise, R. J., Cash, R. W., & Bolton, D. L. (1985, August). Development and validation of an instrument  to  measure  statistical  anxiety.  Paper  presented  at  the  annual  meeting  of  the Statistical Education Section. Proceedings of the American Statistical Association, Las Vegas, NV.

Curtis, S., Gesler, W., Smith, G., & Washburn, S. (2000). Approaches to sampling and case selection in  qualitative research: Examples in the geography of health. Social Science and Medicine, 50, 1001-1014. doi:10.1016/j.jas.2007.02.013.

18 www.macrothink.org/ije

Datta, L.  (2001).  The  wheelbarrow,  the  mosaic,  and  the  double  helix:  Challenges  and strategies for  successfully carrying out mixed methods evaluation. Evaluation Journal of Australia, 1(2), 33-40.

Denscombe,  M.  (2008).  Communities  of  practice:  A  research  paradigm  for  the  mixed methods approach. Journal of Mixed Methods Research, 2, 270-283. doi:10.1177/1558689808316807

Firestone, W. A. (1993). Alternative arguments for generalizing from data, as applied to qualitative research. Educational Researcher, 22(4), 16-23. doi:10.3102/0013189X022004016.

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago, IL: Aldine.
Greene, J. C. (2007). Mixed methods in social inquiry. San Francisco, CA: Jossey-Bass. Greene, J. C. (2008). Is mixed methods social inquiry a distinctive methodology? Journal of
Mixed Methods Research, 2, 7-22. doi:10.1177/1558689807309969

Greene, J. C., & Caracelli, V. J. (1997). Defining and describing the paradigm issue in mixed-method   evaluation.  In  J.  C.  Greene  &  V.  J.  Caracelli  (Eds.),  Advances  in mixed-method  evaluation:  The  challenges  and  benefits  of  integrating  diverse  paradigms (New Directions for Evaluation, No. 74, pp. 5-17). San Francisco, CA: Jossey-Bass.

Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method evaluation designs. Educational Evaluation and Policy Analysis, 11, 255-274. doi:10.2307/1163620

Happ, M. B., DeVito Dabbs, D. A., Tate, J., Hricik, A., & Erlen, J. (2006). Exemplars of mixed  methods  data  combination  and  analysis.  Nursing  Research,  55(2,  Supplement  1), S43-S49. doi:10.1097/00006199-200603001-00008

Jang, E. E., McDougall, D. E., Pollon, D., & Russell, M. (2008). Integrative mixed methods data analytic strategies in research on school success in challenging environments. Journal of Mixed Methods Research, 2, 221-247. doi:10.1177/1558689808315323

Jarrell, M. G., & Burry, J. A. (1989, November). Coping Strategies Inventory for Statistics. Paper presented at the annual meeting of the Mid-South Educational Research Association, Little Rock, AR.

Johnson,  R.  B.  (2009).  Toward  a  more  inclusive  “Scientific  Research  in  Education.”
Educational Researcher, 38, 449-457. doi:10.3102/0013189X09344429

Johnson, R. B., & Christensen, L. B. (2008). Educational research: Quantitative, qualitative, and mixed approaches (3rd ed.). Thousand Oaks, CA: Sage.

Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26.



19 www.macrothink.org/ije

Johnson, R. B., Onwuegbuzie, A. J., & Turner, L. A. (2007). Toward a definition of mixed methods research. Journal of Mixed Methods Research, 1, 112-133. doi:10.1177/1558689806298224

Kennedy,  M.  (1979).  Generalizing  from  single  case  studies.  Evaluation  Quarterly,  3,
661-678.

Lee, Y-j., & Greene, J. C. (2007). The predictive validity of an ESL placement test: A mixed methods approach. Journal of Mixed Methods Research, 1, 366-389. doi:10.1177/1558689807306148

Li, S., Marquart, J. M., & Zercher, C. (2000). Conceptual issues and analytical strategies in mixed-method  studies of preschool inclusion. Journal of Early Intervention, 23, 116-132. doi:10.1177/105381510002300206

Maxcy, S. J. (2003). Pragmatic threads in mixed methods research in the social sciences: The search for  multiple modes of inquiry and the end of the philosophy of formalism. In A. Tashakkori  &  C.  Teddlie  (Eds.),  Handbook  of  mixed  methods  in  social  and  behavioral research (pp. 51-89). Thousand Oaks, CA: Sage.

Maxwell, J.  A.  (2004,  April).  Realism  as  a  stance  for  mixed  methods  research.  Paper presented at the  annual meeting of the American Educational Research Association, San Diego, CA.

McEvoy, P., & Richards, D. (2006). A critical realist rationale for using a combination of quantitative and qualitative methods. Journal of Research in Nursing, 11, 66-78.

Mertens, D. (2003). Mixed methods and the politics of human research: The transformative-emancipatory perspective. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 135-164). Thousand Oaks, CA: Sage.

Miles,  M.  B.,  &  Huberman,  A.  M.  (1994).  Qualitative  data  analysis:  An  expanded sourcebook (2nd ed.). Thousand Oaks, CA: Sage.

Morse, J. M. (2003). Principles of mixed methods and multimethod research design. In A. Tashakkori  &  C.  Teddlie  (Eds.),  Handbook  of  mixed  methods  in  social  and  behavioral research (pp. 189-208). Thousand Oaks, CA: Sage.

Onwuegbuzie, A. J. (2003). Effect sizes in qualitative research: A prolegomenon. Quality & Quantity: International Journal of Methodology, 37, 393-409.

Onwuegbuzie, A. J. (2004). Academic procrastination and statistics anxiety. Assessment & Evaluation in Higher Education, 29, 3-18.

Onwuegbuzie, A. J., & Collins, K. M. T. (2007). A typology of mixed methods sampling designs in  social science research. The Qualitative Report, 12, 281-316. Retrieved from http://www.nova.edu/ssss/QR/QR12-2/onwuegbuzie2.pdf

Onwuegbuzie, A. J., & Collins, K. M. T. (2009, March). An innovative method for analyzing themes in mixed research: Introducing chi-square automatic interaction detection (CHAID).


20 www.macrothink.org/ije

Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA.

Onwuegbuzie, A. J., & Combs, J. P. (2009a). An innovative method for analyzing themes in mixed   research:  Introducing  mixed  thematic-exploratory  factor  analyses.  Manuscript submitted for publication.

Onwuegbuzie, A. J., & Combs, J. P. (2009b, April). The relationship between statistics anxiety  and  coping  strategies  among  graduate  students:  A  mixed  research  study.  Paper presented at the annual  meeting of the American Educational Research Association, San Diego, CA.

Onwuegbuzie, A. J., & Combs, J. P. (2010). Emergent data analysis techniques in mixed methods  research: A synthesis. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (3rd ed., pp. 397-430). Thousand Oaks, CA: Sage.

Onwuegbuzie, A. J., & Dickinson, W. B. (2008). Mixed methods analysis and information visualization:   Graphical  display  for  effective  communication  of  research  results.  The Qualitative Report, 13, 204-225. [Online] Available: http://www.nova.edu/ssss/QR/QR13-2/onwuegbuzie.pdf

Onwuegbuzie, A. J., & Leech, N. L. (2004). Enhancing the interpretation of “significant” findings: The role of mixed methods research. The Qualitative Report, 9, 770-792. [Online] Available: http://www.nova.edu/ssss/QR/QR9-4/ onwuegbuzie.pdf

Onwuegbuzie, A. J., & Leech, N. L. (2006). Linking research questions to mixed methods data   analysis   procedures.   The   Qualitative   Report,   11,   474-498.   [Online]   Available: http://www.nova.edu/ssss/QR/QR11-3/onwuegbuzie.pdf

Onwuegbuzie, A. J., & Teddlie, C. (2003). A framework for analyzing data in mixed methods research. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioral research (pp. 351-383). Thousand Oaks, CA: Sage.

Onwuegbuzie, A.  J.,  Collins,  K.  M.  T.,  &  Leech,  N.  L.  (in  press).  Mixed  research:  A
step-by-step guide. New York, NY: Taylor & Francis.

Onwuegbuzie, A. J., Dickinson, W. B., Leech, N. L., & Zoran, A. G. (2009). Toward more rigor in focus  group research: A new framework for collecting and analyzing focus group data. International Journal of Qualitative Methods, 8(3), 1-21.

Onwuegbuzie, A. J., Johnson, R. B., & Collins, K. M. T. (2009). A call for mixed analysis: A philosophical framework for combining qualitative and quantitative. International Journal of Multiple Research Approaches, 3, 114-139.

Onwuegbuzie, A. J., Slate, J. R., Leech, N. L., & Collins, K. M. T. (2007). Conducting mixed analyses: A  general typology. International Journal of Multiple Research Approaches, 1,
4-17.



21 www.macrothink.org/ije

Onwuegbuzie, A. J., Slate, J. R., Leech, N. L., & Collins, K. M. T. (2009). Mixed data analysis:  Advanced  integration  techniques.  International  Journal  of  Multiple  Research Approaches, 3, 13-33.

Ragin, C. C. (1989). The comparative method: Moving beyond qualitative and quantitative strategies. Berkeley, CA: University of California Press.

Rao, V.,  &  Wolcock,  M.  (2003).  Integrating  qualitative  and  quantitative  approaches  in program evaluation. In F. J. Bourguignon & L. Pereira de Silva (Eds.), Evaluating the poverty and distribution impact of  economic policies (pp. 165-190). New York, NY: The World Bank.

Reichardt, C. S., & Cook, T. D. (1979). Beyond qualitative versus quantitative methods. In T. D.  Cook  &  C.  S.  Reichardt  (Eds.),  Qualitative  and  quantitative  methods  in  evaluation research (pp. 7-32). Thousand Oaks, CA: Sage.

Rescher, N. (2000). Realistic pragmatism: An introduction to pragmatic philosophy. Albany, NY: State University of New York Press.

Roberts, A. (2002). A principled complementarity of method: In defence of methodological eclecticism and  the qualitative–qualitative debate. The Qualitative Report, 7(3). [Online] Available: www.nova.edu/ssss/QR/QR7–3/roberts.html

Sandelowski, M. (2000). Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed-method studies. Research in Nursing Health, 23, 246-255. doi:10.1002/1098-240X(200006)23:3<246::AID-NUR9>3.3.CO;2-8

Sandelowski, M. (2001). Real qualitative researchers don't count: The use of numbers in qualitative research. Research in Nursing and Health, 24, 230-240.

Stake, R. E. (1980). The case study method in social enquiry. In H. Simons (Ed.), Towards a science of the  singular (pp. 62-75). CARE Occasional Publications No. 10. Norwich, UK: Center for Applied Research in Education, University of East Anglia.

Stake, R. E. (2005). Qualitative case studies. In N. K. Denzin & Y. S. Lincoln (Eds.), The
Sage handbook of qualitative research (3rd ed., pp. 443-466). Thousand Oaks, CA: Sage.

Stake, R.  E.,  &  Trumbull,  D.  J.  (1982).  Naturalistic  generalizations. Review  Journal  of
Philosophy and Social Science, 7, 3-12.

Tashakkori, A.,  &  Teddlie,  C.  (1998).  Mixed  methodology:  Combining  qualitative  and quantitative approaches. Applied Social Research Methods Series (Vol. 46). Thousand Oaks, CA: Sage.

Teddlie, C., & Tashakkori, A. (2009). Foundations of mixed methods research: Integrating quantitative and qualitative techniques in the social and behavioral sciences. Thousand Oaks, CA: Sage.


22 www.macrothink.org/ije

Teddlie, C., Tashakkori, A., & Johnson, R. B. (2008). Emergent techniques in the gathering and analysis of mixed methods data. In S. N. Hesse-Biber & P. Leavy (Eds.), Handbook of emergent methods (pp. 389-414). New York, NY: The Guilford Press.

Todd, Z., Nerlich, B., McKeown, S., & Clarke, D. D. (2004). Mixing methods in psychology: The integration of qualitative and quantitative methods in theory and practice. New York, NY: Psychology Press.

West, E., & Tulloch, M. (2001, May). Qualitising quantitative data: Should we do it, and if so, how? Paper  presented at the annual meeting of the Association for Social Research, Wollongong, New South Wales, Australia.

23 www.macrothink.org/ije

Tidak ada komentar:

Posting Komentar

Pembaca Dermawan nulis komentar, Pembaca Sopan follow Ulfah Mey Lida's Blog, Pembaca Budiman nulis komentar dan follow Ulfah Mey Lida's Blog.