Chat with us, powered by LiveChat Defining Synthesis |

Educational Researcher, Vol. 46 No. 3, pp. 131 –139
DOI: 10.3102/0013189X17703946
© 2017 AERA.

ApRIl 2017 131

esearch syntheses in education, like meta-analyses (Glass,
1976) and best-evidence syntheses (Slavin, 1986), are
conducted to identify evidence-based practices.1 They do

so by combining findings across empirical studies whose con-
structs are sufficiently similar to warrant comparison, which
makes constructs essential to framing research. As the building
blocks of theory, constructs are characterized by how they link
abstractions to observed phenomena given social, historical,
political, and cultural assumptions at work in conceptualizing
them (Watt & Van Den Berg, 2002). In research, constructs
carry disciplinary assumptions about what effects matter and for
whom (e.g., Eisenhart & DeHaan, 2005; Latour & Woolgar
1979/1986). For instance, familiar constructs studied in educa-
tion, such as “high achieving,” “disabled,” and “career ready,”
come preloaded with assumptions about students and what
effects might matter for them.

Research syntheses are conducted in hermeneutic circles
(Skrtic, 1991); the constructs included and excluded from them
anticipate how problems will be framed and solutions formu-
lated. For example, research syntheses that include medical
accounts of the construct “disability” invoke medical solutions
while syntheses that include social accounts of disability invoke
social solutions. As such, answers returned to the research syn-
thesis question, “What works to improve employment for peo-
ple with disabilities?” will depend on how “disability” is first

formulated. Making explicit the social, political, historical, and/
or cultural accounts of “disability” can reveal the hermeneutic
circle in which a research synthesis is involved and suggest pos-
sibilities for including alternative accounts of constructs.

Although many narrative (Petticrew & Roberts, 2006) and sys-
tematic review methods—such as metanarrative (Greenhalgh,
Macfarlane, Bate, Kyriakidou, & Peacock, 2005), metastudy
(Paterson, Thorne, Canam, & Jillings, 2001), and critical interpre-
tive synthesis (CIS; Dixon-Woods et al., 2006)—can be drawn
upon to critique constructs, none fully answers our call, through a
systematic process, to understand how constructs and methodologi-
cal elimination decisions frame the results of research syntheses. In
Table 1, we clarify research synthesis as our object of critique. We
characterize and differentiate research syntheses, such as meta-
analysis and best-evidence synthesis, from other systematic reviews
that do not include methodological elimination as an integral part
of their study screening process. We also show how neither tradi-
tional nor qualitative systematic review methods include a system-
atic process to critically analyze constructs in existing research
syntheses. Interrogating constructs in research synthesis is needed
(we argue) in the field of education, where governments and educa-
tionalists privilege these syntheses to answer questions about what

703946EDRXXX10.3102/0013189X17703946Educational ResearcherEducational Researcher

1University of South Florida, Tampa, FL
2University of Kansas, Lawrence, KS

Unpacking Assumptions in Research Synthesis:
A Critical Construct Synthesis Approach
Jennifer R. Wolgemuth1, Tyler Hicks2, and Vonzell Agosto1

Research syntheses in education, particularly meta-analyses and best-evidence syntheses, identify evidence-based
practices by combining findings across studies whose constructs are similar enough to warrant comparison. Yet constructs
come preloaded with social, historical, political, and cultural assumptions that anticipate how research problems
are framed and solutions formulated. The information research syntheses provide is therefore incomplete when the
assumptions underlying constructs are not critically understood. We describe and demonstrate a new systematic review
method, critical construct synthesis (CCS), to unpack assumptions in research synthesis and to show how other framings
of educational problems are made possible when the constructs excluded through methodological elimination decisions
are taken into consideration.

Keywords: critical theory; educational policy; disability studies; meta-analysis; qualitative research



works in policy and practice (Donmoyer, 2012; National Research
Council, 2002). To support dialogue and decision making that is
more fully informed, contextualized, and critical, inquiry must also
examine the social, historical, political, and/or cultural assumptions
underlying constructs.

In this article we describe and demonstrate a new systematic
review method, critical construct synthesis (hereafter CCS). A CCS
explores and critiques constructs included and excluded from
research synthesis, particularly through the process of screening out
studies that fail to meet methodological standards for providing best

or quality evidence. It shows how reexamining research syntheses in
light of constructs identified in methodologically excluded literature
may open up possibilities for reframing educational problems.

Exclusion, Constructs, and Critique in
Research Syntheses

In broad terms, the aim of any research synthesis is to summarize
and evaluate research and knowledge on a topic. With the intro-
duction of meta-analysis, research syntheses became tools for

Table 1
Types and Characteristics of Literature Reviews

Type of Literature
Review Description

Systematic Process
for Reviewing

Primary Literature

Includes All


Includes Critical
Analysis of

Existing Research


Traditional review A review that provides an overview of
literature on a topic. Does not use
systematic review methods.

No Yes No No

Critical review A review of literature that critically
examines primary literature.
Does not use systematic review

No Yes Yes No

Systematic review
Research synthesis:


A systematic review that synthesizes
only studies that meet
predetermined methodological

Yes No No No

Research synthesis:

A systematic review that summarizes
studies through statistical
comparison of findings

Yes Noa No No

Research synthesis:
Narrative review

A systematic review that summarizes
studies narratively, rather than by

Yes Noa No No

Metanarrative A systematic review that seeks to
show various ways researchers
have understood a heterogeneous
topic area, often over time.

Yes Yes No No

Metastudy A systematic review that aims
to generate new insights into
phenomena through an analysis
of theory, methods, and finding of
qualitative research.

Yes Nob Yes No

Critical interpretive

A systematic review that aims to
develop an interpretive model
of a phenomenon from existing

Yes Noc Yes No

Critical construct

A systematic review that aims
to critique the constructs in
literature included and excluded
from an existing research

Yes Yes Yes Yes

Note. Literature reviews relevant to the object of critique and methods in this article. For different and more exhaustive typologies of literature views, see, for example,
Petticrew and Roberts (2009); Gough, Thomas, and Oliver (2012); and Kastner, Antonya, Soohiaha, Strausa, and Triccoa (2016).
aSome meta-analyses and narrative reviews do not exclude quantitative studies based on the quality of their designs (Cooper & Hedges, 2009). They do, however, exclude
qualitative and conceptual literature.
bMetastudy does not exclude qualitative studies based on the quality of their designs (Paterson, Thorne, Canam, & Jillings, 2001). They do, however, exclude quantitative
and conceptual literature.
cCritical interpretive synthesis includes quantitative and qualitative studies but excludes those deemed “fatally flawed” (Dixon-Woods et al., 2006).

ApRIl 2017 133

quantifying an intervention’s effect by pooling estimates across stud-
ies (Glass, 1976). As a quality control measure, reviews often restrict
research syntheses to studies passing exacting standards of method-
ological rigor. The U.S. Department of Education’s What Works
Clearinghouse (WWC; 2014), for example, reviews only
well-designed quasi-experimental studies, single-case designs, and
randomized experiments. Yet, excluding studies not up to method-
ological par conflicts with the idea of reviewing the full knowledge
base. Glass (2000), for one, has remained “staunchly committed to
the idea that meta-analyses must deal with all studies, good bad and
indifferent” (para. 33). Restricting research syntheses to studies that
meet methodological standards may be an effective tool for estab-
lishing evidence-based practices, but choosing only well-designed
studies does not inoculate research syntheses from the influence of
constructs and the assumptions they carry.

Under previous positivist notions of social science inquiry, con-
structs, thought of as latent variables, were operationally defined in
order to sanitize them of tacit assumptions (Phillips & Burbules,
2000). Such treatment rested on a hard distinction between theoreti-
cal constructs, which were loaded with assumptions, and empirical
constructs, which were taken to be assumption free. But the distinc-
tion between these two types of constructs failed to hold after Kuhn
(1962) showed that scientific communities work within prevailing
frameworks (paradigms), which are central to empirical claims.
Despite advantages that may accrue from operationally defining
constructs in the process of a research synthesis, doing so does not
isolate constructs from their social, political, historical, and cultural
contexts. The identification of underlying assumptions in the
research synthesis process is not a warrant for “anything-goes” rela-
tivism or a call to increase objectivity. Rather it supports the position
that methodological decisions about what research designs to include
in a research synthesis, and constructs that inform and emanate from
those decisions, ought to be unpacked and subject to critique.

The recognition and use of critique in reviews is not new. Cooper
(1985) noted early on that the purpose of conducting a review could
be critical to show how previous conclusions were unwarranted
based on “the literature’s incommensurability with the reviewers’
theoretical stance and/or criteria for methodological validity” (p.
10). Similarly, Petticrew and Roberts (2006) described critical
reviews as aiming to critique methods and results of primary litera-
ture but without “using the formalized approach of a systematic
review” (p. 41).

A host of qualitative systematic review methods now employ
more formalized approaches to critique primary literature in the
review process. For example, the metasynthesis phase of metastudy
explores how various theoretically informed analytic options influ-
ence research findings (Paterson et al., 2001). Dixon-Woods and
colleagues (2006) also characterize CIS as not just summarizing the
literature’s data but also tracing the sociopolitical origins of
entrenched constructs. However, CCS stands out from these sys-
tematic review methods because it takes research synthesis as its
object of critique and shows how research synthesis results are sensi-
tive to constructs in the literature it includes.

Introducing CCS

Before demonstrating our use of CCS, we describe it in terms
of its philosophy of inquiry, methodology, and methods.

Philosophies of inquiry are attempts to provide consistent answers
to a constellation of questions, such as What is the notion of
inquiry? What should count as evidence in educational inquiry?
(Biesta, 2015). Methodology delineates what the inquiry entails
and provides justifications for how it will be conducted within
the philosophies of inquiry it is embedded. Methods describes
the procedures scholars follow while inquiring within the overall

Philosophy of Inquiry

We conceptualize CCS as philosophically dexterous, transferable
among different paradigms of inquiry within the critical tradi-
tion. By critique, we mean a sociopolitical analysis that highlights
how power and ideology operate to structure and stratify society, to
marginalize, oppress, and limit possibility. Below we illustrate this
dexterity with two stances: critical realism (Bhaskar, 1998) (informed
by Marxism) and poststructuralism (e.g., Foucault, 1980). Although
these stances diverge in many respects, they can be situated within
the critical tradition and thus supply the intellectual resources
needed for CCS.

Critical realism. Realism, including critical realism, holds that
the deep structure of reality is external to human cognition
(Gorski, 2013; House, 1991). Although reality may be multi-
layered (Bhaskar, 1998), realism values constructs that map onto
external reality (Sider, 2014). The construct “human being,” for
example, may directly map onto reality’s structure (Armstrong,
1978), and if so, any full cognition of reality would require the
inclusion of that construct to be adequate. It would not be a
proper object of criticism. If “human being” does not map onto
reality, however, then it can be the object of criticism and aban-
doned, revised, or refined to meet social needs. Constructs such
as “disability” may fit this latter category (Searle, 1997). Empiri-
cal inquiry can determine if such constructs actually function to
legitimate social oppression and so neutralize them. A dosage of
criticality is thus injected into realism when advocates of realism
recognize the need to unmask hegemonic constructs wrongly
assumed to map onto reality, hence, critical realism (Bhaskar,
1988, 1993).

Following Ian Hacking (2002, 2004), critical realism pos-
its a dynamic relationship between reality and dominant ide-
ology, which creates a looping effect. For instance, when
researchers classify a subset of schoolchildren as “high achiev-
ers,” this dynamic relationship is reflected when schoolchil-
dren accept, resist, or redefine this identity option. Their
response can reinforce or discourage researchers from using
the construct or adapting it. The “high achievers” construct
thus evolves due to a feedback loop between labelers and
labeled. Looping makes vocabulary in education different
from vocabulary in natural science, wherein no such interac-
tion between labeling and labeler exists (e.g., rocks do not
contest or admire geologists’ labels of them). A critical real-
ist–informed CCS might evaluate the impact of looping
effects on research syntheses to provide emancipatory expla-
nations of findings. For example, one might wonder if the
suspect construct of “disability” harmfully delimited the pos-
sible findings of a research synthesis.


Poststructuralism. In contrast to critical realism, poststructural-
ism holds that reality is entirely ideological. Poststructuralism
engages boundaries between prevailing ideologies and sees real-
ity as blurred (Peters & Burbules, 2000), one can never step
outside of ideology to “objectively” define reality according
to neutral facts. No external or objective structure (like the
looping effect) exists to distinguish between what is real and
what is defined by a construct. A poststructuralist analysis, for
example, would critique the “looping effect” as producing a
distinction between “labeler” and “labeled,” rather than reflect-
ing a “real” process.

Drawing on Foucault’s (1980) description of power and
knowledge (power/knowledge), a poststructuralist analysis for-
mulates ideology as laden with power. The role of power in lever-
aging some ideologies above others (e.g., neoliberalism) takes
many forms and is often masked in liberal democracies (Giroux,
2004). But the incessant flux and productive work of power
means no ideology ultimately prevails over the others or even
remains stable over the long haul (Foucault, 1980). Ideological
disagreements are not resolvable by evidence but through power.
A poststructuralist-informed CCS might seek to disrupt prevail-
ing ideologies in research syntheses by showing that the con-
structs studied are not inevitable, natural, or immutable but
instead contestable as they manifest through fallible social and
historical discourse.


CCS draws on two types of well-established methodologies in
qualitative research. The first methodology is the qualitative sys-
tematic review (Hannes & Macaitis, 2012), most closely resem-
bling the processes of a CIS (Dixon-Woods et al., 2006). CIS
involves an iterative and reflexive approach to synthesizing a
body of literature on a topic (Dixon-Woods et al., 2006). Our
CCS, although similar in approach, differs from a CIS in its aim.
The aim of CCS is not to generate an interpretive synthesizing
argument across the literature as is done in a CIS but to under-
stand how the results of research syntheses are sensitive to the
constructs included and excluded from review.

The second methodology is any “construct analysis” approach
to analyze latent meaning in text. For example, constructs might
be analyzed using techniques from ethnographic content analy-
sis (Altheide & Schneider, 2013), semiotic analysis (e.g., Barthes,
1964), and/or (critical) discourse analysis (e.g., Gee, 2010).
These analytic techniques assist researchers in interrogating con-
structs and their assumptions in research syntheses and stem
from the broad swath of the critical tradition.


CCS involves multiple steps: (1) Identify a research synthesis of
interest. (2) Focus on a bundle of constructs encompassed in the
topic of interest (e.g., kinds of people, outcomes, interventions,
or contexts). (3) Replicate the synthesis’ literature search strat-
egy. (4) Screen gathered articles to eliminate works not on topic.
The first four steps largely replicate the original synthesis,
whereas the next steps are unique to CCS: (5) Divide the

remainder into distinct sets, literature included and excluded
from the original synthesis. Importantly, the excluded set should
include conceptual and empirical research originally deemed not
up to par. (6) Develop coding sheets that track key characteris-
tics of construct formulation (e.g., questions posed, key terms
defined, answers given). (7) Code both sets of literature, included
and excluded. (8) Resolve any coding discrepancies. (9) Proceed
to analyze the constructs of interest in both sets of articles.
Comparison of the two sets can identify what assumptions
underlying the constructs informed the findings of the original
research synthesis and how findings were sensitive to them.
These steps define the method, but it need not be a linear pro-
cess. Researchers may cycle back and forth between the steps, as

Demonstrating CCS

Below we demonstrate how we proceeded through eight steps of
CCS to explore the constructs “work” and “autism” in literature
on employment for youth with autism (Wolgemuth et al., 2016)
from a research synthesis on postschool transition for youth with
disabilities (Cobb et al., 2013).

Step 1: Identify a Research Synthesis of Interest

The primary aim of CCS is to critically explore constructs in an
existing research synthesis. Therefore, in the first step, the
research team selects a research synthesis of interest. Reasons for
selecting a particular synthesis will vary, but it is important that
the search strategy used in the original research synthesis be pub-
lished or otherwise known so that it can be replicated.

We chose to consider a research synthesis based on the first
author’s participation in a U.S. Department of Education–
contracted systematic review of literature on postsecondary out-
comes for youth with disabilities (Cobb et al., 2013) that used
WWC guidelines (with some modifications) to screen primary
literature. The aim of the research synthesis was to identify effec-
tive programs and strategies that support students with disabilities
in the United States to transition from high school. Reflecting on
the synthesis and its processes, the first author worried that of the
738 studies passing the abstract screening process, only 16 met
WWC standards (with reservations). She wondered about assump-
tions the constructs “disability,” “employment,” “independent liv-
ing,” and “postsecondary education” advanced in the final report
and what understandings might have been enabled in a more
inclusive review. To explore these questions, she brought together
a team of researchers to study a subset of literature in the research
synthesis: literature on autism and employment.

Step 2: Focus on Constructs That Encompass a
Topic of Interest

In Step 2, the review team determines the constructs it wants to
explore, including its rationale for doing so. The first author
noted that three of the 16 studies that met standards for inclu-
sion in the research synthesis were about work for people with
disabilities, but none was specifically about “work” for people

ApRIl 2017 135

with “autism,” although two included participants with autism
diagnoses. She wondered what possible understandings of work
and autism the overall report and its literature base privileged
and excluded.

For our CCS, we focused on the constructs of “work” (con-
ceptualized as both an intervention and an outcome construct in
the research synthesis) and “autism.” We chose to make these
constructs focal given a global increase in the prevalence of
autism diagnoses (Elsabbagh et al., 2012) and reported lower
employment and compensation of people with autism in the
United States as compared to other disability categories
(Wehman et al., 2014). We argued that although literature
examined the construction of autism in fiction literature (e.g.,
Hacking, 2009), its construction in the academic literature had
not been investigated.

Step 3: Replicate the Literature Search Strategy

In Step 3, the review team identifies the original search strategy;
determines, based on the constructs it wishes to explore, whether
terms need to be added and/or eliminated; and replicates the
search using the original databases.

The research synthesis reported the key terms and Boolean
operators used to search the literature: disability (e.g., autis* OR
Asperger*), AND population (e.g., adolescent OR youth) AND
outcome (e.g., work OR job OR employment) AND program (e.g.,
work experience OR supported employment) AND research design
(e.g., RCT OR quasi-experimental). Because the original synthe-
sis sought to synthesize literature for three postschool outcomes
(employment, higher education, and independent living) for all
youth with disabilities, we replicated the search for our CCS
using only terms that would yield literature on work for youth
with autism. We eliminated key terms in disability and outcome
and program that were not associated with autism and employ-
ment to generate an initial set of primary literature germane to
our review. That is, we removed disability key terms, such as
intellectual disability; program terms, such as independent living;
and outcome terms, such as higher education.

Although the original synthesis had a broader scope with
regard to disability, intervention, and outcome, the methodolo-
gies it sought were only those with potential to meet WWC evi-
dence standards (e.g., single-case designs, quasi-experimental
designs, and randomized controlled trials). We therefore added
key terms in research design, whose original set included only
terms associated with quantitative studies, to capture conceptual
and qualitative work (e.g., qualitative, commentary). We ran our
literature search using the same databases (e.g., ERIC, PsycINFO,
Medline) as the original review, which yielded 13,076 sources.
Also duplicating the original review, we limited our search to
only research published in peer-reviewed journals. We recognize
this as a source of publication bias that is a limitation of the
original review and a delimitation to the scope of our critique.

Step 4: Screen Gathered Articles to Eliminate Works Off

In Step 4, the review team screens the articles identified in the
search strategy following the original review parameters and, if

applicable, narrowing those parameters based on the constructs
the team chose to explore.

We began screening the 13,076 articles by first identifying
only those that contained the words autism or Autism Spectrum
Disorder or ASD or Asperger’s Syndrome and work or employment
or vocation or job or career either in their titles or abstracts. We
chose to conduct this initial screen in order to more efficiently
eliminate articles that were not of interest to our review and to
yield a more manageable set of article abstracts to screen for a
team of four researchers. This process left us with 2,738 articles.
Consistent with the original review, we then screened the
abstracts to identify articles that discussed work training or
employment that did or could occur during high school. Also
consistent, we defined (a) work as employment, vocation, or par-
ticipation in paid or unpaid labor; (b) transition programs as
including career training, career therapy, or counseling; and (c)
youth as people between the ages of 13 and 22. We included lit-
erature about adults when it offered retrospective examinations
of their work experiences as youth. As is typical in this stage of
the review process, we erred on the side of inclusion. The abstract
screen process resulted in 252 sources for which we conducted
a full-text screen. The full-text screen yielded 62 articles for

Step 5: Develop an Extraction Pro Forma

In Step 5, the review team develops a pro forma to extract key
information from the articles and capture all statements in the
articles about the constructs of interest. Dixon-Woods and col-
leagues (2006) reported that the pro forma they developed for
their CIS was ultimately impractical to use, especially for large
documents, and wondered at the utility of formal data extraction
for interpretive syntheses. Given the focused nature of CCS on
specific constructs and the inclusion of only published articles in
the original review, we felt a pro forma would help us identify
features of articles and specific sections for later analysis.

Our pro forma, summarized in Table 2, included summary
information about the article, including its purpose, methods,
population studied, findings, and conclusions. It also included
a section about methodology to enable us to analyze potential
connections between methodology and constructions of autism
and work. The remainder of the pro forma focused on the con-
structs of interest (autism, work, and the worker with autism).
We used the pro forma to record positive and negative state-
ments/definitions of autism, work, and the worker with autism.
By positive and negative, we did not mean “good” and “bad” but
meant statements about what autism/work is (positive) and
what autism/work is not (negative). We also included spaces for
notes on our initial impressions of the constructs and other

Step 6: Use the Pro Forma to Extract Information

In Step 6, the research team uses the pro forma to extract rele-
vant information about the articles and statements about the
constructs of interest. This process will likely be iterative with
Step 5; that is, several team members can use the first draft of the
pro forma to extract one study, come together and discuss their


findings, identify commonalities, and potentially revise the pro
forma to better suit their research aims. Step 6 is also likely to
result in further elimination of articles based on the inclusion/
exclusion criteria in Step 4.

We used our pro forma to extract information from the 62
articles that passed the full-text screen. We extracted the first
eight articles as a team and met twice to share our extractions,
reconcile differences, and discuss the pro forma. These initial
extractions resulted in some revisions to the pro forma (e.g., we
clarified what we meant by positive and negative). Once we felt
confident in our process, we extracted the remaining 54 articles
in pairs. The pairs conducted their extractions individually and
then met to reconcile differences and create a final pro forma.
During this time, we met biweekly as a team to discuss our prog-
ress and any questions or concerns. This process also resulted in
the elimination of an additional 45 articles that did not meet the
aims of the original review (e.g., employment studies conducted
in postsecondary, instead of secondary, settings). A final set of 17
primary sources was included in our CCS.

Step 7: Analyze the Pro Forma and Full Texts

Up to this point, we have described CCS as more or less system-
atic, adopting the steps and processes common to many system-
atic reviews. In Step 7, the research team conducts the analysis of
the constructs and their underlying assumptions, using both the
pro forma and full texts. This includes identifying and compar-
ing articles that were and were not included in the original syn-
thesis. As noted above, this step is as variable as there exist
approaches for analyzing text and its latent meanings, and the
analysis technique selected will reflect the overarching philoso-
phy (e.g., critical realist, poststructural) and the aim of the CCS.
Theoretical literature in the field (in our case, disability studies)
is engaged to think through the analysis and interpret findings.
Reflexivity takes on a heightened importance during this step as
the team reflects on its own assumptions about the constructs
and the CCS process.

Our analysis involved a close reading and textual analysis of
the articles and extractions, seeking to understand how they
depicted autism, work, and the worker with autism. The philoso-
phy undergirding our CCS was broadly critical, and therefore we
used several of Gee’s (2010) discourse analysis tools, including the
significance-building tool and the identity-building tool, to cri-
tique what the articles featured and lessened as significant and
the identities they made possible and prevented. Working with
these tools, we categorized autism and work in a matrix with
two spectra: from simple to complex and from asset to deficit.
We developed this strategy based on Gee’s (2000) view that dis-
cursive “identities can be placed on a continuum in terms of
how active or passive one is in ‘recruiting’ them” (p. 104). In
our analysis, a simple construction ignored intersections of
identities (e.g., autism, class, race) in a unidimensional and
easy-to-follow narrative of autism and work. A complex con-
struction included multiple dimensions and intersections of
autism and work—depicting them in a more tentative, detailed,
and/or multiperspectival narrative. Deficit constructions
depicted autism or work in negative terms (e.g., students [situ-
ated] on the autism spectrum are lacking in social skills),
whereas asset constructions depicted autism or work more posi-
tively (e.g., students [situated] on the autism spectrum are good
visual learners).

Next, to understand what the constructions of autism and
work said about the worker with autism, we grouped articles that
seemed to put forward similar constructions of autism and work
(e.g., simple deficit accounts of autism, simple asset accounts of
work) and asked if there was a common story being built and, if
so, how. This process resulted in two major stories and variants.
The first, intervention story, identified autism as a problem for
which people on the autism spectrum needed treatment to ren-
der them useful as workers to society. Work in the intervention
story was usually presented as a set of discrete skills or tasks.
Complex stories, in contrast, invited positive accounts of autism
and broader notions of work that problematized the interven-
tion story. Connecting research methodologies to constructs, we

Table 2
Summary of Worker-With-Autism CCS Pro Forma

Extraction Area Information Extracted and Observations

Summary information Study code, citation, empirical or nonempirical, peer-reviewed or practitioner journal, author’s field/discipline/
occupation, population, major construct/theory investigated, genre/design of study, purpose of the work/
study, nature of sample/group under discussion (total number, age range, sex, ethnicity, education, other
characteristics), description of the research approach/intervention (setting and work, design and procedures),
major findings, conclusion, discussion, implications, suggestions for future research

Methodology Methodologists, theory of method (e.g., postpositivist, interpretivist, pragmatist), role/voice of researcher, role/voice
of participant

Construction of autism Positive statements/definitions of autism, negative statements/definitions of autism, comments about the way
autism is constructed

Construction of work Positive statements/definitions of work, negative statements/definitions of work, comments about the way work is

Construction of the worker with autism Positive statements/definitions of the role of work for people with autism at work, negative statements/definitions
of the role of work for people with at work, comments about the role of work for the person with autism.

Other thoughts or concerns Concerns about the method or writing, other thoughts

Note. CCS = critical construct synthesis. Full pro forma available at

ApRIl 2017 137

noted that only two of the 17 primary sources in our CCS met
standards for inclusion in the original research synthesis based
on their designs (quasi-experiment and single subject). Both
were coded as intervention stories, which discussed the impact of
interventions on observable and/or measurable outcomes and
drew on behaviorist principles to conduct the interventions.

We read the articles and conducted the extractions individu-
ally and in pairs, and the process of coming to and interpreting
the two stories occurred during biweekly team meetings over the
course of several months. During this time, we read widely about
the social construction of autism, critiques of neoliberalism, and
discourses of social science. On the basis of our readings, and
consistent with our critical orientation, we interpreted the con-
structions of autism and work and stories in terms of broader
sociopolitical (e.g., neoliberal, disability) and academic (e.g.,
postpositivist) discourses. In particular, we relied on critical dis-
ability literature that emphasized the social construction of dis-
ability alongside neoliberal accounts of work, literature that
critiques describing people with disabilities in terms of their pro-
ductivity (e.g., McKenzie, 2013).

Reflexivity. Throughout the abstract screen, extraction, and
analysis processes and into the writing phase, we discussed and
reflected on our individual and collective assumptions about
work and autism. This reflection informed the aim of our
inquiry, how we approached our analysis, and the terms we
chose to use in our write-up. We became aware of the ways our
writing and talk constructed particular versions of autism and
work. Through reflective conversations, we became clear about
the constructions we wanted to privilege—critical, discursive
ones that would challenge essential, humanist, and neoliberal
accounts of disability and work. For example, we began our
conversations using “people-first” language (i.e., students with
autism) to place the person ahead of the disability. However,
we became concerned this humanist language was not aligned
with our readings of the disability studies literature that theorize
autism as a social construction produced in systems of power. We
wanted our conversations and write-up to suggest that autism is
a constructed condition that shapes presumptions and identities
about who one is/can become. Therefore, we adopted the phrase
(situated) on the autism spectrum to emphasize the constructive
power of language, relationships, and labels, such as autism.

Step 8: Write Up the CCS Report

The final step in CCS is the report write-up. Here researchers
decide on the structure, style, and voice of their report. Should
the manuscript be written more traditionally or experimentally?
Should it include or seek to downplay the research team’s per-
spectives and experiences conducting the review? Should it be
written in first person or third? As in most research dissemina-
tion deliberations, these decisions will be made in light of the
anticipated venue (e.g., brief report, academic journal) and audi-
ence (e.g., policymakers, other researchers).

Our CCS write-up was first a conference paper presented at
the American Educational Research Association. Like many syn-
thesis write-ups, it was a lengthy, including three tables describ-
ing the features of all 17 primary sources and an appended pro

forma. Because we hoped to reach those who study autism and
might be open, if not sympathetic, to a critical approach, we
targeted a prominent disability studies journal for publication.
This meant our 60-page manuscript had to be cut in half. We
deleted most of the tables and the pro forma and made them
available as supplemental material; we also cut descriptions of
our methodology and worked elsewhere to streamline our intro-
duction, findings, and conclusions. The result was that the final
manuscript we submitted looked less like a traditional systematic
review write-up, which often involves in-depth descriptions of
methods, and included primary literature (Petticrew & Roberts,
2006). Still, we chose to write up our CCS fairly conventionally,
including introduction, methods, results, and discussion sec-
tions, to facilitate navigation. Whether this is the best way to
approach the CCS write-up is something we continue to wonder
about, especially in light of the connections CCS can illuminate
between the norms of academic writing and the kinds of under-
standings write-ups produce. In the conclusion section of our
CCS write-up, we recommended academics experiment with
writing in ways that engender less restrictive and more positive
accounts of work for youth situated on the autism spectrum. We
suggested CCS reviewers might also play with form and content
in their write-ups to attend to the constructions of constructs
they enable.


Critically examining constructs in scholarly literature is important
for understanding underlying assumptions about what counts as
good education and for whom. Empirical evidence derived from
experimental research can provide information about “what works”
but always underdetermines the answer in the final analysis.
Practitioners, policymakers, and scholars need more than this infor-
mation to make sense of what works in education (Donmoyer,
2012). We argue a broader understanding is needed, an understand-
ing that is aware of the productive power of research, the hermeneu-
tic circles in which research is produced, and the possibilities of
reframing educational problems and their solutions. We introduce
CCS as a methodology for unpacking constructs in research synthe-
ses. We do so to promote a “better” research synthesis, one that does
not take constructs at face value and takes seriously the ways in
which review methodologies (inclusion and exclusion decisions)
construct and privilege some accounts over others.

Inclusionary and exclusionary methodological decisions are
grounded in disciplinary and sociopolitical assumptions about
what should count as “valid” research. Findings produced by any
research synthesis are constituted within these assumptions, and
exploring excluded evidence reveals the implications of those
decisions, showing what might be found and known under a
different set of assumptions about what evidence counts. CCS
reveals the implications of these methodological elimination
decisions by comparing assumptions about constructs in
included and excluded literature. It asks, What methodologies
entail particular understandings at the exclusion of others? What
might be thought, concluded, recommended differently? How
might problems and solutions be reframed?

In the example CCS, we noted that only two of our 17 pri-
mary sources were eligible for inclusion in the original research


synthesis based on their designs (quasi-experiment and single
case). These were both coded in our CCS as intervention stories
that explored the impact of interventions on observable or mea-
surable outcomes and drew on behaviorism to design and con-
duct the interventions (Wolgemuth et al., 2016). The
intervention stories depicted autism in deficit terms and work
narrowly as a set of tasks for hourly pay. Absent from the research
synthesis, and found in our CCS, were primary sources that dis-
cussed autism as a form of neurodiversity or in strengths-based
terms. Also absent from the research synthesis and found in our
CCS were primary sources that discussed work as an individual
right, a career, activism, or unpaid labor. We worry about the
limited and rather bleak understandings of people with disabili-
ties and their life possibilities enabled by a research synthesis of
studies that met criteria for inclusion. We worry, alongside oth-
ers (cf. Van Cleave, 2012) critical of “scientifically based
research,” that “narrow definitions of research or science [in
research syntheses] might trivialize rather than enrich our under-
standing of education policy and practice” (Feuer, Towne, &
Shavelson, 2002, p. 4).

The aim of CCS is not to demonstrate that research syntheses
should not be conducted or that all systematic reviews should be
as inclusive as possible. Instead, the aim is to empirically trace
and reveal the limitations of exclusionary decisions in order to
inform a more tentative, critical, and contextualized understand-
ing of the terms and conclusions produced in research syntheses.
Constructs are indispensable to scholarly inquiry, but using
them without understanding both their history and the work
they do may produce unnecessarily limited understandings on
which to base policy and practice decisions. Through CCS, we
can better understand connections between research designs,
constructs we use, and their potential effects, with the aim
to reveal how research synthesis might not yield best ethics—
optimistic accounts of people that reframe their “problems” and
open up possibilities for their lives.

1Following Cooper and Hedges (2009), we use the term research

synthesis to refer to systematic reviews that “attempt to integrate empiri-
cal [quantitative] research for the purpose of creating generalizations”
(p. 6). We describe critical construct synthesis as a systematic review method
particularly well suited to interrogate constructs in research syntheses that
exclude primary literature based on methodological criteria.


Altheide, D., & Schneider, C. J. (2013). Qualitative media analysis (2nd
ed.). Newbury Park, CA: Sage.

Armstrong, D. M. (1978). Universals and scientific realism (Vol. 1).
Nominalism and realism. Cambridge, MA: Cambridge University

Barthes, R. (1964). Elements of semiology. New York, NY: Hill & Wang.
Bhaskar, R. (1998). The possibility of naturalism: A philosophical cri-

tique of the contemporary human sciences (3rd ed.). New York, NY:

Bhaskar, R. (1993). Dialectic: The pulse of freedom. New York, NY:

Biesta, G. (2015). On the two cultures of educational research, and how
we might move ahead: Reconsidering the ontology, axiology and

praxeology of education. European Educational Research Journal,
14(1), 11–22.

Cobb, R. B., Lipscomb, S., Wolgemuth, J. R., Schulte, T., Veliquette,
A., Alwell, M., . . . Weinberg, A. (2013). Improving postsecond-
ary outcomes for transition-age students with disabilities: An evidence
review (NCEE 2013-4011). Washington, DC: National Center
for Education Evaluation and Regional Assistance, Institute of
Education Sciences, U.S. Department of Education.

Cooper, H. M. (1985). A taxonomy of literature reviews. Paper presented
at the Annual Meeting of the American Educational Research
Association, Chicago, IL. (ERIC Document Reproduction Service
No. ED254541)

Cooper, H., & Hedges, L.V. (2009). Research synthesis as a scientific
process. In H. Cooper, L. V. Hedges, & J. C. Valentine (Eds.), The
handbook of research synthesis and meta-analysis (pp. 3–16). New
York, NY: Russell Sage Foundation.

Dixon-Woods, M., Cavers, D., Agarwal, S., Annandale, E., Arthur, A.,
Harvey, J., . . . Sutton, A. J. (2006). Conducting a critical interpre-
tive synthesis of the literature on access to healthcare by vulnerable
groups. BMC Medical Research Methodology, 6, 35.

Donmoyer, R. (2012). Can qualitative researchers answer policymak-
ers’ what-works question? Qualitative Inquiry, 18, 662–673.

Eisenhart, M., & DeHaan, R. L. (2005). Doctoral preparation of scientifi-
cally based education researchers. Educational Researcher, 34(4), 3–13.

Elsabbagh, M., Divan, G., Koh, Y., Kim, Y., Kauchali, S., Marcin,
C., . . . Fombonne, E. (2012). Global prevalence of autism and
other pervasive developmental disorders. Autism Research, 5(3),

Feuer, M. J., Towne, L., & Shavelson, R. J. (2002). Scientific culture
and educational research. Educational Researcher, 31(8), 4–14.

Foucault, M. (1980). Power/knowledge: Selected interviews and other
writings (1972–77) (C. Gordon, Ed.). Essex, UK: Harvester.

Gee, J. P. (2000). Identity as an analytic lens for research in Education.
Review of Research in Education, 25(1), 99–125.

Gee, J. P. (2010). How to do discourse analysis: A toolkit. New York, NY:

Giroux, H. A. (2004). The terror of neoliberalism: Authoritarianism and
the eclipse of democracy. Boulder, CO: Paradigm.

Glass, G. V. (1976). Primary, secondary, and meta-analysis of research.
Educational Researcher, 5(10), 3–8.

Glass, G. V. (2000). Meta-analysis at 25. Retrieved from http://www

Gorski, P.S. (2013). What is critical realism? And why should you care?
Contemporary Sociology, 42(5), 658–670.

Gough, D., Thomas, J., & Oliver, S. (2012). Clarifying differences
between review designs and methods. Systematic Reviews, 1, 28.
Retrieved from

Greenhalgh, T., Macfarlane, R. G., Bate, P., Kyriakidou, O., &
Peacock, R. (2004). Storylines of research in diffusion of innova-
tion: A meta-narrative approach to systematic review. Social Science
Medicine, 61, 417–430.

Hacking, I. (2002). Historical ontology. Cambridge, MA: Harvard
University Press.

Hacking, I. (2004). Between Michel Foucault and Erving Goffman:
Between discourse in the abstract and face-to-face interaction.
Economy & Society, 33(3), 277–302.

Hacking, I. (2009). How we have been learning to talk about autism: A
role for stories. Metaphilosophy, 40, 499–516.

Hannes, K., & Macaitis, K. (2012). A move to more systematic
and transparent approaches in qualitative evidence synthesis:
Update on a review of published papers. Qualitative Research,
12, 402–442.

ApRIl 2017 139

House, E. R. (1991). Realism in education. Education Researcher, 20,

Kastner, M., Antonya, J., Soohiaha, C., Strausa, S. E., & Triccoa,
A.C. (2016). Conceptual recommendations for selecting the most
appropriate knowledge synthesis method to answer research ques-
tions related to complex evidence. Journal of Clinical Epidemiology,
73, 43–49.

Kuhn, T. S. (1962). The structure of scientific revolutions. Chicago, IL:
University of Chicago.

Latour, B., & Woolgar, S. (1986). Laboratory life: The construction of
scientific facts. Princeton, NJ: Princeton University. (Original work
published 1979)

McKenzie, J. A. (2013). Models of intellectual disability: Towards
a perspective of (poss)ability. Journal of Intellectual Disability
Research, 57(4), 370–379.

National Research Council. (2002). Scientific research in education
(R. J. Shavelson & L. Towne, Eds.). Washington, DC: National
Academy Press.

Paterson, B. L., Thorne, S. E., Canam, C., & Jillings, C. (2001). Meta-
study of qualitative health research. Thousand Oaks, CA: Sage.

Peters, M. A., & Burbules, N. C. (2000). Poststructuralism and educa-
tional research. Lanham, MA: Rowman & Littlefield.

Petticrew, M., & Roberts, H. (2006). Systematic reviews in the social
sciences: A practical guide. Malden, MA: Blackwell.

Phillips, D. C., & Burbules, N. C. (2000). Postpositivism and educa-
tional research. Lanham, MA: Rowman & Littlefield.

Searle, J. R. (1997). The construction of social reality. New York, NY:
Free Press.

Sider, T. (2014). Writing the book of the world. New York, NY: Oxford
University Press.

Skrtic, T. M. (1991). Behind special education: A critical analysis of pro-
fessional culture and school organization. Denver, CO: Love.

Slavin, R. E. (1986). Best-evidence synthesis: An alternative to meta-
analytic and traditional reviews. Educational Researcher, 15(9),

Van Cleave, J. (2012). Scientifically based research in education as
a regime of truth: An analysis using Foucault’s genealogy and gov-
ernmentality (Unpublished doctoral dissertation). University of

Georgia, Athens. Available from ProQuest Digital Dissertation
and Theses Database. (Accession No. gua4088456)

Watt, J. H., & Van Den Berg, S. (2002). Elements of scientific theories:
Concepts and definitions. Research methods for communication sci-
ence. Boston, MA: Allyn and Bacon.

Wehman, P., Carol, S., Carr, S., Targett, P., West, M., & Cifu, G.
(2014). Transition from school to adulthood for youth with
autism spectrum disorder: What we know and what we need to
know. Journal of Disability Policy Studies, 25(1), 30–40.

What Works Clearinghouse. (2014). What Works Clearing House:
Procedures and standards handbook version 3.0. Washington, DC:
Institute of Education Sciences, U.S. Department of Education.

Wolgemuth, J. R., Agosto, V., Lam, Y. H., Riley, M., Hicks, T. A., &
Jones, R. (2016). Storying transition-to-work for/and youth on the
autism spectrum in the United States: A critical construct synthesis
of academic literature. Disability & Society, 31(6), 777–797.


JENNIFER R. WOLGEMUTH, PhD, is an assistant professor at the
University of South Florida, 4202 E. Fowler Ave., Tampa, FL 33620;
[email protected] Her research focuses on the ethics and validity of
social science research, with attention to its (unintended) impacts on
participants, researchers, and research audiences.

TYLER HICKS, PhD, is a postdoctorate researcher at the University of
Kansas, 1122 West Campus Rd., Lawrence, KS 66045; [email protected]
His research focuses on issues in critical realism, Bayesian methodology,
and inclusive school reform.

VONZELL AGOSTO, PhD, is an associate professor of curriculum
studies at the University of South Florida, 4202 E. Fowler Ave., Tampa,
FL 33620; [email protected] Her research agenda focuses on curriculum
leadership as anti-oppressive education, with an emphasis on race, gen-
der, and dis/ability.

Manuscript submitted June 28, 2016
Revisions received January 16, 2017, and March 3, 2017

Accepted March 4, 2017

error: Content is protected !!