lakomski, gabriele (1991) beyond paradigms; coherentism and holism in research

97
7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 1/97 INTRODUCTION GABRIELE LAKOMSKI The question of how best to research problems in the field of education continues to occupy a central place in the work of educational writers. When the use of qualitative versus quantitative methods in educational research is discussed, the debate commonly turns on the issue of whether methods for studies in education should be borrowed from the physical or the interpretive social sciences. In the first case, researchers argue that empirical educational research requires quantitative-statistical methods to guarantee the objectivity, i.e., scientific nature of results. On the other hand, proponents of qualitative methods point out that exclusive attention to observable, social-educational phenomena is too restrictive since it does not capture such inner phenomena as people’s intentions, reasons, and their tacit knowledge which provide the “real” basis for social action. Hence, qualitative researchers, while not necessarily rejecting the view that educational research is also explanatory and predictive, emphasize that their goal is an alternative one which seeks to make explicit hidden meanings. From their perspective, educational phenomena, studied in the scientific, i.e., “positivist” mode, are at best distorted and at worst trivialized since educational issues with their “inner” dimension have merely been described in their “surface” features. While the qualitative-quantitative debate overtly emphasizes research methods, it also presumes the existence of two different paradigms which are believed to be underwritten by different epistemologies. The scientific paradigm is said to be that of positivism, an umbrella term for a number of foundational theories of knowledge. Its alternative, the interpretive, humanistic, or hermeneutic paradigm is premised on the notion of verst ehen. Although there appears to be general agreement on. the existence and validity of both paradigms, opinions differ on how to appraise such co-existence. Either they are complementary, on the grounds that it takes more than one methodololgy - with its attendant epistemology - to answer relevant questions in education, or they are oppositional. In the first view, increased attention to the practical procedures of both forms of research is urged and fundamental, paradigmatic, and epistemological differences tend to be de-emphasized. But according to the second view, these calls for the blending of methods are countered by others who see such moves as obfuscating the basic incompatibility of both paradigms. In this case, the possibility of reconciliation or synthesis is denied, and the superiority of one or the other paradigm is advocated: they remain oppositional. Recently, there has emerged in the literature a third alternative on how to appraise the co-existence of these two paradigms, the unity thesis, which denies epistemological 501

Upload: cristian-marx

Post on 14-Apr-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 1/97

INTRODUCTION

GABRIELE LAKOMSKI

The question of how best to research problems in the field of education continues to

occupy a central place in the work of educational writers. When the use of qualitative

versus quantitative methods in educational research is discussed, the debate commonly

turns on the issue of whether methods for studies in education should be borrowed from

the physical or the interpretive social sciences. In the first case, researchers argue that

empirical educational research requires quantitative-statistical methods to guarantee the

objectivity, i.e., scientific nature of results. On the other hand, proponents of qualitative

methods point out that exclusive attention to observable, social-educational phenomena

is too restrictive since it does not capture such inner phenomena as people’s intentions,

reasons, and their tacit knowledge which provide the “real” basis for social action.Hence, qualitative researchers, while not necessarily rejecting the view that educational

research is also explanatory and predictive, emphasize that their goal is an alternative

one which seeks to make explicit hidden meanings. From their perspective, educational

phenomena, studied in the scientific, i.e., “positivist” mode, are at best distorted and at

worst trivialized since educational issues with their “inner” dimension have merely been

described in their “surface” features.

While the qualitative-quantitative debate overtly emphasizes research methods, it

also presumes the existence of two different paradigms which are believed to be

underwritten by different epistemologies. The scientific paradigm is said to be that of

positivism, an umbrella term for a number of foundational theories of knowledge. Itsalternative, the interpretive, humanistic, or hermeneutic paradigm is premised on the

notion of verst ehen. Although there appears to be general agreement on. the existence

and validity of both paradigms, opinions differ on how to appraise such co-existence.

Either they are complementary, on the grounds that it takes more than one methodololgy- with its attendant epistemology - to answer relevant questions in education, or they

are oppositional. In the first view, increased attention to the practical procedures of

both forms of research is urged and fundamental, paradigmatic, and epistemological

differences tend to be de-emphasized. But according to the second view, these calls for

the blending of methods are countered by others who see such moves as obfuscating the

basic incompatibility of both paradigms. In this case, the possibility of reconciliation orsynthesis is denied, and the superiority of one or the other paradigm is advocated: they

remain oppositional.

Recently, there has emerged in the literature a third alternative on how to appraise

the co-existence of these two paradigms, the unity thesis, which denies epistemological

501

Page 2: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 2/97

502 G. LAKOMSKI

diversity and argues that the paradigms view sanctioning it is false. Assuming a

non-foundational, coherentist theory of knowledge, the unity thesis argues that since

foundational theories of knowledge outrun their own resources to justify what they claim

are secure foundations, the epistemological foundations presumed by the quantitativeand the qualitative paradigms are consequently also unjustified. Furthermore, the very

conception of Kuhnian paradigms which underwrites the distinction is shown to be

incoherent, a result widely accepted in philosophy of science. The unity thesis rejects

the view that knowledge can be partitioned, believing it instead to be all of a piece. It

consequently disagrees with the paradigms view that different research methods can be

grouped under incommensurable paradigms.

The most important and far-reaching consequence of the unity thesis is that it can

provide an answer to what is a fundamental problem for the other two views. If one

assumes epistemological diversity as is the case in both forms of the diversity thesis,

then it is impossible rationally to integrate, render coherent, or even compare the

findings produced in either research tradition. In other words, growth in knowledge

cannot take place in the field of education. The enormous advantage of the unity thesis

is that - subject to its view of knowledge as non-foundational - it can offer criteria

and standards of justification for judging the respective merits of both traditions. They

can in fact be brought into a productive relation with one another when comparing them

according to coherence criteria which are routinely employed when judging scientific

theories. Knowledge in education grows by working out how much is shared between

theories - the development of touchstone - and how much is not, and making the

differences as clear as possible so that the nature of disagreements can be determined,

and possibly ruled out. In addition, for the unity thesis to work, it is not required that

researchers give up their commitments to whatever paradigm they adhere to, although

some might come to do just that when the checking of theories is concluded. It is initially

sufficient for them to be willing to engage in defending their accounts by using such

commonly accepted constraints on good theorizing as coherence, explanatory power,

comprehensiveness, and so on.

It is suggested that the unity thesis is the best available explanation of the

quantitative-qualitative distinction in educational research. The solution it offers to

the debate is novel, epistemologically sound, and practically superior since - in

emphasizing coherence and holism - it promotes the growth of knowledge in the

field and allows for methodological diversity.

Since Walker and Evers first presented their unity thesis in its most systematic and

comprehensive form in Keeves’ Educational Research Methodology, Measurement and

Evaluation: An I nternational Handbook, its growing importance and acceptance is

increasingly documented in educational research. It was given prominence in a recent

symposium between Husen and Keeves which appeared in the Spring 1988 issue of

Interchange under the title of “A Symposium on Educational Research: Unity or

Complementarity?” In addition, Keeves (1988, p. xvii), as editor of the International

Handbook, argues, referring to the ideas presented in the Walker-Evers thesis, that

a guiding theme of the volume is “that there is a unity in the field that arisesboth from the epistemological bases of inquiry into educational problems . . as

well as a coherence that arises from recent changes in social theory.” Furthermore,

there are available systematic applications of non-foundational epistemic justification

and coherence criteria in a number of fields. To mention just some more recent

Page 3: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 3/97

Beyond Paradigms 503

examples: Evers and Lakomski’s (1991) Knowing Educational Administration, is the

first full-blown coherentist account in educational administration; Thagard’s (1988)

Computational Phil osophy of Science develops a coherentist justification for theory

choice of scientific theories, and Goldman’s (1988) book Moral Knowledge, applies

coherentist criteria to ethics.

The chapters collected in this Special Issue are a further contribution to this important

theoretical development. They specifically explore applications and implications of the

unity thesis in discrete areas of educational research. This is not to say that there

is equal acceptance of the unity thesis by all contributors. But, given that they are

united in their opposition to the relativist stance which currently characterizes much of

educational research, there is general acceptance of the importance of scientific realism,

naturalism, and coherentist justification. Together, these philosophical strands serve to

eschew dualisms and thus further the creation of unity in the study and ultimate solution

of educational problems.

Page 4: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 4/97

Page 5: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 5/97

COHERENCE

CHAPTER 1

AND REDUCTION: rMPLICATIONS FOR

EDUCATIONAL INQUIRY

JAMES C. WALKER

Faculty of Education, University of Canberra, Australia

Abstract

This chapter outlines the consequences for educational inquiry of the view that human

knowledge is a “seamless web”, and that one major path to progress lies in the search

for coherence between theories and traditions of inquiry. An exciting example comes from

recent work in philosophy and cognitive science proposing a theory of the mind-brainwhich promises to unite, through intertheoretic reduction, cultural, psychological and

neuroscientific understandings of teaching and learning. This supports some appropriate

medium to long term strategies for educational research, such as interdisciplinary

specialization and the co-development of educational theories.

Introduction

Educational theory and research, while having their own characteristic interests and

problems, are influenced by developments in other disciplines, especially philosophy,

psychology and the social sciences. Over recent decades the form of influence hasbecome increasingly epistemological, as debates about validity and reliability, different

research paradigms, the organization of knowledge in the curriculum and the nature of

teaching and learning have thrown up different views about what we can know, how

we can come to know it and why our claims to knowledge are credible or justified.

Historical studies notwithstanding, educational inquiry tends to focus on the present,

because of its applied nature and pressures from current professional practice and

public interest. These pressures can make it hard to take a long term view of where

theory and research should be headed. Yet choices made now, about current projects,

affect our future directions and medium and long term options. More importantly, the

conceptualization and organization of educational inquiry as a whole, the relations

between its internal specializations, ‘and the relations between them and the social and

natural sciences, as well as educational policy and practice, need to be considered from

more than a short term perspective.

In this chapter I wish to consider some general philosophical issues whose bearing on

educational inquiry (Evers, 1987b; Walker, 1985b; Walker & Evers, 1984), although in

some respects long term, is closely related to the choices of the present. The issues

505

Page 6: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 6/97

506 J. C. WALKER

themselves are not new. My aim in raising them is to draw attention to a strand

of contemporary philosophy which as yet has not been widely considered among

educational inquirers, although some of its intellectual antecedents have been considered

and, in fact, widely rejected. It is the materialist or physi~alist quest for a unified sciencewithin which educational inquiry, along with the social sciences generally, would sit

coherently with the natural sciences. This is presented as a working hypothesis about

the conditions for progress in the growth of knowledge, not for any other, ideological,

reasons.

One aspect of this strand of philosophy has had some currency in educational inquiry:

the view that all knowledge is of a piece, a seamless web. This “coherent&” epistemology

is due largely to the work of Quine. The particular way this coherentism is developed

in recent philosophy is even more novel in the educational field. The epistemology is

specified through a reductionist view of human nature exemplified in recent restatements

of physicalism by philosophers such as Clifford Hooker, Stephen Stich, and Patricia and

Paul Churchland.

The physicalism development in the 1940s and 1950s by the logical empiricists was

strongly associated, not just with their aspirations for a unified science, but also with

their interest in the history and philosophy of science, especially physics (Nagel,

1961; Oppenheim & Putnam, 1958). Contemporary physicalism, focusing sharply on

theories of the mind-brain, is associated with the growth of cognitive science, notably

the overlapping work in cognitive neurobiology, cognitive psychology and artificial

intelligence.

The unificationist quest runs against several popular beliefs: that there is a fundamental

epistemic divide between the social and natural sciences; that on each side of this divide

there are different modes of explanation and understanding including distinct views of

causation; that human learning and behavior are not reducible to physical entities or

relations; and that a materialist view of human persons and culture is inconsistent

with the moral aspect of human life, rooted in freedom of choice and responsibility

for actions. The quest for a unified science of education, then, conflicts with deeply

held epistemological, ontological and ethical beliefs. The challenge is to demonstrate

in principle, and if possible in practice, that the dualisms underpinning these beliefs

are neither necessary nor indeed our best options for the future (Walker, 1985a).

The Coherence of Natural Knowledge

Traditionally, epistemologists sought to show our knowledge could be certain, reliable

or warrantable by locating it on secure foundations. For classicaf empiricism the

foundations were observations or “sense data”, for classical rationalism the intuitions

of pure reason. Modern foundationalists offer more sophisticated variations on these

themes. If by sound methodology (such as valid inference) we coutd show our beliefs

to be based on these foundations, our claims to knowledge would be justified.

Foundationalism has lost support for many reasons, not least the impossibility of

identifying foundations without some prior theoretical framework itself contentious in

that it is not an instance of the foundational category. This is clear, for example, in the

now almost uncontroversial doctrine of the theory-ladenness of observation.

Page 7: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 7/97

Beyond Paradigms 507

The recognition that there is no one simple way of interpreting experimental results

undermines foundationalism. Neither confirmation nor disconfirmation will do on its

own. Taking up Duhem’s (1954) argument that hypotheses cannot be tested one by one

in isolation from the whole theoretical networks in which they are embedded nor from

the theoretical frameworks of research methodology, Quine (1953, 1960) points out

that neither experiment, logic, nor method alone can tell us which hypotheses should

be saved, revised or rejected. According to the Quine-Duhem thesis, what is being

tested in research is always a global theory or whole theoretical network. We move

from foundationalism to holism. We recognize that competition between theories and

hypotheses cannot be resolved simply by appeals to authoritative empirical evidence or

logic, both of which embody fallible theoretical commitments.

A consequence of foundationalism’s failure and the move to epistemological holism

is the recognition that there is no “first philosophy”, no way prior to knowing itself of

demonstrating any warrant for our claims to knowledge, no Archimedean point outside

our body of knowledge from which we can judge it. Quine (1960, p. 3) invokes Neurath’s

metaphor: “Science is like a boat, which we rebuild plank by plank while staying afloat in

it. The philosopher and the scientist are in the same boat.” Quine’s (1969) response to

this holistic outcome is the naturalizing of epistemology. We study knowledge itself as

a natural phenomenon, and learn from the successes and failures of attempts to acquire

it. The epistemic quest is understood as a problem solving process conducted by natural

organisms in the natural world. (For the physicalist, this includes the social world.)

Epistemology becomes continuous with natural science, viewed as one evolving whole.

What we can know will depend on what kind of creature we are, especially upon our

cognitive capacities, and upon the natural contingencies determining what we can learn

through addressing the problems in ourselves and our environment. The natural world

is a real world; it is not constituted by our knowledge, although we and our knowledge

are part of it; our naturalism is epistemologically realist. Inquiry is a search for “facts of

the matter” (Quine, 1977), notwithstanding that facts are perceived through theoretical

lenses and represented through theoretical frameworks. Holism, naturalism, pragmatism

and realism become the crucial features of our account of knowledge.

Coherence is easier to espouse than achieve. Given the theory-ladenness of observation

and the absence of foundations, we are left with the prospect of choosing between

competing theories with no recourse to an authoritative arbiter to prescribe a choice.Facing this, some theorists have quickly succumbed to a debilitating relativism, in which

there is no prospect of rational choice. The naturalism of the scientific realist, however,

cautions against such a move. What is needed is an account of theory competition in

which the naturalizing of epistemology guides us in selecting our best option.

One epistemological tradition influential among social scientists takes us halfway

there. Popper’s “evolutionary epistemology” presents theory competition as a process

analogous to the selective elimination of biological evolution (Popper, 1972). Theories

compete by addressing shared problems, developing as “theory-series” (Lakatos, 1970)

by a process of trial and error. Just as biological organisms - including whole species

- survive or die according to their inbuilt capacities to solve the problems imposedupon them by their environment, so human theories, including science, are assessable in

terms of their problem solving power, and competing theories can be judged as relatively

more or less powerful. The most systematic attempt to apply such an evolutionary

Page 8: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 8/97

Page 9: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 9/97

Beyond Paradigms 509

carefully, for example, to the ways in which the brain most effectively conducts its

cognitive enterprise, and to the fundamental features of the natural world as understood

through the physical sciences. Whereas in practice we may assume such systematic

coherence to be established in certain cases (there are no serious doubts about the

coherence of chemistry with physics), in other cases, notably relating to theories of the

mind and of learning, there is room for reservation: hence the present argument for a

long term perspective.

Respect for the intellectual or practical problem solving power of a theory implies a

pragmatic coherence theory of evidence, but not necessarily of truth. On this score,

while we endorse the earlier pragmatist (e.g., Deweyan) emphasis on problem solving

and epistemological holism, we reject its instrumentalist theory of truth. A Quinean

pragmatic realism employs the coherence tests of the superempirical virtues to sort out

the merits of competing theories, and permits us to use the resources of the resulting

preferred theory to spell out its relation to the world, to specify what constitutes, in its

case, truth as a correspondence relation between the sentences of the theory and the

facts of the matter. A coherence theory of evidence is distinct from and compatible

with a correspondence theory of truth (Quine, 1970). Epistemological pragmatism and

coherentism are partners with scientific realism (Evers, 1987b).

All this implies some conceptual and methodological common ground between

theories competing in any given field of knowledge (e.g., educational psychology),

and between (at least prima facie) non-competing theories in different fields (e.g.,

psychology and neurobiology). Minimally, all theories are amenable to application of

the criteria of superempirical virtue; maximally, given the reductionism expounded in

the next section, there will be, when coherence obtains, the possibility in principle of

intertheoretic reduction in which one theory can be explained within the framework

of another. Whereas long term reducibility to physical theory is a desirable outcome

in educational inquiry, an important short and medium term goal (on which the

long term achievement will depend) is a program of intra and interfield theoretical

co-evolution (P.S. Churchland, 1986 pp. 361-365). This program will involve attempts

both at construction of explicit cross conceptualizations and at the devising of research

methods and designs which enable comparison and integration of findings. Running

through this program is the attempt to discover and construct what, borrowing a term

from Lakatos (1970), we have called “touchstone theory” (Walker, 198.5~; Walker &Evers, 1982, 1988).

Touchstone is an intertheoretic set of concepts and methods and the empirical evidence

they generate and incorporate, geared to processing information and solving problems in

the natural world. It embraces logical as well as experimental practices. It provides the

resources for theory comparison and competition and for intertheoretically generated

theory refinement. It is what produces algorithmic and heuristic coherence or unified

problem solving power. As such, unlike the epistemic items posited by foundational

epistemologies, it is not fixed; as theories under consideration develop and change, so

may touchstone. It is part of our theory of the world, our web of belief, and may be

corrected. Touchstone is relative to the overall development of our epistemic whole,and changes as the development of theories generates new overlaps or displaces old

ones. Whether explicit or implicit, touchstone varies synchronically (if and when theories

are being compared) and diachronically (through evolution of theories). The point to

stress is that since criteria for evaluation are themselves embedded within theories,

Page 10: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 10/97

510 J. C. WALKER

intertheoretic evaluative comparison depends on criteria being shared between theories.

As shared theory, then, touchstone gives field specific and interfield general content to

the coherentist account of the superempirical virtues.

The notion of touchstone theory needs to be understood in relation to the natureof theory competition. Given that theories are understood as proposed solutions to

problems, competing theories are alternative projected solutions to the same problem

or problems. For Tl to be in competition with T2, the theories must be addressing at

least one common problem. Granted some variability in their respective formulations

of the problem(s), if there is genuine competition there will be some semantical and

methodological commonality, and therefore implicitly at least commonality of theoretical

structure.

The interrelation of touchstone problems and touchstone between competing theories

provides the basis for identifying and distinguishing between fields of study, or disciplines

(Walker, 1985b; Walker & Evers, 1982). The recognition of closeness of competition

(with the limiting case being no competition - i.e., an unchallenged theory or body of

theory), naturalistically determined by the closeness of the range of problems addressed,

leads to the social-epistemic constitution of a more or less coherent discipline. The

looser the constitution, and the weaker the actual competition, the harder it is to

draw boundaries, which change as theories and their touchstone develop. Moreover,

the unevenness of theoretical development means that changes in one field, or even a

cluster of disciplines, may not be absorbed by intellectual relatives for some time, or

not at all, the cultural lag inhibiting epistemic progress - a problem for applied fields

such as educational research.

Competition, however, is not always patent, nor recognized by proponents of

competing theories. A common metatheoretical issue, evident not least in debates

over the mind-brain, is whether a competition is under way. Theories previously

thought compatible may be discovered to be incompatible. This typically emerges

from the recognition that there is indeed touchstone which can be brought to bear

on the respective theories; that theories of the mind and theories of the brain, for

example, are addressing the same set of problems. The discovery may be prompted by

developments within the fields housing the theories, or from discoveries elsewhere in

our web of belief. The debate may well proceed in terms of the adequacy of respective

problem formulations. A constructive stance towards the epistemic enterprise will notrule out the desirability of discovering new touchstone and, with it, hitherto undetected

competition. This applies particularly to fields such as educational inquiry which are

under pressure to focus on the short term and are not well geared to keep an eye on

intellectual developments across our global body of knowledge.

In then home territories and with an eye on their near neighbors, however,

educational researchers, like other social scientists, are increasingly recognizing not just

the desirability of rapprochement between adherents of different research traditions (as

in the quantitative/qualitative debate) but the methodological need for more coherent and

if possible integrated first order methodology within and across traditions. Educational

researchers have adopted multimethod approaches, particularly in sociological studies,emphasizing triangulation across data, investigators, theories and methods (Denzin,

1978). Although less widespread, perhaps, there is a similar awareness in psychological

research, especially since Campbell and Fiske (1959) (who coined the term “triangula-

tion”) first advocated multitrait/multimethod analysis in educational testing and measure-

Page 11: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 11/97

Beyond Paradigms 511

ment, moved by the epistemological observation that construct validity requires both

convergent and discriminant validity. In the broader sweep of things, it can be argued

that a qualitatively conceived and researched framework is a fundamental precondition

for quantitative research. In the spirit of Quine’s comment that science is selfconscious

commonsense Campbell, discussing action research and program evaluation, concludes

that “quantitative knowing depends on qualitative knowing in going beyond it.” Both

the quantitative and the qualitative traditions need “cross validating additions”.

More than that, I have sought to remind my quantitative colleagues that in the successful laboratorysciences, quantification both builds upon and is cross-validated by the scientist’s pervasive qualitativeknowledge. The conditions of mass-produced quantitative social science in program evaluation are suchthat much of this qualitative base is apt to be lost. If we are to be truly scientific, we must reestablishthis qualitative grounding of the quantitative in action research ]Campbell, 1988. p. 3761.

It is well, too, to remember that educational policy makers and practitioners constantly

put together information of various kinds from diverse sources and produced by various

methods within a variety of theoretical frameworks. Their focus on practical problems

gives them a working context in which to find or construct the touchstone they need.

However much they might individually or in groups favor one kind of research over

another, practitioners and policy makers do not readily assume incommensurability

between approaches. Similar attention by researchers to the problem to be solved

rather than preoccupation with the method or theory to be defended is necessary

to construct touchstone to achieve the flexibility and open mindedness required for

a unified approach to educational theory and research.

The epistemic and social specialization and attendant ~agmenta~on of educationalinquiry - which affects modes of publication, conferences, appointments, funding and

the training of researchers as well as the conduct of particular research projects - tends

to work against the development of touchstone and a co-evolutionary framework. The

need for interdisciplinary work stands out, not just collaboration between workers in

different disciplines but, as Campbell has pointed out, for “interdisciplinary specialists”

who can work in “interdisciplinary space”. Reflecting on his own efforts in this

mode, he points out the obvious - but academically scandalous - implication: the

interdisciplinary specialist will have to comment on many fields in which they are not

competent. Owning up to such incompetence (and the present essay is another example),

Campbell dissipates the scandal by recalling Quine’s observation that both learning newlanguages and conceptual innovation are processes of trial and social correction. The

brave interdisciplinary specialist needs sympathetic support.

. the process of exploring this interdisciplinary niche requires corrective critical responses fromthose whose areas overlap you have to be willing to keep up the conversation with those whospeak your language imperfectly, patiently correcting their misconceptions while still encouraging

their efforts. It is at this point where our collective process so often fails, where bold explorers

of interdisciplinary space get no response, critical or otherwise, from the disciplines they overlap

and end in paranoid isolation if they persist at all. Proud scholars who refuse to talk with thosewho do not speak their language competently are neglecting their duty to . collective omniscience[Campbell, 1988. p. 4391.

Notwithstanding heartening breakthroughs, epistemic isolationism not only flourishes,but is a social phenomenon which has had some insistent theoretical backing in

modern epistemology. Clearly, our touchstone coherentism is incompatible with such

antiholist epistemologies as the doctrines of incommensurability between theories

Page 12: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 12/97

512 J. C. WALKER

(Kuhn, 1970) and between forms of knowledge (Hirst, 1966). Arguments against

these positions have been presented elsewhere (against incommensurability of theories

or “paradigms”: Walker, 1985b; Walker & Evers, 1988; against incommensurability

of forms of knowledge: Evers, 1987a; Evers & Walker, 1983). Incommensurabilistdoctrines proclaim relativisms which outrun the doctrines’ explanatory resources.

Moreover, they are self contradictory. To argue incommensurability presumes that

some account (chiefly semantic) of a given theory can be provided showing lack of

common ground with another theory. But this account must itself be given in terms of

some theory, if not the second theory, then some third theory articulating semantically

with the other two. In other words, to argue incommensurability on semantic grounds

(Feyerabend, 1975) requires some semantic touchstone. The same goes for logical and

methodological arguments. This does not mean, of course, that all research methods

can be integrated at the substantive or first order level. It does mean, to the extent

that inquiry is progressing, that they can be rationally compared at the superempirical

level, and integrated into a broader, coherent framework. The relativity of touchstone,

and of alternative theories, is not a radical atomized relativity; it is relativity within a

developing epistemic whole which is embedded in the natural world.

Touchstone is relative, but far from thereby innocuous. Running through and unifying

all touchstone are the coherence constraints imposed by our naturalism, a naturalism

which forces theories into a testing relation with a real world, with the facts of the

matter. Therefrom comes its bite. As natural beings in the natural world, with survival

needs and evolutionary and cultural constitutions, we are not free arbitrarily to dispose

of a given theory we are employing to order our behavior and solve our problems, unless

we have an alternative theory or give up on the job. The exigencies of real practice,

including epistemic practice (e.g., science or teaching) demand theoretic response, and

where there are alternatives, choice. To make the choice, we have no way of deciding

(short of random response which in itself makes certain theoretical assumptions) other

than by comparing the alternatives according to criteria with which they are logically

compatible or which they share by way of logical commitment. Coherentism links a

theory of epistemic progress with our knowledge of the natural, or physical, world.

Epistemic progress is a natural process of which, if our naturalistic epistemology is

sound, a physical account is possible.

Reductionism

As presented here, physicalism is foremost an epistemological theory flowing from

an acceptance of strong coherence requirements, chiefly that psychology, social science

and educational inquiry be shown to be at least logically consistent with and preferably

reducible to physical theory. Hence theories of mental states, if correct, will be reducible

to neuroscience. The judgment in favor of physical theory derives from historical and

pragmatic considerations. In short, physical theory is our most powerful body of theory,

with by far the most sophisticated account of any relation to the real world so farprovided by any body of theory. This general claim has been powerfully argued by

P.M. Churchland (1979).

As such, our physicalism is to be distinguished from another, antireductionist,

position prevalent in cognitive science - functionalism - which can be interpreted

Page 13: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 13/97

Beyond Paradigms 513

physicalistically. For functionalism, psychological theories of mental states are emergent

from but not reducible to neuroscience, since cognition involves semantic and logical

representations, whereas the study of neural structures deals only in causal relations.

Psychological states are functions of the causal role they play in the brain. Just asthe functional states of a computer are realized in electrical circuits, so mental states

are realized in the structures of the brain. Their emergent functions, however, cannot

be explained neurally (Dennett, 1978; Fodor, 1975; Pylyshin, 1984). As against this

espousal of the autonomy of psychology, the coherentist maintains that reduction to

physical theory is possible, and that to adopt it as a working hypothesis is the more

constructive alternative.

On the coevolutiona~ pi@re of science, psychotogy and neuroscience should each be vulnerable todisconfirmation and revision at any level by the discoveries of the other. And when inquiries converge

on a subject matter, as for example they do on Iearning, memory, attention and perception, eachshould be open to the discoveries of the other. The isolation of psychology from the disconfirmatoryevidence of neuroscience would be a mistake, because in general it is such susceptibility that keepsa science honest. Short run isolation of a science while it works up a head of steam is one thing,but isolation in the long run, isolation in princip le, is quite another. . . . The unity of science isadvanced as a working hypothesis . . because theoretical coherence is the “principal criterion of

belief-worthiness for epistemic units of all sizes from sentences on up” [P.M. Churchland, 19801.Once a theory is exempt from having to cohere with the rest of science, its confirmation ledgeris suspect and its credibility plummets [P.S. Churchland, 1986, p. 3761.

Our contention, then, is that the unit of science implies the reducibility of psychology

and the social sciences to physical theory. But what is meant by “reduction to physical

theory”‘? As P.S. Chur~hland comments, the word “reduction” has “a bewildering variety

of uses, many of which have connotations of insult and abuse.” (P.S. Churchland, 1986,

p. 278). Setting these aside, the most important point to make is that reduction is a

relation between theories, not between entities. Above all, we should be clear that

the physicalist reductionist is not saying that culture is reduced to behavior, nor the

mind to the brain, but that theories of culture are reduced to theories of behavior, that

theories of the mind are reduced to theories of the brain. In other words, reduction is

an epistemological, not an ontological, relation - though it has ontological implications.

It concerns relations between different sets of claims about what we know and what we

can know. For example, the theory of optics might be claimed to reduce to the theory of

electromagnetic radiation. So far as the human mind and human culture are concerned,then, the question is whether some theory of mental states is reducible to a theory

of the working of neuronal ensembles, or whether a theory of symbolic relations and

representations is reducible to a set of dispositions to behavior.

Intertheoretic reduction leads to two kinds of outcome generally considered advances

in science; explanatory unification and ontological simplification. The simultaneous

achievement of unity and simplicity, both dimensions of coherence, is a clear achieve-

ment. Here reduction can shade into another epistemically progressive step: the

elimination of one theory by another. Consider the fate of the caloric theory of

heat, superseded by the kinetic theory. The latter theory does not identify caloric

with molecular kinetic energy; its implication is that there is no such thing as caloric.The same goes for the phlogiston theory of combustion and the demonic possession

theory of nervous disorders.

Reduction is the tightest instance of theoretic co-evolution, and often the unexpected

outcome of it. The immediate methodological injunction of the reductionist is to set

Page 14: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 14/97

514 J. C. WALKER

up a co-evolutionary framework. Even when reduction does not eventuate, this is an

epistemically progressive step. Speaking generally and in disciplinary terms, it is clear

enough that the histories of physics and chemistry, astronomy and dynamics, the theory

of infectious disease and microbiology, to name but a few, demonstrate the benefits of

co-evolution, as “discoveries at one level often provoke further experiments and further

corrections at the other level, which in turn provoke questions, corrections and ideas for

new explorations” (P.S. Churchland, 1986, pp. 363-364).

The contemporary physicalist account of intertheoretic reduction differs from its

logical empiricist predecessor in denying that the phenomena identified by the reduced

theory must be thoroughly correlated with those in the reducing theory. Logical

empiricists used various devices, especially bridge principles connecting the two sets

of phenomena, to achieve this correlation. Hooker (1981) points out that the history

of science provides no clear examples of this happening in practice. Rather, what is

reduced is not the old theory, but a corrected version of it, with the limiting case

being elimination. Whereas virtually nothing of the caloric theory of heat survived,

and a certain amount of classical dynamics, most of the theory of optics lives on.

Reductions, in other words, may be more or less “smooth”, or more or less “bumpy”.

This unevenness of reduction is a further demonstration of the need for insisting that it

is not phenomena, but theory, which is reduced - a point of significance for the prospect

of reducing mental to neural states and processes. It is also a point implied by scientific

realism: the world does not change just because our theories do, notwithstanding the

effect people’s theories can have in their interaction with phenomena of all kinds.

Intertheoretic reduction has an explanatory function. The new, reducing theory, by

advancing on the old, will explain the successes and failures of the old, preserving its

successes within a richer framework of understanding. This is illustrated vividly by

Hooker (1981, p. 49) in his account of thermodynamics and statistical mechanics.

As we have seen, the process of reduction is not necessarily an all-at-once affair. It

occurs through the co-evolutionary development of theories, in which extensions and

corrections are made to more than one theory as inquiry proceeds. P.S. Churchland points

out that this is already happening in the theoretic interanimation of neurobiologists,

psychologists and neurologists in research on memory, attention and learning, with

the possibility remaining open of behavioral observations from psychology reducing

to neurobiological hypotheses, but meantime both contributing to the wider programof understanding how the human information processing, storage and retrieval system

works. The field has a strikingly interdisciplinary character. It “has in the last twenty-odd

years become a classical exhibit of productive research on a nervous system capacity

at many levels at once” (P.S. Churchland, 1986, p. 268). Hence, granted that it would

be premature for physicalists to claim imminent reductions across the board, it would be

equally presumptuous for a believer in the autonomy of psychology to reject theoretical

co-evolution and the possibility of a reductionist outcome.

it would be simply boneheaded for a cognitive psychologist working on learning and memory

to refuse to care about animal models, pathway research, clinical cases, imprinting in chicks, and

habituation in Aplysia. We simply don’t know remotely enough yet to know what is not relevant[P.S. Churchland, 1986, p. 3731.

And, of course, science is always unfinished, even in cases of reduction. What is

reduced is one theory of phenomena as currently understood to another theory of

Page 15: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 15/97

Beyond Paradigms 515

phenomena as currently understood. Since our epistemology is of a piece with our

theory of the natural world, and not a form of first philosophy, it too is a candidate

for continuing evolution, in concert with the rest of our web of belief, as well as for

intertheoretic reduction.It is from this perspective that we should understand the evolution of human

knowledge from commonsense or “folk” theory to science. We have moved from a

folk to a scientific physics, from a folk to a scientific biology. At present, however,

much educational inquiry is conducted within the framework of folk psychology and

sociology, describing and explaining human behavior in terms of beliefs and desires,

hopes, disappointments and fears. Intentions are explained by reference to belief-desire

combinations, and beliefs by reference to perceptions and inferences. Countless everyday

platitudes - generalizations about human nature and behavior - hang together as a

theory of internal states and their interaction with the environment, especially other

people. The crucial question is whether this body of theory is reducible to a scientificaccount. The considerable problems standing in the way of such a reduction - not

least the intentionality of basic concepts such as “belief” and “desire” (P.M. Churchland,

1981) - has led contemporary physicalists to be sceptical about the prospects of such

a reduction. Stich (1986) argues that the concept of “belief” has no place in cognitive

science, P.M. Churchland that the probability of elimination of folk psychology is high,

given the success of an explanatory neuroscience (P.M. Churchland 1988, pp. 46-47)

and P.S. Churchland that since folk psychology is so clearly amenable to scientific

improvement, “what will eventually reduce to neuroscience are generalizations of

scientific psychology that have evolved a long way from the home ‘truths’ of extant

folk psychology” (P.S. Churchland, 1986, p. 312),

Given our present dependence on folk psychology, it is helpful to approach the

issue from the other end of the epistemic scale, and ask what prospects there might

be for developing, not just an account of psychological theory, but of all theory, in

neurological terms. If there is a way forward - and an answer to functionalism -

here, we may be able to develop a co-evolutionary theoretical framework within which

the desired refinements or eliminations of folk theory emerge. There are signs that this

is a promising strategy.

The first point to make is that it is possible to show how theories can be represented

naturalistically in neural terms, how a fully naturalized epistemology might be possible.So far, in presenting a coherentist epistemology, we have spoken of our knowledge

as “theories”, “hypotheses”‘, and “claims”, with the implication that these are stated

symbolically, for example in sentences. But if in our reductionist account knowledge

is in fact brain states, what does it mean to say that sentences exist in the brain, if

indeed they do? In short, how can we reduce a representational account of knowledge

to a physical account of brain processes, and how can we relate “knowing that”, or even

“knowing how”, to a causal theory of learning? Apparently we need a non-symbolic,

non-sentential account of knowledge.

Drawing extensively on cognitive science, particularly cognitive neurobiology, P.M.

Churchland argues that such an account is in prospect, (His ideas have been given aninitial educational application in Evers, [1990a].) Churchland points out that recent

work in neurobiology, such as the accounts of cognition in studies of parallel distributed

processing, is enabling us to understand knowledge as biological information processing,

in particular, pattern recognition. Salient instances are evident in the development of

Page 16: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 16/97

516 J. C. WALKER

models of neural networks by researchers in artificial intelligence, which have tended to

supersede emphases on program writing as ways of understanding human information

processing. Such artificial neural networks model salient features of the brain’s neuronal

organization. Here I shall not duplicate the account of Churchland’s “connectionist”theory provided by Lakomski in her contribution to the present volume (p. 537), which

should be consulted before reading on. In particular, an understanding of the notion of

a prototype vector is presumed in the following exposition.

Evers (1990b) has pointed out that there could be a vast dividend for educational

theory if learning and cognition are matters of pattern recognition, pattern association

and pattern processing. To date the bulk of our theorizing about learning and cognition

has been conditioned by our linguistically (and so sententially) driven theories of

language as serial and logico-rational. This leads to a focus on serial algo~thms in

teaching mathematics and science, for example. Yet we know that there are people

who can multiply large numbers correctly in an instant and cannot be using serial

algorithms. Consider idiots savants. (See the movie Ruin Man.) Since the brain is

an enormous pattern associator, it is possible that such people have hooked into its

“machine language” for “direct processing”. The truly extraordinary possibility is that

we might one day be able to devise a pedagogy that allowed people’s learning to access

more directly the actual pattern processing features of brains, or restructure curricula

to reflect key patterns in knowledge so that learning within and between traditional

subjects is driven by considerations of pattern association, with logical structure being

a vital but derivative feature.

Or consider the nonsentential learning processes characteristic of much, probably

most, cultural transmission, and therefore of informal - and (embeddedly) formal- education. Typically, these are theorized mentalisticaliy by anthropologists and

qualitative educational researchers as symbol systems, in the manner of symbolic

interactionism, semiotics, and so on. The mentalistic predicates used to describe

cultural patterns are even less convincing devices than the sophisticated apparatuses

of linguistics for explaining learning.

The connectionist account provides Churchland with a way of naturalizing explanation.

First, activated prototype vectors constitute not just the creature’s recognition but its

understanding of the objective situation, which is reflected in its behavior.

Explanatory understanding consists in the activation of a specific prototype vector in a well trained

network in the apprehension of a problematic case as of a general type, a type for w&h the

creat ure has a det ai l ed and wel l i nformed rep~esent ut ~on. Such a representation allows the creature

to anticipate aspects of the case so far unperceived, and to deploy practical techniques appropriate

to the case at hand [P.M. Churchland, lYX9, p. 2101.

This not only enables us to account for depth and breadth of understanding by

reference to the degree of experience, practice and training, even though different

individuals may understand and classify a situation in the same basic way. More

importantly, it enables us to provide a unified theory of explanation.

One prominent fact, ill addressed by any existing account of explanation, is the variety of differenttypes of explanation. We have causal explanations, functional explanations, morat expianations,

derivational explanations, and so forth. Despite some procrustean analytical attempts, no one of

these seems to be the basic type to which all of the others can be assimilated. On the prototype

activation model, however, we can unify them all in the folio~~ing way. Explanatory understanding

Page 17: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 17/97

Beyond Paradigms 517

is the same thing in all these cases: what differs is the character of the prototype that is activated

[P.M. Churchland, 1989, p. 2121.

The account is not just descriptive, either; but normative also. It addresses coherencecriteria, in particular simplicity - in supplying a reductive account of explanation - and

in explanatory power - in explaining the physical basis for all types of explanation. Thus

contemporary physicalism is not open to the charge leveled against the logical empiricists,

that they were hamstrung by a commitment to reducing all forms of explanation to one:

the deductive nomological model (Margolis, 1978; for a rebuttal see P.M. Churchland,

1980). Churchland’s reduction of all forms of explanation enables us to take a more

relaxed approach than the logical empiricists: we take whatever generalizations we

can get, and test them out against our coherentist criteria. Debates among educational

inquirers about the compatibility or incompatibility of logico-rational explanation and

interpretive understanding will need to be seen in a new light, a light which shouldencourage efforts to provide a unified account not only at the microphysical but at the

sentential-theoretical level. It is unlikely that if forms of explanation can be harmonized

at the former level they are incompatible at the latter. Good heart is given to those, like

Campbell, who assert their intellectual interdependence.

Reasons for scepticism about reduction abound. We have so far concentrated on

abstract epistemological discussion and on the individual mind-brain. Granted, however,

that education is a social and cultural as well as an individual process, it is as well to

consider the bearing of our coherentist and reductionist naturalism on our understanding

of culture and society. P.M. Churchland has considered two particular sceptical arguments

advanced from a cultural perspective. They emphasize the importance and complexity ofcultural contexts in our understanding of human nature, and the variability and plasticity

of human consciousness and behavior. First, it could be argued that in focusing on brain

structure and processes, our naturalism is limited to the microscopic level, whereas

much of what constitutes consciousness and behavior derives from the relations between

individual humans, and from cultural practices and institutions. Hence a reductionist

naturalism is explanatorily limited and deficient. Second, human beings are as varied

as the range of human cultures and the scope for human plasticity that cultures permit.

Part of this plasticity is to be explained by the reality of human self-determination,

creative activity and reflexive action. Moreover, cultures evolve, generating conditions

for further human variation.In contending that his naturalist connectionism - the network theory - can explain

these facts, P.M. Churchland shows that each argument rests on a misunderstanding

of the naturalist position. Human plasticity and the determination of consciousness

by the cultural surround are essential components of naturalism (P.M. Churchland,

1979). Furthermore, he points out that the network theory, interestingly, shares some

basic tenets with prominent anti-naturalist philosophy. There is touchstone with the

continental tradition’s insistence on the non-propositional or non-sentential character

of most human knowledge and the emphasis on human agency in continental and recent

analytic philosophy - notions shared widely in the social sciences and in educational

theory and research.

With the network theory of the brain, naturalistic reductionists can explain the

plasticity of human nature by delineating the sustaining underlying mechanisms, the

dimensions on which change is possible and the forces driving changes in cognitive

Page 18: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 18/97

518 J. C. WALKER

configuration. Variation is possible because within the matrix, whose weights “determine

what features in the world one responds to, which values one embraces, and which range

of behaviors one commands” (P.M. Churchland 1989, p. 131). The sheer number of

neurons and synaptic connections in the brain yield a possible number of cognitiveconfigurations well in excess of the total number of elementary particles in the universe.

This is more than enough to explain human plasticity.

The cultural embedding argument emphasizes that explanations for human behavior

must include cultural features. We respond not just to light, sound and heat, but to

meaningful language, culturally significant facial expressions, moral judgments and

obligations, social customs, and so on - features of great subtlety and complexity. In

response, Churchland reiterates the powerful capacity of trained networks to recognize

and represent abstract and subtle features. What is important is the training, that there

is an appropriate “teacher” to shape representations of and responses to the cultural

environment. Examples are to hand of how networks can be trained to recognize visual,

logical and linguistic patterns.

It should therefore come as no surprise that a human infant comes to recognize and respond to

cultural features that resist definition in terms of notions like mass, charge, length and so forth,

because the most dominant “teacher” in the local environment is the culture into which the infant

is born. The set of weights that constitutes a child’s developing consciousness is continually being

shaped by the linguistic, conceptual and social surround. The developing brain comes to reflect the

elements and structure of that surround in great detail, for that is what networks do. What shapes

them is the stimuli they typically receive, and the subsequent corrections in their responses to which

they are typically subject. Small wonder that we become attuned to the categories of the culture that

raises us [P.M. Churchland, 1989, p. 1331.

It would be to miss the point to say that this concedes that the explanation of human

behavior must rely more on features of the cultural than of the microphysical level. To

pit the cultural against the physical is to substitute a false dichotomy for the relevant

distinction, which is between simple, context free features and complex, highly context

dependent features. The point is that it has already been demonstrated how a system

of physical elements can represent and respond to complex contexts, including cultural

contexts, and learn through so doing, its cognitive configurations developing in complex

and subtle ways through adjustment of weights. Naturalistic reduction explains how this

can be so.

Conclusion

If the drift of this essay is sound, educational researchers, especially as a community

with traditions, practices and organizations, should give some priority to attention to

two sets of epistemic-social relations. First, there should be a sustained emphasis on

strengthening relations between the various specializations and schools of thought in

education itself. Second, there is the exciting prospect of strengthened links with

basic research in cognitive science, and participation in the lively discussion of the

philosophical and methodological issues which are emerging. As basic research throws

up more discoveries with potential practical applications, educational inquirers have

much to contribute to our global knowledge of learning.

Our first task is to fasten onto existing practices of co-evolutionary research,

Page 19: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 19/97

Beyond Paradigms 519

and to foster, through institutionalized practices as well as cultural change, a co-

evolutionary framework which will be recognized and respected by educational inquirers

generally. One part of this task could well involve taking up Campbell’s notion of the

interdisciplinary specialist, providing for the production of such people in our research

institutions and organizations and for the presentation of their contributions in our

conferences and journals. Closer contact with interdisciplinary specialists in basic fields

such as cognitive science would facilitate this.

To underpin these efforts intellectually we need a reflexive naturalistic epistemology,

with the unity of science as a constructive working hypothesis, uniting work on several

fronts simultaneously, contextualizing educational research within the total scientific

enterprise, and fostering a rehabilitated regard for science itself.

References

Campbell, D. T. (1988). Methodology and epistemology for social science. Chicago: University of Chicago

Press.

Campbell, D. T. & Fiske, D. W. (1959). Convergent and discriminant validation by the multitrait-

multimethod matrix. Psychol ogical Bul l efi n, 56, 81-105.

Churchland, P. M. (1979). Scienti fi c reali sm and the pl asficif y of mind. Cambridge: Cambridge University

Press.

Churchland, P. M. (1980). Joseph Margolis: Persons and minds: The prospects of a non-reductive

materialism. Di al ogue, 19, 461-79.

Churchland, P. M. (1981). Eliminative materialism and the propositional attitudes. Journal of Phil osophy,

78(2).

Churchland, P. M. (1985). The ontological status of observables: In praise of the superempirical virtues. InP. M. Churchland & C. A. Hooker (Eds.), Images of science. Chicago: University of Chicago Press.

Churchland, P. M. (1988). M att er and consciousness: A contemporary int roduction to the phil osophy of

m i n d . Cambridge, MA.: MIT Press.

Churchland, P. M. (1989). A neurocomput at i onal perspecti ve: The natur e of mi nd und the strucmre of

science. Cambridge, MA.: MIT Press.

Churchland, P. S. (1982). Mind-brain reduction: New light from philosophy of science. Neuroscience, 7(5),

1041-1047.

Churchland, P. S. (1986). Neurophil osophy: Toward a unif ied science of the mind-brain. Cambridge, MA.:

MIT Press.

Dennett, D. C. (1978). Brai nstorms: Phil osophical essays on mi nd and psychology. Montgomery, Vt.:

Bradford Books.

Denzin, N. K. (1978). The research act: A t heoreti cal i ntr oducti on to sociol ogical methods (Second edition).

New York: McGraw-Hill.Duhem, P. (1954). The aim and structure of physical theory (Translated from the French second edition

(1914) by P. P. Wiener; first edition 1906). Princeton: Princeton University Press.

Evers, C. W. (1987a). Epistemology and the sructure of educational theory: Rehections on the

O’Connor-Hirst debate. Journal of Phi losophy of Education. 21(l), 3-13.

Evers, C. W. (1987b). Naturalism and philosophy of education. Educati onal Phil osophy and Theory, 19(2),

11-21.

Evers, C. W. (1990a). Educating the brain. Educati onal Phi l osophy and Theory , 22(2), 65-80.

Evers, C. W. (1990b). Personal communication.

Evers, C. W. & Walker, J. C. (1983). Knowledge, partitioned sets and extensionality. Journal of Philosophyof Education, 17(2), 155-170.

Feyerabend, P. K. (1975). Against method: Out li ne of an anarchisti c f heory of know ledge. London: NLB.

Fodor, J. A. (1975). The language of thought. New York: Crowell.

Hirst, P. H. (1966). Educational theory. In J. W. Tibble (Ed.), The Studyof

Education. London: Routledgeand Kegan Paul.

Hooker, C. A. (1981). Towards a general theory of reduction. 1. Historical and scientific setting; 2. Identity

in reduction; 3. Cross categorial reduction. Di al ogue, 20, 38-59; 201-236; 496529.

Kuhn, T. S. (1970). The structure of scienti f i c revolut ions (Second edition). Chicago: University of Chicago

Press.

Page 20: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 20/97

520 J. C. WALKER

Lakatos, I. (1970). Falsification and the methodology of scientific research programmes. In I. Lakatos &

A. Musgrave (Eds.), Cr~ficism and t he growt h of know ledge. Cambridge: Cambridge University Press.

Margolis, J. (1978). Persons and mi nds: The prospects of a non-r educti ve mat eri al i sm (Svnt hese Li brary . Vol.

125, Boston St udi es i n t he Phi l osophy of Science, Vol: 57). Dordrecht: Reidel. .Naeel. E. (1961). The str uctu re of science. New York: Harcourt. Brace & World.Opienheim, P. ‘& Putnam, H. (1958). Unity of science as a working hypothesis. In H. Feigl (Ed.), M innesota

studi es in t he phi l osophy of sci ence, 2, 3-36.

Popper, K. R. (1972). Obj ecti ve know l edge: An evohuionary approach. Oxford: Oxford University Press.

Popper, K. R. (1977). Part 1 of K. R. Popper & J. C. Eccles, The sel f and i ts brain. Berlin:

Springer-International.

Pylyshyn, Z. (1984). Computati on and cognit ion. Cambridge, MA.: MIT Press.

Quine, W. V. (1953). Two dogmas of empiricism. In W. V. Quine (1961). From a logical point of view.

Cambridge, Mass.: Harvard University Press.

Quine, W. V. (1960). Word and object. Cambridge, MA.: MIT Press.

Quine, W. V. (1969). Epistemology naturalized. In W. V. Quine, Ont ologi cal rel ati vi ty and other essays.

New York: Columbia University Press.

Quine, W. V. (1970). Phil osophy of logic. Englewood Cliffs, N.J.: Prentice-Hall.

Quine, W. V. (1977). Facts of the matter. In R. W. Shahan & K. R. Merrill (Eds.), American philosophy.Norman: University of Oklahoma Press.

Quine, W. V. & Ullian, J. S. (1978). The w eb of beli ef (Revised edition). New York: Random House.

Stich, S. P. (1986). From fol k psychol ogy t o cogni t i ve science: The case against bel i ef. Cambridge, MA.:

MIT Press.

Walker, J. C. (1985a) Materialist pragmatism and sociology of education. Bri ti sh Journal of Sociol ogy of

Educati on, 6( 1)) 55-74.

Walker, J. C. (1985b). Philosophy and the study of education: A critioue of the commonsense consensus.

The Austra l i an Journal of Educati on, 29(2), 101-114.

Walker. J. C. (1985~). The ohilosooher’s touchstone: Towards pragmatic unitv in educational studies.

Journal of Phi l osophy o f Educati on: 19(2), 181-198.. -

Walker, J. C. & Evers, C. W. (1982). Epistemology and justifying the curriculum of educational studies.

Bri t i sh Journal of Educati onal St udi es. 30(2), 213-229.

Walker, J. C. & Evers, C. W. (1984). Towards a materialist pragmatist philosophy of education. EducationResearch and Perspect i ves, 11(l), 23-33.

Walker, J. C. & Evers, C. W. (1988). The epistemological unity of educational research. In J. P. Keeves

(Ed.). Educati onal research methodol ogy, measurement and eval uati on: An int ernati onal handbook.

Oxford: Pergamon Press.

Biography

Professor James Walker is Dean of the Faculty of Education in the University

of Canberra, Australia. He has written numerous articles in the philosophy of

education, and has published in curriculum theory, educational policy and educationaladministration, and research methodology. His work in educational ethnography, a five

year study of transition from high school, is published in L outs and L egend s and related

articles. His current research interests include democratic philosophy of education, and

the relation between knowledge and practice in professional education.

Page 21: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 21/97

CHAPTER 2

TOWARDS A COHERENTIST THEORY OF VALIDITY

COLIN W. EVERS

School of Graduate Studies, Faculty of Education, Monash University, Clayton, Victoria,

3168, Australia

Abstract

Theorizing about validity has for a long time been dominated by epistemological assumptions

associated with logical empiricism. In the 1960s these assumptions were systematically

challenged by a paradigms construal of scientific knowledge. More recent educational

research methodology has been much influenced by this paradigm’s perspective and itsassociated epistemological relativism and pluralism, and there is now considerable debate

about how to understand the familiar terms of educational research appraisal within such

an epistemological setting. This chapter offers a coherentist epistemological perspective on

some features of this debate concerned with understanding validity.

Although subject to differing demands of practice, theory of research and theory

of validity have been linked by shared epistemological assumptions, especially those

assumptions that derive from the period of dominance of philosophy of science in

epistemology. In terms of its impact on educational studies, we may take this period as

beginning with logical empiricism in the late 194Os, through the paradigms era arising out

of the work of Kuhn and still dominating educational studies today, to recent attempts to

apply coherence theories of knowledge and justification to educational theory building

and adjudication.

In what follows, I shall trace some of these epistemological influences on ways of

understanding the validity of tests, experiments, and inquiry procedures. Although I

think the arguments mounted by Kuhn (1962), Hanson (1958), Feyerabend (1962), and

others against logical empiricism are decisive, I regard the resulting paradigms construal

of research and methodology in education as mistaken. These arguments are more a

reducti o ad absurdum of narrow empiricism than a prospective methodology in their own

right. Instead, I shall defend a coherence theory of justification and elaborate some of its

consequences for understanding research and validity. Finally, if both logical empiricism

and paradigms theory are false, then any successful applications of methodologies based

on those epistemologies must be drawing on coherence justification somewhere. I argue

that this turns out to be the case, implicitly in Cronbach’s work on validity, early and

late, and explicitly in Campbell’s more recent work on theory of research.

521

Page 22: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 22/97

522 C. W. EVERS

Validity and Logical Empiricism

In 1954, a joint committee of the American Psychological Association, AmericanEducational Research Association, and the National Council on Measurements Used

in Education, produced a set of Technical Recommendat i ons for Psychological Tests and

Diagnostic Techniques. The “essential principle” behind the document is that

‘a test manual should carry information sufficient to enable any qualified user to make sound

judgments regarding the usefulness and interpretation of the test’ (Technical Recommendations,

1954, p. 202).

Since the usefulness of a test and its manual is partly a function of the degree to which

a test achieves its aims, the question of test validity is of prime importance. Roughly

speaking, validity in this context is a matter of the extent to which a test (or instrument,

or procedure) measures what it purports to measure. More generally, it is concernedwith the soundness of the inferences that can be made from test scores, or results.

The Technical Recommendations identifies four aims of tests and therefore four clusters

of possible inferences, or types of validity: content validity, which aims to measure

present performance by sampling an identified universe of performance; predictive

validity, concerned with future performance; concurrent validity, like predictive validity

but matched in the present rather than the future against some outside criterion; and

construct validity, where the trait or quality being measured is itself defined in terms

of the test. In the first revision of these recommendations, the 1966 Standards for

Educat ional and Psychological Test s and M anuals, predictive validity and concurrent

validity were collapsed into what was called “criterion-related validity”, thus yielding

three types of validity (Standards, 1966, pp. 12-13).

Focusing for a moment on tests, the interesting epistemological question is how we

can ever know whether our inferences from scores are sound. Strictly speaking, for a

score to count as a score, it must exist under some description. But descriptions are

comprised of words (or other symbolic tokens) which in turn must be meaningful in

order to sustain inferences. Ordinarily, this does not pose problems since most of the

words we use in everyday discourse are defined contextually, in terms of other words.

However, logical empiricism places severe restrictions on the adequacy of definitions.

Contextual definition is certainly part of the story, but eventually for meanings to be

known there must be some correspondence between some words and empirical evidence,

or observations (Feigl, 1950). Ostensive definition will do, but it seems to work best for

words with the most modest inferential connections. On the other hand, words that are

the richest in inferential structure, that are embedded in the most central parts of a

theory or theoretical context, seem to be least obviously connected to experience.

Empiricism’s compromise between inferential richness, or theoreticity, and empirical

adequacy is operational definition. Thus the theoretical term “length” would be

defined in terms of the sequence of observable operations used to carry out a certain

measurement procedure. Of course, if descriptions of operations are also theoretical

then we need to repeat the process until we reach observations sufficient for empirical

meaning. This is the problem with tests, which can actually be regarded as operational

definitions of scores. In the case of concurrent and predictive validity the empirical

content of the theoretical terms describing the scores is given by stipulating some

antecedently meaningful criterion. In the case of content validity, we are presumably

Page 23: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 23/97

Beyond Paradigms 523

dealing with just some subset of an antecedently meaningful universe of examples. (See

Kerlinger, 1964, pp. 444449).

Even with these apparently simple cases, we now know that there are epistemological

difficulties. Take again the example of defining “length”. Presumably the operation takesplace at some particular time and place using some particular set of singularly specified

apparatus, including a rule,, Must every act of measurement use this particular rule

on penalty of yielding a different definition? If the answer is yes, then we are looking

at different, non-equivalent, definitions of length for every rule. If the answer is no,

then we must have some way of specifying an equivalence class on rules that preserves

sameness of operational definition. Something sufficient to permit us to say that rule,

= rule, = . . _ = rule, would do nicely. However, this amounts to the task of giving an

operational definition of “same length”. Such a task cannot be done for an indefinite n

unless we make use of some notion of “standard rule”. Standard rules do exist, of course,

but the considerations that go into their selection, namely those that will give generality

over time, place, and circumstance to the measurement of length, have long since outrun

the meagre resources of operational definition (Hempel, 1966, pp. 93-94).

The problem here is quite general, and has been noted by both Popper (1963,

pp. 44-45) and Quine (1957, p. 231). Namely, there is no such thing as a class of

similar objects. (For a proof of this theorem, see Watanabe, 1969 pp. 376-379.) As

Popper insists, similarity is always similarity-for-us. Since some similarity groupings

are essential for theorizing, given the weak naturalistic constraint that we have finite

learning capacities, operaticural definitions will reflect a prior theoretical decision to

group operations according to some weighting of features or saliences. In another

context, Popper calls these weightings “hypotheses”, and we can follow this usage

here. However, what this argument implies is a form of semantic holism. Observations

do not correspond one to one with theoretical terms to be defined, but instead distribute

their empirical content across the entire network of prior hypotheses and their inferential

contexts. Quine reaches this conclusion in his classic paper “Two Dogmas of Empiricism”

(1951). The upshot is that the epistemological demand for knowledge of the empirical

meaning of a term always outruns the resources posited for operational definition,

however simple the term. An absence of disagreement over the representativeness of

samples for content validity, or criteria with which predictive or concurrently tested

scores may be correlated is not a waiving of theoreticity so much as an indication ofshared, or t~~c~~t~~e, theory (Walker & Evers, 1988).

For construct validity, the Tec~nicQl ~ecu~~e~dutio~s document (1954) is less

sanguine; theory is acknowledged to intrude from the beginning:

To examine construct validity requires both logical and empirical attack. Essentially, in studies ofconstruct validity we are validating the theory underlying the test. The validation procedure involvestwo steps. First, the investigator inquires: From this theory, what predictions would we make regardingthe variation of scores from person to person or occasion to occasion? Second, he gathers data toconfirm these predictions [p. 2141.

The big advantage of construct validity, if it can be made to work, is that it promises

a way out of what Hempel (1965) calls the theoretician’s dilemma. Essentially, this isanother artifact of maintaining a sharp distinction between theory and observation. As

we noted earlier, where empiricist demands of definition can be met, theoretical terms

are invariably uninteresting. Where they enjoy extensive intertheoretic connections and

Page 24: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 24/97

524 C. W. EVERS

enter into a wide range of deductive relations, they are hard to define. If we could

validate a whole theory, that is, show that the theory describes what it purports to

describe, we can have both empirical adequacy and the kind of inferential richness

needed to develop fine grained concepts suitable for the social sciences. An addedbonus would be that troubles with the first three (or perhaps two) types of validity

which, because of the intrusion of theory, become species of construct validity, might

admit of resolution.

Can whole theories be usefully validated? In their important paper on construct

validity, Cronbach and Meehl (1955) take up the challenge. They begin the task

of specifying the logic of construct validation by setting out their philosophy: “The

philosophy of science which we believe does most justice to actual scientific practice

will now be briefly and dogmatically set forth” (p. 78). Not surprisingly, they offer a

version of logical empiricism. They define a nomological network as a theory comprised

of an interlocking network of laws. A network relates observables to each other, to

theoretical constructs, and constructs to each other. To count as science a construct

must figure in a network some of whose laws involve observables, and so on (Cronbach

& Meehl, 1955, pp. 78-79). Validating a theory boils down to demonstrating that the

network is warranted by empirical evidence. To counter the . . .

“toughminded”, who fear that allowing construct validation opens the door to unconfirmable test

claims the answer is that unless the network makes contact with observations, and exhibits explicit,

public steps of inference. construct validation cannot be claimed [Cronbach & Meehl, 1955, p. 791.

As one might expect, worries over the relationship between evidence and meaning,

with an attendant shift to holism, apply equally to the justification of theories. The

business of exhibiting explicit, public steps of inference that were also epistemologically

compelling, came under great pressure in the 1960s and eventually led to the demise of

logical empiricism.

Epistemology and Logical Empiricism

For assessing its merits as an epistemology it is useful to see logical empiricism as an

example of foundationalism. Generally speaking, foundational justification proceeds firstby identifying an epistemically privileged subset of knowledge claims and then by arguing

that this subset somehow warrants all other justified knowledge claims. An early version

of foundationalism, which I would call “strict foundationalism”, was championed by the

empiricist philosopher David Hume in the eighteenth century, wherein knowledge was

reckoned as justified only if it was deducible from the privileged subset of sensory

experiences. With the arrow of deducibility going from a finite number of singular

sensory impressions Hume had no trouble showing that no general or lawlike empirical

claims are ever justified. To go from a finite set to an infinite set some principle of

induction is required. But the principle of induction must itself be warranted. We may

indeed have foundational evidence for such a principle - perhaps in the past it hasalways held - but we need the same principle to deduce that it will hold for future,

or unobserved, cases beyond the finite range of foundations. And such an argument is

circular.

This difficulty is known as the problem of induction and it renders problematical

Page 25: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 25/97

Beyond Paradigms 525

all attempts to make unrestricted empirical generalizations from a finite observation

base. In Campbell and Stanley’s (1963) discussion of factors jeopardizing the validity

of experimental and quasi-experimental designs for research, they note its effect on

external validity by entering a caveat:

This caveat introduces some painful problems in the science of induction. The problems are painful

because of a recurrent reluctance to accept Hume’s truism that induction or generalization is never

fully justified logically. Whereas the problems of internal validity are solvable within the limits of the

logic of probability statistics, the problems of external validity are not logically solvable in any neat,

conclusive way. Generalization always turns out to involve extrapolation into a realm not represented

in one’s sample. Such extrapolation is made by assuming one knows the relevant laws [p. 171.

Three initial comments need to be made here. First, the external validity of

experiments is being contrasted with their internal validity. Basically, internal validity

is concerned with whether an experiment is significant in the production of someanticipated outcome, whereas external validity is concerned with the generalizability

of an experimentally produced effect. Second, from an epistemological point of view,

the justification of external validity for experiments is the same as the justification of

construct validity for tests. As Cronbach and Meehl(1955, p. 89) note, “the investigation

of a test’s construct validity is not essentially different from the general scientific

procedures for developing and confirming theories.” And finally, one might expect

the ubiquity of theory to blur the distinction between internal and external validity.

Logical empiricists, and Vienna Circle positivists before them, had a partial response

to Hume’s argument; namely, to alter the direction of deduction between knowledge

and its foundations. It is knowledge claims, grouped systematically into theories, or

networks, that imply privileged foundations, in this case observation reports, rather

than vice versa. For this sort of broad foundationalism, the relation of justification

between observation and theory is testability, where testability is thought to involve

two components. Observation reports that match those which may be deduced from

a theory are said to confirm the theory, and observation reports which fail to match

expectations fulsifv, or disconfirm, it. A theory may thus be regarded as validated to

the extent that it has been subject to many tests which have confirmed, but in no way

disconfirmed, it.

Of course, in practice the testing of theories is more complex, but it is the complexity

of practice that ultimately tells against logical empiricism. Consider confirmation and the

problem of induction. As there is only ever a finite number of confirming.observations,

theories will always be radically underdetermined by empirical evidence. We can fit an

arbitrary number of curves to a finite set of data points. Under these conditions the

notion of inductive support fails to have purchase as it is not clear which empirically

adequate but distinct theory is being supported. For example, Newtonian mechanics

enjoyed several hundred years of accumulated confirmations, yet it ultimately failed

to be validated not just on fine matters of detail but right through to its most central

theoretical categories. For all the evidence that confirmed it also confirmed relativity

theory. Cronbach and Meehl (1955, p. 87) are aware of the problem but end up running

together both cumulative inferential support and radical falsification: “Confidence in a

theory is increased as more relevant evidence confirms it, but it is always possible that

tomorrow’s investigation will render the theory obsolete.” There is also a puzzle over

what counts as relevant confirming evidence. In rigorous formulations of testability,

Page 26: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 26/97

526 C. W. EVERS

the deductive relations between theory and observation are defined in terms of the

truth functional material conditional. Since it is the truth values of sentences that are

semantically important, a non-black non-raven, for example a leaf, logically can confirm

the hypothesis that all ravens are black (Hempel, 1965).

Campbell, who had read Popper (1959) and Hanson (1958), was aware of these

problems and hedged against them in Experi ment al and Quasi-Experi ment al Designs

fo r Research (see also Campbell, 1984). There, the important evidential relation was

falsification, not confirmation: “The task of theory-testing data collection is therefore

predominantly one of rejecting inadequate hypotheses” (Campbell & Stanley, 1963, p.

35). They claim that technically speaking, hypotheses are never confirmed; rather they

are “probed” by the results of experiments. On this account, the chief strategy behind

successful experimental design is to limit the number of plausible rival hypotheses about

the role of an experiment in producing some particular result. So, randomization was

thought to render implausible some eight alternative hypotheses that threatened the

internal validity of experiments. Let us suppose, for the moment, that this is so, although

we can note that in a later work Cook and Campbell (1979) demonstrate a complexity

even with internal validity by producing some threats not amenable to randomization.

The complexity of the social world, together with a paucity of true generalizations,

would make a similar methodology for external validity very difficult. This is because

whole theories, or networks, of hypotheses imply observations, and falsification

distributes its bad news only disjunctively across a network. As Quine (1951, p.

43) has claimed, and as Campbell (1986a, p. 508) has later acknowledged, we can

hold true any claim, come what may, if we are prepared to make drastic enough

revisions elsewhere in the network. The Duhem-Quine thesis, as this result is often

called, renders exceedingly problematical any purported evidential relationship between

a hypothesis and falsifying observations.

One response is to note that the thesis fails to distinguish plausible from implausible

hypotheses. If falsification is avoided only by invoking implausible rival hypotheses then

it is as good as falsification outright. However, note also that plausibility is not an intrinsic

property somehow embedded in some hypotheses rather than others. Plausibility is

an epistemic notion and is therefore imputed relative to the prior assumption of

some theory. In short, however bad the observational news may be for Newtonian

mechanics, fromthat

perspective, time dilation, mass increases due to velocity, andcurved space-time are just implausible.

Cronbach and Meehl (1955) do not use explicit plausibility judgments as a device

to limit the range of construct validity threatening alternatives arising from negative

evidence. Their advice is more diffuse:

The choice among alternatives, like any strategic decision, is a gamble as to which course of action

is the best investment of effort. Is it wise to modify the theory? That depends on how well the

system is confirmed by prior data, and how well the modifications fit available observations. Is it

worthwhile to modify the test in the hope that it will fit the construct? That depends on how much

evidence there is to support the hope, and also on how much it is worth to the investigator’s

ego to salvage the test. The choice among alternatives is a matter of research planning [p. X4].

When thinking about gambling, investigators’ egos, and the nature of research

planning, remember that their paper was supposed to yield a research plan for making

explicit, public steps of inference necessary for claiming the validity of constructs.

Page 27: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 27/97

Beyond Paradigms 527

So far we have canvassed two familiar problems with logical empiricism’s account of the

justificatory relationship between theory and empirical evidence - underdetermination

problems with classical confirmation theory and complexity of test problems with

falsification. The final problem I want to raise challenges the whole point of foundationaljustification. Recall that foundationalism requires the identification of an epistemically

privileged subset of knowledge claims from which others derive their warrant. But what

guides the choice of such a subset? In the case of Hume’s classical strict foundationalism,

choice of foundation is guided by a theory of the powers of the human mind, notably

a theory of learning and cognition. Learning is occasioned by the receipt of sensory

impressions and cognition is partly a matter of the logical manipulation of these

impressions (Hooker, 1975). The trouble with such a theory is that it is not known

non-inferentially; it is not part of the foundations. Indeed it cannot be because it makes

general empirical claims about human learning, thus requiring inferential justification.

But if the selection of privileged knowledge claims depends on the use of non-privileged

theory, the structure of foundational justification collapses.

This argument also applies to broad foundationalism. For the choice of observational

evidence to test theories reflects theoretical beliefs about knowledge acquisition by

humans. By the same token, Campbell and Stanley’s plausibility judgments and

Cronbach and Meehl’s strategic decisions are likewise theory-laden. The correct solution

to this problem in my view, one defended by Quine (1969) and adopted with increasing

systematicity by Campbell, is to naturalize epistemology. If epistemology presupposes

theories of human learning and cognition why not just use the best theories available,

theories from science - psychology or cognitive neurobiology - rather than a priori,or armchair, theories? This sounds circular because the notion of “best” being employed

here is epistemic, so it will be one of the challenges of an alternative, non-foundationalist

epistemology to show that the circularity is not vicious.

Paradigms of Validity

All of the difficulties with logical empiricism we have canvassed have concerned

problems over the relationship between theory and empirical evidence. It would appear

that the matter of theory justification is not settled by evidence, however comprehensive.One conclusion drawn by a number of philosophers of science, for example Kuhn and

Feyerabend, is that if all the evidence there is for a theory is empirical kvidence, and

if empirical evidence can never be adequate for rational theory adjudication, then so

much the worse for the enterprise of rational theory adjudication. This is especially the

case where alternative theories are comprehensive enough to contain, or entail, theory

specific criteria for theory choice. What Kuhn calls “paradigms” provide a good example

of this:

In learning a paradigm the scientist acquires theory, methods, and standards together, usually in

an inextricable mixture. Therefore when paradigms change, there are usually significant shifts inthe criteria determining the legitimacy both of problems and of proposed solutions That

observation . provides our first explicit indication of why the choice between competing paradigms

regularly raises questions that cannot be resolved by the criteria of normal science . (scientists] will

invariably talk through each other when debating the relative merits of their respective paradigms. In

the partially circular arguments that regularly result, each paradigm will be shown to satisfy more or

Page 28: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 28/97

528 c. w. EVERS

less the criteria that it dictates for itself and to fall short of a few of those dictated by its opponent

[Kuhn, 1962, pp. 109-1101.

Here we have an argument for the inc~~~e~~~ru~il i~y of paradigms, for epistemic

relativism between fairly comprehensive theories. Moreover, belief in paradigm specific

epistemologies has a direct relevance for research methodology. Orthodox logical

empiricist theorizing about research designs, validity, and reliability, for example, is

akin to “normal science”. But there are other, alternative ways of conceiving research

and inquiry. The question of which is best cannot rationally be decided because the

epistemic notion of “best” is relative to each paradigm.

Lincoln and Guba (I 985) offer a detailed version of the paradigms thesis of educational

research as part of their defense of naturalistic inquiry. (Note that this is a different

sense of “naturalism” to that employed by Dewey, Quine, or Campbell, when they

speak of epistemology.) Thus, consider their discussion of what it takes to establish thetrustworthiness of an inquiry. “Trustworthiness” is an epistemic notion to do with the

warrantability of an inquiry’s findings or inferences. As such, standards of justification

will be paradigm specific. To demonstrate this, they consider answers to the following

four research questions:

(1) How do we show the “truth” of the findings of a particular inquiry?

(2) To what extent are these findings applicable to other contexts?

(3) Can the inquiry be replicated?

(4) How can we establish that the results are independent of researcher biases and

perspectives? (See Lincoln & Guba, 1985, p. 290.)

Within the logical empiricist paradigm - what they call positivism - we have four

familiar answers. Establishing internal validity is crucial for the first; external validity

matters for the second; reliability is what the third is all about; and the fourth concerns

objectivity. But this cluster of answers draws on a common set of epistemological and

metaphysical assumptions. For example, that there is a world “out there” which can be

known which corresponds to true claims, that the knower can be separated from the

known, that events are relatively separable and independent, that different events have

different causes, and that what happens in the world can be known in a way free from

value assumptions (Lincoln & Cuba, 1985, p. 28).

However, using arguments from underdetermination of theory, complexity of tests,

and theory-ladenness of observation, Lincoln and Guba both challenge the truth

of logical empiricism and maintain that its epistemologi~al standards are distinct,

indeed orthogonal to some alternatives. Within the paradigm of naturalistic inquiry,

the above four questions would be answered as follows: establishing credibility, not

internal validity, is vital for the first; the second is a matter of transferability, not

external validity; the third requires a case for dependability, not reliability; and the

last involves confirmability, not objectivity (Lincoln & Guba, 1985, pp. 301-327). There

is, of course, a detailed epistemological story to be told about the adequacy conditions

for establishing each of these criteria for naturalistic warrantability. Suffice it to note that

these are supposed to be relative to the naturalistic inquiry paradigm and not the logical

empiricist paradigm, which is simply inapplicable - being a different paradigm. The

upshot is that the traditional notions of validity appear to have integrity, or definition,

only within logical empiricism.

Page 29: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 29/97

Beyond Paradigms 529

One puzzle for the paradigms approach is whether, for example, an ethnographic

study can be said to describe something “out there”, whether the non-positivist study

can be said to correspond with how the world really is. If logical empiricism has a

mortgage on correspondence truth, realism, and objectivity, then in what sense might

the inferences and findings of non-positivist research paradigms be regarded as “true”?

If we accept the relativism of the paradigms thesis then there is an equivocation over

the word “true”. Its various senses would be paradigm relative. This might satisfy those

researchers who are prepared to acquiesce in subjectivism, but it creates difficulties for

critics of logical empiricism who want to argue for social change or improvement - for

example defenders of critical theory, feminist research, or action research. These critics

want to say that there are realities “out there” that are oppressive independently of how

victims mistakenly see matters, that limit human potential regardless of how ideologically

content we may feel with our lot, that need to be changed. (See, for example, Foster,

1986; Bates, 1983, in the critical theory tradition.) On this view, radical subjectivism

becomes a political stance in de facto support of an existing distribution and exercise

of power. However, since logical empiricism is also thought to be part of the political

problem, where are the solutions?

Lather (1986, p. 65) describes this dilemma as being caught between a rock and a

soft place; between the “unquestionable need for trustworthiness in data generated by

alternative paradigms and . . . the positivist claim to neutrality and objectivity.” She

proposes a solution that is aimed at reconceptualizing validity within a postpositivist

framework of interpreted data and researcher commitment. Essentially, validity of

research, interpretation, and coordinating background theory, is a matter of ensuring

the presence of self-correcting research procedures and practices. Examples include

triangulation of methods, expanding construct validity to include accounts of how data

figured in the transformation of theory, require face validity to reflect participants’

reactions to inquiry, and propose guidelines for catalytic validity to require “that

respondents gain self-understanding and, ideally, self-determination through research

participation” (Lather, 1986, p. 67).

Various questions of detail could be asked about each of these proposals. However,

the puzzle for me is the point of the exercise, which appears to be aimed at removing

researcher bias through the provision of self-correcting research. For if positivism has a

lien on objectivity, then presumably “bias” and “correction” are also paradigm relative.The trouble is, these terms are clearest within the rejected logical empiricist paradigm.

But in the absence of a world “out there” that can be known by inquirers willing to

use some methods rather than others, it is difficult to know what to count as bias and

correction. On the other hand, if a clear meaning is established in some other paradigm,

it is difficult to see why there is dilemma at all over the rock and soft place. Why should

one research paradigm be obliged to meet the epistemological demands of another? Yet

the assumption of some such obligation appears to lie behind not just Lather’s discussion,

but Lincoln and Guba’s as well.

Consider, for example, the methodological virtue of triangulation. This is only an

epistemological virtue relative to the kind of inferences that are thought to be sanctionedif agreement occurs. But again, to ask the more basic question, why should agreement

be epistemologically more desirable than disagreement? I can suggest some metaphysical

baggage that would link agreement in triangulation with trustworthiness of inference, but

it will be uncongenial to paradigms theorists. It is, in a word: realism. The supposition of

Page 30: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 30/97

530 C. W. EVERS

a real world in complex causal interaction with physical, thinking, acting, interpreting,

inferring humans (and whatever equally real causally discriminating apparatus they may

be using) certainly provides the basis for an economical account of the conditions under

which triangulation can be an epistemic virtue.

The structure of this argument is known as “inference to the best explanation”

(BonJour, 1985). Evidence for realism does not lie in some kind of direct sensory

experience of the furniture, or ontology, of the world; not even for the strictest

empiricist. Arguments for underdetermination, empirical test complexity, and theory-

ladenness tell decisively here. As Quine has remarked:

What are given in sensation are variformed and varicoloured visual patches, varitextured and

varitemperatured tactual feels, and an assortment of tones, tastes, smells, and other odds and

ends; desks are no more to be found among these data than molecules [Quine, 1960, p. 2501.

Yet the notion of “best” is still epistemic. Moreover, even if the realist hypothesis

is rejected as a way to defend the trustworthiness of certain inquiry procedures, any

argued replacement is “better” in an equally epistemic sense. This suggests that other

epistemological criteria are being employed, criteria that can be common to different

paradigms.

Coherence Justification

Let us explore the possibility of touchstone theory choice, and hence validity criteria

by re-examining our earlier arguments against logical empiricism. In each case the

structure of the argument amounted to showing that for some epistemological feature- objectivity, falsification, confirmation - the demands of the feature outran the

resources of empirical evidence. This argument structure is not saying that logical

empiricism is inadequate because it is not warranted by the data. Rather, the argument

is that logical empiricism is incoherent: it makes claims it cannot satisfy on its own

suppositions. But if this is the nature of the argument, then it will sustain another

conclusion; namely, there is more to evidence than empirical evidence. The reductio

argument against narrow empiricism only goes through if we suppose an equally narrow

view of evidence. However, if incoherence can function as evidence for the inadequacyof logical empiricism, then it ought to be applicable as a standard of evidence for the

adjudication of other theories. Churchland makes this point in a general way:

Since there is no way of conceiving or representing “the empirical facts” that is completely independent

of speculative assumptions, and since we will occasionally confront theoretical alternatives on a scale so

comprehensive that we must also choose between competing modes of conceiving what the empirical

facts before us are, then the epistemic choice between these global alternatives cannot be made

by comparing the extent to which they are adequate to some common touchstone, “the empirical

facts”. In such a case, the choice must be made on the comparative global virtues of the two global

alternatives, Tl-plus-the-observational-evidence-therein-construed, versus T2-plus-the-observational-

evidence-therein-(differently)-construed. That is, it must be made on superempirical grounds such

as relative coherence, simplicity, and explanatory unity [Churchland, 1985, pp. 41421.

Churchland’s point, together with the fact that paradigms theorists use the incoherence

of empiricism to argue that it is not correspondence true, suggests that the notion of

coherence evidence is compatible with correspondence truth. The strategy would be

Page 31: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 31/97

Beyond Paradigms 531

to let coherence criteria grind out their story of which is the most warranted theory,

and then assume the existence of all objects presumed by that theory as constituting

the nature of the world - what the theory matches up with, or corresponds to. This

would even permit strong relativists to say that research is really (and correspondencetruly) paradigmatic in its structure. Such is the payoff from relocating certain narrow

empiricist notions of correspondence, realism, and evidence into a non-foundational,

coherent& context of epistemic justification.

Using this account of justification will permit us to solve a number of difficulties

raised earlier; for example the circularity problem created over what constitutes the best

theory of knowledge acquisition when these are embedded in the very epistemologies

under dispute. Requiring the theories to be coherent will force choices. It will also

select reflexiveness since a good epistemology should leave its subject matter and

itself learnable, and learnability will include the correction of error. Furthermore,

there will be a premium on naturalistic epistemologies if our most coherent theories

of humans reckon them to be part of the natural order. There are advantages of

parsimony in seeing knowledge as part of the natural order too, and in elaborating

accounts to include social factors in the production and distribution of knowledge. In

this way, a coherence theory of justification, with its emphasis on consistency, simplicity,

comprehensiveness, and explanatory unity, can function as a self-reflexive touchstone to

winnow rival epistemological alternatives without circularity.

Interestingly, Campbell defends the main claims being made here. His support of

naturalistic epistemology, in particular evolutionary epistemology, is long standing.

However, more recently, responding to Quine’s arguments, he has defended the

combination of coherence justification and correspondence truth: “My position is to

accept the correspondence meaning of truth and goal of science and to acknowledge

coherence as the major but still fallible symptom of truth” (Campbell, 1977, p. 445).

Needless to say, in drawing so heavily on scientific theory for epistemological details,

this view coheres well, not with naive realism, but with scientific realism. A challenge

for those who wish to produce a non-realist account of why agreement in triangulation

is an epistemic virtue would need to make it cohere with some view about the reality,

or otherwise, of knowing subjects and their environment.

Earlier, in the discussion about operational definition, we saw that empirical

evidence distributes its evidential support holistically. Nevertheless, there must besome provisional set of antecedent hypotheses that permit some observations to

count as more salient for the learning of causally contiguous (one word) sentences.

There must be some entering point for infant language learning to be possible. This

epistemological constraint of learnability will not save operational definition, but it

will create difficulties for the kind of semantic holism needed to support the thesis

that paradigms are incommensurable. For incommensurability trades on the idea that

the meaning of a term is a matter of its conceptual role in a theory. Systematically

different theories of leadership, intelligence, and education, for example, will leave

different uses of these terms orthographically identical but semantically distinct, so that

speakers will talk past each other. Now from an epistemological point of view, learningthe meaning of these terms requires a prior mastery of the theoretical network necessary

to provide conceptual roles. But theoretical networks are comprised of terms which in

turn have to be learned. So a regress threatens to make theories unlearnable. The correct

response to this problem is not to compromise semantic holism by reinstating a sharp

Page 32: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 32/97

532 C. W. EVERS

theory/observation distinction. Rather, the learnability argument is best construed as

evidence for the existence of a certain amount of (shifting, provisional) touchstone

theory in the processing of experiences (Churchland, 1979, pp. 75-80; Papineau, 1979;

Walker & Evers, 1982, 1988). One might expect naturalistic learning theories to throw

some light on the nature of this touchstone in humans at the perceptual level.

The ubiquity of theory and the methodological appeal to touchstone coherence criteria

of theory justification suggest that varieties of validity are ultimately different kinds of

construct validity, and validating constructs is a matter of developing a most coherent

global theory. If this is so, it will explain the diffuseness that was first implicit and is

now more explicit in validity theory. Consider Cronbach’s recent reflections:

The positivists’ distinction between theory and observation can no longer be sustained and

there is no hope of developing in the short run the “nomological networks” we once grandly

envisioned Our best strategy is probably contextualism In brief, one offers a

generalization and then tries to locate the boundaries within which it holds. As the structureultimately becomes clumsy, someone will integrate most of the information into a more gracefulone. For scientists, this is a reminder that knowledge evolves slowly and indirectly, that one can be

prideful about contributing to the advance without the hubris of insisting that one has the “correct”

theory. For practical testers, this warns that an instructive program of construct validation - strong or

weak - is unlikely to reach the closure needed to defend a test that is already under fire [Cronbach,

1988, p. 141.

Messick (1988, p. 42); elaborating and defending the unified view of validity contained

in the 1985 Standards for Educati onal and Psychological Test i ng, identifies four bases for

test validity: (1) plausibility of interpretations as inductive summaries of evidence, (2)

the value implications of test interpretations, (3) the relevance of scores for particular

applications, and (4) the social consequences of proposed uses of tests. For Messick(1988) . . .

the heart of the unified view of validity is that appropriateness, meaningfulness, and usefulness

of score-based inferences are inseparable and that the unifying force is empirically grounded construct

interpretation [p. 351.

It would follow that the global theory to be validated, or shown more coherent than

rivals, would include ethical theory, as well as a theory of society and social causation.

Having seen earlier that Campbell accepted objections to logical empiricism and

embraced coherentist justification, I should note a qualification with regard to his

recent views on the distinction between internal and external validity. Campbell

(1986b) redraws this distinction as “local molar causal validity” and the “principle

of proximal similarity”. Roughly speaking, the former involves no generalization -

although Campbell qualifies this as an “exaggeration” - the latter does. The former

invites us to “back up from the current overemphasis on theory first” (Campbell, 1986b,

p. 70). Crucially, “in the new contrast, external and construct validities involve theory.

Local molar causal validity does not” (Campbell, 1986b, p. 76). The puzzle is that

Campbell knows that the whole enterprise of identifying local molar causes is laden with

theory - theory containing general terms. Identifying caused outcomes involves the use

of descriptions and terms that imply such outcomes are kinds. It is as though he is trying

to draw the observation/theory distinction at the internal/external validity level. Yet for

inferential purposes, the strongest distinction possible here is that between plausible and

implausible theories, and that requires a coherentist argument. Some remarks he makes

in another context on a parallel distinction - between the analytic and the synthetic -

may clarify this puzzle:

Page 33: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 33/97

Beyond Paradigms 533

Thus while usually convinced by Quine when I read him, insofar as I can understand him, I havenot really stabilized the Gestalt switch which is required to abandon the analytic-synthetic distinction[Campbell, 1977, p. 441.1

So far, I have argued that the process of validation in general is coherent& and realist

rather than empiricist or paradigmatic, and that a coherence theory of justification is

compatible with a correspondence theory of truth. I want to conclude with some brief

remarks about judging hypotheses plausible, since that is of concern to both Cronbach

and Campbell. Let us suppose that we have two hypotheses up for consideration. Since

plausibility is relative to the quality of presumed background theory, let us locate each

within a theory, Tl and T2. Now whatever empirical tests we devise for adjudicating

between Tl and T2 - confirmation perhaps, or falsification - they will be insufficient.

We need to look at superempirical theoretical virtues. Work on these is at an early stage

in epistemology so it is not possible to give a rigorous account of their application. But

they can be made intuitively clear. For example, we should prefer Tl to T2 if Tl is simpler

than T2. If we construe simplicity in terms of number of assumptions, or axioms, then it

will be a virtue in this sense. Any theory can account for any phenomenon if we are

willing to just add assumptions. Indeed, without some premium on simplicity, the notion

of explanation loses its point if we can just add an assumption for every phenomenon

to be explained. We should prefer Tl to T2 if Tl explains more than T2. We should

prefer Tl to T2 if Tl is free from contradiction and T2 is not. Anything can be derived

from a theory that contains a contradiction. We should prefer Tl to T2 if Tl squares

better with what we already have reason to believe. We should prefer Tl to T2 if Tl

leaves behind fewer difficult unanswered questions. One question for each theory ishow can it ever be learned, or acquired. Coherence justification does not leave justified

theories floating above the evidence. Rather, the theory must cohere with some account

of how it could be learned (presumably from experience if naturalistic epistemology is

sound; but note that any epistemology will have to be self-referential, or reflexive) (see

Williams, 1980). Applying this condition, we may conclude that Tl coheres better with

an epistemology we have reason to believe, than T2. Herein lies the basis for plausibility

judgments among hypotheses.

There are other formulations of, and variations on, the above conditions which are

based on the work of Lycan (1988). For example, Evers and Lakomski (1991) have

undertaken a systematic application of coherence justification to competing theories ofeducational administration. And Thagard (1988) has attempted to explain the growth

of scientific knowledge, especially the replacement of one theory by another, using

coherence criteria. However, because of the global nature of coherence justification,

theory adjudication is a complex and often controversial business, even when simplified

by the acceptance of a great deal of shared background or touchstone theory. Judgments

of validity will be thus likewise controversial or provisional, mirroring the epistemic

status of the theories such judgments are about. Note that it is not the term “validity”

which is here being used equivocally; it is embedded within a coherentist realist

epistemology. Rather the uncertainty is over what knowledge, or theory, is most

likely to be true. But the answer to this question will depend in turn on improvingour understanding of the epistemology of theory choice, with gains for the coherentist

accruing to naturalistic approaches to human knowledge acquisition.

Perhaps the most interesting recent work on the task of naturalizing coherence

justification is in the area of cognitive neurobiological modeling. That most powerful

Page 34: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 34/97

534 C. W. EVERS

of epistemic engines, or learning systems, the human brain, appears to learn by

applying holistic soft constraints on global neural network representations of knowledge,

suggesting there is much to be learned about knowledge and its justification from

neuroscience (Churchland, 1986; Thagard, 1989). Here, however, we need to look atnew ways of representing knowledge and its dynamics - not the familiar sentential

representations, but in terms of neural network geometries and connection strengths

between neurons. Much naturalistic epistemology in this vein is occuring in the study

of parallel distributed processes - the mathematical modeling of neural networks

(Rumelhart & McClelland, 1986; Muller & Reinhardt, 1990. For an introduction to

these ideas in education, see Evers, 1990).

Having recognized the limits of logical empiricism and more recently paradigms

theory, philosophers, methodologists, and cognitive scientists now have the opportunity

to study the microstructure of cognition, the fine details of epistemically progressive

global and local belief change. However, this is only the beginning of a full coherentist

account of theory validation. To steal a line from Donald Davidson, it’s nice to know

that we won’t run out of work.

References

Bates, R. (1983). Educati onal admini strat ion and the management of know l edge. Geelong: Deakin University

Press.

BonJour, I_. (1985). The struct ure of e~~~r ~e~~ now ledge. Cambridge, MA.: Harvard University Press.

Campbell. D. T. (1977). Descriptive epistemology: psycholo~~al, sociological and evolutionary. The ~iZZiamJames Lect ures. Harvard University. Cited as reprinted in Campbell (1988).

Campbell, D. T. (1984). Can we be scientific in applied social science? In R. F. Connor, D. G. Attman, & C.

Jackson (Eds.), Evaluat i on Studi es Revi ew Annual (pp. 2648). Cited as reprinted in Campbell (1988).

Campbell, D. T. (1986a). Science’s social system of validity-enhancing collective belief change and the

problems of the social sciences. In D. W. Fiske & R. A. Schweder (Eds.), M etut heory i n soci al science:

Pluralism and subjectivit ies (pp. 108-135). Chicago: University of Chicago Press. Cited as reprinted in

Campbell (1988).

Campbell, D. T. (1986b). Relabeling internal and external validity for applied social scientists. In W.

M. K. Trochim (Ed.), Adv ances in quasi -experi mental design and analy sis (pp. 67-77). San Francisco:

Jossey-Bass.

Campbell, D. T. (1988). M ethodol ogy and epist emal ogy for soci al science. Chicago: University of Chicago

Press.

Campbell, D. T. & Stanley, J. C. (1963). Experi mental and qi l asi- experj mental designs for research. Chicago:

Rand McNally.

ChurchIand, P. M. (1979). ~c~ent~ ~c eali sm and the pfast~cit ~l f mi nd. Cambridge: Cambridge University

Press.

Churchland, P, M. (1985). The ontological status of observables: In praise of superempirical virtues. In P.

M. Churchland & C. A. Hooker (Eds.), fmages ofscience.Chicago: University of Chicago Press.

Churchland, P. S. (1986). Neurophilosaphy. Cambridge, MA.: M.I.T. Press.

Cook, T. D. & Campbell, D. T. (1979). Quasi- experi mentat ion: Desi gn and analy sis i ssues for field sett ings.

Chicago: Rand McNally.

Cronbach, L. J. (1988). Five perspectives on the validity argument. In H. Wainer & H. I. Braun (Eds.),

Test val i d i tv (pp. 3-17). Hillsdale, N.J.: Lawrence Erlbaum.. . .Cronbach, L. J. & Meehl, P. E. (1955). Construct validity in psychological tests. Psychological Bullet in,

52, 281-302. Cited as reprinted in C. I. Chase & H. G. Ludlow (Eds.) (1966). Readings i n educati onnl

and psychol ogical measurement . New York: Houghton Mifflin.

Evers. C. W. (19901. Educating the brain. Educati onal Phi l osophy and Theorv . 22C2). 65-80.

Evers, C. W. & Lakomski, G.71991). Know ing educati onal adzmimstrati on. Oxford: ‘Pergamon Press.

Feigl, H. (1950). Existential hypotheses. P~~~as~p~~~ fScience, 7(l), 3562.

Feyerabend, P. K. (1962). Explanation, reduction, and empiricism. M innesota Studi es in the Phi losophy of

Science, Vol. 3. Minnea~lis: University of Minnesota Press.

Page 35: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 35/97

Beyond Paradigms 535

Foster, W. (1986). Paradi gms and prom i ses. Buffalo, N.Y.: Prometheus Books.

Hanson, N. R. (1958). Patt erns of discovery. Cambridge: Cambridge University Press.

Hempel, C. (1965). Aspects of sci enti fi c explanat ion. New York: Free Press.

Hempel, C. (1966). Phil osophy of natural science. Englewood Cliffs: Prentice-Hall.

Hooker, C. A. (1975). Philosophy and meta-philosophy of science: empiricism, Popperianism and realism.Synrhese, 32, 177-231.

Kerlinger, F. N. (1964). Foundati ons of behavi oral research. N ew York: Holt, Rinehart and Winston.

Kuhn, T. (1962). The structure of scienufic revolutions. Chicago: University of Chicago Press.

Lather, P. (1986). Issues of validity in openly ideological research: Between a rock and a soft place.

I nt erchange, 17(4), 63-84.

Lincoln, Y. & Guba, E. (1985). Natural ist ic inquiry. Beverly Hills: Sage Publications.

Lycan, W. G. (1988). Judgemenr and j usti fi cati on. Cambridge: Cambridge University Press.

Messick, S. (1988). The once and future issues of validity: Assessing the meaning and consequences of

measurement. In H. Wainer & H. I. Braun (Eds.), Test val i d i ty (pp. 33-45). Hillsdale, N.J.: Lawrence

Erlbaum.

Muller, B. & Reinhardt, J. (1990). Neutral netw orks: an int roducti on. Berlin: Springer.

Papineau, D. (1979). Theory and meaning. Oxford: Clarendon Press.

Popper, K. R. (1959). The logic of scienti fi c discovery . London: Hutchinson.Popper, K. R. (1963). Conjecture and refutat ions. London: Routledge and Kegan Paul.

Quine, W. V. (1953). Two dogmas of empiricism. In W. V. Quine (1961), From a logical point of view

(pp. 20-46). Cambridge, MA.: Harvard University Press.

Quine, W. V. (1957). The scope and language of science. In W. V. Quine (1976), The way s of paradox and

other es.ruys (pp. 228-245) (second edition, enlarged). Cambridge, MA.: Harvard University Press.

Quine, W. V. (1960). Posits and reality. In W. V. Quine (1976), Th e w ays of paradox and ot her essays

(pp. 69-90). New York: Columbia University Press.

Quine, W. V. (1969). Epistemology naturalized. In W. V. Quine Ont ological relat iv it y and other essays

(pp. 69-90). New York: Columbia University Press.

Rumelhart, D. E. & McClelland, J. L. (Eds.), (1986). Parallel distributed processing, Vols 1 and 2.

Cambridge, MA.: M.I.T. Press.

Standards for educat i onal and psychol ogical t ests and manual s (1966). Washington, D.C.: American

Psychological Association.Technical r ecommendati ons for psychol ogical t ests and di agnosti c t echniques (1954). Psychol ogical Bul l et i n,

51: 201.-238.

Thagard, P. (1988). Computat ional phil osophy of sci ence. Cambridge, MA.: M.I.T. Press.

Thaaard, P. (1989). Exolanatorv coherence. Behavioral and Brain Sciences, 12. 435-502.

Watanabe, S: (1969). I ?now inghnd guessing. New York: John Wiley.

Walker, J. C. & Evers, C. W. (1982). Epistemology and justifying the curriculum of educational studies.

Bri t i sh Journal of Educati onal St udi es, 30(2), 213-229.

Walker, J. C. & Evers, C. W. (1988). The epistemological unity of educational research. In J. P. Keeves

(Ed.), Educati onal research, met hodol ogy and measuremem: An i nt ernat i onal handbook (pp. 28-36).

Oxford: Pergamon Press.

Williams, M. (1980). Coherence, justification, and truth. Review of M et aphysi cs, 34(2), 243-272.

Biography

Colin W. Evers holds degrees in mathematics, philosophy, and education, and teaches

in the School of Graduate Studies, Faculty of Education, Monash University, Australia.

His teaching and research interests are in philosophy of education, educational research

methodology, and educational administration.

Page 36: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 36/97

Page 37: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 37/97

CHAPTER 3

POLICY ANALYSIS: PRACTICAL REASONOR EMPIRICAL SCIENCE?

GABRIELE LAKOMSKI

School of Education, The University of Melbourne, Parkville, Victoria 30.52, Australia

Abstract

This chapter examines William Dunn’s tr ansactio~ I model of argument which utilizes

prectical reason as its methodological framework for policy decision-making. It is argued

that this formal model, offered as an alternative to scientific policy analysis, fails subject

to its assumption of knowledge and its acquisition as stipulated by the folk psychologica1

sentential view of learning. Since recent empirical evidence does not support this view, amore defensible methodological framework for policy analysis is suggested based on the

causal account of knowledge, i.e., the way real brains acquire and process information.

The Methodological Dispute in Policy Analysis

methodological concerns have played a central role in public and educational policy

analysis for as long as these disciplines have existed as discrete fields of inquiry and

professional practices (Boyd, 1988; Garson, 1986; Mitchell, 1984). Amongst the many

problems still awaiting solutions none is more persistent than that of rationally selecting

“best policy” under conditions of imperfect knowledge and uncertainty. Traditionally,

Laswellian policy science (Lasswell, 1951; Lerner & Lasswell, 1951), or the ~y~o~~~c

~~u~~~~u~, ought to attack the problem by taking a global, historical view of the

policy context, and to study the conditions of social change with systems analysis

as its preferred theoretical framework and a narrow empiricism as its methodology

(Garson, 1986, p. 538).

While equally mindful of the complexity of social life, defenders of the anti-synoptic

tradition argued for an incremental, piecemeal approach to policy analysis (Braybrooke

& Lindblom, 1963) defended pluralism as its theoretical framework, and emphasized

case and contextual studies as appropriate methodologies. The major concern of

anti-synoptic writers was, and continues to be, however the alleged value-neutrality

of Laswellian policy science, a concern which has subsequently led to question the

status of the field as a science, as well as the role of policy analysis in a democratic

society*

One consequence of this development is that some analysts, motivated in part by

reports of the negligible and even negative impact of policy research (e.g., Dunn, Mitruff,

&. Deutsch 1981) simply declared the science question dead: “Resolved: Policy analysis is

537

Page 38: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 38/97

538 G. LAKOMSKI

not a sci ence, is not scientific; indeed, scientific status is an inappropriate goal for policy

analysis” (Landsbergen & Bozeman, 1987, p. 625). Another consequence of the currently

prevailing anti-science mood is the rise of alternative models of policy analysis which

align themselves directly with postpositivist, interpretive social science (e.g., Callahan& Jennings, 1983; Jennings, 1983, 1987). Specifically, these anti-synopt ic writers argue for

the inclusion of political and ethical values, as well as the social context of policy analysis

generally (e.g., Dunn, 1983; Fischer & Forester, 1987; MacRae, 1976; Wildavsky, 1979)

and advocate that naturalistic approaches be adopted as the most appropriate to deal

with these issues (e.g., Lincoln & Guba, 1986). Combining many of the concerns of

both the older anti-synopt ic tradition as well as its contemporary interpretive expression

is the policy analysis model suggested by Dunn.

Dunn’s t ransact i onal model of argument (Dunn, 1981, 1982), based on Toulmin’s

practical logic (Toulmin, 1958), is a recent and explicit attempt at providing a

procedural model for rationally enabling policy choice. It represents a special case

of the methodological paradigm which largely characterizes the newer approaches to

policy analysis, that of practical reason with its emphasis on rhetoric, persuasion, and

legal reasoning. Since policy makers and analysts are meaning constructing agents, so

the argument goes, it is the epistemological feature of intentionality which provides

the real grounds to understand social complexity and thus facilitates superior policy

making. Unlike the “statistical empiricism” of Laswellian policy science, the alternative

methodological proposal of practical reason is believed to be antithetical to empirical

scientific methodology and to lend itself neither to the formulation of laws, nor to

causal explanation. It is normative, not descriptive, more concerned with the adequacy

and cogency of policy arguments than their truth.

But the claim of practical reason to represent human understanding in its essence

is a matter of the soundness of thp epistemology underwriting it. Following Quine’s

(1969; Quine & Ullian, 1978) “epistemology naturalized” argument, what makes for

the soundness of an epistemology is that it be learnable; that is, its embedded theory

of the mind must present an adequate account of human cognition and learning. An

epistemology, then, is only as valid as its implied theory of learning, and there is no

principled distinction to be made between the former and the latter.. Epistemology

is continuous with natural science (see Walker & Evers, 1982 who introduced this

argument into [philosophy of] education; Evers, 1987; Walker & Evers, 1988). In the

present context, this means that the account of learning embedded in practical reason

must be able to explain how we have come to know about such abstract objects as

purposes and beliefs which are taken to function in the explanation of behavior.

It is the purpose of this chapter to argue that the epistemology underlying Dunn’s

procedural model, the classical Justified True Belief (JTB) account, is unsound by way

of arguing that its implied view of learning and cognition, characterized by the “folk

psychological” (Stich, 1983) propositional attitudes, is mistaken. Showing this requires

demonstrating first that folk psychology is an empirical theory, and that it subsequently

can be assessed in the manner of all theories (P.M. Churchland, 1981, 1988a, 1989b;

P.S. Churchland, 1989). When evaluated as such, folk psychology’s view of knowledgeturns out to be sentential. But according to the neurosciences whose business it is

to explain brain functioning, there is no support for the assumption that the brain

functions primarily as a linear “sentence-cruncher”, to use P.S. Churchland’s (1989)

colorful phrase. The consequence for Dunn’s model is, then, that policy choice as an

Page 39: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 39/97

Beyond Paradigms 539

example of intelligent, rational action has little to do with overt linguistic behavior,

regardless of the form this takes. If so, it would seem advisable to hold the folk

psychological categories of belief and desire at arms length and concentrate on how

we in fact process information as a precondition for understanding policy choice as a

specific example of cognitive action.

Policy Analysis as Reasoned Argument

In his introduction to the “Symposium on Social Values and Public Policy” (Dunn,

1980-81), Dunn states that the policy sciences need to be based on a perspective which is

neither value-neutral nor value-committed. The policy sciences must not be value-neutral

since value-neutrality would deprive policy of political significance and take it back to

the canons of positivism and the “empiric0-analytic sciences”. To be value-committed,

on the other hand, is equally undesirable because this would deny the policy sciences

of “reasoned ethical discourse” which is Dunn’s central concern. In his view, policy

knowledge can only grow through open critical discourse which, in turn, presupposes

rules or standards of assessment that enable the participants in policy making to examine

rival ethical claims without anyone thus dominating the outcome (Dunn, 1980-81, p.

519). These rules or standards are spelt out in the t r ansact ional m odel ofargum ent which

is Dunn’s proposal for operationalizing pr actical r eason directed towards establishing the

adequacy and cogency of claims to knowledge rather than their truth. (Dunn, 1981, 1982)

Following Toulmin, there are no context-independent criteria of judging the merits of an

argument, and this is as should be, according to Dunn, given that policy analysis, as a

practical science, is carried out in specific contexts.

Dunn offers his model as a reply and an alternative to the kind of social-scientific

experimentation advocated by Campbell (1969) in his “experimenting society”. Given

that reforms are “symbolically mediated and purposive social processes”, Dunn believes

they are more akin to arguments than to quasi-experimentation, or scientific experi-

mentation generally, which directs questions to nature directly. The strength of the

t ransact ional model, as Dunn sees it, consists in broadening the range of standards

thought suitable for challenging and assessing knowledge claims, including a number

of specific tests which help determine their adequacy, cogency and relevance. Thesetests also have the function of “plausible rival hypotheses” to the knowledge claims

under examination. The model which best represents the way in which agents settle

competing claims in social contexts is that of jurisprudential reasoning. Its standards

include, amongst others, rules for making valid causal inferences. As a consequence,

and unlike the replication of experiments, argumentation leads “toward a pragmatic and

dialectical conception of truth . . . Knowledge is no longer based on deductive certainty

or empirical correspondence, but on the relative adequacy of knowledge claims which

are embedded in ongoing social processes” (Dunn, 1982, p. 94).

Of central importance in the t r ansact i onal model is the distinction between analytic

and substantial arguments which Dunn takes over from Toulmin (1958). Toulmin arguesthat formal logic is too narrow to be of any use in the practical assessment of arguments.

Our claims to knowledge are sound when our supporting arguments are adequate. What

is to count as an adequate argument is dependent upon the field from which it is taken:

“validity is an intra-field, not an inter-field notion” (Toulmin, 1958, p. 255).

Page 40: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 40/97

540 G. LAKOMSKI

The merits of a procedural schema, based on the above described assumptions and

proposed to guarantee rational policy choice if followed conscientiously, are said to

be such that it allows (1) for the visual representation of the systematic structures

of competing arguments; (2) for the critical analysis of the frames of reference orideologies of those contesting knowledge claims; and (3) for the development of “truth”

and “utility” tests by means of which knowledge claims can be tested. Finally, and most

importantly, it allows for democratic knowledge creation via the critical exchange and

challenge of knowledge claims in communicative action. The model is composed of

six elements which are data (D), claim (C), warrant (W), backing (B), rebuttal (R),

and qualifier (Q). The first three elements parallel those of the classical syllogism, to

be supplemented by the second set. Backing (B) denotes additional data, claims or

arguments introduced in case the warrant is in doubt (Dunn, 1982, p. 96). Warrants

provide reasons for the acceptance of a claim (Dunn, 1981, p. 42). Rebuttal (R)

specifies conditions under which the adequacy or cogency of a knowledge claim can

be challenged. Policy claims and rebuttals together “form the substance of policy issues,

that is, disagreements among different segments of the community about alternative

courses of government action” (Dunn, 1981, p. 42). Qualifier (Q) serves to indicate the

degree of cogency or force of a claim.

Following the de~nition suggested by Weiss and Bucuvalas (1980), Dunn (1982)

considers truth tests to be “decision points concerning evidence; the grounds for

accepting or rejecting truth claims include . , . empirical as well as formal rational tests”

(p. 100). Relevance or utility tests, on the other hand, “are decision points concerning the

delineation of an appropriate domain of inquiry or action”, a definition Dunn expands to

include “the explicit or implicit purposes of knowledge claimants or their challengers”

(p. 101). Truth tests which appraise the adequacy of a knowledge claim and challenge

its causal assumptions are considered more problematic. Dunn suggests a classification

of truth tests which takes account of alternative modes of explanation (von Wright),

different knowledge-constitutive interests (Habermas), and competing standards for

assessing ethical claims (MacRae). Specifically, Dunn (1982, p. 105) proposes that

there should be different truth tests for different “knowledge transactions”, presumably

based on the assumption that there are different kinds of knowledge to be transacted

which have their corresponding purposes. These are e~pirico”a~aZytic, concerned with

logical consistency of laws, etc. and/or their correspondence to observed regularities;interpretive, directed toward human purposes, reasons, and motives; pragmatic denoting

effective past action; authori tat ive, relating to the well-established status of those

producing knowledge as well as its general acceptance, or the use of approved methods;

lastly, crit ical in the Habermasian sense of liberating human actors from unexamined

doctrines. According to Dunn, we learn what counts as an adequate argument under

specific conditions in specific professions, knowledge is fallible and corrigible. This is

quite in keeping with Toulmin’s epistemological prescription to ignore scepticism and

moderate our ambitions (Toulmin, 1958, p. 248).

Appl yi ng t he Procedure

It is easy to see why Toulmin’s applied logic seems attractive to policy writers

such as Dunn (and also Mason & Mitroff, 198CrSl) since here we appear to have

Page 41: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 41/97

Beyond Paradigms 541

a sound rationale for including the ordinary arguments of practitioners and those

of the policy experts and scientists thus satisfying the democratic ideal central to the

anti-synoptic tradition. The model is all the more attractive since it is also combined

with an assurance that the transactional model is not only context-sensitive, but alsovalue-critical and intersubjectively valid. That is, it avoids the problems associated with

straight-out partisan approaches without having to compromise on the right (democratic)

values. Furthermore, we are also presented with a formal set of procedures by means of

which we could settle any kind of claim or argument and thus guarantee policy choice.

But can we?

Dunn acknowledges that stakeholders, subject to their different worldviews,

ideologies, and frames of reference, are bound to challenge the presuppositions

used to back up a warrant, that is, the reasons provided for a claim. He does not

consider such challenges a disadvantage, but points out that in the empirico-analytic

and hermeneutic sciences such questioning is not even possible, and it is this special

feature which makes his model unique. While the potential for publicly challenging

knowledge claims may be considered an advantage (accepting for the sake of the

argument Dunn’s claim regarding the alleged inability of both natural science and

hermeneutics to challenge assumptions), whether or not it is of value depends on

the model’s ability to settle the challenge, to determine what is to count as the

more adequate argument, given its own theoretical resources. Since policy analysis

in Dunn’s view aims “t o produce and t ransform poli cy-relevant informat ion that may

be ut il i zed in poli t i cal set t i ngs t o resolv e poEi cy problems” (Dunn, 1981, p. 35) the

model’s claims stand or fall with the accuracy of its account of knowledge production

and the practical performance it is said to guarantee.

Some problems are readily apparent. It is a commonplace to note that in the

empirical world power, learning, and linguistic differences are distributed unevenly,

disadvantaging some from the outset, and that gender, race, as well as age play a

significant role in who gets to talk in social situations, and who is listened to, and

who is included in the deliberation process in the first place. In addition, the implied

assumption that all are committed to rational debate, that all share the same willingness

to declare their ideologies or standpoints honestly, and - by implication - that none

are thereby disadvantaged, is quite unrealistic.

For his model to work, Dunn must presume that power and inequality are suspendedin order for “good will” to prevail; that our standards and levels of learning and policy

relevant knowledge are even; that we are all equally capable of putting our points of

view; and that gender differences, race and age are irrelevant; in other words, that

we live in a perfectly rational world. Here Dunn’s model resembles Habermas’ ideal

speech situation which suffers from similar problems (Lakomski, 1988). Central to the

assumption of perfect rationality is the belief that human agents are in fact capable

of “knowing their own minds”: their beliefs, thoughts and intentions which comprise

self-knowledge, and secondly, that this self-knowledge is fundamentally linguistic. Only

then is it reasonable to assume that rational deliberation could work at all. But before we

consider the relation between beliefs and behavior directly, let us first examine whetherthe decision procedure suggested by Dunn permits a policy choice to be made following

its own prescriptions. Consider the following example.

According to recent research (Gill, 1988), girls achieve better academically when

they are in single-sex rather than co-educational classrooms (D). Therefore, we might

Page 42: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 42/97

542 G. LAKOMSKI

conclude that it is definitely better (Q) to educate girls in single-sex classrooms or

even schools (C). Our warrant or reason for (C) is that the separation of boys and

girls for purposes of academic instruction caused higher academic achievements for

girls than had been the case when they were taught in co-educational settings. Wecould further back our warrant by saying that equality of opportunity is a fundamental

principle which demands single-sex classrooms in the interests of academic achievement

of females. We could further argue that whatever increases the academic achievements

of any underprivileged group ought to be done, females being one such group. Recall

Dunn’s point that the empirical evidence is rarely in doubt, but that the underlying

reasons and assumptions are challengeable. Suppose, then, that a group of parents

opposes the claim that it is better to educate girls in single-sex classrooms on the

assumption that short-term gains (higher academic achievements) do not outweigh

longer-term social consequences such as presumed inadequate socialization. The parent

group thus advocates co-education, rather than single-sex education, on the assumption

that learning together is more important in terms of ultimately overcoming sexism than is

segregation, even if such segregation yields better academic performance for the girls in

the short term. We thus have two competing claims and must choose between competing

policy prescriptions. How do we settle the conflict?

On Dunn’s own account, the relevance and cogency of any person’s or group’s

knowledge claim is always context-dependent. In the present example, we have two

different “contexts” in terms of two different feminist theories. Following the logic of

Dunn’s model, what is relevant and cogent to a separatist feminist policy maker may not

at all be relevant and cogent to a parent of a different feminist persuasion who believes

that separatism is a mistaken strategy for overcoming sexism and its practices. On their

own accounts, each theory is equally valid. But since Dunn is interested in producing

and transforming “poli cy-relevant informat ion t hat may be uti l i zed in poli t ical sett ings to

resolve policy problems” the question of which claim is the more “adequate” must be

settled. This is to be done by truth tests, by introducing “different sets of assumptions

and underlying presuppositions” (Dunn, 1982, p. 103) which might challenge a claim’s

causal assumptions. But truth tests themselves have, as we saw earlier, their own

relevant contexts. These are ultimately the different kinds of knowledge presumed

to underlie the different purposes for “knowledge transactions”. But if truth tests are

thus tied to their own respective spheres of influence, then, by definition, there areinfinitely many, equally true arguments or theories. But since not all claims can be

true, and if we cannot decide between true and false claims, we cannot decide the

status of any claim. Knowledge is impossible and we end up with incommensurable

theories. For if competing arguments, as well as relevance and cogency tests, are

dependent on their contexts, and if there are specific truth tests for specific types of

“knowledge transaction” based on different kinds of knowledge, then the standards of

appraisal cannot possibly serve to arbitrate between competing claims. Both arguments

and standards of appraisal are based on their respective knowledge foundations, and

there is in principle no difference between the arguments to be assessed and that which

is supposed to be doing the assessing. Not only can competing claims not be settled sinceeach is restricted by its own epistemic sphere of influence (the principle of “intra-field

validity”), but claims within the same epistemic community face difficulties in terms

of the support or evidence they themselves can marshal1 since whatever warrant or

backing may be brought up is only as valid as the presumed conception of knowledge

Page 43: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 43/97

Beyond Paradigms 543

underwriting them, and its truth or validity cannot be presumed a priori. This means in

the present example that no amount of new reasons or arguments offered, either for or

against the above feminist claims, could in principle settle the issue of which educational

policy to adopt. As a consequence, the t ransact i onal model of argument makes policydeliberation both relativist and arbitrary. Given the example presented here, efforts toovercome sexism and its practices become a matter of subjective preference which can

hardly rate as a satisfactory policy prescription. The inability to make a choice is the

direct result of the model’s epistemological assumptions which are made more explicit

in the following section.

The Regress of seasons

As a procedure of justification, Dunn’s model is faced with the threat of an infiniteregress of reasons, characteristic of the JTB account of knowledge (Armstrong, 1981;

Williams, 1977, 1980), which in Evers and Lakomski’s (1991) definition is expressed as

follows:

Person x knows that p, where p is some particular claim to knowledge, if and only if

(i) p is true

(ii) x believes that p, and

(iii) x is justified in believing that p for the reason q (p. 5).

Displaying the justificatory structure of JTB shows how neatly Dunn’s decision modelparallels it. Of particular significance in the transactional model, as well as the JTB

account, is condition (iii) because for q to be a justifying reason, it must itself be an

item of knowledge, just as backings and warrants must be to support an argument. But

if q is already an item of knowledge, then JTB’s definition is circular since it already

contains an appeal to knowledge in the definiens. But more importantly, an infinite

regress threatens since q can only be known by virtue of knowing something else. Faced

with the problem of circularity as well as the infinite regress, the solution has been sought

in the distinction between derived and immediate knowledge. JTB, then, is an account of

derived knowledge, of claims known in virtue of being implied by further knowledge. The

chain of implications is not infinite, however, as it stops at knowledge that is immediate

or underived. It follows on this view that all our derived knowledge must rest on, or be

derivable from, some foundation of immediate knowledge, traditionally the so-called

sense-data, first person sensory reports, and observation statements. Dunn’s acceptance

of a plurality of foundations is a variation on the theme that knowledge is believed to be

in need of foundations, albeit different ones. This multi-foundationalist view has strong

similarities with the paradigms’ view (Walker & Evers, 19SS), although Dunn does not

employ that concept.

In line with many interpretivist social scientists and philosophers who followed Kuhn’s

and Feyerabend’s lead in raising decisive objections against logical empiricism, Dunn

accepts that (social) reality is a matter of interpretation (with physical reality being a

matter of direct observation), and that (social) theories are always underdetermined by

data or evidence (Dunn, 1982, p. 104). As a consequence, he also accepts that no theory

can ever be justified evidentially which leads to the conclusion that all theories are equally

Page 44: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 44/97

544 G. LAKOMSKI

plausible or acceptable. There is thus no such thing as a true theory in the social sciences,

and objectivity becomes unobtainable. But this conciusion only follows if evidence equals

empirical evidence only. In addition to the fact that theories are always underdetermined

by the available evidence, we also know that what is taken as empirical evidence is alwaystheory-laden. So in comparing theories we cannot simply rely on empirical evidence as

the neutral arbiter, as our “touchstone” (Walker & Evers, 1988). Since we are now in

the business of actually comparing theories (observation being as theoretical as anything

else), and since we know what makes for better or worse theories, the solution is that

we use their traditional virtues, i.e., relative coherence, simplicity, and explanatory unity

- P.M. Churchland’s (1985) “superempirical virtues” - to argue for a global assessment

of theories which includes empirical evidence as one criterion amongst several. On this

account, then, the best theory is one which coheres best with our interpreted experience

(BonJour, 1985; Evers & Lakomski, 1991, pp. 37-44; Williams, 1980). Additionally,

Dunn’s model relies on a theory of meaning in which terms are entirely determined by

their embedding conceptual framework, an impossible position to hold. For if meaning

were entirely determined by its conceptual role, then we could never have learnt either

of the feminist theories mentioned above. Nor could we have learnt about such abstract

objects as intentions, thoughts, and reasons. Theories are complex networks of sentences

which we do not learn ail at once. We must have begun by learning simple terms or

expressions first, but in order for us to understand these and their role in the theory,

knowledge of the whole theory is already presupposed or we would not be able to assign

them the function we do. Here we see that JTB promises more than it can deliver given

the theoretical tools it has at its disposal. In presuming learning capacities - a theory of

the mind - which postulate antecedents to learning which are themselves unlearnable,

it comes out unknowable on its own account: it fails to be self-referential.

The Theoretical Character of Belief and Desire

In specifying conditions for claims to count as knowledge, an epistemology implicitly

presumes a theory of the mind. What we can know crucially depends on the requisite

perceptual and cognitive capacities which we developed as a species. And we have

to learn from infancy onwards what these capacities are. We do not just “have”observational judgments, they have to be learnt beginning with the prior learning of

making complex perceptual discriminations, as Campbell (1974) and Popper (1981), for

instance, have long argued. We also have to learn the requisite linguistic or propositional

system within which we constitute our beliefs, desires, and thoughts. Since we first have

to acquire that, it cannot itself be the medium of learning, leading to the conclusion that

there has to be a type of learning prior to that involving the manipulation of sentences

(P.M. Churchland, 1989b, p. 155). And here our commonsense understanding of

ourselves fails to provide an answer. In order to demonstrate where folk psychology falls

short, it is necessary to make explicit what has been implied in the preceding discussion,

namely, that it is an empirical theory, and one which - on the current evidence - mightneed to be replaced. Here the Churchlands’ work provides the most telling arguments.

The propositional attitudes, so called because they express a distinct attitude towarda specific proposition (P.M. Churchland, 1988b, p. 63), i.e., the thought that (gum trees

are lovely; the belief that (humans act rationally); t he fear t hat (the Middle East crisis

Page 45: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 45/97

Beyond Paradigms 545

will turn into war) have long been taken to play a causal role in human behavior

and are central to folk psychology. Human conscious intelligence, according to folk

psychology, is said to consist of the rational manipulation of these propositions by

means of deductive inference. But rather than being the “sort of super-causal logical

relation” (P.M. Churchland, 1988a, p. 213) they are alleged to be, the propositional

attitudes display relationships typical of all theoretical explanations, as can be seen

by comparing them to the logical pattern formed by physical laws. If the latter can

be characterized by the “numerical attitudes”, both sets can be described as follows

(P.M. Churchland, 1988b, p. 64; 1981, pp. 70-71):

Propositional atti tudes Numerical atti tudes

. . . believes that p . . . has a length, of n

. . . suspects that p . . . has a kinetic energyj of n

Either “attitude” can be completed by putting a proposition in place of p or a number

in place of n. Only then do we have a determinate predicate. The logical relations

holding between numbers and numerical atti tudes also hold between propositions and

propositional atti tudes. Most importantly, where these relations hold universally, we are

in a position to state laws - for the latter set of relations as much as for the former,

numerical ones. In other words, we utilize the abstract relations which hold in the domain

of certain abstract objects such as numbers, vectors, or propositions in order to help us

state the empirical regularities between real states and objects, and that includes those

between various kinds of mental states. Summing up, the full-blooded intentional idiom,

contrary to popular opinion, possesses the same complex logical structure as the rest of

our scientific theories.

An important consequence of folk psychology’s empirical character is that it might

actually be false. This possibility, however, seems quite counterintuitive because we

appear to get by rather well by relying on its categories in terms of explaining and

predicting everyday behavior with a fair degree of success. Even if folk psychology

were false, would it matter for our practices, including those which comprise “policy

analysis”‘? And how would we then explain our behavior? These are difficult questions

which cannot be answered at present. But to see that they are real, and that much

depends on the answers, let us consider some of the fundamental problems for folkpsychology which lend support to the claim that it might be a false empirical theory.

Probl ems f or t he Sent ent i al Paradi gm

In the following section, I shall briefly note two major problems of the view that our

cognitive activity consists in the manipulation of sentences (for the original definition

of the sentential view see P.S. Churchland, 1980, 1989. See also Hacking’s 1975 accountof the reasons for “lingualism’s” favored philosophical status). The first is what Stich

(1983, p. 214) and P.S. Churchland (1989, p. 388 onwards) call the “infralinguistic

catastrophe”: much intelligent behavior is displayed by organisms who do not have any

overtly linguistic capacity, and that includes behavior by human infants, adults suffering

Page 46: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 46/97

546 G. LAKOMSKI

from left-hemisphere lesions, as well as deaf mutes (see P.S. Churchland, 1983). To

explain this, one could of course argue that this behavior is not really “cognitive” in

the required sense. But this would beg the question since it is already assumed that

intelligent behavior is identical with linguistic behavior. Secondly, while granting that the

above is intelligent behavior although it does not show itself linguistically, one prominent

suggestion maintains that it is based on a “thought language”, Mentalese, in which

organisms devoid of overt linguistic behavior reason and solve problems (Fodor, 1975).

The assumption that organisms possess such a language of thought which, according to

Fodor, is a proper language with all relevant characteristics seems far-fetched, and its

existence in principle impossible to ascertain since it is radically unlike the linguistic

capacity we display as adults. Its categories can be reached only via those of our

language, making the point of independently establishing them moot. (For a fuller

discussion of Mentalese see P.S. Churchland, 1980; also Stich, 1983, pp. 187-197.) But

the more interesting and important issue here is that of how infants learn our language

if Mentalese is assumed to be the template. If a child acquires, for example, German

by matching a German word with the appropriate word in Mentalese, the child can only

learn those German words for which there are words in Mentalese. If matching is the

process, then one must conclude that the child does not in fact learn any new concepts

at all in addition to those pre-existing in Mentalese. More absurd still is the further

implication that all the new concepts gained through scientific discoveries must have

already been present in Mentalese from the beginning, terms such as quarks, electrons,

atoms, neurons, etc. (P.S. Churchland, 1980). The existence of a “language of thought”,

then, would seem to stretch the bounds of credibility too far. Given these problems

which have been noted albeit briefly, it is more reasonable to assume that cognitive

activity does not equal “sentence-crunching”, an assumption which, inter alia, enjoys

the theoretical virtue of simplicity.

The second problem for the sentential view relates to that of accessing knowledge.

For an organism to survive, it is essential that it have speedy access to knowledge

relevant to respond to “the four F’s: feeding, fleeing, fighting, and reproducing”

(P.S. Churchland, 1987, p. 548). This goes as much for the cat’s recognition of a mouse

as food as it does for a driver’s recognizing a red light as signalling “stop”. Essential

in either situation is that the relevant stimuli are recognized instantly. But how does the

relevant information processing system know which bits of information to call up? Whatis its mechanism of sorting so that the correct response follows? The problem is far from

innocent, notwithstanding the fact that we manage to respond by and large correctly

and instantly to such stimuli in our everyday commerce with the world. Since this is

so, calling up our mental store of sentences/beliefs in order to find those which may

apply in a specific situation, and systematically eliminating those which do not by the

relevant logical rules, would present a sure recipe for evolutionary disaster since such

sorting would require a lot of time. Besides, the sentential/belief view presumes that we

do have immediate access to our mental states, an assumption not born out by research

in social and cognitive psychology (for relevant research see Nisbett & Wilson, 1977).

Borrowing Hanson’s “there is more to seeing than meets the eye-ball”, we may now saythat there is more to cognition than is expressed in sentences. But if the structures of

knowledge are not sentences, then what are they? And if the propositional attitudes do

not really explain what goes on inside our heads, and hence explain our behavior, what

kind of an explanation of behavior could we possibly advance?

Page 47: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 47/97

Beyond Paradigms 547

Neural Networks and Nonsentential Representation

The beginnings of an answer to both questions are suggested by computational

neuroscience (for a brief overview see Sejnowski, Koch, & Churchland, 1988) inwhat is termed “Connectionism”: the mind/brain’s capacity for the parallel distributed

processing of information, or PDP models of brain functioning (for an introduction

see P.M. Churchland, 1988, Ch. 7; P.S. Churchland, 1989, Ch. 10; a full account is

Rumelhart & McClelland, 1986. A first application of the network account in education

is given by Evers, 1990). Secondly, given the PDP account, a most promising alternative

to the folk psychological sentential explanation of behavior is the prototype activation

model developed by P.M. Churchland (1989a). In its simplest form this means that

instead of calling up endless lists of sentences, i.e., rules or laws, the mind/brain “calls

up” a prototype of some cognitive situation (P.M. Churchland, 1988, pp. 217-218).

Churchland acknowledges that the idea of prototypes is neither new nor uncontroversial.

In the present context, it poses the particularly difficult problem of how it is represented

in cognitive creatures such as ourselves if it is not represented linguistically. Here recent

research into the functional properties of neural networks provides an answer. These

are artificial networks which have been constructed to simulate essential features of the

neuronal organization of the brain. What is so remarkable about them is that they have

been extremely successful in learning assigned tasks such as differentiating mine from

rock echoes, recognizing complex visual features, and transforming written text into

speech, NETtalk, to mention just some (P.M. Churchland, 1988, p. 156 onwards).

A simple network has three specific architectural features. It possesses (1) (neuron-

like) processing units, (2) connections between these units, and (3) connection weights

which are the differential strengths of connections between the processing units

(P.S. Churchland, 1987, pp. 550-553; P.M. Churchland, 1988, pp. 156-165). The means

of communication between the processing units are signals such as (neuronal) firing

rate which are numerical rather than symbolic. The bottom, input, layer of units are

something like sensory units directly receptive to environmental input; then follows an

intervening layer, the so-called “hidden units”, and finally the output layer. (In real

brains there are between five and fifty intervening layers.) Each bottom unit emits an

output through its own “axon”. The strength of this output is a function of the unit’s

level of stimulation. Each axon branches out into a number of terminal branches and

sends a copy of that output to each and every “hidden unit”. The bottom units thus

make a variety of synaptic connections with each of the intervening units. The strength

each connection possesses is called its weight. What happens is that the set of activity

levels (input vector) induced by stimulating the input units is transported upward to

the hidden units, changing in the process by the influence of the output function of

the bottom cells, the pre-existing pattern of synaptic weights and the summing activity

within each of the hidden units. A pattern of activations is thus produced at both the

input level and another one at the hidden unit level. What kind of pattern results from

this activity, for a given input, depends entirely on the configuration of synaptic weights

which hit the hidden units (P.M. Churchland, 1989a). The process characterizing activityfrom bottom to hidden layers is repeated from hidden to top layers. The output vector,

then, similarly is the result of the activation pattern generated by the hidden units. This

description makes it clear that the most important characteristic of networks is their

complete interconnectivity. It is possible to construct model networks with whatever

Page 48: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 48/97

548 G. LAKOMSKI

number of input units, hidden units, and output units we wish. And we can also begin to

appreciate the processing power embedded in the (modest) two-tier arrangement. Most

important of all: the synaptic weights in the overall system can be modified in order to

obtain the vector-to-vector transformation we want, networks can be trained to learn, asthe example of Sejnowski and Rosenberg’s (1987) NETtalk shows. The most remarkable

thing to remember is that networks are not given any rules, laws, or generalizations

prior to or during the learning process. The crucial factor in learning is the value of the

synaptic connection weights which, in turn, are determined by a learning algorithm called

“back-propagation of error” or the general i zed del t a rul e (P.M. Churchland, 1988b, p.

159). Basically, the strategy exploited by the algorithm is “the calculated error between

the actual values of the processing units in the output layer and the desired values, which

are provided by a training signal” (P.S. Churchland, 1987, p. 551). The error signal of

the output layer is fed back to the input layer and is used to adjust each weight in the

network. This process is done countless times (via programing a conventional computer

to act as “teacher”) by feeding the network with diverse examples of Fs, for instance,

and by the network looking for a configuration of weights which will turn the neurons

at the hidden level into a set of complex feature detectors to which the output units will

respond in their turn. In being thus trained up the network slowly learns as the weights

are adjusted after each back propagation of the previous error signal. Learning thus

consists in minimizing the mean squared error over the training set of words, or Fs, or

whatever else was fed into the system. In this manner, the network actually generates

a set of internal representations for the relevant features to be recognized.

Policy Analysis as Naturalized Science

The most important results of studying the functions of artificial neural networks and

PDP for our purposes are (1) t hat l earni ng i s not a mat t er of t he li near processi ng of

symbol s and rul es but is rat her a gl obal, or netw ork, affai r, and (2) t hat consequent l y,

represent at i ons i n t he hi dden uni t s are not symbol s ei t her but are neuronal pat t erns

of act ivat ion. Although neural network modeling is relatively young, leaving many

questions still to be answered, it does suggest that human understanding and the

grounds for action are not to be sought in sets of stored generalizations such as thepropositional attitudes but are rather located in one or more prototypes as these were

earlier described.

If natural language appears to be no more than “a surface abstraction of much richer,

more generalized information processes in the cortex, a convenient condensation fed

to the tongue and hand for social purposes” (Hooker, 1975, p. 217), what follows

for policy analysis? The first result is that reliance on folk psychological categories

as explanatory of policy behavior, since mistaken, ought to be given up. We might

need to continue using them since there is as yet no other medium of representation,

but this communicative function must not be confused with its purported explanatory

function. Indeed, what the preceding thumbnail sketch of brain functioning does show isjust why commonsense or folk psychology is as successful as it is: it allows us to recognize

problematic situations instantly because of the work done at the sublinguistic, neuronal

level, i.e., given appropriate prototypes. Another consequence is that we can concede

that a good policy analyst, as has been observed many times in the literature, may well

Page 49: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 49/97

Beyond Paradigms 549

be distinguished by his or her “insight” or “intuition” since such expressions can now be

seen as place-holders for the complex computational processes of the brain. We know

indeed more than we can tell, and we are beginning to explain why this is so.

Will knowledge of causal, neurological, detail shape and change the practice ofpolicy analysis in view of the fact that natural language is the only medium we have

for communication? Initially perhaps not. We appear to have to continue using an

idiom not up to the task of representing our mental processes, or representing them

falsely. But what we learn from the development of neural networks, PDP and the

prototype activation model is that there is an alternative, empirical account of how

cognitive creatures in fact acquire knowledge of their environment. And if this is the

causal story, then it follows that this will eventually include policy relevant knowledge

as a possibly quite specific configuration of synaptic connection weights. We learn even

more specifically that such knowledge generation is global given the network character

of brain functioning, and that any policy decision made is the result not of rational

deliberation or prachd reumn but of the successful relevant prototype being activated.

And we may comment in passing that the distinction between analytic and substantive

arguments, believed to be all important by Dunn (and Toulmin), simply evaporates. It

is of no relevance in the neurological story of information processing.

How does the brain learn to recognize and subsume a specific problematic situation

under a relevant policy prototype? Unlike an artificial network which is trained up by

a “teacher”‘, for us, our global environment steps in as the “teacher”. In one sense,

this amounts to saying that a pohcy analyst, for instance, needs to gather as many and

as varied policy experiences as possible (leaving unspecified for the present what such

an experience might be) since successful subsumption of policy relevant information is

predicated upon already existing prototypes. “Goodness” of policy choice would then

be a function of how well trained a relevant neural network is, i.e., it is a function of

the richness of existing prototypes.

This raises the question of what function such sentential structures as our best scientific

theories have in relation to the practice of policy analysis in view of the fact that we

appear to be learning without the aid of sentential structures. The short answer is that

theories, even those we judge true by our coherentist standards, do not seem necessary

in order to learn a complex practice (Evers, 1990). This is not to say that true theories are

irrelevant to learning, only that since networks do not partition the inputs they receiveinto categories such as “true” or “false”, or “entailment”, or “implication”, they do not

possess any kind of privileged position. Since, on the other hand, true scientific theories

do explain the way the world is, they must have some functional importance. But this is

a big question in need of research. Tentative and speculative as these suggestions have

been, and counterintuitive as the elimination of folk psychology might appear, the way

forward for policy analysis lies in the more detailed, empirical work of the neurosciences

since they hold the key for the explanation of all cognitive behavior including that of

deciding which policy option is best.

References

Armstrong, D. M. (1981). Bel i eJ tr uth and know l edge. Cambridge: Cambridge University Press.BonJour, L. (1985). The strucfure ofempi ri caf know l edge. Cambridge, MA: Harvard University Press.

Page 50: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 50/97

550 G. LAKOMSKI

Boyd, W. L. (1988). Policy analysis, educational policy, and management: Through a glass darklv? In N.

J. Boyan (Ed.), Handbook of research on educati onal admi nistr ati on. New York: Longman.

Braybrooke, D. & Lindblom, C. E. (1963). A strat egy for decision. London: Collier-Macmillan.0. T___I__^ n , I .I

Campbell, D. T. (1969). Reforms’as experiments. American Psychol ogi st, 24: 409329.

Campbell, D. T. (1974). Evolutionary epistemology. In P. A. Schilpp (Ed.), Thephilosophy of Karl Popper.

La Salle, IL: Open Court.

Churchland, P. M. (1981). Eliminative materialism and the propositional attitudes. The Journal of

Phil osophy, LX XVI II (2). 67-90.

Churchland, P. M. (1985). The ontological status of observables: In praise of the superempirical virtues. In

P. M. Churchland & C. A. Hooker (Eds.), Images of science. Chicago: University of Chicago Press.

Churchland, P. M. (1988a). Folk psychology and the explanation of human behaviour. The Aristote l ian

Society, Supplementary Volume LXII, 209221

Churchland, P. M. (1988b). Mat ter and consciousness (Revised edition). Cambridge, MA: M.I.T. Press.

Churchland, P. M. (1989a). On the nature of explanation: A PDP approach. In P. M. Churchland A

neurocomput at i onal perspecti ve: The natur e of mi nd and t he struct ure of sci ence. Cambridge, MA:

M.I.T. Press.

Churchland, P. M. (1989b). On the nature of theories: A neurocomputational perspective. In P. M.Churchland, A neurocomput at i onal perspecti ve: The nat ure of mi nd and the struct ure of sci ence. Cambridge,

MA: M.I.T. Press.

Churchland, P. S. (1980). Language, thought, and information processing. Nous, 14, 147-170.

Churchland, P. S. (1983). Consciousness: Transmutation of a concept. Pacifi c Phi losophical Quart erl y, 64,

80-95.

Churchland, P. S. (1987). Epistemology in the age of neuroscience. The Journal of Phi l osophy, 84(10),

544-553.

Churchland, P. S. (1989). Neurophil osophy: Toward a unif ied sci ence of the mi ndlbrai n. Cambridge, MA:

M.I.T. Press.

Dunn, W. N. (Ed.) (198C-81). Symposium on social values and public policy. Policy Studies Journal,

9(4).Dunn, W. N. (1981). Public policy analysis. An introduction. Englewood Cliffs, NJ: Prentice-Hall.

Dunn, W. N. (1982). Reforms as arguments. In E. R. House, S. Mathison, J. A. Pearsol, & H. Preskill(Eds.). Evaluation studies review annual. Vol. 7. Beverlv Hills. CA: Sage.

Dunn, W. N. (Ed.) (1983). Val ues, ethi cs, and the pract i ce of pol i cy6naly sis. Lexington, MA: D. C.

Heath.

Dunn, W. N., Mitroff, I. I., & Deutsch, S. J. (1981). The obsolescence of evaluation research. Evaluation

and Program Pl anni ng, 4, 207-218.

Evers, C. W. (1987). Naturalism and philosophy of education. Educati onal Phil osophy and Theory, 19(2),

11-21.

Evers, C. W. (1990). Educating the brain. Educati onal Phi l osophy and Theory , 22(2), 65-80.

Evers, C. W. & Lakomski, G. (1991). Knowing educational administration. Oxford: Pergamon Press.

Fischer, F. & Forester, J. (1987). Confronti ng values in policy analysis. Beverly Hills, CA: Sage.

Fodor, J. A. (1975) The l anguage of t hought. New York: Crowell. (Paperback edition (1979) Cambridge,

MA: Harvard University Press)

Carson, G. D. (1986). From policy science to policy analysis: A quarter century of progress. In Dunn, W.N. (Ed.), Symposium on social values and public policy, Pol i cy Studi es Journal , 9(4), 535-544.

Gill, J. (1988). Whi ch w ay t o school ?: A revi ew o f the evi dence on t he singl e sex versus coeducati on debat e

and an annot at ed bi bl i ography of t he research. Commonwealth Schools Commission: Canberra.

Hacking, I. (1975). Why does l anguage mat ter to phil osophv? Cambridge: Cambridge Universitv Press

Hooker, C. .A. (i975).-Philosophy and metaphilbsophy’of science: Empiricism, Popperianism’and realism.

Synt hese, 32, 177-231.

Jennings, B. (1983). Interpretive social science and policy analysis. In Callahan, D. & Jennings, B. (Eds.),

Ethi cs, th e social sci ences, and pol i cy anal ysi s. New York: Plenum.

Jennings, B. (1987). Policy analysis: Science, advocacy, or counsel. In S. Nagel (Ed.). Research i n publ i c

poli cy analy sis and management. Greenwich, CT: JAI Press.L, I

Lakomski. G. (1988). Critical theorv. In J. P. Keeves (Ed.). Educational research. methodoloav. and. ~ I “,measurement. An internati onal handbook. Oxford: Pergamon Press.

Landsbergen, D. & Bozeman, B. (1987). Credibility logic and policy analysis. Know l edge: Creati on,Di ffusion, Ut i li zati on, S(4), 625-649.

Lasswell, H. D. (1951). The policy orientation. In Lerner. D. & Lasswell, H. D. (Eds.), Thepolicy sciences.

Stanford: Stanford University Press.

Lerner, D. & Lasswell, H. D. (Eds.) (1951). The policy sciences. Stanford: Stanford University Press.

Page 51: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 51/97

Beyond Paradigms 551

Lincoln, Y. S. & Guba, E. E. (1986). Research, evaluation, and policy analysis: Heuristics for disciplined

inquiry. Poli cy Studi es Revi ew , 5(3), 54&S&.

MacRae, D. (1976). The saci a~fu ncti on afsaciuZ science. N ew Haven: Yale University Press.

Mason, R. 0. & Mitroff, I. I. (1980-81). Policy analysis as argument. In Dunn, W. N. (Ed.), Symposium

on social values and public policy. Policy Studi es Journal , 9(4), 579584.

Mitchell, D. E. (1984). Educational policy analysis: The state of the art. Educational Administration

Quarterly, 20(3), 129-160.

Nisbett, R. E. Bc Wilson, T. D. (1977). Telling more than we can know: Verbal reports on mental processes.

Psychological Review, 84(3), 231-259.

Popper, K. R. (1981). Obj ecti ve know fedge. An evol uti onary appranch (Revised edition reprinted with

corrections and a new appendix 2). Oxford: The Clarendon Press.

Quine, W. V. 0. (1969). Epistemology naturalized. In Quine, W. V. 0. Ont ological relat iv it y and other

essays. New York: Columbia University Press.

Quine, W. V. 0. & Ullian, J. S. (1978). The w eb of bel i ef (Second edition). New York: Random

House,

Rumelhart, D. E. & McClelland, J. L. (1986). Paral lel distr i buted processi ng: Explorat i ons i n the

microstructure ofcognition. Vol. I. Foundations. Cambridge, MA: M.I.T. Press.

Sejnowski, T. J. & Rosenberg, C. R. (1987). Parallel networks that learn to pronounce English text,Complex Systems, 1, 145-168.

Sejnowski, T. J., Koch, Ch.. & Churchland, P. S. (1988). Computational neuroscience. Science, 241,

1299-1306.

Stich, S. (1983). From f ol k psychol ogy t o cogni t i ve science: The case against bel i ef. Cambridge, MA:

M.I.T. Press.

Toulmin, S. (1958). The uses of argument. Cambridge: Cambridge University Press.

Walker, 3. C. & Evers, C. W. (1982). Epistemology and justifying the curricutum of educational studies.

Bri t i sh Journal of ~ducat i anai St udi es, 30(2), 21S229.

Walker, J. C. & Evers, C. W. (1988). The epistemological unity of educational research. In Keeves. J. P.

(Ed.), Eyducat i onal research, methodol ogy, and measurement . Oxford: Pergamon Press.

Wildavsky, A. (1979). Speaki ng tr uth to power: The art and craft of poli cy anal ysis. Roston: Little,

Brown.

Weiss, C. H. & Bucuvalas, J. (1980). Truth tests and utility tests: Decision makers’ “frames of referencefor social science research”. American Soci ol ogical Revi ew , 45, 302-312.

Williams, M. (1977). Groundless belief. Blackwell: Oxford.

Williams. M. (1980). Coherence justification and truth. Review of ~efup~y sics. 34(2), 243-272.

Biography

Gabriele Lakomski is a senior lecturer in the Institute of Education, The University

of ~eibourn~, Australia. Her most recent publication is C. W. Evers and G. Lakomski

(1991) Knowing Edu~utio~al Ad~~n~stratio~, published by Pergamon Press. She serves on

a number of editorial boards including Curriculum Inquiry and the Journal of EducationalAdministration and is Section Editor of the second edition of the International

Encyclopedia of Education. She writes about philosophical problems in educational

(administration) research, research methodology, educational policy, and feminist issues.

Page 52: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 52/97

Page 53: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 53/97

CHAPTER 4

HERMENEUTICS: A THREAT TO SCIENTIFIC SOCIALSCIENCE?

DENIS C. PHILLIPS

Professor of Education and Philosophy, Stanford University, California, U.S.A.

Abstract

In the past few decades, numerous commentators who have been influenced by the

Continental hermeneutical tradition have argued that the social sciences, and related

applied fields such as educational research, are closer in certain logical or epistemological

respects to the humanities than to the natural sciences. This case is outlined; and then itis argued that, although many of the points made by hermeneuticists are both sound and

important, the conclusions that are drawn go too far. Two sets of questionable conclusions

are dissected: (a) those embodying claims about the epistemology of the social sciences

and related fields; and (b) those that describe the nature and limits of the social sciences

and educational research.

Introduction

Considering only the last three hundred years, from about the time of Vito onwards, a

massive literature on hermeneutics has accumulated. But in the past two decades there

has been a veritable avalanche of material - a poor academician can be driven to the

edge of bankruptcy trying to keep pace with the new books.

Unfortunately this recent material is more a repository of enthusiasm than of

enlightenment. There are differing accounts of the nature of the key issues, although

what comes shining through is the fact that hermeneuticists manage to reach (via

difficult and sometimes nearly impenetrable prose) some far-reaching and important

conclusions about the nature of the social sciences (and, by default, about related

areas such as empirical educational research). To the skeptical eye, the literature is

full of claims, but the arguments are left sketchy or unclear (or both); and there isa dearth of concrete examples - Stegmiiller (1988, p. 109) laments that “analysis of

examples is totally absent”. To add insult to injury, some writers (without much by

way of supporting argumentation) extend the scope of hermeneutics - so that, like

the Scarlet Pimpernel, hermeneutical issues are claimed to be everywhere.

The following discussion will attempt to bring some order to this complex domain.

First, there will be a distillation of the hermeneuticist case, especially in the form in

which it is advanced as a criticism of traditional empirical social science and related

553

Page 54: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 54/97

554 D. C. PHILLIPS

fields; then the discussion will focus upon the resulting claims that are made. In general,

the center of interest will be the image held by hermeneuticists of both social science

and research in applied fields such as education, and their epistemologies.

However, there is one more preliminary matter. Throughout the discussion the terms

“hermeneutical” and “interpretive” will be used as synonyms; their use will be varied

simply as a stylistic device to maintain the reader’s interest. One word derives from

Greek, the other from Latin; but they mean the same and even, in their classical usages,

refer to the same winged messenger of the gods (whose function was to communicate

the wishes of the deities in a form that mere mortals could understand).

The Interpretivist Case

The interpretivist case runs as follows, although it must be stressed that what follows

is a general account and of course individuals disagree about many of the details.

According to interpretivists, physical scientists deal with objects by explaining their

behavior either in terms of external forces or in terms of inner processes that result from

their physico-chemical microstructure. Notions of force, energy, causation, and natural

law are central; and the.methods by which knowledge is built-up are observational and

experimental. The underlying epistemological premise is a form of empiricism (many

interpretivists would say it is a form of logical positivism). Until very recently, social

science (especially in the U.S.A.) has proceeded by mimicking the physical science

approach - behaviorism in psychology being one example among many, though alsothe most notorious.

On the other hand, hermeneuticists would argue, humans are not mere physical

objects; people are impelled by ideas, knowledge, and hopes and desires. They harbor

intentions. And these things depend upon the use of symbols, as in language; as Gadamer

puts it (1977, p. 29), “language is not only an object in our hands, it is the reservoir

of tradition and the medium in and through which we exist and perceive our world.”

Symbols and language, of course, are impossible without societies. Furthermore, many

actions undertaken by individuals are actually constituted by public meanings, socially

adopted rules, conventions, and the like; thus, to take a fairly trivial example, one cannot

understand the game of tennis, let alone play it, unless one understands the rules andconventions that define the valid activities of the game (a serve that is “in”, an acceptable

placement of a return of the ball, etc.). But activities as diverse as participating in a

dinner party, consulting a physician, writing a philosophical paper, and giving evidence

in court, are no less constituted by socio-cultural rules and conventions.

It seems to follow from all this that to explain the actions of a person (as opposed to

the behavior of a physical object), an investigator must uncover the understandings of

the actor - how the actor interprets the situation he or she is in, how the mores and

beliefs of the society in which the actor is located are influential, what the actor sees

as being the possible responses that are open (given the social beliefs the actor holds),

and the symbolic meaning of the forms of behavior that are open to the actor in thatparticular setting. As the social scientist Bauman put it (1978, p. 12):

Men and women do what they do on purpose. Social phenomena, since they are ultimately acts of men

and women, demand to be understood in a different way than by mere explaining. Understanding them

Page 55: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 55/97

Beyond Paradigms 555

must contain an element missing from the explaining of natural phenomena: the retrievaf of purpose, ofintention, of the unique configuration of thoughts and feehngs which preceded a social phenomenonand found its only manifestation, imperfect and incomplete, in the observable consequences of actionTo understand a human act, therefore, was to grasp the meaning with which the actor’s intention

invested it; a task, as could easily be seen, essentially different from that of natural science.

This is the heart of the hermeneuticist or interpretivist position, and it is this to which

reference is being made by these labels in the subsequent discussion.

It should be noted in passing that a rift occurs at this point between several “schools”

of interpretivists. Some believe that it is necessary - as a corollary of the points

outlined above -to pursue the subjective understandings of actors; these can be labeled

“phenomenologically oriented hermeneuticists”. Others eschew this subjective approach

and focus instead upon the public meanings together with the observable actions of, and

interactions between, people in social settings. There is an accompanying disagreement,

therefore, about the methods of inquiry that are appropriate. It is probably true to saythat mainstream American social scientists tend to look askance at methods that smack

of subjectivism; and philosophers in the English-speaking world have a similar attitude

towards theories of meaning that focus upon the “pictures”, ideas, or intentions internal

to the individual. The dominant contemporary philosophical approach to meaning

focuses instead on the public realm, on how people operate with language - a view

that has clear implications for methodology. Although dominant in North America, this

latter orientation is not confined to it (see, for example, Apel, 1977, p. 301). However,

it is not the purpose of the present paper to pursue issues concerning the methodology of

interpretive studies; the focus here is unabashedly the theoretical arguments and claims

made on behalf of the importance of hermeneutics for the social sciences.To return to the exposition of the interpretivist case: human action, according to

the general position being expounded here, is a type of text (albeit an unwritten

one) - for a text is nothing more than a collection of symbols expressing meaning

(or even layers of meaning), although this meaning itself may be expressed in terms of

metaphors or complex cultural symbols. Hence it is possible to use the discipline that has

developed over many centuries to interpret texts - the discipline of hermeneutics, with

its central notion of the hermeneutic circle - to interpret and throw light upon human

action. This extension of hermeneutics to cover the non-written realm began with the

nineteenth-century figures Schleiermacher and Dilthey (see Palmer, 1969, Part 11); but

it reached its apogee with the work of Taylor (1977) and Ricoeur in the early 1970s.Thus, in his essay “The Model of the Text: Meaningful Action Considered as Text” first

published in 1971, Ricoeur wrote (1977, p. 316):

Now my hypothesis is this: if there are specific problems which are raised by the interpretation of textsbecause they are texts and not spoken language, and if these problems are the ones which constitutehermeneutics as such, then the human sciences may be said to be hermeneutical (1) inasmuch astheir objecr displays some of the features constitutive of a text as text, and (2) inasmuch as theirmethodology develops the same kind of procedures as those of . . text interpretation.

Many in the hermeneutics camp have gone on to point out that human societies are

full of the “objectifications” of meaning (as Gadamer, Betti and others term it) - notonly written texts, but social institutions, practices and rituals, and physical artifacts.

Hermeneuticists generally go further than this, however, and stress that interpreters

who are attempting to grasp the meaning of an actor or to grasp meaning that has

Page 56: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 56/97

556 D. C. PHILLIPS

been objectified in some way, have their own understandings shaped by the fact that

they themselves are members of a particular culture at a particular historical moment.

Interpretation, in other words, is not an act in which a “disembodied” investigator is

trying to decipher the (pre-established) meaning of a culturally and historically situated

actor or institution; rather, the interpreter, too, must become hermeneutically aware of

his or her own historicity (or “preunderstanding”, as some writers term it). As Linge puts

it, in his editor’s introduction to Gadamer’s Phifosophicd Hermeneutics (1977, p. xiv):

This methodological alienation of the knower from his own historicity is precisely the focus of

Gadamer’s criticism. Is it the case, Gadamer asks, that the knower can leave his immediate

situation in the present merely by adopting an [interpretive] attitude? An ideal of understanding

that asks us to overcome our own present is intelligible only on the assumption that our own

historicity is an accidental factor. But if it is an onto log ica l rather than a merely accidental and

subjective condition, then the knower’s own present situation is already constitutively involved in

the process of understanding.

Interpretivists sometimes use examples such as the following: in the physical sciences,

the behavior of objects is explained by bringing to bear physical laws - such as when the

orbit of a planet is explained by deducing its behavior from Newton’s or Kepler’s laws

(together with a statement of the initial conditions). On the other hand, the action of

Julius Caesar in crossing the Rubicon is not explained by bringing it under a law - for

there are no laws of nature pertinent to the voluntary actions of Roman generals standing

on the banks of particular rivers such as the Rubicon; rather, Caesar’s action is explained

in terms of his intentions, and in terms of the symbolic importance of that particular river

(which marked the border between divisions of the Roman empire). Caesar’s action was

not the product of laws of nature (despite the fact that his body was a physical object), but

it was voluntary - a result of his consciously reaching the decision to carry out a revolt (a

revolt being, of course, a social phenomenon). Furthermore, our attempts to understand

Caesar’s action are mediated by the historical/cultural milieu in which we, as interpreters,

are located; so, as hermeneuticists, we are struggling to understand ourselves at the very

same time that we are struggling to understand Caesar.

Some Far-Reaching Conclusions

Before the discussion proceeds it should be acknowledged that there is much in the

interpretivist position, as just outlined, that is compelling. Humans are not mere physical

objects; and to understand or explain why a person has acted in a particular manner,

the meaning (or meanings) of the action have to be uncovered - and to do this the

roles of language and of social symbolisms and values have to be taken into account.

Furthermore, it is clear that every society contains many “objectifications” of meaning

in its rituals, symbolisms, institutions, and so forth. (These words are being written in

the U.S.A. on July 4th amid the festivities, which serves to drive the point home.)

What shall be disputed are some of the very wide-ranging conclusions about research

in education and the social sciences that are drawn by hermeneuticists, conclusions

that stray well past what is warranted by the preceding position. These questionable

conclusions can be clustered into two major groups.

Page 57: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 57/97

Beyond Paradigms

Epistemological Conclusions

557

The first set of wide-ranging conclusions can be introduced via reference to Charles

Taylor. In his now classic essay “Interpretation and the Sciences of Man” first publishedin 1971, Taylor reveals himself to be a powerful spokesman for the view that the

epistemological foundations of empirical science are an unsatisfactory base on which

to erect a “science of man”. Taylor refers disparagingly to the ingredients that make

up the “epistemological bias” of empirical social science, and he writes that

many, including myself, would like to argue that these notions about the sciences of man are

sterile, that we cannot come to understand important dimensions of human life within the bounds

set by this epistemological orientation [Taylor, 1977, p. 1061.

Along what seems to be similar lines, Macdonald and Pettit argue in their Semantics

and Social Science that the epistemology of the social sciences is close to that of

the humanities: “Social science, insofar as its concern is the explanation of human

behaviour, begins to look like a discipline which belongs with the humanities rather

than the sciences” (Macdonald & Pettit, 1981, p. 104). This is a view which must

come as something of a shock to empirical educational researchers; and the shock

is exacerbated by the fact that Macdonald and Pettit are not alone. Thus, somewhat

less pithily, Rabinow and Sullivan assert, in their I nt erpr et i ve Socia l Science: A Reader

(1979, p. 13), that

Interpretive social science has developed as the alternative to earlier logical empiricism as well as

the later systems approaches, including structuralism, within the human sciences. It must continueto develop in opposition to and as a criticism of these tendencies. Here interpretive social science

reveals itself as a response to the crisis of the human sciences that is constructive in the profound

sense of establishing a connection between what is studied, the means of investigation, and the ends

informing the investigators. But at the same time it initiates a process of recovery and reappropriation

of the richness of meaning found in the symbolic contexts of all areas of culture.

So, then, the first set of wide-ranging conclusions that are drawn are epistemological;

and yet detailed epistemological arguments are in short supply in this literature - for

example, it has not been shown in any detailed way how it is that hermeneuticists

actually know, that is, how the products of their interpretive endeavors are warranted.

(Once again Stegmuller’s remark comes to mind; he states that philosophers of science

customarily support their claims about the epistemology of science by detailed analyses

of examples of scientific work, but hermeneuticists do not do the same. in their own

fields.) The issues here will be taken up again later.

A different but obviously closely related form in which the epistemological claims

surface is in terms of the relation between the human sciences, the natural sciences, and

the humanities. The issue can be phrased as a question: is hermeneutical or interpretive

social science really a science, or is it a branch of the humanities? As Connolly and

Keutner put it in the introduction to their edited volume Hermeneut i cs versus Science?

(1988, p. l), “do the hermeneutical disciplines . . differ in some important way from the

natural sciences, i.e., are those disciplines “autonomous”?” And Schutz put it extremelyclearly when he wrote (1962, p. 34):

There will be hardly any issue among social scientists that the object of the social sciences is human

behavior, its forms, its organization, and its products. There will be, however, different opinions

Page 58: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 58/97

5.58 D. C. PHILLIPS

about whether this behavior should be studied in the same manner in which the natural scientists

studies his object or whether the goal of the social sciences is the explanation of the “social reality”

as experienced by man living his everyday life within the social world.

A spatial analogy might help clarify this second form taken by the epistemological

claims of the hermeneuticists. (This is meant only as a preliminary heuristic device;

obviously it is hard to locate specific theorists precisely, for their thought is usually

complex and defies simple accurate categorization.) The humanities, the social sciences,

and the natural sciences can be visualized as arranged - in that order - along a

continuum. With respect to this continuum, several schools of thought exist: (1)

Some scholars have wanted to drive a wedge between the humanities and the rest,

by insisting upon the “autonomy” of the humanities; typically, this has been done by

stressing the nature of the humanities as interpretive disciplines, in which hermeneutics

(and especially the hermeneutic circle) has a central position (see Stegmuller, 1988). (2)

Others have hammered at the same wedge, by insisting that the sciences are demarcated

from the humanities by having a logical character accurately described by the logical

positivists. (3) A number of scholars have wanted to remove the wedge entirely. One

group has tried to do this by insisting that al l inquiry, to be genuine inquiry aimed at

producing warranted knowledge, must have the same underlying epistemology; usually,

the epistemology of science is taken as the model. On some readings, Dewey, and

perhaps Popper, belong to this group. (It should be stressed that in taking science as

the paradigm case of knowledge, these thinkers are not necessarily advocating a narrow

positivistic view of knowledge; in fact both Dewey and Popper have a fairly liberal view

of the nature of science - a topic on which there shall be more discussion later.) (4) Adifferent group has wanted to remove the wedge entirely by stressing that all knowledge

contains a strong interpretive element. Heidegger and Gadamer, according to some

of their remarks, ought tentatively to be classified as members of this group. Thus

Gadamer (1977, p. 38) writes that “Hermeneutical reflection fulfills the function that

is accomplished in all bringing of something to conscious awareness. Because it does,

it can and must manifest itself in all our modern fields of knowledge, and especially

science.” (5) Others, in particular writers such as Taylor (1977), Macdonald and Pettit

(1981), and Dilthey (1976) wish at l east to drive a wedge into the continuum between

the natural sciences and the social sciences, so that the social sciences end up being

grouped with the humanities. Typically, as discussed earlier, the argument is that thesocial sciences, like the humanities, must give a central place to interpretive methods.

It can be seen, therefore, that there is a degree of overlap between the views of those

in groups (4) and (5); but (4) offers a somewhat more radical position than (5).

Conclusions Concerni ng t he Nat ure of Social Science

The interpretivist case as outlined earlier also embodies within it certain views about

the nature and purpose of the various social and human sciences. Humans live in

societies, and societies are rife with objectifications of meaning; and it is with the

elucidation of these that the social sciences are centrally concerned. As Dilthey (1976,p. 192) put it,

Here the concepts of the human studies is completed. Their range is identical with that of

understanding and understanding consistently has the objectification of life as its subject-matter.

Page 59: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 59/97

Beyond Paradigms 559

Thus the range of the human studies is determined by the objectification of life in the external

world. Mind can only understand what it has created. Everything on which man has actively

impressed his stamp forms the subject-matter of the human studies.

But is it altogether clear that Dilthey is right? And even if the answer to this is in

the affirmative, does it follow that the central methods of the “human studies” must be

hermeneutical?

Skeptical Commentary

These two groups of far-reaching conclusions both require careful scrutiny. There

is some overlap between them, of course, so the discussion of each cannot be keptabsolutely water-tight. It will make sense to build up to the central issue concerning

epistemology, so for want of a better arrangement the discussion will proceed in reverse

order.

Commentar y: The Nature of Social Science

In the view of the interpretivists, the social sciences or “human studies” (together

with related fields such as educational research) are almost entirely concerned withmeaningful human action together with the objectifications of meaning that are to be

found in human societies. (Dilthey, 1976, pp. 163-7, did allow that a study of nature

was also relevant, insofar as natural events are frequently the stimuli for human action,

and form the focus of mankind’s attempts to develop knowledge.) But the fact of the

matter is that the social sciences are not so centrally concerned with hermeneutical

matters as has been supposed by supporters of the interpretivist position. To make this

case, it need not be denied that some sort of interpretive activity is required in some of

the social sciences; the point is that there is much else besides.

(1) In general, it may be true that the social sciences study phenomena that are social;

and social phenomena, as the interpretivists claim, are constituted by the use of language

and by other symbolic interaction - and thus cry out for hermeneutical.analysis. But

the “in general” marks an important caveat. The qualification is required because there

are many social sciences and they do not constitute a “natural kind”; the category is

human-made and is of necessity somewhat vague. The point is that there are some

social sciences where hermeneutical activity does not appear to be central - witness

various branches of psychology, and much of economics.

According to some accounts psychology is a member of the social (and certainly of

the human) sciences; and it is clear that psychology includes within its domain the

study of mechanisms, such as the cognitive and emotional ones, that underlie individual

human performance. Mechanisms like these can be studied in a manner that is as little

hermeneutical as is, say, biological research. Cases in point are the use, by cognitive

psychologists, of nonsense syllable experiments, and designs that focus on performance

on non-verbal tests such as Raven’s “Progressive Matrices”.

Page 60: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 60/97

560 D. C. PHILLIPS

And then there is economics, which is usually regarded as a clearcut member of the

social sciences - yet much of it can hardly be claimed to be hermeneutical. Some

branches of this “dismal science” certainly study the effects of social choices (as in

market phenomena), but these choices are conceptualized as being the mathematical

aggregate of individual choices. And it is crucial to note that, in general, the individual

is treated in the manner of an “ideal type” in physics: the individual is presumed to

be fully rational, fully knowledgeable, and to have a clear prioritization of needs and

desires. Mathematical modeling plays an important role here, but not hermeneutics. (All

that this adds up to is merely that economics is not, in essence, a descriptive discipline

which aims to discover by historical and interpretive methods why individuals make the

economic choices that they do.)

(2) However, even in those social sciences that do focus upon social phenomena- cultural anthropology, political science, and sociology are typical cases - there

is something more to study than human actions driven by motives, reasons, and

socially-determined understandings and interactions. Human actions have consequences

(both intended and unintended), and the study of these might not always require

hermeneutical interpretation. Theorists such as Popper place a great deal of emphasis

on the unintended consequences of human behavior; indeed, these consequences are

seen as a major driving force in history and are part of the reason that it is impossible

accurately to predict the future. Popper writes, in italics no less, that “only a minority of

social institutions are consciously designed while the vast majority have just “grown”, as

the undesigned results of human actions” (1961, p. 65). A little later (p. 67) he stresses

the “unavoidable unwanted consequences of any reform.” Sometimes Popper uses a

simple economics example to illustrate his point: a person who decides to buy a house

does not want the market suddenly to go up, but it will be the unintended consequence

of his or her entry into the housing market that prices indeed will rise. The study of

the laws of economics, and of how much the price of a commodity will change as the

number of individuals in the market changes, seems to be a non-hermeneutical scientific

activity. (The broader implications of Popper’s insight here will be discussed shortly.)

Consider a non-economic example: a political party in power in a country might

adopt a foreign policy for a set of reasons that requires interpretive understanding;

but unintended consequences of this policy might be that citizens resident overseas

have to return, gasoline shortages might break out as a consequence of disruption ofoverseas supplies, and there could as a result be a rise in the unemployment rate, which

in turn might differentially affect members of minority groups, leading to race riots and

the eventual overthrow of the party in power! All of these things can be documented,

correlated, and studied without use of hermeneutical methods. (This is not to deny, of

course, that some of these issues could be studied, for other purposes, using interpretive

methods. The point is that they also can be studied, and are studied in the social sciences,

using non-interpretive methods.)

At this point, if not earlier, an objection is likely to surface: The supporter of

hermeneutics is likely to protest that, contrary to the claim I have made, of course

all these research activities inescapably do require the use of interpretive methods!It is hard to resist the conclusion that, in making this counter-claim, hermeneuticists

have changed the meaning of the key term involved. Perhaps the point can be made

in terms of a distinction between a weak (and almost trivial) sense of “hermeneutics”

or “interpretation”, and a strong sense. In the weak sense, all endeavors that use the

Page 61: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 61/97

Beyond Paradigms 561

medium of language involve interpretation-from following the directions in a recipe, to

understanding an advanced lecture in an academic speciality, the language of the writer

or speaker must be comprehended. In this weak sense, hermeneutics is like the Scarlet

Pimpernel. Unfortunately, though, this is a useless point for advancing the inte~retivists’strong case; for it does not ~~~~0~ that because physicists or chemists or sociologists

use (and must understand) language, the epistemology of their disciplines is somehow

suspect or weaker than they thought it to be, or that they must use strong hermeneutic

methods as their core methodology. The strong hermeneutical program arises, not simply

because of the universal human use of language, but because of special problems within

this umbrella - the problem of understanding written records of human thought or

action (or other objectifications of these things, such as monuments or social practices

or rituals) from ages or cultures that are different from the interpreter’s own. Indeed,

the strong hermeneutical program only makes sense on the assumption that interpreters

do (in general) understand their own culture (or at least their own sub-culture) and their

own language; for otherwise there would be descent into a self-referential nightmare in

which an interpreter might not understand (in the weak sense) his or her own actions or

writings or thoughts of a prior moment (not to mention the fact that children would not

understand - and therefore could not talk with, or learn from - their parents)! If this

was the case, of course, all inquiry would instantly grind to a halt, and humans would

be ossified at the social level of sea anemones (if, indeed, these organisms can be said

to be ossified).

(3) To pick up the main thread of the argument: even where the center of attention

in a social science is an issue that clearly involves interpretation (in the strong sense),

there are many related issues that are non-hermeneutical (in this sense). For example,

members of a population might vote in a surprising way at an election, and their

actions may require culturally-sensitive interpretation (in the strong sense) in order to

be understood. (This is the sort of thing that is done, or that is attempted, by “TV

experts” on election night.) But other issues arise in understanding elections - such

things as the influence of the weather on the turnout of voters, the party preferences

of younger versus older voters, and the turnout of members of various ethnic groups.

To gather information on matters such as these, no strong hermeneutical activity has

to be engaged in. Certainly on some matters, the voters might have to be asked for

information (for example, in an exit poll of young voters to see which candidates theyvoted for), but what takes place here is quite unlike the strong hermeneutical activity

carried out by literary experts interpreting the meaning of Hamlet’s soliloquy or by

historians trying to understand some action of Julius Caesar.

(4) Finally, it should be noted that in many sciences different levels of phenomena

are distinguished - as when physical scientists distinguish between the sub-atomic level,

the atomic level, the molecular level, and so on. The relationship between such levels

is a highly debated matter: can phenomena at one level be “reduced to” (i.e., explained

in terms of) phenomena and laws at a “lower” level? Although the issues here are

exceedingly complex, it seems clear that explanatory principles used at one levei do

not always apply at higher or lower levels.The same holds true in the social sciences and educational research; and it seems that

supporters of the interpretivist position (that is, of course, the strong position) would

be wise to consider the possibility that the use of hermeneutics might be appropriate at

some levels but not at others - leaving at least some phenomena to be dealt with by

Page 62: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 62/97

562 D. C. PHILLIPS

non-hermeneutic social inquiry. It has already been seen that economics is an example

where the focus of attention is often (at least} on the group or aggregate level rather

than on the level of the individual human actors - and at the aggregate level there

seems to be a place for non-hermeneutical activity.Schelling (1978, Chapter 1) gives a simple example that can be adapted here for

illustrative purposes. (Artistic license has been exercised, and a different moral has

been drawn from the one Schelling highlights - he is interested in the question of the

fruitful co-ordination of the indi~~idual and group levels.) When audience members enter

large lecture halls, they seat themselves according to their own individual preferences.

That is, the choice of seating is an individual action, and the sitter’s knowledge, beliefs,

desires, and so forth may all play a role; and the choice of seating may also be a symbolic

act, such as one of defiance. Furthermore, a person’s choice is affected by the choices

made by people who entered the hall earlier. To understand why an individual chose a

particular seat, some sort of interpretive inquiry might be appropriate. And yet, if oneleaves the individual level of analysis and moves to a ‘*higher” level - the level where

audiences in halls rather than individuals become the “unit of analysis” - then it might

be apparent that there is a generalizable pattern to the filling of halls, the knowledge of

which could be helpful to designers of lecture halls, safety experts, and so on. And to

discover this pattern, no hermeneutical methods might have to be used; indeed, it can

be put even more strongly - hermeneutical methods could hinder the discovery of the

pattern rather than help. (&helling argues, quite rightly, that the motives of individuals

might have to be considered if any attempt is going to be made to change future seating

patterns; but if the aim is not to change the pattern but to use knowledge of it in future

planning, then understanding the “micromotives” is not necessary for the comprehension

of the pattern in the “macrobehavior”.)

Another way to phrase the point just made is that plot all of the patterns that are found

at the macrolevel in society are “objectifications” of meaning. And reference to Popper

can bolster the point: his argument that often what are most important in social affairs

are the unintended consequences of action is, in effect, making the point that there are

aspects of society that are not objectifications of meaning (for, by definition, unintended

consequences do not embody anyone’s intentions or meanings). Hermeneuticists who

assume that all social phenomena are objectifi~ations, have a distorted view of social

phenomena - and at the very least they owe us an argument to justify their position.

If they concede the point, they still owe us a discussion of the criteria that can be used

to distinguish those phenomena that are objectifications of meaning from those that are

not (a debt which, up to the present, they have seemed reluctant to discharge).

The conclusion that must be reached, then, is that although many social science

inquiries need to involve (strong) interpretive methods, many - very many - do

not have to. For it appears that the image of social science held by interpretivists is

too narrow; it is a view that is colored and limited by their own enthusiasms. This

conclusion is as far-reaching as those reached by the interpretivists, and it has important

implications: it weakens the remaining set of conclusions of the interpretivists. Given that

their view of social science is recognized as unduly narrow, it becomes more difficult toinsert a wedge between social science (as it really is) and the natural sciences and thereby

to group the social sciences with the humanities; and thus it becomes more difficult to

sustain an epistemological onslaught. But it is to this remaining broad set of far-reaching

epistemological conclusions of the interpretivists that the discussion now must turn.

Page 63: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 63/97

Beyond Paradigms 563

Commentar y: The Epistemol ogical Conclusions

There are two elements that require discussion here. In the first place, hermeneuticists

often attack the epistemology of traditional social science, which they regard ascrudely empiricist, or worse, as a form of positivism. Second, there is the matter

of the epistemology of (strong) hermeneutical social science itself - that is, what do

intepretivists want to put in place of the present “inadequate” epistemology, and is their

alternative adequate?

To deal with first things first; clearly it is a travesty to regard all of mainstream

social science, even just in the U.S.A., as being neo-behaviorist in spirit. (What of the

recent developments in cognitive science, social psychology, ethnomethodology and

anthropology, linguistics, political science, organizational theory, etc.?) In the late

twentieth century it is abundantly clear that neither the natural nor the social sciences

have to be viewed as being based on logical positivism; an image of science has emerged

over the past few decades according to which it is a more open and more speculative

endeavor than had previously been thought (see the discussion of the work of Kuhn,

Lakatos, Popper, Feyerabend and others in Phillips, 1987, Part 1). It is not stretching

the truth to suggest that when hermeneuticists attack the epistemology of mainstream

social science, what they have in mind is what Popper has called “misguided naturalism”

(Popper, in Adorn0 et al ., 1976, pp. 90-91). In effect they are attacking what by now

is recognized widely to be a straw man.

At least one alternative analysis of the epistemology of science has been offered by

Popper himself; hermeneuticists such as Taylor do not discuss it, for it seems immune

from the sort of charges they offer of positivism. (This is not to say that Popper has all

of the answers, or even some; his work remains a source of controversy - but in some

respects it is clearly an advance over positivism.) Popper denies that human knowledge

(including, of course, scientific knowledge) is certain by virtue of the fact that it is erected

upon unshakeable foundations. His books develop the case for a non-foundationalist

epistemology - see, for example, his Logi c of Scient i fi c D i scovery, Conjectur es and

Refutations, and Obj ect iv e Know l edge - although it should be stressed that Popper

is not alone among twentieth-century epistemologists in regarding foundationalism as

outdated. (For an example of a psychologist who holds this epistemology, see Weimer,

1979.) That is, Popper and many others do not approach the problem of knowledge

in terms of seeking the “rock-bottom” and indubitable foundations upon which the

certain knowledge of science (and of everyday life, so far as it has certain knowledge)

is built by a process of induction. Instead, these luminaries stress that no knowledge is

unshakeably certain, and that there are no absolutely sound foundations for knowledge.

Human knowledge is speculative, it projects tentatively into the future; whatever reason

we have to believe the things we do believe, it is not because they are based on absolutely

sound foundations! Our beliefs, and the considerations that led us to hold them, are

always open to the possibility of revision. (Popper has offered an account of the issues

surrounding the “rationality of scientific belief” that arise here, but it is not clear that

his resolution of the problems is acceptable. See Newton-Smith, 1981.)

Acceptance of a non-foundationalist approach to epistemology, in which all knowledge

is regarded as tentative, has an additional virtue: it allows a softening of the predicament

highlighted by Gadamer (who did not see it as a predicament so much as a too-often

neglected fact of life), namely, the fact that interpretations must inevitably emerge

Page 64: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 64/97

Page 65: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 65/97

Beyond Paradigms 565

(Betti, 1980, especially pp. 68-69). The application of this distinction to literature leads

to some controversies - Fish, for example, would deny that Shakespeare’s intentions

(the “meaning” of Hamlet) is important, for what is relevant in literature is what readers

can impose or construct for themselves (the “significance”, which of course Fish wouldprefer to label as “meaning”). However, even in the contentious realm of literary theory

the distinction itself is useful and provides terminology that serves to highlight the issues

at stake.

In those areas of the social sciences where the strong hermeneutical program seems

appropriate, even greater light is shed by Betti’s distinction. Consider a person in a social

setting who performs some act that draws the attention of a social scientist. (The study

of individual human action, it will be recalled from the preceding discussion, is one area

where hermeneutical methods do seem appropriate in the social sciences.) Betti would

have us, in effect, recognize two sets of issues: (a) what did the actor intend (that is,

what was the meaning of the act), and (b) what can the social scientist or interpreter

say about the act (that is, what is the significance of the act). This is not to say that on

every occasion Gove of these matters are of interest, but both are possible concerns. Of

course Betti is not alone in pointing to these two things; the interpretivist Winch (most

notably in his well-known dispute with Jarvie) aho makes this point - and Winch argued

that the issue having priority was the identification or description of the act, which by

conceptual necessity involves the determination of the actor’s intentions. Only after the

act has been identified, Winch suggested, might the social scientist be able to go on and

say something about it in terms of his or her own disciplinary perspective (assuming that

this second phase is relevant to the particular inquiry) (Winch, 1970, pp. 249-259; see

also Winch, 1958).

What can be said, then, about the epistemological unde~i~nings of this two-stage

interpretive process? The second stage is less problematical, relatively speaking. After

an act or document is interpreted as being an instance of X (for example, an expression

of jealousy), then it is relatively straightforward to judge if it falls within the domain

of some theory T. To be an acceptable theory, T would need to have a warrant that

is appropriate - if T is a theory of literary criticism, then it would need whatever

warrant is required for reputable status within that field, whereas if T is from sociology

or economics then different types of warrants would be appropriate. Within the social

sciences, there is a degree of agreement - although it is far from universal - aboutsuch matters as whether a theory T is well-warranted (or if not, why not), and whether

phenomenon X fails within the domain of T. (This sounds simple enough on the surface,

but of course there are many complexities; these, however, are subject to lively debate

and investigation within the traditional academic domains. Whether or not one judges

the epistemological program sketched here to be reasonable depends upon whether one

regards epistemology as a total field as viable, or as dead. If the reader judges it to be

dead, there is not much more to be said, except that this essay should have ceased being

of relevance long ago!)

The epistemological difficulties of the second phase pale into insignificance, however,

when compared to the problems faced by the first. How does an interpreter know that heor she has correctly identified the intentions of an actor (or has understood the meaning

that has been objectified in some social institution)? Neither Winch, Betti, Gadamer,

Dilthey, Taylor or the rest of the hermeneutical horde has made much headway here,

although many of them certainly espouse the ideal of settling on correct interpretations.

Page 66: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 66/97

566 D. C. PHILLIPS

Betti is an illuminating figure here, for he explicitly wants to establish an “objective”

position; he writes (1980, p. 57) of “the demand for objectivity: the interpreter’s

reconstruction of the meaning contained in meaning-full forms has to correspond to

their meaning-content as closely as possible”, and this requires “honest subordination”

(i.e., subordination of the interpreter to the “other” whose meaning is being deciphered).

Betti criticizes Gadamer’s book Truth and M ethod on the ground that (unintentionally) it

undermines the quest for objectivity, which Gadamer also espouses. Yet the best that

Betti can do himself is to argue that objectivity arises through the strenuous subjective

efforts of the interpreter intuitively or emphathetically to understand the meaning of the

other! This hardly seems an adequate means to achieve the goal he set out in the form

of a methodological canon which he labels, somewhat grandiosely, as “the canon of the

hermeneutical autonomy of the subject”:

By this we mean that meaning-full forms have to be regarded as autonomous, and have to beunderstood in accordance with their own logic of development, their intended connections, and in

their necessity, coherence and conclusiveness; they should be judged in relation to the standards

immanent in the original intention: the intention, that is, which the created forms should correspond

to from the point of view of the author . [Betti, 1980, p. 581.

From the point-of-view of the present writer, this canon is fine, but the epistemological

resources with which Betti wants to operationalize it are, to say the least, deficient.

A case can be made - although it can only be sketched here - that for the purposes

of social science, meanings and intentions can be investigated using traditional scientific

methods. That is, it can be argued that there is no epistemological difference in kind

between gaining knowledge about the other objects of science and gaining knowledgeabout meanings and intentions. Many branches of science can provide cases where the

objects of interest are not directly observable or measurable, but where their presence

(and their nature) is inferred from what is observable. This process is hypothetical, and

it is not guaranteed to be successful; but it is self corrective - by a bootstrapping

process involving testing and elimination of errors (which is itself a tentative business),

the warrants for the claims that are made about such objects become stronger (though,

many would argue, never so strong that matters become completely settled). Again,

what is sauce for the scientific goose is sauce for the hermeneutic gander: intentions and

meanings can be investigated in the same way. Tentative hypotheses can be checked,

if somewhat indirectly; empirical evidence can have a bearing on hermeneutical issues;

and hermeneuticists can - and do - use the hypothetico-deductive method common

across the sciences (Follesdal, 1979).

Conclusion

The net conclusion is that although there are some areas of social science and

educational research where the strong hermeneutical program is important, these

neither exhaust the scope of the social sciences and educational research nor offer

any serious grounds on which to hold that the social sciences and related applied fields

are more closely allied with the humanities than with the natural sciences. Those who

hold the contrary view, and claim that there is a similarity in kind with the humanities,

or that empirical social science is completely misconceived, need to offer more detailed

arguments and examples.

Page 67: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 67/97

Beyond Paradigms 567

Nevertheless, the sometimes exaggerated claims of the hermeneuticists have served

a very useful purpose: these claims have forced the adherents of traditional “pure” and

“applied” social science to broaden their view of the nature of persons - instead of

treating people on a par with inanimate objects they have been forced to regard personsas actors located within social and historical webs of meaning. And this constitutes a

watershed. (For an example of how “traditional” researchers have come to accommodate

the hermeneutical position, see Gage, 1989. Critics of Gage’s earlier works have often

labeled him as a neo-positivist.)

Acknow ledgements - The author has profited greatly from conversations with Deborah Kerdeman,

although she does not fully endorse some of what is written herein. Henry Alexander, Ron Glass, Ray

McDermott, Gabriele Lakomski, Colin Evers, and Harvey Siegel have also given helpful feedback on

earlier versions.

References

Apel, K.-G. (1977). The a priori of communication and the foundation of the humanities. In F. Dallmayr

and T. McCarthy (Eds.), Understandi ng and soci al inqui ry . Notre Dame, IN: University of Notre Dame

Press.

Bauman, Z. (1978). Hermeneut i cs and social science. New York: Columbia University Press.

Betti, E. (1980). Hermeneutics as the general methodology of the gei stesw i ssenschuft en. In J. Bleicher

(Ed.), Contemporury hermeneurics. London: Routledge.

Connolly, J. & Keutner, T. (Eds.) (1988). Hermeneuti cs versus science? Notre Dame, IN: University of

Notre Dame Press.

Dilthey, W. (1976). Di l they: Selecfed w ri ti ngs. Cambridge: Cambridge University Press.

Fish, S. (1980). Is there a text in t his class? Cambridge, MA: Harvard University Press.

Follesdal, D. (1979). Hermeneutics and the hypothetico-deductive method. Dialectica, 33, 319-336.

Gadamer, H.-G. (1977). Philosophical hermeneutics (Tr. D. Linge). Berkeley: University of California Press.

Press.

Gage, N. I,. (1989). The paradigm wars and their aftermath. Educational Researcher, 18,4-10.

Macdonalcl, G. & Pettit, P. (1981). Semant i cs nnd social sci ence. London: Routledge.

Newton-Smith, W. (1981). The rationality ofscience. ondon: Routledge.

Palmer, R. (1969). Hermeneutics. Evanston, IL: Northwestern University Press.

Phillips, D. C. (1987). Phil osophy, science and social i nquiry . Oxford: Pergamon Press.

Popper, K. (1961). The poverty of historicism. London: Routledge.

Popper, K. (1976). The logic of the social sciences. In T. Adorn0 ef al. (Eds.), The posit iv ist dispute in

German sociology. London: Heinemann.

Rabinow, P. & Sullivan, W. (Eds.). (1979). I nt erpr efi ve social science: A reader. Berkeley: University of

California Press.

Ricoeur, I’. (1977). The model of the text. In F. Dallmayr & T. McCarthy (Eds.), Linderstanding and

social inquiry. Notre Dame, IN: University of Notre Dame Press.

Schelling, T. (1978). M icromot i ves and macrobehavi or. New York: Norton.

Schutz, A (1962). The probl em of social reali t y: Coll ected papers 1. The Hague: Martinus Nijhoff.

Stegmtiller, W. (1988). Walther von der Vogelweide’s lyric of dream love and quasar 3C 273. In J. Connolly

& T. Keutner (Eds.), Hermeneuti cs versus science? Notre Dame, IN: University of Notre Dame Press.

Taylor, C. (1977). Interpretation and the sciences of man. In F. Dallmayr & T. McCarthy (Eds.),

Understandi ng and social inqui ry . Notre Dame, IN: University of Notre Dame Press.

Weimer, W. (1979). Notes on the methodology of scientific esearch. N ew Jersey: Lawrence Erlbaum.

Winch, P. (1958). The idea of a social science. London: Routledge.

Winch, P (1970). Comment. In R. Borger & F. Cioffi (Eds.), Explanati on i n the behavi oral sci ences.

Cambridge: Cambridge University Press.

Page 68: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 68/97

568 D. C. PHILLIPS

Biography

D. C. Phillips, an Australian, moved to Stanford University in California in 1974,

where he is currently Professor of Education and Philosophy, and Chairperson of

the Program in Research and Evaluation Methods in Education. He is the author

or co-author of a number of books including Phi l osophy, Science, and Social I nquir y

(1987), Vi sions of Chi l dhood (1986)) Perspecti ves on Learni ng (1985), and Toward

Reform of Program Evaluat i on (1980). He is co-editor of the 1991 NSSE Yearbook

Evaluati on and Educati on: At Quart er-Century . During 1990-91 he was President of the

Philosophy of Education Society.

Page 69: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 69/97

CHAPTER 5

META-ANALYSIS, METHODOLOGY AND RESEARCH

INTEGRATION

BRIAN D. HAIG

Department of Psychology, University of Canterbury, Christchurch, New Zealand

Abstract

This chapter presents a critique of meta-analysis by focusing on its underlying rationale.

It is argued that drawing a distinction between scientific and evaluative inquiry, where

meta-analysis is depicted as a methodology for quantitatively combining the results of

evaluation research, presupposes an unacceptable and largely empiricist view of both

forms of inquiry. Against prominent advocates of meta-analysis, it is claimed that a tenable

view of scientific research will be inherently evaluative. Additionally, it is argued that

the alleged incommensurability of theoretical frameworks does not prevent inter-theoretic

communication and integration, and that Glass’ meta-analytic rationale offers a deficient

account of the nature of methodology and policy research. It is contended further, that the

inherent deficiencies of Fisherian outcome studies render meta-analysis a worthless research

exercise. As a result it is concluded that serious attempts to integrate worthwhile knowledge

claims cannot be achieved statistically through the meta-analyses of outcome studies, but will

instead require “qualitative”, densely reasoned efforts to construct postulational and global

theories. Finally, the chapter gives some attention to a neglected perspective on research

integration by outlining an explanatory coherentist theory of justification and showing how

this can be implemented procedurally in social science research.

Introduction

The recent adoption and widespread use of meta-analysis procedures to integrate the

results of outcome studies in many areas within the social sciences stands as one of the

most striking research developments of the last decade. Meta-analysis is an approach to

data analysis that involves the quantitative analysis of data analyses of extant empirical

studies. Hence the term “meta-analysis” coined by Glass (1976). Meta-analysis is

concerned with the statistical analyses of the results of data analyses from many

individual studies in a given domain for the purposes of integrating or synthesizing

those research findings.

Meta-analysis comes in a variety of forms (Bangert-Drowns, 1986), but in its simplest

form meta-analysis requires computation of the average effect size for a group of studies.

For Glass, the effect size measure is the standard score obtained by subtracting the mean

of the control group from that of the treatment group and dividing this difference by the

standard deviation of the control group. This is done for each of the relevant dependent

569

Page 70: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 70/97

570 B. D. HAIG

variables in each study. The effect sizes are then summed and divided by the total number

of effects to obtain the average effect size.

In the face of burgeoning and fragmented research literatures displaying conflicting

results, meta-analysis is offered as a rigorous and objective alternative to the customaryunsatisfactory integration methods of narrative literature reviews and vote-counting of

significance test outcomes. Narrative reviews have been criticized as casual, severely

selective and unable to portray accumulated knowledge (Light & Smith, 1971), while

vote-taking from box-score tallies of significance test outcomes has been faulted for

its faiture to acknowledge the methodological asymmetry between confirmation and

refutation, and for its bias in favour of large-sample studies for which the significant

outcomes are largely a function of statistical power (Mechl, 1978).

Despite the claimed advantages of meta-analysis, a number of different types of

criticism have been leveled against it. For example, Slavin (1984) has argued that the

actual use of meta-analysis procedures in education constitutes a retrograde step in theart of research integration. Others (e.g., Bruno & Ellett, 1988; Cook & Leviton, 1980;

Erwin, 1984) have pointed out serious methodological limitations of the approach; while

at a m~ta-theoretical level Eysenck (1984) has argued that the meta-analytic enterprise

is unscientific, and constitutes “an abuse of research integration”.

However, to date evaluations of meta-analysis and its applications have shown little

regard for the accompanying rationale provided by Glass (Glass, 1972; Glass & Kliegl,

1983). Glass himself rightly claims that many misunderstand meta-analyses of outcome

research because they fail to take cognizance of the rationale. This failure is offered by

Glass as the reason for the widespread misunderstanding of Smith, Glass, and Miller’s

(1980) original meta-analysis of psychotherapy outcome studies.

My purpose in this chapter is to provide a critique of meta-analysis by focusing

on the conception of inquiry embodied in its underlying rationale. It is argued that

drawing a distinction between scientific and evaluative inquiry, where meta-analysis is

depicted as a procedure for combining the results of evaluation research, presupposes

an untenable and essentially empiricist view of both types of inquiry. Additionally, it is

claimed that there are a number of deficiencies in Glass’ perspective on methodology,

and that meta-analysis is premised on a deficient account of policy research. Finally,

some general suggestions are made about desirable future directions for our research

integration efforts. These involve adopting a coherentist theory of justification to help

construct and evaluate postulational and global theories.

Scientific and Evaluative Inquiry

The core of the rationale for Glassian meta-analysis involves drawing a distinction

between scientific and evaluative inquiry. Glass’ (1972) position is that researchers

as scientists are concerned to satisfy their curiosity by seeking truthful conclusions in

the form of theories comprising explanatory laws. By contrast, evaluators undertake

research on behalf of a client which is aimed at producing useful decisions based ondescriptive determinations of the worth of particular products or programs. For Glass

the meta-analysis of outcome studies properly involves the integration of the products

of evaluative research only,

GIass differentiates scientific from evaluative inquiry in respect of nine basic contrasts.

Page 71: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 71/97

Beyond Paradigms 571

I shall critically consider the most important of these contrasts, arguing that none of

them plausibly distinguish scientific and evaluative inquiry. This will serve to undermine

seriously an essential part of the rationale he provides for meta-analyzing outcome

studies, and at the same time help provide a more defensible characterization of the

research process.

Motivation of the Inquirer

According to Glass scientific inquiry is undertaken largely to satisfy the curiosity of

the researcher, and to this end will involve the construction of theories. By contrast,

the researcher’s basic concern in conducting evaluative inquiry is to help solve a client’s

practical problem.But trying to distinguish between the two types of inquiry in this way won’t do. For

one thing, science, being a human activity, is pursued for a multiplicity of reasons, both

personal and epistemic. Also, given that science aims at valuable truth, then the use of

a theory to help solve a practical problem counts as an epistemic virtue and contributes

to the overall excellence of that theory. Further, it is desirable to conceive of scientific

inquiry itself as a problem oriented endeavor where the concern is to formulate better

ill-structured problems so that we might solve them (Haig, 1987; Nickles, 1981).

Laws and the Particular

Glass briefly invokes the popular distinction between nomothetic and idiographic

research to differentiate further scientific and evaluative inquiry. For him scientific

inquiry involves the search for laws understood as statements of relationship among

variables or phenomena, whereas evaluation just involves the description of the value,

or values, of a particular thing.

This contrast between nomothetic and idiographic forms of inquiry is clearly based

on the widely held view that causal laws are universal, or generally applicable, empirical

regularities. However, it is more defensible to think of causal laws as the causally

necessary activity of generative mechanisms rather than their conditions of activation

or expressions of effect (Bhaskar, 1978). On this view it is a contingent matter whether

the mechanisms happen to be in a closed system like an experiment in which they can

produce empirical regularities. A law does not cease to exist in an open system just

because its empirical manifestations are absent. It is just that these latter are typically

altered or checked by the work of other causal mechanisms in an open system.

Now, by taking causal laws to involve the natural necessity of causal mechanisms

rather than the generality of empirical regularities, we can effectively dismiss Glass’

use of the popular distinction between nomothetic and idiographic inquiry for wrongly

holding causal laws and claims about the particular to be alternatives. Because laws arecentrally a matter of necessity, nomothetic inquiry can be either idiographic or universal

in nature. A science of the particular is a perfectly proper project, as for example in

the ethogenic study of individual lives using autobiographical methods. Moreover, to

endorse a study of the particular is not to foreclose the possibility that the future

Page 72: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 72/97

572 B. D. HAIG

comparative study of individual lives may well reveal deep structural universals (cf.

De Waele & Harre, 1979).

The Role of Explanation

According to Glass science involves the continual search for subsurface explanations of

surface phenomena. Evaluative inquiry, on the other hand, does not seek explanations.

“A fully proper and useful evaluation can be conducted without producing an explanation

of why the product or program being evaluated is good or bad or how it operates to

produce its effects . . . [It] is usually enough for the evaluator to know that something

attendant upon the [product or program] is responsible for the valued outcomes.”

(Glass, 1972, pp. 5-6) Glass’ position seems to be that, even though program

treatments can be causally responsible for their measured outcomes, it matters little

that knowledge of this gleaned from evaluation studies does not tell us how programs

produce their effects, because such knowledge is not needed for policy action.

Glass is surely correct in asserting that scientists are centrally concerned with the

construction of causal theories to explain phenomena, for this is the normal way in

which they achieve understanding of what they study. However, he is wrong to insist that

proper evaluations can deliberately ignore knowledge of underlying causal mechanisms.

The reason for this is that the effective implementation and alteration of social programs

requires knowledge of the relevant causal mechanisms involved (Gottfredson, 1984),

and strategic intervention in respect of these is the most effective way to bring about

social change. I grant that orthodox realism is wrong to insist that the relevant causal

mechanisms will always be unobserved, but it is the case that appeal to knowledge of

covert mechanisms will frequently be required for understanding and change.

Truth and Social Utility

This is probably the major contrast for Glass. He asserts that scientific inquiry

characteristically attempts to assess the truth of knowledge claims, whereas evaluative

inquiry attempts to gauge the worth of things. Glass takes truth to comprise the empiricalvalidation and logical consistency of knowledge claims, while worth is understood as

social utility. He acknowledges that truth is highly valued and worthwhile, but, this

point aside, he insists that the contrast between truth and utility effectively helps to

distinguish science from evaluation.

It is clear that Glass’ position amounts to drawing a sharp fact/value distinction in

which theoretical knowledge is claimed to be value free. But I shall show in a moment

that this is a distinction that cannot be sustained. At the same time, it should be

appreciated that Glass fails to make an important epistemic distinction that is crucial

to a satisfactory understanding of science: in identifying truth with empirical adequacy

and logical coherence Glass has conflated the epistemic notions of truth and acceptance.Truth is best understood as (causal) correspondence with reality where it functions as

a guiding ideal for science. As such it is a highly valued, though unattained, goal that

helps us make sense of science as an attempt to represent and intervene in the world.

However, truth is only accessible indirectly by way of the various criteria we use to

Page 73: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 73/97

Beyond Paradigms 573

evaluate and accept theories (Hooker, 1987). Empirical adequacy and logical coherence

are in fact two such criteria. They do not constitute truth itself, but instead function as

surrogates for truth.

By separating fact from value Glass clearly believes that the theoretical knowledge ofscience is free from value commitments. But this is definitely not the case. As just noted,

excellence of theory is determined by a multiplicity of related epistemic criteria. These

typically include explanatory power, existential depth, internal and external consistency,

fertility and practical application. Criteria such as these are in effect the good-making

features of theories, and as such they provide us with the means for judging theories to

be of value. The employment of such a range of values is an expression of the idea that

realist science pursues theoretically interesting or valuable truth, not truth simpliciter.

Contrary to Glass’ view of the matter, scientific inquiry deliberately strives to marry the

true and the good.

The point being made here is not just that science is influenced by value commitments,

but that it is actually constituted by values. That values are an inextricable part of science

can be seen dramatically when we realize that science is helping us turn our world into an

artifact which, because it involves the realization of human designs, necessarily combines

fact and value (Hooker, 1987). The old empiricist fact/value bifurcation has never been

true to science, and therefore cannot be used as a sound basis for distinguishing between

scientific and evaluative inquiry.

None of Glass’ major contrasts plausibly differentiate scientific and evaluative inquiry.

The post-positivist sketch presented here takes science to be a value laden, problem-

oriented human endeavor that tries to construct valuable causal explanatory theories ofboth particulars and universals. This is a view of science that emphatically rejects Glass’

empiricist view of both scientific and evaluative research. As we shall see later, it is a view

of science that assigns little importance to research integration through meta-analysis.

The Nature of Methodology

Met hodology QS Empiri cal

An important part of the rationale for Glassian meta-analysis involves adopting aconception of methodology as a substantive empirical discipline. According to Glass,

critics have often misunderstood meta-analysis because they have failed to appreciate

that it embodies a methodology of this sort. He claims that for any given empirical

domain a methodology combines with an object field and a taxonomy to give that

domain its basic structure. None of these three components are given to us a priori

as a product of logic. Instead, they are chosen for both arbitrary and historical reasons.

Methodologies, for instance, are selected and developed partly as a response to the

structure and pragmatic needs of society. In this regard, Fisherian experiments are said

to embody principles that grow out of the demand for control made by a technological

society. MethodoIogical assumptions are not established a priori. Rather, they aregenuinely refutable conjectures. Indeed, it should be emphasized that one of the main

functions of Glassian meta-analysis is to undertake the empirical investigation of such

assumptions as part of its own object field.

Glass believes that the most serious criticisms of meta-analysis lose their force when

Page 74: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 74/97

574 B. D. HAIG

they are examined from the standpoint of empirical methodology. The two major

criticisms he examines concern the quality of studies included in meta-analyses, and the

incommensurability involved in combining such studies. I shall consider these problems

in turn and suggest that Glass’ view of each is far from satisfactory.

The Quality of Study Problem

Glass has strongly criticized the traditional practice of excluding from the review

process all those studies deemed to be methodologically unsound. He objects that

judgments of exclusion frequently reflect the particular subjective biases of the

researchers involved and that such judgments are made a priori involving the treatment

of methodological principles as dogma. With his empirical approach to methodology,Glass adopts liberal criteria for the inclusion of studies in a meta-analysis. Faced with

the predictable “garbage in, garbage out” complaint, Glass defends the inclusion of

methodologically poor studies on the grounds that it is an empirical question whether

threats to internal validity affect the results of studies. He contends that meta-analysis

can answer this question by ascertaining the relation between these threats and effect

sizes.

However, as Erwin (1984) has correctly noted, Smith et aZ .‘s (1983) meta-analysis

of the psychotherapy outcome literature arbitrarily excludes all studies with non-

experimental controls. Having criticized other reviewers for the a priori exclusion of

“bad” studies, Smith et al. fail to give non-experimental studies a chance to justify

empirically their inclusion in the meta-analysis.

More serious than this inconsistency is the point noted by Erwin that Glass and

his co-researchers cannot justifiably assume that a randomized experiment with two

treatments is sufficient to conclude that a psychotherapeutic treatment causally produced

beneficial effects. This is because such experiments generally are incapable of ruling

out all plausible rivals to the hypothesis under test. This methodological limitation

of outcome experiments is but part of a deeper worry I have that such experiments

are vitiated by further serious methodological shortcomings. I believe that Fisherian

experiments are broken-backed and that, because such experiments dominate meta-

analyses, there will not be a sufficient number of well controlled studies to determine

how methodological quality affects results. Indeed, I believe we are led to the conclusion

that the meta-analysis of Fisherian outcome studies is not really a worthwhile research

exercise.

Fisherian experiments are characterized by their employment of the procedure of

randomization to control for unwanted nuisance variables, the use of analysis of variance

to partition causes, and the appeal to statistical significance tests to help evaluate the

causal hypotheses in question. Unfortunately, each of these three features does not do its

intended job, thus seriously undermining the Fisherian approach to experimentation.

The procedure of randomization is employed in Fisherian experiments for two reasons:

it provides a justification for significance tests by guaranteeing the correct probability

distribution under the null hypothesis; and, as just noted, it is introduced in order to

control the influence of nuisance variables. But neither of these reasons is plausible. As

I shall note shortly, Fisherian significance tests are flawed and should be abandoned.

Page 75: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 75/97

Beyond Paradigms 575

This removes one of the reasons for appealing to randomization. Also, as Urbach

(1985) has argued, randomization is both unworkable and unnecessary. Randomization

is unworkable because an infinite number of randomizations will be needed to match

the possible sources of error; not all possible influences are randomized out by a singlerandomization. Randomization is unnecessary because we can identify the most potent

nuisance variables and match groups on these. (Here I ignore the fact that matching

may introduce its own problems; cf. Meehl, 1970).

Fisherian experiments make use of analysis of variance (ANOVA) procedures in order

to fathom the relative contributions that various factors make to producing an outcome.

However, ANOVA is seriously deficient as an analysis of causes, and is often a poor

guide to causal structure in the social sciences. ANOVA is frequently used as a method

for trying to discern the separate causes of variation, but as Lewontin (1974) and Sober

(1984) have shown, partitioning by ANOVA cannot separate causes that occur in an

interactive world. Rather than provide us with an analysis of separate causes, ANOVA,

with its linear model, provides us with a tautological partitioning of total variance among

observations into main effects and interactions of various orders. To turn ANOVA’s

linear model into a contingent one relating the values of variables we would need general

statements about functions. But the ANOVA model is a local analysis giving results

dependent on the distributions of the particular populations sampled. As Lewontin puts

it, ANOVA confuses the local analysis of variance with the global analysis of causes.

As mentioned earlier, significance testing gets incorporated into the Fisherian

conception of experimentation in order to evaluate empirically the hypotheses under

test. Unfortunately, significance tests have a number of shortcomings which have long

been discussed in the literature (cf. Oakes, 1986). The basic problem with significance

tests is that they provide very weak support for a hypothesis or theory. It is acknowledged

by many professional statisticians that the point-null hypothesis is (quasi) always false in

the social sciences. This being the case, reasonable sample size makes the achievement

of statistical significance the likely outcome of an experiment. Even if we make the

achievement of statistical significance contingent upon the rejection of a directional null

hypothesis, we still have about a fifty-fifty chance of obtaining a statistically significant

result with a “good” experiment (Meehl, 1967).

Glass and Kliegl (1983) express deep dissatisfaction with the practice of outcome

research, particularly in the area of psychotherapy. Nonetheless, they still thinkmeta-analysis can winnow out something of value from such studies. In this section of

the Chapter I have argued that the methodological flaws inherent in Fisherian outcome

studies makes them irredeemable.

The Problem of I ncommensurabil it y

Research integration through meta-analysis has also been criticized for lumping

together studies that are fundamentally incomparable. Glass handles this criticism by

drawing a distinction between practical and theoretical commensurability. He claims the

more common criticism - that meta-analysis mistakenly mixes “apples and oranges”

- has to do with practical commensurability; and that, by comparison with theoretical

commensurability, it poses little problem. For Glass theoretical commensurability “. . . is

Page 76: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 76/97

576 B. D. HAIG

a long-standing point of debate in the philosophy of science, and the best that can be

said of progress toward the solution of the problem is that there has been little” (Glass

& Kliegl, 1983, p. 39).

However, Glass’ treatment of commensurability is grossly inadequate. In separatingpractical from theoretical commensurability, he implausibly presupposes a strong

observation/theory distinction. But post-positivist philosophy of science makes it

abundantly clear that such a distinction is not to be had. Observation statements are

theory laden and are assembled into networks with other classes of theoretical statements

to form scientific theories. This holds for both observational and postulational theories.

As a result, any meaning variance will hold, not only across different postulational

theories, but also across different observational theories such as outcome studies. There

is only theoretical commensurability.

Glass quickly lays the spectre of theoretical incommensurability by declaring that

there has been minimal philosophical progress on this issue (really, cluster of issues).

He does not say anything about the (different) views of Kuhn (1970) and Feyerabend

(1978) on this problem. Nor does he indicate that there are a number of ways of coping

with incommensurability. For one thing, it is a common understanding of Kuhn’s position

that incommensurability means that rival theories cannot be comparatively appraised,

whereas Kuhn only meant to suggest that the appraisal cannot be effected by a neutral

set of procedures and facts. Further, the real problem here is incommensurability of

meaning, which is said to arise because a term gets its meaning from its place in a

theory, with the result that it will mean something different for competing theories.

It is commensurability of meaning that Feyerabend dispairs of achieving when we are

confronted with the task of comparing rival theories.

In maintaining that the meaning of a scientific term is determined entirely by its role

in the network of statments that comprise the theory, proponents of incommensurability

such as Feyerabend adopt an implausibly extreme view of conceptual role semantics.

Not only does such a view of meaning fail to square with the history of science (pace

Feyerabend; see Nersessian, 1984), but it makes incommensurable theories unlearnable.

As Walker and Evers (1988) point out, we cannot possibly understand any part of

a theory unless we properly comprehend the entire theory, but such comprehension

itself is impossible for us until we have learned the various parts. For conceptual role

semantics to be viable, it must permit the meaning of a term to be determined only partlyby its conceptual role in a theory. When this is the case we can avoid the unattractive

consequence of incommensurability that rival theories are not comparable in terms of

some shared criteria.

A different way of avoiding the problem of meaning incommensurability involves

adopting a theory of meaning that shifts its focus from the idea of meaning to that

of reference. Putnam (1979) has developed one such promising theory which provides

for continuity of reference while admitting incommensurability of meaning. Although

not without its problems, this theory does concern itself with “. . meaning that is

pretty natural for a wide range of linguistic practices, and which does not invite talk of

incommensurability. It is the kind of theory that scientific realists about entities need”(Hacking, 1983, p. 91).

Yet a different strategy for avoiding the problem of incommensurability has been

fashioned by Nersessian (1984). She believes that the post-positivist tendency to

conceptualize and solve the problem of incommensurability in terms of the philosophy

Page 77: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 77/97

Beyond Paradigms 577

of language does not square with meaning change in actual scientific practice. From

a cognitive-historical examination of meaning changes in various electromagnetic field

theories that span different paradigms she concludes that, while the meaning of

“electromagnetic field” differs significantly across theories, each concept shares partof the meaning of its predecessors. This suggests that there is meaning variance, but

with a significant amount of commensurability.

It would appear, then, that philosophers have made sufficient progress in solving

the problem of meaning commensurability not to dispair of constructing, evaluating

and integrating different and possibly competing theories. There is no need to seek a

meta-analytic haven from the spectre of incommensurability.

Met hodology as Theory

As noted above, Glass rejects the influential conception of methodology as an a priori

enterprise and maintains it is an empirical endeavor. I believe he is right to criticize

a priori methodology, but wrong to suggest that methodology is just an empirical

enterprise. Viewing methodology as a priori knowledge is dubious because the notion of

a priori knowledge is itself highly questionable. The a priori categories of analytic truth,

synthetic a priori truth, logical truth and mathematical truth have all been subjected to

serious criticism within philosophy (e.g., Haack, 1974; Kitcher, 1983; Quine, 1953). But

while one can accept Glass’ claim that methodological statements are genuinely refutable

conjectures, it does not follow that methodological assertions are evaluated solely on

empirical grounds. The reason for this is that, in science, procedural knowledge, no less

than substantive knowledge, has the status of warranted conjectural theory and that,

broadly speaking, both kinds of knowledge are validated using the methods of science

(cf. Walker & Evers, 1988). Because substantive scientific theories are underdetermined

by the relevant data, they are additionally evaluated on superempiri~al dimensions such

as explanatory power, systemic worth and fruitfulness. We should not expect it to be

any different with our methodological theories. To be sure, empirical evidence will have

some bearing on assessing the soundness of methodological rules, but these assessments

will be highly inconclusive without invoking appropriate superempirical criteria. I have

already argued that the deeply flawed nature of Fisherian outcome studies prevents Glass

from being able to carry out sensibiy his empirical scrutiny of the merits of those studies

deemed methodologically suspect on a priori grounds. Additionally, I note that, where

Glass admonishes researchers for engaging in what he thinks are a priori methodological

debates, it may very well be the case that some of the disputes are really a posteriori

inter-theory debates about contingent matters.

Glass and his associates (Glass, McGaw & Smith, 1981; Smith et al., 1980) have

repeatedly emphasized that meta-analysis recommends itself over traditional review

procedures because of its objectivity. This is said to be achieved by proscribing judgment

strategies in meta-analysis which will prevent biases entering into the results it produces.

Here Glass appears to saddle himself with the untenable empiricist doctrine of logicism,which maintains that knowledge claims can be produced by making use of data and logic

only. However, logicism has been shown to fail (e.g., Maxwell, 1975) precisely because

human judgments have been found to be an important component in the production

of worthwhile knowledge. In fact, appeal to the various superempirical criteria in

Page 78: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 78/97

578 3. D. HAIG

appriasing theories is one striking way in which human judgments necessarily enter

into the knowledge production process.

bet a-analy st and Pol i cy

Glass and Kliegl (1983) maintain it is naive to believe that rational policy decisions

must be based on relevant knowledge from well established theories. They echo Meehl’s

(1978) judgment that “most so-called “theories” in the soft areas of psychology (clinical,

counseling, social, personality, community, and school psychology) are scientifically

unimpressive and technologically worthless” (p. 806). However, by reinterpreting

such theories as the modest products of evaluative research, and submitting them to

meta-analysis where appropriate, Glass and Kliegl believe that useful knowledge can be

provided for decision makers. In this way they believe they can overcome researchers’

habitual tendency to engage in “partisan squabbles and theoretical hot-dogging when

attempting to inform policy makers” (p. 35). However, meta-analysis has generally

failed in its attempt to establish clear judgments of pragmatic worth for policy makers.

Different meta-analyses in the same subject area have often produced different results.

For example, the constructive replication of the initial Smith et al . (1980) meta-analysis

of psychotherapy outcome studies by Prioleau, Murdoch and Brody (1983) produced

discrepant conclusions. Because meta-analyses are unavoidably replete with human

judgments over which researchers will differ, it is only to be expected that they will

be unable to provide clients with unambiguous messages.

Also concerning policy, it is worth noting briefly that Glass and Kliegl make

inappropriate use of Habermas. They claim “Habermas (1971) argued convincingly that

the knowledge-constitutive interests that determine, in part, the selection of a certain

methodology for science can be derived from the structure and pragmatic needs of the

society in which the science exists” (p. 35). However, Habermas’ (1971) critical-theoretic

analysis of cognitive interests relies uncritically on an inappropriate empiricist theory

of science (a point Habermas himself now concedes). Relatedly, Habermas’ insistence

that our knowledge-constitutive interests are somehow transcendental and a priori is

implausible, and clearly should be anathema to Glass and Kliegl.

This point aside, it is important to stress that methodologies and social institutionsdo relate to each other in mutually supporting ways. As part of an empiricist conception

of inquiry meta-analysis helps to serve as a prop for our extant social institutions by

providing them with the conceptual resources that help maintain the status quo. One way

meta-analysis reinforces the status quo stems from its acceptance of, and reliance upon,

fragmented literatures spawned by narrow specialist research. An appreciation of the

need for action seldom eventuates from fragmented knowledge. With its characteristic

focus on outcome studies, m&a-analysis is simply not capable of giving us broad,

coherent perspectives.

Meta-analysis further reinforces the status quo by restricting its attention to outcome

studies which focus on phenomenal appearances and refrain from considering underlyingcauses. This willingness to stop short of attempting to tell decent causal stories

contributes to a general inability to regard educational programs and social institutions

more generally as structurally problematic, and results in an absence of knowledge of the

relevant causes which would be the objects of strategic social change.

Page 79: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 79/97

Beyond Paradigms 579

A third way in which meta-analysis reinforces the status quo stems from the fact that

it is not critically aim-oriented. By willingly accepting clients’ goals, evaluation research

employs meta-analysis as part of an instrumental rationality concerned to devise and

follow efficient means to clients’ ends. As such, meta-analytic methodology affords usneither the inclination nor the ability to challenge the goals of clients, programs or social

institutions.

Glass and Kliegl’s (1983) remarks on research policy are consistent with the prevailing

view, which arises from a coupling of empiricist epistemology and liberal political

theory (cf. Unger, 1975). According to this view research policy is an essentially

political reaction to the demands of pressure groups and to the “externalities” of

economic activity. It operates, therefore, as an expedient and instrumentally rational

endeavor. What is needed, by contrast, is a prudent research policy that accommodates

the cognitive dynamics of good science. Such a research policy would be critically

aim-oriented, cognizant of the considerable time often needed to produce well-developedtheories, and mindful of the fact that the application of mature theories often involves

further and specific basic mediating research. Of course, a satisfactory realization of

science transformed along these lines would depend crucially upon changes to our

specifically scientific and broader social institutions.

Theory and Integration

It is important to realize that meta-analysis does not integrate research findings in

the constructive sense of combining and systematizing parts into larger wholes. Rather,

it functions as a reductionist enterprise, and in two related ways: it either ignores

explanatory theories that purportedly refer to hidden causal mechanisms, or it factors

out the specifically postulational component of such theories (thus helping to present

the empiricist picture of strict cumulative progress). Clearly, serious attempts to integrate

worthwhile knowledge cannot be achieved statistically through meta-analysis, but instead

will require “qualitative” densely reasoned efforts to construct suitable theories. For, it

is well-structured theories that are the real bearers of significant integrated knowledge.

With its empiricist “anti-theory” bias, meta-analysis neglects two of the most important

vehicles we have for conveying worthwhile knowledge: global theories and postulational

theories.

Global Theories

Cook and Campbell (1979) claim modern philosophers of science such as Popper,

Kuhn and Feyerabend have exaggerated the role of comprehensive theory in advancing

scientific knowledge. However, I believe such an emphasis is part of a general

post-positivist attempt to rectify empiricism’s tendency to concentrate on small-scale

observational theories as cognitive entities separated from other parts of the scientificendeavor such as methodology and metaphysics. The idea of a global theory, as

developed by Hooker (1987), suggestively articulates the structure that fits a good

number of our scientific theories. Global theories (e.g., quantum theory, evolutionary

theory, radical behaviorism) are large-scale theories that are typically composed of a

Page 80: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 80/97

580 B. D. HAIG

partial world view, a methodology, and a theory of instruments. They also specify

what is observable, provide a language for data reports, and will often make use of

additional theories deemed necessary for their application. Good global theories exhibit

the systemic feature of integrating their cognitive components into tight conceptualstructures. They are, therefore, important realizations of integrated knowledge.

Interestingly, radical behaviorism stands as one of our best examples of a well-

structured global theory. It contains a world view comprising a deterministic, physicalistic

and monistic metaphysics; it also includes an ethic of persons denied freedom, dignity

and moral responsibility and a vision of an ideal society in Skinner’s novel Walden

Two. Its methodology comprises an inductive account of method and an instrumentalist

conception of theories, while its theory of instruments justifies experimental use of

the conditioning chamber and cumulative recorder. Radical behaviorism also specifies

that operant and respondent behavior and their controlling stimuli are the legitimate

observables, and that the language for data reports about such observables is constrained

by an operationally defined terminology. Finally, radical behaviorism makes some appeal

to additional theories in its applications, as for example, the theory of placebos in

evaluating behavior therapy. Although Skinner’s brand of behaviorism will strike

many as unacceptably empiricist, it does, nevertheless deserve to be regarded as a

well-structured global theory with a high degree of internal coherence. What it does

lack, however, is the important virtue of explanatory coherence, a notion which I will

come to shortly.

Clearly, one better diagnostic alternative to evaluating the quality of outcome studies

through meta-analysis would involve attempts to assess how effectively global our extant

theories are. I expect that many of them would turn out to be minimally global with the

various relevant components being related by little more than conjunction. Attempting

to make such theories effectively global would help us fathom the extent to which the

conjuncts mutually cohere. I have already indicated that I think Fisherian experimental

methodology does not consistently square with the interactive nature of the social reality

it helps us investigate. That we have failed to appreciate this point stems in good part

from the fact that our methodological and ontological commitments largely remain

uncoupled in weakly structured global theories.

Po.~tu~atio~a~Theories

Postulational, or deep-structural, theories are those theories that purport to refer

to hidden generative mechanisms. Much of the world’s furniture is hidden from our

view; so, if we want to know how things operate rather than settle for an account of

their surface features, then we must fashion deep-structural or postulational theories.

These are the theories that realism advocates and that we need to understand and

effect change. Postulational theories recommend themselves to us because they have

more epistemic virtues than theories about observables, including outcome studies.

According to empiricism, empirical adequacy, or factual support, is the mark of a

good theory. However, scientific realism seeks to determine the general excellence of

theories, and to this end will invoke the various superempirical criteria that have been

mentioned above. Of these, explanatory power, internal and external consistency and

Page 81: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 81/97

Beyond Paradigms 581

systemic worth in particular, combine to give us some measure of a theory’s explanatory

coherence. In the next section I want to take up the idea that explanatory coherence is

central to theory justification.

Integration as Explanatory Coherence

Foundationalism is the historically influential theory of epistemic justification that

claims we are justified in believing those theories that are appropriately related to a

privileged source. Empiricist researchers characteristically take observational data to be

that source for the empirical sciences and rely almost exclusively on empirical adequacy

as the measure of good theory. Foundationalist epistemologies are now widely rejected

and coherentist justification has emerged as an atttractive alternative (BonJour, 1985;

Williams, 1977). Briefly, coherentism maintains that a belief is justified in virtue of

its coherence with other accepted beliefs. One contemporary version of coherentism,

explanationism, asserts that coherence is determined by explanatory relations and

that all justification aims at maximizing the explanatory coherence of belief systems

(Lycan, 1988).

Today the major challenge to explanationism comes from reliabilism, which asserts

that a belief is justified to the extent that it is acquired by reliable methods (Goldman,

1986). However, I believe with Thagard (1989) that both explanationism and reliabilism

can fruitfully be combined within a broad coherentist theory of justification and that such

a theory has positive application in empirical research contexts. In this final section of

the chapter I want to show how such a theory of justification can operate in social science

research. The attainment of explanatory coherence is an important, but neglected part

of research integration which simultaneously contributes to the unity and justification

of knowledge.

Retr oducti ve M eth od

In order to show how explanatory coherence can be implemented at a methodological

level, I need to outline the theory of scientific method that serves as my orientingframework. This theory of method is essentially retroductive in character and claims

that science often does, and should, proceed through a number of phases: regulated

by a developing problem comprising a set of empirical and conceptual constraints

(Haig, 1987; Nickles, 1981), relevant observed data are obtained, and are then

analyzed for potentially interesting patterns (Tukey, 1980). Once established, these

data patterns are explained by postulating the existence of an underlying causal

mechanism, through a retroductive, theoretical reasoning process (Curd, 1980). From

a positive judgment of the initial plausibility of such an existential hypothesis, attempts

are then made to elaborate on the nature of that mechanism, often by way of constructing

plausible models from an appropriate and familiar source (Harre, 1976). Partly becausetheories are underdetermined by the relevant data, subsequent attempts to test the

developed theory against its competitors will not in general be decisive (cf. Harding,

1976) and theory appraisal will have to be undertaken on dimensions in addition

to empirical adequacy. To repeat, these superempirical features will include initial

Page 82: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 82/97

582 B. D. HAIG

plausibility, fertility, explanatory depth, unifying power, internal and external coherence

and practical efficacy (Churchland, 1985; Kuhn, 1977). It should be noted that this

retroductive account of method gives explicit attention, and assigns major importance

to, the three contexts of theory generation, theory development and theory appraisal.

Coherence Justi fi cati on and Theory Generat i on

In indicating how a broad conception of coherence justification can operate in the

prosecution of social science research I shall largely confine myself to the retroductive

method’s context of theory generation. As I have just noted, the basic goal in this

context is to generate plausible theory to explain significant data patterns. The data

analysis phase itself is appropriately viewed as a two-stage compound affair where the

patterns thrown up by exploratory data analysis are critically checked through the use of

confirmatory data analysis procedures (Tukey, 1980). Exploratory data analysis involves

descriptive, and frequently quantitative, detective work designed to reveal patterns or

structure in the data sets under scrutiny, such data often being displayed visually in

(semi-) graphical form. It is important that researchers give extended attention to this

exploratory phase, because, in securing a heavy information yield from our data, they

are more likely to throw up the provocative data patterns that occasion the need for

explanatory theory.

Although carefully conducted exploratory analyses may, in addition to suggesting data

patterns, actually carry some measure of validity (thus calling into question the standard

exploratory/confirmatory contrast), it will normally be appropriate to check on our

emergent data patterns through use of confirmatory data analysis procedures. Computer

intensive resampling methods such as the jackknife, the bootstrap and cross-validation

(Diaconis & Effron, 1983; Effron & Gong, 1983) constitute one important set of

confirmatory procedures that fits well within a coherentist framework. By exploiting

the massive computational power of the modern high-speed computer, these methods

free us from the restrictive assumptions of normal statistical theory and permit us to

gauge the reliability of chosen statistics by making something like a billion calculations

on, say, 50 data points. The jackknife, for example, is an important attempt to establish

the accuracy of a computed estimate of some quantity of interest such as a mean, or a

standard deviation, or a correlation. It proceeds by removing one observation at a time

from the original data set and recalculating the statistic of interest for each of the reduced

data sets. The variability of the statistic across all the truncated data sets can then be

described by giving us an empirically obtained measure of the reliability or stability of

the original estimate.

The important point to be made about reliability checks like those afforded by the

jackknife is that they can be made part of a coherentist approach to justification, where

the coherence is provided by consistency of test outcomes. Here the reliability checks

on our suggested data patterns constitute a validating strategy. Our willingness to acceptthe results of such data analyses is in accord with what Thagard (1989) calls “the principle

of data priority”. This principle asserts that statements about observational data have a

degree of acceptability on their own. Such claims are not indubitable, but they do stand

by themselves better than claims justified solely in terms of what they explain. What

Page 83: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 83/97

Beyond Paradigms 583

justifies the provisional acceptance of data statements is that they have been achieved

by reliable methods. What justifies our provisional belief in the patterns thrown up by

exploratory data analysis is their reliabilist confirmation through use of computer based

consistency tests.

It is important to stress here that the justification of belief in such patterns is

heuristic and forward looking: we provisionally accept such beliefs for purposes of

constructing a theory that will explain the data patterns, but our knowledge of the

data patterns will receive the further required justification, if and when, they enter

into the explanatory relations of the theory that contains them. According to our

coherentism which links explanationism and reliabilism, data statements are not fully

acceptable without being linked to plausible explanatory propositions. Here I should

state that, even though reliabilism embraces the principle of data priority, it is not a

modest foundationalist position. This is because foundationalism insists that justificatory

relations are uni-directional (going from basic propositions to nonbasic propositions),

whereas the strategy of explanatory justification just outlined is bi-directional and

mutually enhancing.

The analysis and confirmation of striking data patterns provides a natural stimulus to

the generation of new theory which helps explain why the data pattern as they do. Now,

the type of ampliative inference involved in the generation of new explanatory theory

is not inductive, but retroductive. Retroductive reasoning may be characterized briefly

as follows: some observations are encountered which are surprising because they don’t

follow from any accepted hypothesis; we come to notice that those observations would

follow as a matter of course from the truth of a new hypothesis in conjunction with

accepted auxiliary claims; we therefore conclude that the new hypothesis is plausible

and thus deserves to be seriously entertained and further investigated. This typical

depiction of retroductive reasoning focuses on its logical form and, as such, is of

limited value unless it is conjoined with a set of regulative principles which enable

us to view retroduction as a pattern of inference, not just to any explanations, but to

the most plausible explanations.

These principles, which function as constraints within our developing research

problem, will be variously empirical, metaphysical and methodological in nature. In

judging the initial plausibility of our new theories we are in effect making prospective

judgments about their pursuit-worthiness, and we do this in the first instance byfathoming whether our theories are the products of sound retroductive reasoning.

Here I want to press the point that explanatory coherence is central to the.judgments of

the initial plausibility of our retroductively obtained theories. According to the program

of explanatory coherence that I have adopted, the explanatory component of a theory

coheres with the data component if the former explains the latter. In this regard, our

explanatory theories will normally be expected to explain more than one data pattern or

piece of empirical evidence. A theory which explains two different data patterns or types

of data is said to be consilient, and a theory which explains three classes of facts is judged

more consilient, and so on (Thagard, 1989). In this way consilience captures the idea of

explanatory breadth, which is an important aspect of explanatory coherence. In fact,the explanatory coherence account of justification I have adopted employs the notion

of consilience to capture the important methodological principle that a theory is better

supported, the wider the variety of evidence for it. And, in general, the more consilient

a theory is, the greater its initial plausibility will be.

Page 84: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 84/97

584 8. D. HAIG

In showing how explanatory coherence justification applies in the context of theory

generation, I have ignored the contexts where theories are developed and subsequently

subjected to concerted appraisal. It should be made clear, however, that Thagard’s

(1989) program for implementing judgments of explanatory coherence works throughinference to the best explanation as it applies to the evaluation of mature theories.

“Inference to the best explanation” is a useful expression that covers the protean

inference involved in the multi-criteria1 appraisal of mature theories. By focusing on

theory generation my concern has been with inference to one (or more) plausible

explanations, not inference to the best explanation. Finally in this section I note that, to

be consilient, a theory must be postulational. That is, it must invoke causal mechanisms

in order to have a chance of explaining the various data displays it attempts to unify.

Because empiricist observational theories seldom embrace the causal mechanisms that do

the crucial explanatory work, they are generally unable to achieve an acceptable level of

consilience. Outcome studies, meta-analyzed or not, also suffer from this deficiency.

Conclusion

The currently popular practice of meta-analyzing outcome studies has been called into

question in this chapter. My critique has focused on Glassian meta-analysis because,

in addition to being widely used in educational and psychological research, it boasts

a better developed methodology than other approaches to meta-analysis. However,

Glass’ methodology is deficient in many respects, principally because it embraces

an unacceptable empiricist conception of both science and evaluation. I have alsosuggested that the poverty of Fisherian outcome studies renders their meta-analyses

by whatever form a pointless research endeavor. The alternative thesis of this paper is

that well-structured theories, rather than meta-analyzed domains, are the real bearers

of significant integrated knowledge. Consistent with Walker and Evers’ unity of research

thesis, I have sought to show how coherence justification can fruitfully be applied in

the course of generating empirically-based explanatory theories. The brand of scientific

realist philosophy adopted in this chapter takes explanation and unity to be cardinal

epistemic goals. With this in mind I have recommended linking these twin virtues in an

explanatory coherentist theory of knowledge justification.

References

Bangert-Drowns, R. L. (1986). R eview of developments in meta-analytic method. Psychological Bullet in.

99, 388-399.

Bhaskar, R. (1978). A reali st rheory of scienceSecond edition). Sussex: Harvester Press.

BonJour, L. (1985). The structure of empiri cal know l edge. Cambridge, MA: Harvard University Press.

Bruno, J. E. & Ellett, F. S. (1988). A core analysis of meta-analysis. Quali ty and Quanti ty , 22, 111-126.

Churchland, P. M. (1985). The ontological status of observables: In praise of superempirical virtues. In

P. M. Churchland & C. A. Hooker (Eds.), Images of science (pp. 3547). Chicago: University of

Chicago Press.

Cook, T. D. & Campbell, D. T. (1979). Q uasi-experimentation. Chicago: Rand McNally.

Cook. T. D. & Leviton, L. C. (1980). Reviewing the literature: A comparison of traditional methods with

meta-analysis. Journal of Pers~naIi ~y, 48, 449--l ‘j 2.

Curd, M. V. (1980). The logic of discovery: An analysis of three approaches. In T. Nickles (Ed.), ~c~e~~~~c

d~c~very. logic and rati onali~ (pp. 201-219). Dordrecht: Reidel.

Page 85: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 85/97

Beyond Paradigms 585

Diaconis, P. & Efron, B. (1983). ~omouter”intensive methods in statistics. Scienti fi c American, 248.

De Waele, J.-P. & Ha& R. (1979). Autobiography as a psychological method. In G. P. Ginsburg (Ed.),Emerging str at egies i n social psycholoni caZ research (pp. 177-209). Chichester: Wiley.

Efron, 6. & Gong, G. (1983). -A-leisurely look at the Gdotstrap, the jacknife, and cro&-validation. American

Stat i sti cian, 37, 3&B.

Erwin, E. (1984). Establishing causal connections: Meta-analysis and psychotherapy. M idw est Studies in

Phil osophy, Vol. 9 (pp. 421-436). Minneapolis: University of Minnesota Press.Eysenck, H. J. (1984). Meta-analysis: An abuse of research integration. Special Educution, 18, 41-59.

Feyerabend, P. K. (1978). Against method. London: Verso.Glass, G. V. (1972). The wisdom of scientific inquiry on education. Journal ofResearchn Sci ence Teaching,

9, 3-18.Glass, G. V. (1976). Primary, secondary and meta-analysis of research. Educat i onal Researcher, 5, 3-8.

Glass, G. V., McGaw, B., & Smith, M. L. (1981). M et& analy sis i n social research. Beverly Hills: Sage.Glass, G. V. & Kliegl, R. M. (1983). An apology for research integration in the study of psychotherapy.

Journal qf Consult i ng and Cli nical Psychology, 51, 28-41.

Goldman, A. I. (1986). Epistemology and cognit ion. Cambridge, MA: Harvard University Press.

Gottfredson, G. D. (1984). A theory-ridden approach to programme evaluation. American Psyc~o~og~t,39, 1101-1112.

Haack, S. (1974). Deviant logic. Cambridge: Cambridge University Press.Habermas, J. (1971). Know l edge and human i nt erest s. Boston: Beacon Press.Hacking, I. (1983). Represen~ng and i nt erv eni ng. Cambridge: Cambridge University Press.Haig, B. D. (1987). Scientific problems and the conduct of research. Educati onal Phif osophy and Theory,

19, 22-32.

Harding, S. G. (Ed.) (1976). Can theories be refuted? Dordrecht: Reidel.Harr$, R. (1976). The constructive role of models. In L. Collins (Ed.), The use of model s i n the social

sciences (pp. 16-43). London: Tavistock.Hooker, C. A. (1987). A reali sti c theory of sci ence. Albany: State University of New York Press.Kitcher, P. (1983). The nature of mathemat ical know l edge. Oxford: Oxford University Press.Kuhn, T. S. (1970). The structure of sci enti j c revol uti ons (Second edition). Chicago: University of Chicago

Press.Kuhn, T. S. (1977). The essential tension. Chicago: University of Chicago Press.Lewontin, R. C. (1974). The analysis of variance and the analysis of causes. Ameri can Journal of Human

Geneti cs, 26, 400-411.

Light R. J. & Smith, P. V. (1971). Accumulating evidence: procedures for resolving contradictions amongdifferent research studies. Harvard Educational Review, 41 429-471.

Lycan, W. G. (1988). Judgement and just if icat ion. Cambridge: Cambridge University Press.Maxwell, G. (197.5). Induction and empiricism: A Bavesian-freQuentist alternative. M innesota Studi es i n

the Ph~l o.~op~y of Science, Vol . 6 (pp. ‘106-165). Minneapolis: cniversity of Minnesota Press.

Meehl. P. E. (1967). Theory-testing in psychology and physics: A methodological paradox. Philosophy of

Science, 34, 103-11.5.Meehl, P. E. (1970). Nuisance variables and the ex post facto design. M innesota Studi es i n the Phil osophy

of Science, Vol. 4 (pp. 373-402). Minneapolis: University of Minnesota Press.

Meehl, P. E. (1978). Theoretical risks and tabular asterisks: Sir Karl, Sir Ronald, and the slow progressof soft psychology. fournal of Consult i ng and Cli nical Psychology, 46, 8Of& 334.

Nersessian, N. J. (1984). Faraday to Einstein: constructi ng meaning i n scienti fi c theories. Dordrecht: MartinusNijhoff.

Nickles, T. (1981). What is a problem that we might solve it? Synthese, 47, 85-118.Oakes, M. (1986). Statistical nference. Chichester: Wiley.Prioleau, L., Murdock, M., & Brody, N. (1983). An analysis of psychotherapy versus placebo studies. The

Behavi oral and Brai n Sciences, 6, 275-310.

Putnam, H. (1979). Phi losophical papers, Vol. 2. Cambridge: Cambridge University Press.Quine, W. V. (1953). From a logical point of vi ew . Cambridge, MA: Harvard University Press.Slavin, R. E. (1984). Meta-analysis in education: How has it been used? Educational Researcher, 13,

6-15.

Smith, M. L., Glass, G. V., & Miller, T. I. (1980). The benefi t s of psychot herapy . Baltimore: Johns Hopkins

University Press.Sober, E. (1984). The nature ofsefection. Cambridge, MA: MIT Press.Thagard, P. (1989). Explanatory coherence. The Be~a~~oru~and Brai n Sciences, 12, 435-502.

Tukey, J. W. (1980). We need both exploratory and con~rmatory. Ameri can Sfat~ t i c~un, 34, 23-25.

Unger, R. M. (1975). Know ledge and poli ti cs. New York: The Free Press.Urbach, P. (1985). Randomization and the design of experiments. Phifosophy ofScience, 2, 256-273.

Page 86: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 86/97

5% B. D. HAIG

Walker, J. C. & Evers, C. W. (1988). The epistemological unity of educational research. In(Ed.), Educufi onai research, methodol ogy and measurement : an i nt ernat i onal handbook

Oxford: Pergamon Press.

J. P. Keeves

(pp. 28-36).

Williams, M. (1977). Groundless befief. Oxford: Blackwell.

Biography

Brian D. Haig is Senior Lecturer in the Department of Psychology at the University

of Canterbury, New Zealand. His research interests cover Research Methodology,

Science Education, and Philosophical Psychology. His work appears in journals such

as Educat i onal Ph~i osophy and Theory, the Journal of Research in Science Teaching;

Phi l osophy of Social Science, and the ~u~i e~~n f Peace Proposals.

Page 87: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 87/97

Page 88: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 88/97

588 D. T. CAMPBELL

other. I shall come back to this later. What I first want to note is that there are also

hermeneutic doctrines of method, and that these doctrines overlap heavily with the

actual practices of the physical sciences, and with doctrines of method and epistemology

in non-foundationalist post-positivist (but science-admiring) philosophers of science such

as Quine and Hesse.

To make this point, one must distinguish between validity-seeking hermeneutics and

what I will call “ontologically nihilistic” hermeneutics. Let me characterize these (with

a confession that I joust best with over-clarified straw men or, as Weber called them,

“ideal types”). The ontologically nihilist hermeneutists worship creative novelty in

interpretation as an end in itself, banning disputation as to which are more valid.

The validity-seeking hermeneutists regard such creative novelty as a means to better

interpretations, to be achieved by disputation within an interpretive community, using

coherence arguments and other hermeneutic principles. The hermeneutic ontological

nihilists argue with great sophistication that the concept of truth is incoherent,

and on this basis have concluded that the goal of truth, and argumentation about

comparative plausibility of competing interpretations, should be given up. I do not

mean to ridicule this epistemology. Traditional empiricist/logicist philosophers of

science are also in agreement that one cannot compare beliefs with the referents

of those beliefs (e.g., the Ding an Sich) as a truth test. Instead, one must compare

beliefs with beliefs (perhaps presuming privileged status for perceptually generated

beliefs). This also holds for non-foundationalist post-positivists such as Quine and

Popper, who advocate a correspondence meaning for the concept of “truth”, but

who also emphasize the unavailability of correspondence as a truth test for specific

beliefs (e.g., Quine/Duhem equivocality). They also concede to the skeptics that for

referential descriptive knowledge, the requirements of “fully justified” and “known-to-be

true”, belief cannot be met. But they, and the validity-seeking hermeneutists, advocate

continuing the presumptive, fallible, dialectical, search for validity, a search which also

characterizes ordinary perception and learning.

As validity-seeking hermeneutists, I identify Schleiermacher, Dilthey, Weber,

Habermas (I have found a few paragraphs and a footnote in his 1983, p. 251-61, and

note 8, particularly useful), and Geertz (1973, 1983). From my frustrating samplings,

I have judged Gadamer and Ricoeur as, in net, ontological nihilists, but am not

prepared to cite chapter and verse. (Perhaps this is an ideal type with no occupants,not even the “paradigms theorists” of education and the social sciences.)

The hermeneut i c ci r cl e. The methodological tactic most recurrently cited is the

“hermeneutic circle” (or cycle, or spiral), which I have tried to make sense of as

“part-whole iteration” (Campbell, 1988, pp. 478, 505-507), developed in the fallibilist,

conjectural domain of deciphering ancient texts for which no dependable dictionaries

were already available. A guess as to the purpose and meaning of the whole guides

guesses as to the interpretation of specific words, which guesses, if tentatively trusted,

lead to a revision of the guess at the whole, which again leads to revisions of translations

of the parts, etc. The non-foundationalism of this process is dramatized when an

interpretative community decides that a particular word is a copyist’s error, and that they,the interpretive community, know better the original author’s intention than does the

copyist’s text, or even that the author made a slip of the pen in a hypothetical “first” text.

The hermeneutic circle, and all hermeneutic methods, are specific forms of coherence-

maximizing strategies, properly heralded in the present essays as the new consensus-

Page 89: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 89/97

Beyond Paradigms 589

principle of mainstream Anglo-American post-positivist philosophy of science. I believe

that it has emerged from a dialectical process of mutual criticism within that tradition,

uninfluenced by continental philosophy. But the continental heirs of Hegel had this

emphasis first.

It was through long ago conversations with my philosopher friend at Northwestern,

Errol Harris, that I learned the Hegelian emphasis on coherence, and recognized it as

what Quine (in the last 15 paragraphs of “Two Dogmas”, 1951; Campbell manuscript,

1990) and I (1966) were employing. But for Harris’s Hegel it also went beyond a strategy

for knowing to a definition of “truth” with the contrast pair being a coherence definition

of truth versus a correspondence definition. For those cited by Evers and Walker (see

also Lehrer, 1974, 1990), the contrast pair is coherentism versus foundationalism. While

it is the foundationalists who most clearly represent a correspondence definition of

truth (and perhaps even the foolish faith that it can be implemented as a truth test

for some specific beliefs), it seems to me that OUT coherentist tradition is compatible

with a correspondence goal of truth (and hence a correspondence definition of the

meaning of the term “truth”) even though our epistemology assures us that we will

never meet it. Certainly defenders of Tarski’s version of the correspondence definition,

such as Popper and Quine, have never assumed its availability as a truth-test for specific

beliefs. We accept a quasi-Hegelian surrogate goal of increasing coherence even if we

regard this as merely our best available symptom of truth (particularly if the cumulated

evidence has been produced by scientific disputation involving competing hypotheses

about “reality”. Similarly for van Fraassen’s “empirical adequacy” as a goal [Paller &

Campbell, 19891).

I feel sure that in that vast literature there are many other hermeneutic principles, but

I do not recollect encountering such a list. (This is a request for help.) I will provide a

few more here. Congruent with Evers’ observations, I will embed these in texts from

Quine. But I regard these Quine quotes as brilliant ambivalent outbursts from his faith

in logic (a faith which logic itself has undermined), and feel sure that, overall, they are

fully consistent with the older, validity-seeking hermeneutics. And where I differ from

Quine (the 10 per cent of Quine which I have read), I feel that the direction of my

difference is in a quasi-hermeneutic direction. (Not being erudite about hermeneutics, I

am in danger of over-identifying myself with it. One hermeneutic tendency I clearly reject

is a “text-foundationalism”, a tendency to limit ones interpretative goals and permissible

resources to a specific text, excluding beliefs about the author, his audience, and his

times. I am using the term metaphorically for principles in validity-seeking scholarship,

shared by humanities and sciences.)

Omni fa l l ib i l i st t rust . Our predicament as knowers is such that, in improving the validity

of our beliefs, we have no other strategy available than to trust the great bulk of our

beliefs while we revise a small subset of them. But none of our beliefs are foundational.

All are potentially open to revision. What substitutes for foundations is the bulk of our

other relevant beliefs, none of them individually foundational. Our goal in revisions is

to increase coherence.

Quine’s repeated reference to Neurath’s boat (or raft) which must be repairedwhile afloat at sea epitomizes this perspective. “Our boat stays afloat because at

each alteration we keep the bulk of it intact as a going concern” (Quine, 1960, p.

3). And from the precious last 15 paragraphs of “Two Dogmas”, some illustrative

fragments: “Our natural tendency [is, and should be] to disturb the total system as

Page 90: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 90/97

590 D. T. CAMPBELL

little as possible. ” “No statement is immune to revision.” “Even . . . the logical law

of the excluded middle.” I have epitomized this Quinean message as the “doubt-trust

ratio” (Campbell, 1988, pp. 363-365, 477-482) estimating usually to be 99% trust to 1%

doubt (1988, p. 318), although dropping to 85% or 90% trust in a scientific revolutionin astronomy or physics (1988, p. 482). (The slogan “omnifallibilist trust” has been

introduced in Cook & Campbell, 1986.)

Pattern matching. All of our observations are both fallible in principle and have some

inevitable imperfection in practice. The beliefs and theories we relate to our observations

are incomplete and oversimplified. The impressive degree of congruence between them

found in the best of science cu~~~~ have been achieved by a perfect, foundational

relationship at any point. Instead it is achieved by a pattern matching, spreading the

fringe of imperfection over all of the presumed points of contact between observation

and belief or theory (Campbell, 1966). Identification by “feature detection” employs a

local pattern matching, and does not negate this universal hermeneutic principle.

I ncreasing cor respondence w i t h ncreasing scope. In the hermeneutics of translations, a

word in isolation is less confidently read than a word embedded in a sentence. Still better,

if it be embodied in a paragraph, or a book or a literature. All words are equivocal,

polysemous, indexical to some extent. Context is needed, and the larger the context the

better. From recent Bible translation comes the example of a Hebrew word which occurs

only once in the Pentateuch, and whose reading has been under dispute. In the last 50

years, there have been discovered cuneform libraries of a pre-Hebraic closely related

Semitic language, Ugaritic, in which a cognate word is frequently enough used to have

provided enough context so that the scholars have achieved a working consensus on its

interpretations. This consensus is now being used to achieve a consensus on the orphan

word in the Biblical text.

To Quine is attributed a “holism” which amounts to total suspension of consensus

formation on the relative validity of beliefs. This is both criticized, and used to legitimate

an ontologically-nihilistic relativism. I find this a wrong reading. Here is the crucial

paragraph from “Two Dogmas” (Quine, 1951, p. 39):

RusselI’s concept of definition in use was . . an advance over the impossible term-by-term empiricism

of Locke and Hume. The statement, rather than the term, came with Russell to be recognized as

the unit accountable to an empiricist critique. But what I am now urging is that even in taking the

statement as unit we have drawn our grid too finely. The unit of empirical significance is the wholeof science.

My gloss on this, which Quine accepts, is as follows: in practice, science is never going

to be mapped in an apodictic or foundational way to empiricism, nor to observations

(whether in sense data terms or a language of ordinary objects) nor to “stimulations”.

Nonetheless the fallible and underdetermined match between science and the empirical

steadi l y i ncreases as science advances, and as broader samples of science are employed

in the context of interpreting specific scientific statements. The empirical accmmtabil i ty

of science improves as one moves from “term by term” to “statement by statement”,

i.e., the “statement as the unit of accountability” was an improvement over the

“term-by-term empiricism”, but still not an encompassing-enough unit. Larger units

of science (theories for specific domains, multi-domain integrated theories, etc.) would

be still better and, by metaphoric extension, “the whole of science” best of all. This

metaphoric extension refers (a la Peirce’s conceptualization of truth) to “the whole of

Page 91: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 91/97

Beyond Paradigms 591

an eventually perfected, totally comprehensive, integrated science”, i.e., to something

Quine does not expect ever to be completely realized, and certainly not now available

to the “more thorough pragmatist”. That metaphoric asymptote: “the whole of science” is

overly emphasized by Quine’s critics. What is important is the directionality it dramatizes:i ncreasing corr espondence w i t h i ncreasing scope. This principle applies whether that

correspondence be interpreted as between scientific beliefs and “stimulations” or

between beliefs and “observations of objects and events”. Indeed, it is a very general

principle of cognition: the stimulation of a single retinal cell is nearly totally equivocal as

to correspondence with an object posit. A pattern of stimulation of a hundred thousand

retinal cells may leave such little equivocality that only a skeptical philosopher would

notice. The normative rule might be made explicit as: to improv e the ji t of beli ef to

[st imul at i ons] [observ at i ons] [t he ext ernal w orl d], expand t he scope of both bel i efs and

[observations].

Parti al, proximal revi sion. The web of belief, or the coherence of texts, is not so

tightly related that revision of one part requires the revision of all. We do, and should,

revise those nearest in the web of belief, nearest in terms of implication steps. I do

not have space here to document Quine’s recognition and normative recommendation

of piecemeal revision in those 15 paragraphs, but will share with those interested my

unpublished 1990 “Exegesis”. I have tried to capture this aspect of his recommended

strategy under the concept of “the ramification-extinction of plausible rival hypotheses”

(1988, pp. 518-519). That is, we consider the extended implications in our network of

beliefs for each of the rival interpretations of a specific set of data, and find “implausible”

and hence “rule out” those hypotheses whose ramifications are discordant with other

beliefs beyond the immediate focus which are we inclined to trust.

Fall i bil ist pri vi l eging of observat ions and core. The above hermeneutic principles, and

the “principle of charity” to follow are, I believe, clearly compatible with the validity-

seeking hermeneutic tradition, as well as being Quine’s recommendations as to how to

proceed in science. The present one may be specific to science. All through those last 15

paragraphs of “Two Dogmas” are references to a “periphery” of empirical observations,

stubborn or “recalcitrant” ones which motivate readjustments in the “interior”. While

none of these are infallible (we may interpret them away as hallucinations or the meter

being out of callibration, etc.), it is clear that Quine, as an empiricist, advises us to be

more reluctant to discount them than the theories which relate them. While the textual

evidence is less clear, I believe that he regards very central beliefs, as the belief in

knowable order or the laws of logic, also relatively privileged. Normatively;we should be

less willing to revise these (periphery and core) than the intermediate theoretical beliefs.

I suspect that the coherence theorists here assembled share this privileging of center and

periphery, and recommend putting most of the coherence-improving readjustments into

the intermediate theoretical beliefs.

The Principle of Charity. This hermeneutic principle advises us to attribute to the

author of the text a shared humanity. We prefer that tentative translation which makes

most of his or her beliefs “true” or “rational”, qualified by consideration of the

different environment and historical period in which the author lived. Especially clearlyin Habermas, we privilege the assumption that the author is trying to communicate, and

is honest (except for partisan biases we too share, or corrupting social predicaments

that preclude an “ideal speech community”). This particular name for this principle

has been popularized by Quine’s great hermeneutics of translation, Word and Object

Page 92: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 92/97

592 D. T. CAMPBELL

(1960) (he attributes it to Wilson). It is a mainstay of Donald Davidson’s philosophy,

but he mistakenly denies any frame-of-reference relativity, attributing to all others his

own perfect rationality and implicitly the same background beliefs. Mary Hesse (1980,

pp. xviii, 160,161) has specifically identified the principle of charity with the hermeneutic

tradition.

We can envisage physical scientists as part of a hermeneutic community using all of

the principles but “charity” in their interpretation of laboratory data about inanimate

objects. They do, however, use the principle of charity in their efforts to understand the

cryptic and elliptical writings of their fellow scientists. We social scientists too should

apply this principle to the writings of our fellow scientists studying human social behavior.

We should apply it also in trying to understand what the hermeneutists reviewed by

Phillips are trying to say. I go further and join the hermeneutists in recommending that

we systematically extend this principle to those fellow humans we study. Would Denis

Phillips disagree? I believe that we can devise experiments and data collection procedures

to test the principle of charity in this application. By speaking about the meanings of

objects in our shared experiential space we can induce in fellow humans’ behavioral

dispositions which we, as behavioral researchers, might not be able to distinguish

from non-verbally induced conditioned responses. And if we asked “what have you

learned” after the conditioned-response experiments, might not human subjects reply in

terms of meanings about objects in their experiential environments, rather than reciting

which muscle to twitch after which sense-receptor stimulation? (See Campbell, 1963,

1988, pp. 94-146, for more detail on this perspective.) Tolman applied the principle

of charity to his white rats. He, and Watson’s renegade student Lashley, produced the

research which Merleau-Ponty (1963) cites against behaviorism, but wrongly arguing

against doing psychology in a natural-scientific perspective. In contrast, I (1953) have

summarized this same literature as a series of crucial experiments showing that most

animals, and most of one sample of humans, learn about objects in the world rather

than which muscles to contract when. Cannot we similarly operationally “delineate”

(not “define”) the claims about human beings Phillips’ hermeneutists have made, and

put them to scientific test?

There is another convergence between Quine and the “paradigms theorists” or

‘“oniologically-nihilist hermeneutists” which must be noted. Quine has a slogan about

the radical translation situation: “There is no truth of the matter.” Not only is it

possible that we will not be able to choose between two translations, we may be wrong

in assuming that there is a singular correct translation to strive for. Similarly for theories

of physics. We should anticipate the possibility of irresolvable rivalry between complete

theories. (One can get into his writings on such issues in his recent, 100 page Pursuit of

Truth, 1990.) But note that for Quine, these are asymptotic projections of fundamental

“indeterminacies” or “underdeterminations” (he uses both terms, we might prefer the

latter). Overall, he clearly recommends that we proceed with the task of trying to find

empirical predictions on which our competing hypotheses of translation or scientific

theories differ. He believes that we have made progress toward resolving these in

the past in both fields. But as a scrupulous logician, he wants us to remember that

completion of the task of eliminating all rival theories is not guaranteed, and that proof

that we have done so is logically impossible.

For me (and possibly for Quine), there is a double-hermeneutics for the translation

situation which gives a special justification to the “no truth of the matter” slogan.

Page 93: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 93/97

Beyond Paradigms 593

The conditions of language learning for children, and for adults as new words are

introduced into the language, precludes perfect consensus on word meanings within

the core linguistic community. Ostensive illustrations are essential, but these are

far from definit ive, instead are equivocal as both Quine and Wittgenstein have noted.The ostensive-instance-sets of the language learner are never sufficient to insure

conversion to identical meanings (Campbell, 1973). This means for me that Quine

should acknowledge a fourth indeterminacy, in an individual’s learning of the meanings

of the words, to recognize that this too (although very well done) is also an indeterminate,

underdetermined, hermeneutic process. (Instead, his renewed emphasis on linking of

“stimulations” to “holophrastic sentences” in Pursuit of Truth, seems to me too

much like the earlier atomistic foundationalism of “sense data”, which even the

logical positivists themselves gave up for “protocol sentences in the ordinary object

language. “)

Quine’s setting-out these limitations to complete knowledge, this support for the

technical arguments of the skeptics, is not accompanied by a paralyzed defeatism

about the achievement of improved belief or unified theory. Instead, he describes

(particularly in those last 15 paragraphs of “Two Dogmas”) how we do, and how we

should, proceed in spite of our predicament as knowers. He is an advocate of science’s

past achievements and future prospects. This, as I see it, is akin to the stance of the

validity-seeking hermeneutic tradition.

Paradi gm, Li nguist i c, and Cult ural Soli psi sms. The hermeneutic processes of knowing

so far described support a relativism of cultures, languages, and paradigms. Of course,

if groups start out with differing sets of beliefs they will be prone to making different

coherence-enhancing belief revisions, and will doubt the validity and rationality of others

and the consensuses of other believing communities. In science, for example, Britain’s

most eminent physicist of the 1870s Kelvin, could not exclude his Christian faith from

the coherence set he used in evaluating Darwin. Scrupulously using the very best physics

of the day, he proved that the earth was too young for the processes Darwin described

to have taken place. Since then, the limited interpretive horizon provided by physics,

geology, and biology have changed, and the most devoutly Christian among the most

eminent physicists no longer reject Darwin in their coherence-maximizing efforts.

Anthropologists place Kelvin and Darwin in “the same” culture. If there was this much

perspectival relativism in spite of their overwhelmingly shared culturally-given belief-

sets, how much more should be expected, and is indeed found when anthropologists

have studied more exotic cultures. (Up to my limited investment, I am a card-carrying

cultural relativist [Campbell, 1972; Segall, Campbell, & Herskovits, 19661.) Now, had

these anthropologists been complete-incommensurabilists akin to the stereotype of the

“paradigms theorists” controverted by the present essays, they would loyally have

reported that each exotic culture was utterly incomprehensible. Instead, they uniformly

reported that the longer they lived with and studied an exotic people, the more humanly

reasonable they seemed. And they convinced readers of the lesson of cultural relativism

by plausibly presenting exotic world views in comprehensible and sympathetic terms.

Kuhn himself, paradigmatic “paradigms” theorist, has been clear in recent writings(1976, 1983, 1990) that he never intended to deny that members of one paradigm could

learn another, nor to deny that in the history of science they had regularly done so,

and made reasonable (if unproven) choices between paradigms they comprehended.

(One would, of course, expect first-paradigm biases, akin to phonetic, grammatical,

Page 94: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 94/97

594 D. T. CAMPBELL

and semantic “accents” on the part of second language learners.) Kuhn in his original

presentation (1962) also acknowledged sub-specialty and individual differences in the

“shared” paradigm.

There are perhaps some in the relativist, social-constructionist sociology of science(“sociologists of scientific knowledge”, in the “sociology of knowledge” tradition) who

approach total paradigms incommensurability, or “paradigm solipsism”. They endorse

a linguistic solipsism, an exaggerated version of the Whorf-Sapir-Cassirer-Borges

hypothesis that our ordinary entification of the perceptual world into objects and

events is totally determined by the arbitrary categories our language provides us.

I have added Borges to the list in honor of Collins’ (1985) excellent book, which

uses a Borges story as an introduction. In contrast, I (1973, 1989) have argued (and

have cited Quine in at least partial support) that the perceptual reification of external

objects as existing independently of our perception of them is shared by many animals,

is developmentally and phylogenetically prior to language learning, which gets startedwith equivocal ostentations, which would not work were not our perceptual reifications

of middle-sized objects in our visual-tactual field highly similar from person to person,

and because of the ostention requirement, this allows the “way the world is” to edit the

sorts of words that can usefully become socially shared. (My version of this pre-linguistic

perceptual background differs from Quine’s in that I add to his list a fifth indeterminacy,

in the pre-linguistic reidentification of an “object” as “the same”.)

These “ostensionable” (Campbell & Paller, 1989) objects and events are available to

both ingroup child and anthropologist (Horton, 1982) and make possible some degree

of exotic language learning. They do not provide foundational, definitive translation. In

my own best hermeneutic achievement (1964), one of our comprehension checks was a

Muller-Lyer illusion figure with a 700% discrepancy. If the other-culture respondents

reported a different choice on this item than would the anthropologist, we scored them

as not understanding. If they reported differently on the Mtiller-Lyer items with -5% to

+50% discrepancy (and were Guttman-scale consistent), we scored them as perceiving

differently. In spite of this profound epistemological equivocality, we plausibly claimed

to find cultural differences in perception, and reported on direction of difference

and amount. (Had the differences in perception been profound, we could not have

confirmed that we were communicating!) Campbell and Paller (1989) have noted that

it is “ostensionables” at this level that are called for in the “demonstration” of results

in the ideology of the scientific revolution.

From this hermeneutic perspective, it is unlikely that any of the educational research

“paradigms theorists” are claiming total incomprehensibility across paradigms. We

and they are from the same culture, including social science subculture. We share

enough “horizon of interpretation”, or “interpretive framework” to achieve meaningful

disagreement, operationally illustratable (without, of course, definitional operations).

We must enter into hermeneutic, dialectical disputation with them on rival descriptive

claims, even if only imperfectly translated.

In the above, I have approved of, and built upon, Phillips’ contribution, but by

shifting to hermeneutic methodology rather than assumptions about human nature,

have identified the coherentism shared by all four essays with the hermeneutic tradition,

and have joined Walker and Evers in denying paradigm incommensurabiIity between

us and the “paradigms theorists”. I have affirmed, over-elaborated, and qualified the

relationship to Quine which Evers and Walker recognize that I share with them. I have

Page 95: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 95/97

Beyond Paradigms 595

echoed Evers on the epistemological relevance of children’s language learning. What

remains for these last few paragraphs are more specific comments on the papers.

In the course of affirming what he and I share, Evers quite reasonably chides me for

trying to draw an observation/theory distinction at the level of internal/external validity.His point is well-taken, and the issue is one I am prepared to reverse myself on, but

for the present will treat it as a messy loose-end for future resolution. I suspect that

some version of the fact-theory distinction will survive in our post-positivist consensus.

Stegmtiller (1976) in formalizing Kuhn, makes such a distinction, in that the “anomalies”

and retained “facts” are “laden” with other theories than the one under test. While I have

emphasized (1987) that the neural connections involved in vision embody “theoretical”

anticipations as to objects in the world, and object-permanence, such use of “theory”

leaves us needing another term distinguishing between such pan-human unconscious

theorizing and the explicit theories of science. Quine’s periphery/interior distinction

discussed above, and my emphasis on hermeneutic usefulness of “ostensionable” objectssupports such a distinction. But I recommend to all taking Evers’ call for revision of my

point of view very seriously.

Lakomski, Walker, and Evers show a great respect for the neurologizing of

epistemology by Churchland, Churchland, and Stich, and the parallel distributed

processing approach to artificial intelligence of Rumelhart and McClelland. I, of

course, endorse the coherentism they cite this group as sharing. I applaud that aspect

of P.D.P. which sets out to achieve object recognition in spite of differing peripheral

“stimulations” on each exposure, and the talk of “patterns” or “configurations” in this

process. But I reject P.D.P.‘s programmatic peripheralism, and the militant avoidance

of simulating conscious experiences in addition to overt responses. (Edelman, 1987, has

a version of P.D.P. much superior to this regard.) If “folk psychology” be interpreted

only as that Anglo-American philosophical tradition of epitomizing beliefs (and hence

that subset “knowledge”) as sentence or propositions, I, too, reject it. (“Maps” are a

better metaphor than sentences.) But if they want to reject the folk concept of beliefs in

non-sentential form, and folk concepts such as goals, purposes and perceived objects, I

vigorously disagree. My hermeneutic coherentism requires fallibilist continuity with folk

knowing, and folk concepts about the knowing process (Campbell, 1988, Chapter 14).

In this, I reject the overall program of “eliminative materialism”, shared by Churchland,

Churchland, and Stich.

Haig’s explanatory coherentist theory of knowledge justi~cation is one I am happy

to endorse. But Glass’ achievements, and the Fisherian tradition of random assignment

to treatments, fit well within the coherence framework. In Whigish (but plausible)

retrospect, all the talk about plausible rival hypotheses or threats to validity in the

quasi-experimental design tradition, can be seen as relative coherence arguments,

devoid of logical or empirical “proof”, made on human discretionary bases. Random

assignment to treatments greatly reduces the plausibility of some rival hypotheses, often

otherwise very plausible ones. Moreover, we are not likely to ever achieve such a

complete coherence that the possibility of the results being due to “mere chance” in

small experiments can be ruled out. The “bootstrapping” methods Haig favors shareFisher’s world view. (Before modern computers, assumptions about normal distributions

and sampling from infinite universes greatly simplified the presumptive estimates of

plausibility.) Were the diverse experiments Glass pools totally incommensurable, the

consistent findings Glass discovers would not plausibly have been found. Our coherentist

Page 96: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 96/97

596 D. T. CAMPBELL

perspective explains how we can use research results from others with differing theories

and epistemologies. Thus Haig can use Glass’ results without his epistemology.

Along with Haig, I believe that coherentists can also frequently find quasi-experiments

validly interpretable, and that the whole context of many threats to validity (or nuisance

variables) must be considered together, rather than singularly. My students will be

surprised to learn that I can even allow the possibility of unbiased matching. But I

must reject Haig’s statement (p. 575) that “Randomization is unnecessary because we

can identify the most potent nuisance variables and match groups on these.” Because all

the nuisance variables will be measured with partial unreliability and partial invalidity

(or irrelevance), where one really needs matching to produce plausible pre-treatment

equivalence, one is bound to undermatch for these reasons. (My best teaching on these

issues is in Campbell & Boruch, 197.5; but see also Campbell & Stanley, 1963/66 for

the threat of “regression artifacts”, and Cook & Campbell, 1979, Chapter 4 by Charles

Reichardt, and Chapter 7.) LISREL measurement models are addressed to this problem

(and are much better than co-variance adjustment or a beta weight for the treatment as

a dummy variable), but usually have to settle for implausible co-measures of a common

latent variable. But I do agree with Haig that significance tests control for only one

threat to validity, and are often mistaken as controlling for all.

But as I indicated at the beginning, these are minor notes on a set of essays

foreshadowing the post-positivist consensus on a coherentist philosophy of science. I

am happy to join in recommending this perspective to all educational researchers.

References

Campbell, D. T. (1953). Operational delineation of “what is learned” via the transportation experiment.

Psychological Review, 61, 167-174.

Campbell, D. T. (1963). Social attitudes and other acquired behavioral dispositions. In S. Koch (Ed.),

Psychol ogy: A st udy of sci ence Volume 6 (pp. 94-172). New York: McGraw-Hill. Reprinted in D. T.

Campbell, 1988.

Campbell, D. T. (1964). Distinguishing differences of perception from failures of communication in

cross-cultural studies. In F. S. C. Northrop & H. H. Livingstone (Eds.), Cross-cultural understanding:

Epist emol ogy in anfhr opol ogy(pp. 308-336). New York: Harper & Row.

Campbell, D. T. (1966). Pattern matching as an essential in distal knowing. In K. R. Hammond (Ed.),

The psychology of Egon Brunswik (pp. 81-106). New York: Holt, Rinehart and Winston. Reprinted inH. Kornblith (Ed.) (1985), Nafural izi ng epislemology (pp. 49-70 ). Cambridge, MA: MIT Press.

Campbell, D. T. (1972). Herskovits, cultural relativism, and metascience. In M. J. Herskovits, Cultural

re lat iv ism (pp. v-xxiii). New York: Random House.

Campbell, D. T. (1973). Ostensive instances and entitativity in language learning. In W. Gray & N. D.

Rizzo (Eds.). Uni ty through diversit y (pp. 1043-1057). New York: Gordon & Breach.

Campbell, D. T. (1987). Neurological embodiments of belief and the gaps in the fit of phenomena to

noumena. In A. Shimony & D. Nails (Eds.), Nat ural i sti c epist emol ogy: A symposium of IWO decades

(pp. 165-192). Dordrecht, Holland: D. Reidel Publishing.

Campbell, D. T. (1988). (E. S. Overman, Ed.) M ethodol ogy and epi stemology fo r sociul sci ence: Selected

papers. Chicago, IL: University of Chicago Press.

Campbell, D. T. (1989). Models of language learning and their implications for social constructionist analyses

of scientific belief. In S. L. Fuller, M. DeMey. T. Shinn. & S. Woolgar (Eds.). The cognitive turn (pp.

153-158). Boston, MA: Kluwer Academic Publishers.Campbell, D. T. (1990). Exegesis on fifteen famous paragraphs from Quine. Unpublished manuscript.

Campbell, D. T. & Boruch, R. F. (1975). Making the case for randomized assignment to treatments by

considering the alternatives: Six ways in which quasi-experimental evaluations in compensatory education

tend to underestimate effects. In C. A. Bennett & A. Lumsdainc (Eds.), Evaluation and experiments:

Some cri t i cal issues i n assessi ng social pr ograms (pp. 195-296). New York: Academic Press.

Page 97: LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

7/27/2019 LAKOMSKI, Gabriele (1991) Beyond Paradigms; Coherentism and Holism in Research

http://slidepdf.com/reader/full/lakomski-gabriele-1991-beyond-paradigms-coherentism-and-holism-in-research 97/97

Beyond Paradigms 597

Campbell, D. T. & Paller, B. T. (1989). Extending evolutionary epistemology to “justifying” scientific

beliefs: A sociological rapprochement with a fallibilist perceptual foundationalism? In K. Hahlweg & C.

A. Hooker (Eds.), Issues in evol uti onary epist emol ogy (pp. 231-257). Albany, NY: State University of

New York Press.

Campbell, D. T. & Stanley, J. C. (1963/66). Experimental and quasi-experimental designs for research

on teaching. In N. L. Gage (Ed.), Handbook of research on t eaching (pp. 171-246). Chicago, IL:

Rand McNally. Reprinted as (1966) Experi mental and quasi- experi mental designs for research. Chicago,

IL: Rand McNally.

Collins, H. (1985). Changing order: Repli cati on and inducti on in scienti fi c practi ce. Beverly Hills, CA: Sage

Publications.

Cook, T. D. & Campbell, D. T. (1979). Q uasi-experimentation: Design and analysi s for fi el d sett ings.

Chicago, IL: Rand McNally.

Cook. T. D. & Campbell, D. T. (1986). The causal assumptions of quasi-experimental practice. Synt hese,

68, 141-180.

Edelman, G. M. (1987). Neural D arw ini sm: The theory of neuronal group sel ecti on. N ew York: Basic

Books.

Geertz, C. 111973). The int erpretati on of cult ures. N ew York: Basic Books.

Geertz, C. 1,1983). Local know l edge. N ew York: Basic Books.Habermas, J. (1983). Interpretive social science vs. hermeneuticism. In N. Haan, R. N. Bellah, P. Rabinow,

& W. M. Sullivan (Eds.), Social science as moral i nqui ry . New York: Columbia University Press.

Hesse, M. (1980). Revol uti ons and reconstructi ons i n the phil osophy of sci ence. Bloomington, IN: Indiana

University Press.

Horton, R., (1982). Tradition and modernity revisited. In M. Hollis & S. Lukes (Eds.), Rat ional i ty and

relat iv ism (pp. 201-260). Cambridge, MA: MIT Press.

Kuhn, T. S. (1962). The structure of sci enti fi c revol uti ons. Chicago, IL: University of Chicago Press.

Kuhn, T. S. (1976). Theory-change as structure-change: Comments on the Sneed formalism. Erkenntnis,

10, 179-199.

Kuhn, T. S. (1983). Commensurability, comparability, communicability. In. P. D. Asquith & T. Nickles

(Eds.), PSA 1982, Volume 2. East Lansing, MI: Philosophy of Science Association.

Kuhn, T. S. (1991). The road since structure. In A. Fine, M. Forbes, & L. Wessels (Eds.),PSA 1990,

Vol. II. E:ast Lansing, MI: Philosophy of Science Association.Lehrer, K. (1974). Know l edge. Oxford: Clarendon Press.

Lehrer, K. (1990). Theory of know l edge. Boulder, CO: Westview Press.

Merleau-Ponty, M. (1963). The structure of behavior. Boston, MA: Beacon Press.

Paller, B. T. & Campbell, D. T. (1989). Maxwell and van Fraassen on observability, reality, and justification.

In M. L. Maxwell & C. W. Savage (Eds.), Science, mi nd and psychology : Essays i n honor of Grov er

Maxwell (pp. 99-132). Lanham, MD: University Press of America.

Quine, W. V. (1951). Two dogmas of emuiricism. Phi l osophical Revi ew . 60. 2S-43. Reurinted in W. V.

Quine (1963); From a logicalpoint of vi e& (pp. 2& 46). Nkw York: Harper Torchbooks.’

Quine. W. V. (1960). Word and object. New York: Wilev.I J

Quine, W. V. (1990). Pursuit of truth. Cambridge, MA: Harvard University Press.

Segall, M. H., Campbell, D. T., & Herskovits, M. J. (1966). The i nfl uence of cult ure on vi sual percept i on.

Indianapolis, IN: Bobbs-Merrill.

Stegmiiller, W. (1976). The struct ure and dynami cs of theories. New York: Springer-Verlag.