my philosophy

26
Criteria Of Truth In epistemology, criteria of truth (or tests of truth  ) are standards and rules used to judge the accuracy of statements and claims. They are tools of verification. Understanding a Philosophy's criteria of truth is fundamental to a clear evaluation of that philosophy. This necessity is driven by the varying, and conflicting, claims of different philosophies. The rules of logic have no ability to distinguish truth on their own. An individual must determine what standards distinguish truth from falsehood. Not all criteria are equally valid. Some standards are sufficient, while others are questionable. The criteria listed represent those most commonly used by scholars and the general public. Jonathan Dolhenty states there seem to be only three functional, effective tests of truth. He lists these as the correspondence, coherence and pragmatic theories of truth. Authority The opinions of those with significant experience, highly trained or possessing an advanced degree are often considered a form of proof. Their knowledge and familiarity within a given field or area of knowledge command respect and allow their statements to be criteria of truth. A person may not simply declare themselves an authority, but rather must be properly qualified. Despite the wide respect given to expert testimony, it is not an infallible criterion. For example, multiple authorities may conflict in their claims and conclusions. Coherence Coherence refers to a consistent and overarching explanation for all facts. To be coherent, all pertinent facts must be arranged in a consistent and cohesive fashion as an integrated whole. The theory which most effectively reconciles all facts in this fashion may be considered most likely to be true. Coherence is the most potentially effective test of truth because it most adequately addresses all elements. The main limitation lies not in the standard, but in the human inability to acquire all facts of an experience. Only anomniscient mind could be aware of all of the relevant information. A scholar must accept this limitation and accept as true the most coherent explanation for the available facts. Coherence is difficult to dispute as a criterion of truth, since arguing against coherence is validating incoherence, which is inherently illogical. Consensus Gentium Some view opinions held by all people to be valid criteria of truth. According to consensus gentium , the universal consent of all mankind, all humans holding a distinct belief proves it

Upload: maxie-gem

Post on 12-Oct-2015

14 views

Category:

Documents


0 download

DESCRIPTION

Interesting

TRANSCRIPT

Criteria Of TruthInepistemology,criteria of truth(ortests of truth) are standards and rules used to judge the accuracy of statements and claims. They are tools of verification. Understanding a Philosophy's criteria oftruthis fundamental to a clear evaluation of that philosophy. This necessity is driven by the varying, and conflicting, claims of different philosophies. The rules oflogichave no ability to distinguish truth on their own. An individual must determine what standards distinguish truth fromfalsehood. Not all criteria are equally valid. Some standards are sufficient, while others are questionable. The criteria listed represent those most commonly used by scholars and the general public.Jonathan Dolhenty states there seem to be only three functional, effective tests of truth. He lists these as the correspondence, coherence and pragmatictheories of truth. AuthorityThe opinions of those with significant experience, highly trained or possessing an advanced degree are often considered a form ofproof. Theirknowledgeand familiarity within a given field or area of knowledge command respect and allow their statements to be criteria of truth. A person may not simply declare themselves an authority, but rather must be properly qualified. Despite the wide respect given toexpert testimony, it is not an infallible criterion. For example, multiple authorities may conflict in their claims and conclusions.CoherenceCoherence refers to a consistent and overarching explanation for all facts. To be coherent, all pertinent facts must be arranged in a consistent and cohesive fashion as an integrated whole. The theory which most effectively reconciles all facts in this fashion may be considered most likely to be true. Coherence is the most potentially effective test of truth because it most adequately addresses all elements. The main limitation lies not in the standard, but in thehuman inabilityto acquire all facts of an experience. Only anomniscientmind could be aware of all of the relevant information. A scholar must accept this limitation and accept as true the most coherent explanation for the available facts. Coherence is difficult to dispute as a criterion of truth, since arguing against coherence is validating incoherence, which is inherently illogical.Consensus GentiumSome view opinions held by all people to be valid criteria of truth. According toconsensus gentium, the universal consent of all mankind, all humans holding a distinct belief proves it is true. There is some value in the criterion if it means innate truth, such as the laws of logic andmathematics. If it merely means agreement, as in a unanimous vote, its value is questionable. For example, general assent once held theearth was flatand that thesun revolved about the earth.[6]

Mere ConsistencyMereconsistencyis when correct statements do not contradict, but are not necessarily related. Accordingly, an individual is consistent if he does not contradict himself. It is inadequate as a criterion because it treats facts in an isolated fashion without true cohesion and integration; nevertheless it remains anecessary conditionfor the truth of any argument, owing to thelaw of noncontradiction. The value of a proof largely lies in its ability to reconcile individual facts into a coherent whole.Strict ConsistencyStrict consistency is when claims are connected in such a fashion that one statement follows from another.Formal logicand mathematical rules are exemplary examples of rigorous consistency. An example would be:if all A's are B's and all B's are C's, then all A's are C's. While this standard is of high value, it is limited. For example, the premises area priori(or self-apparent), requiring another test of truth to employ this criterion. Additionally, strict consistency may produce results lacking coherence and completeness. While a philosophical system may demonstrate rigorous consistency with the facts it considers, all facts must be taken into consideration for an adequate criterion of truth, regardless of their detriment to any given system.CorrespondenceCorrespondence is quite simply when a claim corresponds with its object. For example, the claim that the White House is in Washington, D.C. is true, if the White House is actually located in Washington. Correspondence is held by many philosophers to be the most valid of the criteria of truth. An idea which corresponds to its object is indeed true, but determining if the correspondence is perfect requires additional tests of truth. This indicates that correspondence is a perfectly validdefinitionof truth, but is not of itself a valid criterion of truth. An additional test beyond this "definition" is required to determine the precise degree of similarity between what is posited and what exists inobjective reality.CustomMost people consciously or unknowingly employcustomas a criterion of truth, based on the assumption that doing what is customary will prevent error. It is particularly applied in the determination of moral truth and reflected in the statement "when in Rome, do as the Romans do". People stick closely to the principle of custom when they use common vernacular, wear common fashions and so forth; essentially, when they do what ispopular. Custom is not considered a serious, or valid, test of truth. For example, public opinion polls do not determine truth.[9]EmotionsMany people allow feelings to determine judgment, often in the face of contrary evidence or without even attempting to collect evidence and facts. They are implicitly accepting emotionsas a criterion of truth. Most people will admit that feelings are not an adequate test for truth. For example, a seasoned businessman will put aside his emotions and search for the best available facts when making an investment. Similarly, scholars are trained to put asidesuchsubjectivejudgments when evaluating knowledge.[10]InstinctThe existence of distinctinstinctshas long been debated. Proponents of instinct argue that we eat because of hunger, drink because of thirst, and so forth. Some have even argued for the existence of God based on this criterion, arguing that the object of every instinct has a referent in reality. The counterpoint of hunger is food; for thirst it is liquid; for the sex drive it is a mate. Instincts are not accepted as a reliable test because they are most often indistinct, variant and difficult to define. Additionally, universal instincts are so few that they offer little to the greater body of philosophy as a criterion.[11]IntuitionIntuitionis an assumed truth with an unknown, or possibly unexamined, source. It is a judgment that is not dependent on arationalexamination of the facts. It is usually experienced as a sudden sensation and/or rush of thoughts that feel "right". Many persons experience intuitive epiphanies which later prove to be true. Scholars have sometimes come upon valid theories and proofs while daydreaming or otherwise mentally occupied with something bearing no apparent relationship to the truth they seek to reveal. Intuition is at best a source for truths, rather than a criterion with which to evaluate them. Intuitive knowledge requires testing by means of other criteria of truth in order to confirm its accuracy.Majority Rule Majority ruleis a statistical method of accepting assertions and proposals. Indemocratic systems, majority rule is used to determine group decisions, particularly those relating to personalmoralityandsocial behavior. Some systems divided into several oppositional factions may depend on mereplurality. While majority rule may make for a good democratic system, it is a poor determinant of truth, subject to the criticisms of the broad version ofconsensus gentium.Nave RealismNave Realismposits that only that which is directly observable by thehuman sensesis true. First-hand observation determines the truth or falsity of a given statement. Nave Realism is an insufficient criterion of truth. A host of natural phenomena are demonstrably true, but not observable by the unaided sense. For example, Nave Realism would deny the existence ofsounds beyond the range of human hearingand the existence ofx-rays. Similarly, there are a number of sense experiments which show a disconnect between the perceived sensation and the reality of its cause.PragmaticIf an idea works then it must be true, to the Pragmatist. The consequences of applying a concept reveal itstruth valueupon examination of the results. The full meaning of an idea is self-apparent in its application. For example, the therapeutic value and effect of penicillin in relation to infections is proven in its administration. AlthoughPragmatismis considered a valuable criterion, it must be used with caution and reservation, due to its potential for false positives. For example, a doctor may prescribe a patient medication for an illness, but it could later turn out that a placebo is equally as effective. Thus, untrue concepts could appear to be working contrary to the purpose of the pragmatic test. However, it has validity as a test, particularly in the formWilliam Ernest Hockingcalled "Negative Pragmatism". In essence, it states that ideas that do not work cannot possibly be true, though ideas which do work may or may not be true.RevelationThe principal distinction between intuition andrevelationis that revelation has an assumed source:God(or another higher power). Revelation may be defined as truth emanating from God. Manyreligionsfundamentally rely on revelation as a test of truth. This criterion is subject to the same criticisms as intuition. It may be a valid reference of truth for an individual, but it is inadequate for providing a coherent proof of the knowledge to others.TimeTime is a criterion commonly appealed to in debate, often referred to as "the test of time". This criterion posits that over time erroneous beliefs and logical errors will be revealed, while if the belief is true, the mere passage of time cannot adversely affect itsvalidity. Time is an inadequate test for truth, since it is subject to similar flaws as custom and tradition (which are simply specific variations of the time factor). Many demonstrably false beliefs have endured for centuries and even millennia. It is commonly rejected as a valid criterion. For example, most people will not convert to another faith simply because the other religion is centuries (or even millennia) older than their current beliefs.TraditionTradition, closely related to custom, is the standard stating that which is held for generations is true. Those accepting tradition argue that ideas gaining the loyalty multiple generations possesses a measure of credibility. Tradition possesses many of the same failings as custom. It is possible for falsehoods to be passed down from generation to generation, since tradition generally emphasizes repetition overcritical evaluation.

RationalismInepistemology,rationalismis the view that "regardsreasonas the chief source and test of knowledge"[1]or "any view appealing to reason as a source of knowledge or justification".[2]More formally, rationalism is defined as amethodologyor atheory"in which the criterion of the truth is not sensory but intellectual anddeductive".[3]Rationalists believe reality has an intrinsically logical structure. Because of this, rationalists argue that certain truths exist and that the intellect can directly grasp these truths. That is to say, rationalists assert that certain rational principles exist in logic, mathematics, ethics, and metaphysics that are so fundamentally true that denying them causes one to fall into contradiction. Rationalists have such a high confidence in reason that proof and physical evidence are unnecessary to ascertain truth in other words, "there are significant ways in which our concepts and knowledge are gained independently of sense experience".[4]Because of this belief,empiricismis one of rationalism's greatest rivals.Different degrees of emphasis on this method or theory lead to a range of rationalist standpoints, from the moderate position "that reason has precedence over other ways of acquiring knowledge" to the more extreme position that reason is "the unique path to knowledge".[5]Given a pre-modern understanding of reason, rationalism is identical tophilosophy, theSocraticlife of inquiry, or the zetetic (skeptical) clear interpretation of authority (open to the underlying or essential cause of things as they appear to our sense of certainty). In recent decades,Leo Strausssought to revive "Classical Political Rationalism" as a discipline that understands the task of reasoning, not as foundational, but asmaieutic. Rationalism should not be confused withrationality, nor withrationalization.Philosophical UsageRationalism is often contrasted withempiricism. Taken very broadly these views are not mutually exclusive, since a philosopher can be both rationalist and empiricist.[2]Taken to extremes, the empiricist view holds that all ideas come to usa posteriori, that is to say, through experience; either through the external senses or through such inner sensations as pain and gratification. The empiricist essentially believes that knowledge is based on or derived directly from experience. The rationalist believes we come to knowledgea priori through the use of logic and is thus independent of sensory experience. In other words, asGalen Strawsononce wrote, "you can see that it is true just lying on your couch. You don't have to get up off your couch and go outside and examine the way things are in the physical world. You don't have to do any science."[9]Between both philosophies, the issue at hand is the fundamental source of human knowledge and the proper techniques for verifying what we think we know. Whereas both philosophies are under the umbrella of epistemology, their argument lies in the understanding of the warrant, which is under the wider epistemic umbrella of thetheory of justification.Theory of JustificationThe theory of justification is the part ofepistemologythat attempts to understand the justification ofpropositionsandbeliefs. Epistemologists are concerned with various epistemic features of belief, which include the ideas ofjustification, warrant,rationality, andprobability. Of these four terms, the term that has been most widely used and discussed by the early 21st century is "warrant". Loosely speaking, justification is the reason that someone (probably) holds a belief.If A makes a claim, and B then casts doubt on it, A's next move would normally be to provide justification. The precise method one uses to provide justification is where the lines are drawn between rationalism and empiricism (among other philosophical views). Much of the debate in these fields are focused onanalyzingthe nature of knowledge and how it relates to connected notions such astruth,belief, andjustification.Theses of RationalismAt its core, rationalism consists of three basic claims. For one to consider themselves a rationalist, they must adopt at least one of these three claims: The Intuition/Deduction Thesis, The Innate Knowledge Thesis, or The Innate Concept Thesis. In addition, rationalists can choose to adopt the claims of Indispensability of Reason and or the Superiority of Reason although one can be a rationalist without adopting either thesis.The Intuition/Deduction ThesisRationale:"Some propositions in a particular subject area, S, are knowable by us by intuition alone; still others are knowable by being deduced from intuited propositions."[10]Generally speaking, intuition isa prioriknowledge or experiential belief characterized by its immediacy; a form of rational insight. We simply just "see" something in such a way as to give us a warranted belief. Beyond that, the nature of intuition is hotly debated.In the same way, generally speaking, deduction is the process ofreasoningfrom one or more generalpremisesto reach a logically certain conclusion. Using validarguments, we can deduce from intuited premises.For example, when we combine both concepts, we can intuit that the number three is prime and that it is greater than two. We then deduce from this knowledge that there is a prime number greater than two. Thus, in can be said that intuition and deduction combined to provide us witha prioriknowledge we gained this knowledge independently of sense experience.Empiricists such asDavid Humehave been willing to accept this thesis for describing the relationships among our own concepts.[10]In this sense, empiricists argue that we are allowed to intuit and deduce truths from knowledge that has been obtaineda posteriori.By injecting different subjects into the Intuition/Deduction thesis, we are able to generate different arguments. Most rationalists agreemathematicsis knowable by applying the intuition and deduction. Some go further to includeethical truthsinto the category of things knowable by intuition and deduction. Furthermore, some rationalists also claimmetaphysicsis knowable in this thesis.In addition to different subjects, rationalists sometimes vary the strength of their claims by adjusting their understanding of the warrant. Some rationalists understand warranted beliefs to be beyond even the slightest doubt; others are more conservative and understand the warrant to be belief beyond a reasonable doubt.Rationalists also have different understanding and claims involving the connection between intuition and truth. Some rationalists claim intuition is infallible and that anything we intuit to be true is as such. More contemporary rationalists accept intuition is not always a source of certain knowledge thus allowing for the possibility of a deceiver who might cause the rationalist to intuit a false proposition in the same way a third party could cause the rationalist to have perceptions of nonexistent objects.Naturally, the more subjects a rationalists claim is knowable by the Intuition/Deduction thesis, the more certain they are of their warranted beliefs, and the more strictly they adhere to the infallibility of intuition, the more controversial their truths or claims and the more radical their rationalism.[10]To argue in favor of this thesis,Gottfried Wilhelm Leibniz, a prominent German philosopher, says, "The senses, although they are necessary for all our actual knowledge, are not sufficient to give us the whole of it, since the senses never give anything but instances, that is to say particular or individual truths. Now all the instances which confirm a general truth, however numerous they may be, are not sufficient to establish the universal necessity of this same truth, for it does not follow that what happened before will happen in the same way again. From which it appears that necessary truths, such as we find in pure mathematics, and particularly in arithmetic and geometry, must have principles whose proof does not depend on instances, nor consequently on the testimony of the senses, although without the senses it would never have occurred to us to think of them..The Innate Knowledge ThesisRationale:"We have knowledge of some truths in a particular subject area, S, as part of our rational nature."[12]The Innate Knowledge thesis is similar to the Intuition/Deduction thesis in the regard that both theses claimknowledgeis gaineda priori. The two theses go their separate ways when describing how that knowledge is gained. As the name, and the rationale, suggests, the Innate Knowledge thesis claims knowledge is simply part of our rational nature. Experiences can trigger a process that allows this knowledge to come into our consciousness, but the experiences don't provide us with the knowledge itself. The knowledge has been with us since the beginning and the experience simply brought into focus, in the same way a photographer can bring the background of a picture into focus by changing the aperture of the lens. The background was always there, just not in focus.This thesis targets a problem with the nature of inquiry originally postulated byPlatoinMeno. Here, Plato asks about inquiry; how do we gain knowledge of a theorem in geometry? We inquire into the matter. Yet, knowledge by inquiry seems impossible.[13]In other words, "If we already have the knowledge, there is no place for inquiry. If we lack the knowledge, we don't know what we are seeking and cannot recognize it when we find it. Either way we cannot gain knowledge of the theorem by inquiry. Yet, we do know some theorems."[12]The Innate Knowledge thesis offers a solution to thisparadox. By claiming that knowledge is already with us, eitherconsciouslyorunconsciously, a rationalist claims we don't really "learn" things in the traditional usage of the word, but rather that we simply bring to light what we already know.

The innate Concept ThesisRationale:"We have some of the concepts we employ in a particular subject area, S, as part of our rational nature."[14]Similarly to the Innate Knowledge thesis, the Innate Concept thesis suggests that some concepts are simply part of our rational nature. These concepts area prioriin nature and sense experience is irrelevant to determining the nature of these concepts (though, sense experience can help bring the concepts to ourconscious mind).Some philosophers, such asJohn Locke(who is considered one of the most influential thinkers of theEnlightenmentand anempiricist) argue that the Innate Knowledge thesis and the Innate Concept thesis are one and the same.[15]Other philosophers, such asPeter Carruthers, argue that the two theses are distinct from one another. As with the other theses covered under rationalisms' umbrella, the types and number of concepts a philosopher claims to be innate, the more controversial and radical their position; "the more a concept seems removed from experience and the mental operations we can perform on experience the more plausibly it may be claimed to be innate. Since we do not experience perfect triangles but do experience pains, our concept of the former is a more promising candidate for being innate than our concept of the latter.[14]In his book,Meditations on First Philosophy,[16]Ren Descartespostulates three classifications for ourideaswhen he says, "Among my ideas, some appear to be innate, some to be adventitious, and others to have been invented by me. My understanding of what a thing is, what truth is, and what thought is, seems to derive simply from my own nature. But my hearing a noise, as I do now, or seeing the sun, or feeling the fire, comes from things which are located outside me, or so I have hitherto judged. Lastly,sirens,hippogriffsand the like are my own invention."[17]Adventitious ideas are those concepts that we gain through sense experiences, ideas such as the sensation of heat, because they originate from outside sources; transmitting their own likeness rather than something else and something you simply cannotwillaway. Ideas invented by us, such as those found inmythology,legends, andfairy talesare created by us from other ideas we possess. Lastly, innate ideas, such as our ideas ofperfection, are those ideas we have as a result of mental processes that are beyond what experience can directly or indirectly provide.Gottfried Wilhelm Leibnizdefends the idea of innate concepts by suggesting the mind plays a role in determining the nature of concepts, to explain this, he likes the mind to a block of marble in theNew Essays on Human Understanding, "This is why I have taken as an illustration a block of veined marble, rather than a wholly uniform block or blank tablets, that is to say what is called tabula rasa in the language of the philosophers. For if the soul were like those blank tablets, truths would be in us in the same way as the figure of Hercules is in a block of marble, when the marble is completely indifferent whether it receives this or some other figure. But if there were veins in the stone which marked out the figure of Hercules rather than other figures, this stone would be more determined thereto, and Hercules would be as it were in some manner innate in it, although labour would be needed to uncover the veins, and to clear them by polishing, and by cutting away what prevents them from appearing. It is in this way that ideas and truths are innate in us, like natural inclinations and dispositions, natural habits or potentialities, and not like activities, although these potentialities are always accompanied by some activities which correspond to them, though they are often imperceptible."The other Two ThesesThe three aforementioned theses of Intuition/Deduction, Innate Knowledge, and Innate Concept are the cornerstones of rationalism. To be considered a rationalist, one must adopt at least one of those three claims. The following two theses are traditionally adopted by rationalists, but they aren't essential to the rationalist's position.The Indispensability of Reason Thesishas the following rationale, "The knowledge we gain in subject area,S, by intuition and deduction, as well as the ideas and instances of knowledge inSthat are innate to us, could not have been gained by us through sense experience."[1]In short, this thesis claims that experience cannot provide what we gain from reason.The Superiority of Reason Thesishas the following rationale, '"The knowledge we gain in subject areaSby intuition and deduction or have innately is superior to any knowledge gained by sense experience".[1]In other words, this thesis claims reason is superior to experience as a source for knowledge.In addition to the following claims, rationalists often adopt similar stances on other aspects of philosophy. Most rationalists reject skepticism for the areas of knowledge they claim are knowablea priori. Naturally, when you claim some truths are innately known to us, one must reject skepticism in relation to those truths. Especially for rationalists who adopt the Intuition/Deduction thesis, the idea of epistemic foundationalism tends to crop up. This is the view that we know some truths without basing our belief in them on any others and that we then use this foundational knowledge to know more truths.BackgroundIt is difficult to identify a major figure in the history of rationalism, or even a major period of history in rational thought before theEnlightenment. One of the primary reasons for this is the fact that humans have the ability to know information they otherwise shouldn't know primarily in the field of mathematics. Every philosopher has acknowledged this to some degree or another. Secondly, it is the nature of philosophical thought to obtain knowledge and information with the use of our rational faculties instead of coming to knowledge by mystical revelation.Since the Enlightenment, rationalism is usually associated with the introduction of mathematical methods into philosophy as seen in the works ofDescartes,Leibniz, andSpinoza.[3]This is commonly calledcontinental rationalism, because it was predominant in the continental schools of Europe, whereas in Britainempiricismdominated.Even then, the distinction between rationalists and empiricists was drawn at a later period and would not have been recognized by the philosophers involved. Also, the distinction between the two philosophies is not as clear-cut as is sometimes suggested; for example, Descartes and Locke have similar views about the nature of human ideas.[4]Proponents of some varieties of rationalism argue that, starting with foundational basic principles, like the axioms ofgeometry, one coulddeductivelyderive the rest of all possible knowledge. The philosophers who held this view most clearly wereBaruch SpinozaandGottfried Leibniz, whose attempts to grapple with the epistemological and metaphysical problems raised by Descartes led to a development of the fundamental approach of rationalism. Both Spinoza and Leibniz asserted that,in principle, all knowledge, including scientific knowledge, could be gained through the use of reason alone, though they both observed that this was not possiblein practicefor human beings except in specific areas such asmathematics. On the other hand, Leibniz admitted in his bookMonadologythat "we are all mereEmpiricsin three fourths of our actions."

HistoryRationalist philosophy from antiquityBecause of the complicated nature of rationalist thinking, the nature of philosophy, and the understanding that humans are aware of knowledge available only through the use of rational thought, many of the great philosophers from antiquity laid down the foundation for rationalism though they themselves weren't rationalists as we understand the concept today.Pythagoras (570495 BCE) Pythagoras was one of the first Western philosophers to stress rationalist insight.[19]He is often revered as a greatmathematician,mysticandscientist, but he is best known for thePythagorean theorem, which bears his name, and for discovering the mathematical relationship between the length of strings on lute bear and the pitches of the notes. Pythagoras "believed these harmonies reflected the ultimate nature of reality. He summed up the implied metaphysical rationalism in the words "All is number". It is probable that he had caught the rationalist's vision, later seen byGalileo(15641642), of a world governed throughout by mathematically formulable laws".[19]It has been said that he was the first man to call himself a philosopher, or lover of wisdomPlato (427347 BCE)Plato also held rational insight to a very high standard, as is seen in his works such asMenoandThe Republic. Plato taught on theTheory of Forms(or the Theory of Ideas)[21][22][23]which asserts that non-material abstract (butsubstantial) forms (or ideas), and not the material world of changeknown to us through sensation, possess the highest and most fundamental kind of reality.[24]Plato's forms are accessible only to reason and not to sense.[19]In fact, it is said that Plato admired reason, especially ingeometry, so highly that he had the phrase "Let no one ignorant of geometry enter" inscribed over the door to his academy.Aristotle (384322 BCE) Aristotlehas a process of reasoning similar to that of Plato's, though he ultimately disagreed with the specifics of Plato's forms. Aristotle's great contribution to rationalist thinking comes from his use ofsyllogisticlogic. Aristotle defines syllogism as "a discourse in which certain (specific) things having been supposed, something different from the things supposed results of necessity because these things are so."[26]Despite this very general definition, Aristotle limits himself to categorical syllogisms which consist of threecategorical propositionsin his workPrior Analytics.[27]These included categoricalmodalsyllogisms.After AristotleThough the three great Greek philosophers disagreed with one another on specific points, they all agreed that rational thought could bring to light knowledge that was self-evident information that humans otherwise couldn't know without the use of reason. After Aristotle's death, Western rationalistic thought was generally characterized by its application to theology, such as in the works of the Islamic philosopherAvicennaand Jewish philosopher and theologianMaimonides. One notable event in the Western timelime was the philosophy of St.Thomas Aquinaswho attempted to merge Greek rationalism and Christian revelation in the thirteenth-century.

Modern rationalismRen Descartes (15961650)Descartes was the first of the modern rationalists and has been dubbed the 'Father of Modern Philosophy.' Much subsequentWestern philosophyis a response to his writings,[29][30][31]which are studied closely to this day.Descartes thought that only knowledge of eternal truths including the truths of mathematics, and the epistemological and metaphysical foundations of the sciences could be attained by reason alone; other knowledge, the knowledge of physics, required experience of the world, aided by thescientific method. He also argued that althoughdreamsappear as real assense experience, these dreams cannot provide persons with knowledge. Also, since conscious sense experience can be the cause of illusions, then sense experience itself can be doubtable. As a result, Descartes deduced that a rational pursuit of truth should doubt every belief about reality. He elaborated these beliefs in such works asDiscourse on Method,Meditations on First Philosophy, andPrinciples of Philosophy. Descartes developed a method to attain truths according to which nothing that cannot be recognised by the intellect (orreason) can be classified as knowledge. These truths are gained "without any sensory experience," according to Descartes. Truths that are attained by reason are broken down into elements that intuition can grasp, which, through a purely deductive process, will result in clear truths about reality.Descartes therefore argued, as a result of his method, that reason alone determined knowledge, and that this could be done independently of the senses. For instance, his famous dictum,cogito ergo sumor "I think, therefore I am", is a conclusion reacheda priorii.e., prior to any kind of experience on the matter. The simple meaning is that doubting one's existence, in and of itself, proves that an "I" exists to do the thinking. In other words, doubting one's own doubting is absurd.[32]This was, for Descartes, an irrefutable principle upon which to ground all forms of other knowledge. Descartes posited a metaphysicaldualism, distinguishing between the substances of the human body ("res extensa") and themindor soul ("res cogitans"). This crucial distinction would be left unresolved and lead to what is known as themind-body problem, since the two substances in the Cartesian system are independent of each other and irreducible.Baruch Spinoza (16321677) The philosophy ofBaruch Spinozais a systematic, logical, rational philosophy developed in seventeenth-centuryEurope.[33][34][35]Spinoza's philosophy is a system of ideas constructed upon basic building blocks with an internal consistency with which he tried to answer life's major questions and in which he proposed that "God exists only philosophically."[35][36]He was heavily influenced by thinkers such asDescartes,[37]Euclid[36]andThomas Hobbes,[37]as well as theologians in the Jewish philosophical tradition such asMaimonides.[37]But his work was in many respects a departure from theJudeo-Christiantradition. Many of Spinoza's ideas continue to vex thinkers today and many of his principles, particularly regarding theemotions, have implications for modern approaches topsychology. Even top thinkers have found Spinoza's "geometrical method"[35]difficult to comprehend:Goetheadmitted that he "could not really understand what Spinoza was on about most of the time."[35]Hismagnum opus,Ethics, contains unresolved obscurities and has a forbidding mathematical structure modeled on Euclid's geometry.[36]Spinoza's philosophy attracted believers such asAlbert Einstein[38]and much intellectual attention.Gottfried Leibniz (16461716) Leibniz was the last of the great Rationalists who contributed heavily to other fields such asmetaphysics,epistemology,logic,mathematics,physics,jurisprudence, and thephilosophy of religion; he is also considered to be one of the last "universal geniuses".[44]He did not develop his system, however, independently of these advances. Leibniz rejected Cartesian dualism and denied the existence of a material world. In Leibniz's view there are infinitely many simple substances, which he called "monads" (possibly taking the term from the work ofAnne Conway).Leibniz developed his theory of monads in response to both Descartes andSpinoza, because the rejection of their visions forced him to arrive at his own solution. Monads are the fundamental unit of reality, according to Leibniz, constituting both inanimate and animate objects. These units of reality represent the universe, though they are not subject to the laws of causality or space (which he called "well-founded phenomena"). Leibniz, therefore, introduced his principle ofpre-established harmonyto account for apparent causality in the world.Immanuel Kant (17241804) Kant is one of the central figures of modernphilosophy, and set the terms by which all subsequent thinkers have had to grapple. He argued that human perception structures natural laws, and that reason is the source of morality. His thought continues to hold a major influence in contemporary thought, especially in fields such as metaphysics, epistemology, ethics, political philosophy, and aesthetics.[45]Kant named his branch of epistemologyTranscendental Idealism, and he first laid out these views in his famous workThe Critique of Pure Reason. In it he argued that there were fundamental problems with both rationalist and empiricist dogma. To the rationalists he argued, broadly, that pure reason is flawed when it goes beyond its limits and claims to know those things that are necessarily beyond the realm of all possible experience: theexistence of God, free will, and the immortality of the human soul. Kant referred to these objects as "The Thing in Itself" and goes on to argue that their status as objects beyond all possible experience by definition means we cannot know them. To the empiricist he argued that while it is correct that experience is fundamentally necessary for human knowledge, reason is necessary for processing that experience into coherent thought. He therefore concludes that both reason and experience are necessary for human knowledge. In the same way, Kant also argued that it was wrong to regard thought as mere analysis. In Kant's views,a prioriconcepts do exist, but if they are to lead to the amplification of knowledge, they must be brought into relation with empirical data".

EmpiricismEmpiricismis atheorywhich states thatknowledgecomes only or primarily fromsensory experience.[1]One of several views ofepistemology, the study of human knowledge, along withrationalismandskepticism, empiricism emphasizes the role ofexperienceandevidence, especially sensory experience, in the formation of ideas, over the notion ofinnate ideasortraditions;[2]empiricists may argue however that traditions (or customs) arise due to relations of previous sense experiences.[3]Empiricism in thephilosophy of scienceemphasizes evidence, especially as discovered inexperiments. It is a fundamental part of thescientific methodthat allhypothesesandtheoriesmust be tested againstobservationsof thenatural worldrather than resting solely ona priorireasoning,intuition, orrevelation.Empiricism, often used by natural scientists, asserts that "knowledge is based on experience" and that "knowledge is tentative and probabilistic, subject to continued revision and falsification."[4]One of the epistemological tenets is that sensory experience creates knowledge. Thescientific method, including experiments and validated measurement tools, guides empirical research.EtymologyThe English term "empirical" derives from theGreekword , which is cognate with and translates to the Latinexperientia, from which we derive the word "experience" and the related "experiment". The term was used by theEmpiric schoolof ancient Greek medical practitioners, who rejected the three doctrines of theDogmatic school, preferring to rely on the observation of"phenomena".[5]HistoryBackgroundA central concept inscienceand thescientific methodis that it must beempiricallybased on the evidence of the senses. Bothnaturalandsocial sciencesuse workinghypotheses that aretestablebyobservationandexperiment. The termsemi-empiricalis sometimes used to describe theoretical methods that make use of basicaxioms, established scientific laws, and previous experimental results in order to engage in reasoned model building and theoretical inquiry.Philosophical empiricists hold no knowledge to be properly inferred or deduced unless it is derived from one's sense-based experience.[6]This view is commonly contrasted with rationalism, which states that knowledge may be derived fromreasonindependently of the senses. For exampleJohn Lockeheld that some knowledge (e.g. knowledge of God's existence) could be arrived at throughintuitionand reasoning alone. SimilarlyRobert Boyle, a prominent advocate of the experimental method, held that we have innate ideas.[7][8]The main continental rationalists (Descartes,Spinoza, andLeibniz) were also advocates of the empirical "scientific method".[9][10]Early EmpiricismThe notion oftabula rasa("clean slate" or "blank tablet") connotes a view of mind as an originally blank or empty recorder (Locke used the words "white paper") on which experience leaves marks. This denies that humans haveinnate ideas. The image dates back toAristotle;What the mind (nous) thinks must be in it in the same sense as letters are on a tablet (grammateion) which bears no actual writing (grammenon); this is just what happens in the case of the mind. (Aristotle,On the Soul, 3.4.430a1).Aristotle's explanation of how this was possible, was not strictly empiricist in a modern sense, but rather based on his theory ofpotentiality and actuality, and experience of sense perceptions still requires the help of theactivenous. These notions contrasted withPlatonicnotions of the human mind as an entity that pre-existed somewhere in the heavens, before being sent down to join a body on Earth (see Plato'sPhaedoandApology, as well as others). Aristotle was considered to give a more important position to sense perception thanPlato, and commentators in the middle ages summarized one of his positions as "nihil in intellectu nisi prius fuerit in sensu" (Latin for "nothing in the intellect without first being in the senses").During themiddle agesAristotle's theory oftabula rasawas developed byIslamic philosophersstarting withAl Farabi, developing into an elaborate theory byAvicenna[11]and demonstrated as athought experimentbyIbn Tufail.[12]For Avicenna (Ibn Sina), for example, thetabula rasais a pure potentiality that is actualized througheducation, and knowledge is attained through "empirical familiarity with objects in this world from which one abstracts universal concepts" developed through a "syllogisticmethod ofreasoningin which observations lead to propositional statements which when compounded lead to further abstract concepts." Theintellectitself develops from amaterial intellect(al-'aql al-hayulani), which is apotentiality"that can acquire knowledge to theactive intellect(al-'aqlal-fa'il), the state of the human intellect in conjunction with the perfect source of knowledge".[11]So the immaterial "active intellect", separate from any individual person, is still essential for understanding to occur.In the 12th century CE theAndalusianMuslimphilosopher and novelist Abu BakrIbn Tufail(known as "Abubacer" or "Ebn Tophail" in the West) included the theory oftabula rasaas athought experimentin hisArabic philosophical novel,Hayy ibn Yaqdhanin which he depicted the development of the mind of aferal child"from atabula rasato that of an adult, in complete isolation from society" on adesert island, through experience alone. TheLatintranslation of hisphilosophical novel, entitledPhilosophus Autodidactus, published byEdward Pocockethe Younger in 1671, had an influence onJohn Locke's formulation oftabula rasainAn Essay Concerning Human Understanding.[12]A similarIslamic theologicalnovel,Theologus Autodidactus, was written by the Arab theologian and physicianIbn al-Nafisin the 13th century. It also dealt with the theme of empiricism through the story of a feral child on a desert island, but departed from its predecessor by depicting the development of the protagonist's mind through contact with society rather than in isolation from society.[13]During the 13th centuryThomas Aquinasadopted theAristotelianposition that the senses are essential to mind intoscholasticism.Bonaventure(12211274), one of Aquinas' strongest intellectual opponents, offered some of the strongest arguments in favour of the Platonic idea of the mind.Renaissance ItalyIn the laterenaissancevarious writers began to question themedievalandclassicalunderstanding of knowledge acquisition in a more fundamental way. In political and historical writingNiccol Machiavelliand his friendFrancesco Guicciardiniinitiated a new realistic style of writing. Machiavelli in particular was scornful of writers on politics who judged everything in comparison to mental ideals and demanded that people should study the "effectual truth" instead.Their contemporary, Leonardo da Vinci (14521519) said,[14]If you find from your own experience that something is a fact and it contradicts what some authority has written down, then you must abandon the authority and base your reasoning on your own findings.The decidedly anti-Aristotelian and anti-clerical music theoristVincenzo Galilei(ca. 15201591), father ofGalileoand the inventor ofmonody, made use of the method in successfully solving musical problems, firstly, of tuning such as the relationship of pitch to string tension and mass in stringed instruments, and to volume of air in wind instruments; and secondly to composition, by his various suggestions to composers in hisDialogo della musica antica e moderna(Florence, 1581). The Italian word he used for "experiment" wasesperienza. It is known that he was the essential pedagogical influence upon the young Galileo, his eldest son (cf. Coelho, ed.Music and Science in the Age of Galileo Galilei), arguably one of the most influential empiricists in history. Vincenzo, through his tuning research, found the underlying truth at the heart of the misunderstood myth of 'Pythagoras' hammers' (the square of the numbers concerned yielded those musical intervals, not the actual numbers, as believed), and through this and other discoveries that demonstrated the fallibility of traditional authorities, a radically empirical attitude developed, passed on to Galileo, which regarded "experience and demonstration" as thesine qua nonof valid rational enquiry.British EmpiricismBritish empiricism, though it was not a term used at the time, derives from the 17th century period ofearly modern philosophyandmodern science. The term became useful in order to describe differences perceived between two of its foundersFrancis Bacon, described as empiricist, andRen Descartes, who is described as a rationalist.Thomas HobbesandBaruch Spinoza, in the next generation, are often also described as an empiricist and a rationalist respectively.John Locke,George Berkeley, andDavid Humewere the primary exponents of empiricism in the 18th centuryEnlightenment, with Locke being the person who is normally known as the founder of empiricism as such.In response to the early-to-mid-17th century "continental rationalism"John Locke(16321704) proposed inAn Essay Concerning Human Understanding(1689) a very influential view wherein theonlyknowledge humans can have isa posteriori, i.e., based upon experience. Locke is famously attributed with holding the proposition that the human mind is atabula rasa, a "blank tablet," in Locke's words "white paper," on which the experiences derived from sense impressions as a person's life proceeds are written. There are two sources of our ideas: sensation and reflection. In both cases, a distinction is made between simple and complex ideas. The former are unanalysable, and are broken down into primary and secondary qualities. Primary qualities are essential for the object in question to be what it is. Without specific primary qualities, an object would not be what it is. For example, an apple is an apple because of the arrangement of its atomic structure. If an apple was structured differently, it would cease to be an apple. Secondary qualities are the sensory information we can perceive from its primary qualities. For example, an apple can be perceived in various colours, sizes, and textures but it is still identified as an apple. Therefore its primary qualities dictate what the object essentially is, while its secondary qualities define its attributes. Complex ideas combine simple ones, and divide into substances, modes, and relations. According to Locke, our knowledge of things is a perception of ideas that are in accordance or discordance with each other, which is very different from the quest forcertaintyofDescartes.A generation later, the IrishAnglicanbishop,George Berkeley(16851753), determined that Locke's view immediately opened a door that would lead to eventualatheism. In response to Locke, he put forth in hisTreatise Concerning the Principles of Human Knowledge(1710) an important challenge to empiricism in which thingsonlyexist either as aresultof their being perceived, or by virtue of the fact that they are an entity doing the perceiving. (For Berkeley, God fills in for humans by doing the perceiving whenever humans are not around to do it). In his textAlciphron, Berkeley maintained that any order humans may see in nature is the language or handwriting of God.[15]Berkeley's approach to empiricism would later come to be calledsubjective idealism.[16][17]The Scottish philosopherDavid Hume(17111776) responded to Berkeley's criticisms of Locke, as well as other differences between early modern philosophers, and moved empiricism to a new level ofskepticism. Hume argued in keeping with the empiricist view that all knowledge derives from sense experience, but he accepted that this has implications not normally acceptable to philosophers. He wrote for example, "Mr. Locke divides all arguments into demonstrative and probable. In this view, we must say, that it is only probable all men must die, or that the sun will rise to-morrow."[18]And, "Mr. Locke, in his chapter of power, says that, finding from experience, that there are several new productions in nature, and concluding that there must somewhere be a power capable of producing them, we arrive at last by this reasoning at the idea of power. But no reasoning can ever give us a new, original, simple idea; as this philosopher himself confesses. This, therefore, can never be the origin of that idea."[19]Hume divided all of human knowledge into two categories:relations of ideasandmatters of fact(see alsoKant'sanalytic-synthetic distinction). Mathematical and logical propositions (e.g. "that the square of the hypotenuse is equal to the sum of the squares of the two sides") are examples of the first, while propositions involving somecontingentobservation of the world (e.g. "the sun rises in the East") are examples of the second. All of people's "ideas", in turn, are derived from their "impressions". For Hume, an "impression" corresponds roughly with what we call a sensation. To remember or to imagine such impressions is to have an "idea". Ideas are therefore the faint copies of sensations.Hume maintained that all knowledge, even the most basic beliefs about thenatural world, cannot be conclusively established by reason. Rather, he maintained, our beliefs are more a result of accumulatedhabits, developed in response to accumulated sense experiences. Among his many arguments Hume also added another important slant to the debate aboutscientific method that of theproblem of induction. Hume argued that it requires inductive reasoning to arrive at the premises for the principle of inductive reasoning, and therefore the justification for inductive reasoning is a circular argument.[20]Among Hume's conclusions regarding the problem of induction is that there is no certainty that the future will resemble the past. Thus, as a simple instance posed by Hume, we cannot know with certainty by inductive reasoningthat the sun will continue to rise in the East, but instead come to expect it to do so because it has repeatedly done so in the past.[20]Hume concluded that such things as belief in an external world and belief in the existence of the self were not rationally justifiable. According to Hume these beliefs were to be accepted nonetheless because of their profound basis in instinct and custom. Hume's lasting legacy, however, was the doubt that his skeptical arguments cast on the legitimacy of inductive reasoning, allowing many skeptics who followed to cast similar doubt.PhenomenalismMost of Hume's followers have disagreed with his conclusion that belief in an external world isrationallyunjustifiable, contending that Hume's own principles implicitly contained the rational justification for such a belief, that is, beyond being content to let the issue rest on human instinct, custom and habit.[21]According to an extreme empiricist theory known asPhenomenalism, anticipated by the arguments of both Hume and George Berkeley, a physical object is a kind of construction out of our experiences.[22]Phenomenalism is the view that physical objects, properties, events (whatever is physical) are reducible to mental objects, properties, events. Ultimately, only mental objects, properties, events, exist hence the closely related termsubjective idealism. By the phenomenalistic line of thinking, to have a visual experience of a real physical thing is to have an experience of a certain kind of group of experiences. This type of set of experiences possesses a constancy and coherence that is lacking in the set of experiences of which hallucinations, for example, are a part. AsJohn Stuart Millput it in the mid-19th century, matter is the "permanent possibility of sensation".[23]Mill's empiricism went a significant step beyond Hume in still another respect: in maintaining that induction is necessary forallmeaningful knowledge including mathematics. As summarized by D.W. Hamlin:[Mill] claimed that mathematical truths were merely very highly confirmed generalizations from experience; mathematical inference, generally conceived as deductive [anda priori] in nature, Mill set down as founded on induction. Thus, in Mill's philosophy there was no real place for knowledge based on relations of ideas. In his view logical and mathematical necessity is psychological; we are merely unable to conceive any other possibilities than those that logical and mathematical propositions assert. This is perhaps the most extreme version of empiricism known, but it has not found many defenders.[17]Mill's empiricism thus held that knowledge of any kind is not from direct experience but an inductive inference from direct experience.[24]The problems other philosophers have had with Mill's position center around the following issues: Firstly, Mill's formulation encounters difficulty when it describes what direct experience is by differentiating only between actual and possible sensations. This misses some key discussion concerning conditions under which such "groups of permanent possibilities of sensation" might exist in the first place. Berkeley put God in that gap; the phenomenalists, including Mill, essentially left the question unanswered. In the end, lacking an acknowledgement of an aspect of "reality" that goes beyond mere "possibilities of sensation", such a position leads to a version of subjective idealism. Questions of how floor beams continue to support a floor while unobserved, how trees continue to grow while unobserved and untouched by human hands, etc., remain unanswered, and perhaps unanswerable in these terms.[17][25]Secondly, Mill's formulation leaves open the unsettling possibility that the "gap-filling entities are purely possibilities and not actualities at all".[25]Thirdly, Mill's position, by calling mathematics merely another species of inductive inference, misapprehends mathematics. It fails to fully consider the structure and method ofmathematical science, the products of which are arrived at through an internally consistentdeductiveset of procedures which do not, either today or at the time Mill wrote, fall under the agreed meaning ofinduction.[17][25][26]The phenomenalist phase of post-Humean empiricism ended by the 1940s, for by that time it had become obvious that statements about physical things could not be translated into statements about actual and possible sense data.[27]If a physical object statement is to be translatable into a sense-data statement, the former must be at least deducible from the latter. But it came to be realized that there is no finite set of statements about actual and possible sense-data from which we can deduce even a single physical-object statement. Remember that the translating or paraphrasing statement must be couched in terms of normal observers in normal conditions of observation. There is, however, nofiniteset of statements that are couched in purely sensory terms and can express the satisfaction of the condition of the presence of a normal observer. According to phenomenalism, to say that a normal observer is present is to make the hypothetical statement that were a doctor to inspect the observer, the observer would appear to the doctor to be normal. But, of course, the doctor himself must be a normal observer. If we are to specify this doctor's normality in sensory terms, we must make reference to a second doctor who, when inspecting the sense organs of the first doctor, would himself have to have the sense data a normal observer has when inspecting the sense organs of a subject who is a normal observer. And if we are to specify in sensory terms that the second doctor is a normal observer, we must refer to a third doctor, and so on (also see thethird man).[28][29]Logical EmpiricismLogical empiricism (akalogical positivismorneopositivism) was an early 20th-century attempt to synthesize the essential ideas of British empiricism (e.g. a strong emphasis on sensory experience as the basis for knowledge) with certain insights frommathematical logicthat had been developed byGottlob FregeandLudwig Wittgenstein. Some of the key figures in this movement wereOtto Neurath,Moritz Schlickand the rest of theVienna Circle, along withA.J. Ayer,Rudolf CarnapandHans Reichenbach. The neopositivists subscribed to a notion of philosophy as the conceptual clarification of the methods, insights and discoveries of the sciences. They saw in the logical symbolism elaborated by Frege (d. 1925) andBertrand Russell(18721970) a powerful instrument that could rationally reconstruct all scientific discourse into an ideal, logically perfect, language that would be free of the ambiguities and deformations of natural language. This gave rise to what they saw as metaphysical pseudoproblems and other conceptual confusions. By combining Frege's thesis that all mathematical truths are logical with the early Wittgenstein's idea that alllogical truthsare mere linguistictautologies, they arrived at a twofold classification of all propositions: theanalytic(a priori) and thesynthetic(a posteriori).[30]On this basis, they formulated a strong principle of demarcation between sentences that have sense and those that do not: the so-calledverification principle. Any sentence that is not purely logical, or is unverifiable is devoid of meaning. As a result, most metaphysical, ethical, aesthetic and other traditional philosophical problems came to be considered pseudoproblems.[31]In the extreme empiricism of the neopositivistsat least before the 1930sany genuinely synthetic assertion must be reducible to an ultimate assertion (or set of ultimate assertions) that expresses direct observations or perceptions. In later years, Carnap and Neurath abandoned this sort ofphenomenalismin favor of a rational reconstruction of knowledge into the language of an objective spatio-temporal physics. That is, instead of translating sentences about physical objects into sense-data, such sentences were to be translated into so-calledprotocol sentences, for example, "Xat locationYand at timeTobserves such and such."[32]The central theses of logical positivism (verificationism, the analytic-synthetic distinction, reductionism, etc.) came under sharp attack after World War 2 by thinkers such asNelson Goodman,W.V. Quine,Hilary Putnam,Karl Popper, andRichard Rorty. By the late 1960s, it had become evident to most philosophers that the movement had pretty much run its course, though its influence is still significant among contemporaryanalytic philosopherssuch asMichael Dummettand otheranti-realists.PragmatismIn the late 19th and early 20th century several forms ofpragmatic philosophyarose. The ideas of pragmatism, in its various forms, developed mainly from discussions that took place whileCharles Sanders PeirceandWilliam Jameswere both at Harvard in the 1870s. James popularized the term "pragmatism", giving Peirce full credit for its patrimony, but Peirce later demurred from the tangents that the movement was taking, and redubbed what he regarded as the original idea with the name of "pragmaticism". Along with itspragmatic theory of truth, this perspective integrates the basic insights of empirical (experience-based) andrational(concept-based) thinking.Charles Peirce (18391914) was highly influential in laying the groundwork for today's empiricalscientific method.[citation needed]Although Peirce severely criticized many elements of Descartes' peculiar brand of rationalism, he did not reject rationalism outright. Indeed, he concurred with the main ideas of rationalism, most importantly the idea that rational concepts can be meaningful and the idea that rational concepts necessarily go beyond the data given by empirical observation. In later years he even emphasized the concept-driven side of the then ongoing debate between strict empiricism and strict rationalism, in part to counterbalance the excesses to which some of his cohorts had taken pragmatism under the "data-driven" strict-empiricist view. Among Peirce's major contributions was to placeinductive reasoninganddeductive reasoningin a complementary rather than competitive mode, the latter of which had been the primary trend among the educated since David Hume wrote a century before. To this, Peirce added the concept ofabductive reasoning. The combined three forms of reasoning serve as a primary conceptual foundation for the empirically based scientific method today. Peirce's approach "presupposes that (1) the objects of knowledge are real things, (2) the characters (properties) of real things do not depend on our perceptions of them, and (3) everyone who has sufficient experience of real things will agree on the truth about them. According to Peirce's doctrine offallibilism, the conclusions of science are always tentative. The rationality of the scientific method does not depend on the certainty of its conclusions, but on its self-corrective character: by continued application of the method science can detect and correct its own mistakes, and thus eventually lead to the discovery of truth".[33]In his Harvard "Lectures on Pragmatism" (1903), Peirce enumerated what he called the "three cotary propositions of pragmatism" (L:cos, cotiswhetstone), saying that they "put the edge on themaxim of pragmatism". First among these he listed the peripatetic-thomist observation mentioned above, but he further observed that this link between sensory perception and intellectual conception is a two-way street. That is, it can be taken to say that whatever we find in the intellect is also incipiently in the senses. Hence, if theories are theory-laden then so are the senses, and perception itself can be seen as a species ofabductive inference, its difference being that it is beyond control and hence beyond critique in a word, incorrigible. This in no way conflicts with the fallibility and revisability of scientific concepts, since it is only the immediate percept in its unique individuality or "thisness" what theScholasticscalled itshaecceity that stands beyond control and correction. Scientific concepts, on the other hand, are general in nature, and transient sensations do in another sense find correction within them. This notion of perception as abduction has received periodic revivals inartificial intelligenceandcognitive scienceresearch, most recently for instance with the work ofIrvin Rockonindirect perception.[34][35]Around the beginning of the 20th century, William James (18421910) coined the term "radical empiricism" to describe an offshoot of his form of pragmatism, which he argued could be dealt with separately from his pragmatism though in fact the two concepts are intertwined in James's published lectures. James maintained that the empirically observed "directly apprehended universe needs ... no extraneous trans-empirical connective support",[36]by which he meant to rule out the perception that there can be anyvalue addedby seekingsupernaturalexplanations fornaturalphenomena. James's "radical empiricism" is thusnotradical in the context of the term "empiricism", but is instead fairly consistent with the modern use of the term "empirical". (His method of argument in arriving at this view, however, still readily encounters debate within philosophy even today.)John Dewey(18591952) modified James' pragmatism to form a theory known asinstrumentalism. The role of sense experience in Dewey's theory is crucial, in that he saw experience as unified totality of things through which everything else is interrelated. Dewey's basic thought, in accordance with empiricism was thatrealityis determined by past experience. Therefore, humans adapt their past experiences of things to perform experiments upon and test the pragmatic values of such experience. The value of such experience is measured experientially and scientifically, and the results of such tests generate ideas that serve as instruments for future experimentation,[37]in physical sciences as in ethics.[38]Thus, ideas in Dewey's system retain their empiricist flavour in that they are only knowna posteriori.