philosophical analysis of the meaning and nature of

11
Research Article Philosophical Analysis of the Meaning and Nature of Entropy and Negative Entropy Theories Kun Wu , 1,2 Qiong Nan , 2 and Tianqi Wu 1,2 1 International Center for Philosophy of Information of Xi’an Jiaotong University, Xi’an 710049, China 2 School of Humanities and Social Science of Xi’an Jiaotong University, Xi’an 710049, China Correspondence should be addressed to Tianqi Wu; [email protected] Received 16 June 2020; Revised 20 July 2020; Accepted 29 July 2020; Published 20 August 2020 Academic Editor: Giacomo Innocenti Copyright©2020KunWuetal.isisanopenaccessarticledistributedundertheCreativeCommonsAttributionLicense,which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. e interpretation of entropy and negative entropy theories on the nature of information is the first scientific paradigm on the interpretation of the nature of information at the beginning of the rise of contemporary information science. e information entropyornegativeentropytheoriesaretherelativemeasurementforthestructuralizationdegreeofthesystemataspecificlevel, which have certain characteristics of relativity, functionality, and quantification. Although the concepts of entropy, negative entropy, information, entropy increase, and entropy decrease often have very different specific meanings in different disciplines andfordifferentscientists,themeaningsoftheseconceptsareunifiedinessenceandthedifferencesbetweenthemaregenerated whenthesamekindofconceptsisappliedtothestudyofdifferentdirectionsofthesamekindofphenomena:eithertothestatic structuralization degree of the system or to the change of the dynamic structuralization degree of the system. Since entropy and negative entropy theories only measure the relationship of structural differences at a specific level of the system from a formal aspect, they are not aimed at the information itself but rather they are aimed at only the structuralization characteristics of the information carrier itself at a specific level. Because of this, it is impossible to deduce the general nature of information directly from such theories. 1. Introduction Starting from the works of Claude Elwood Shannon ([1] (pages 379–423)) and Wiener [2] in the middle of the 20th century, the question of what the information is has always been a major theoretical issue concerned by philosophers andscientists.Accordingtosomerelevantstatistics,sofar,in the fields of information science, systems science, self-or- ganization theory, complexity theory [3], physics, life sci- ence, numerous related interdisciplinary science, and technology disciplines [4], as well as related theories of different philosophical schools, there are no less than a hundred standard and nonstandard formulations on the nature of information put forward from different levels and perspectives,butnouniversallyacceptedexplanationhasyet been found. In truth, if we categorize the existing information concepts into different levels, clarify their origins, and identify their true meanings, then the information concepts with different opinions on the surface will appear less complicated and diverse. Onthisbasis,itisnotverydifficulttorevealthegeneral nature of information. ispaperisjustoneofaseriesofpaperswearetryingto deal with. Its content only specifically analyzes the meaning and nature of the entropy and negative entropy theories related to the discussion of the nature of information. e interpretation of entropy and negative entropy theories on the nature of information is the first scientific paradigmontheinterpretationofthenatureofinformation at the beginning of the rise of contemporary information science,whichstartedinthe1940sandhasdevelopedtothis day. Based on the basic ideas and methods of this scientific paradigm, a lot of theoretical results have been produced, andaseriesoftechnicalandappliedachievementshavebeen produced. Hindawi Complexity Volume 2020, Article ID 8769060, 11 pages https://doi.org/10.1155/2020/8769060

Upload: others

Post on 21-Feb-2022

2 views

Category:

Documents


0 download

TRANSCRIPT

Research ArticlePhilosophical Analysis of the Meaning and Nature of Entropy andNegative Entropy Theories

Kun Wu 12 Qiong Nan 2 and Tianqi Wu 12

1International Center for Philosophy of Information of Xirsquoan Jiaotong University Xirsquoan 710049 China2School of Humanities and Social Science of Xirsquoan Jiaotong University Xirsquoan 710049 China

Correspondence should be addressed to Tianqi Wu tianqi1262016126com

Received 16 June 2020 Revised 20 July 2020 Accepted 29 July 2020 Published 20 August 2020

Academic Editor Giacomo Innocenti

Copyright copy 2020KunWu et al)is is an open access article distributed under the Creative CommonsAttribution License whichpermits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

)e interpretation of entropy and negative entropy theories on the nature of information is the first scientific paradigm on theinterpretation of the nature of information at the beginning of the rise of contemporary information science )e informationentropy or negative entropy theories are the relative measurement for the structuralization degree of the system at a specific levelwhich have certain characteristics of relativity functionality and quantification Although the concepts of entropy negativeentropy information entropy increase and entropy decrease often have very different specific meanings in different disciplinesand for different scientists the meanings of these concepts are unified in essence and the differences between them are generatedwhen the same kind of concepts is applied to the study of different directions of the same kind of phenomena either to the staticstructuralization degree of the system or to the change of the dynamic structuralization degree of the system Since entropy andnegative entropy theories only measure the relationship of structural differences at a specific level of the system from a formalaspect they are not aimed at the information itself but rather they are aimed at only the structuralization characteristics of theinformation carrier itself at a specific level Because of this it is impossible to deduce the general nature of information directlyfrom such theories

1 Introduction

Starting from the works of Claude Elwood Shannon ([1](pages 379ndash423)) and Wiener [2] in the middle of the 20thcentury the question of what the information is has alwaysbeen a major theoretical issue concerned by philosophersand scientists According to some relevant statistics so far inthe fields of information science systems science self-or-ganization theory complexity theory [3] physics life sci-ence numerous related interdisciplinary science andtechnology disciplines [4] as well as related theories ofdifferent philosophical schools there are no less than ahundred standard and nonstandard formulations on thenature of information put forward from different levels andperspectives but no universally accepted explanation has yetbeen found

In truth if we categorize the existing informationconcepts into different levels clarify their origins and

identify their true meanings then the information conceptswith different opinions on the surface will appear lesscomplicated and diverse

On this basis it is not very difficult to reveal the generalnature of information

)is paper is just one of a series of papers we are trying todeal with Its content only specifically analyzes the meaningand nature of the entropy and negative entropy theoriesrelated to the discussion of the nature of information

)e interpretation of entropy and negative entropytheories on the nature of information is the first scientificparadigm on the interpretation of the nature of informationat the beginning of the rise of contemporary informationscience which started in the 1940s and has developed to thisday Based on the basic ideas and methods of this scientificparadigm a lot of theoretical results have been producedand a series of technical and applied achievements have beenproduced

HindawiComplexityVolume 2020 Article ID 8769060 11 pageshttpsdoiorg10115520208769060

For example Arellano Valle et al [5] have applied theresearch method of generalized skew-normal (GSN) negen-tropy to the complexity research of fish condition factor timeseries [5]

Since this interpretation method is proposed in thecontext of the mutual entanglement and correspondencebetween the two concepts of entropy and negative entropy itis necessary to start with the interpretation of the twoconcepts of entropy and negative entropy and their relatedtheories in order to clarify the specificmeaning and nature ofthis interpretation method

2 Entropy Theory

At the beginning the concept of entropy was proposed in theinterpretation of the second law of thermodynamics At thattime the concept of entropy was not linked to the concept ofinformation

21 Clausius Entropy and ldquoPrinciple of Entropy IncreaserdquoIn 1850 the German physicist Rudolf Clausius proposed thesecond law of thermodynamics In 1865 he put forward theconcept of ldquoentropyrdquo and accordingly expressed the secondlaw of thermodynamics as ldquothe principle of entropy increaserdquo

Since Clausius only grasped the concept of entropy in thesense of ldquotransformationrdquo and ldquochangerdquo (it can be seen fromthe intentional choice of the Greek word τρoπη ldquotroperdquomeaning ldquotransformationrdquo and add the prefix ε] to correspondto the word energy and constitute ldquoEntropiardquo with the samemeaning as ldquotransformation contentrdquo [6] German Entropie)he only pointed out the ldquoentropy increase phenomenonrdquo andthe state function of entropy can determine the direction andlimitation of the physical process in the macro state Howeverthe absolute value of the entropy of a physical system and themore general creative meaning and value of entropy were notclearly defined and explained by Clausius which made theClausius ldquoentropyrdquo a little mysterious and speculativeAccording to this the academic circles also metaphorically callClausius entropy ldquoClausius demonrdquo or ldquoentropy demonrdquo

22 Boltzmannrsquos Statistical Physical Entropy In 1877 theAustrian physicist Ludwig Boltzmann made a probabilisticexplanation of the physical meaning of entropy and theprinciple of entropy increase using statistical methods fromthe perspective of molecular kinematics ([7] (page 34)) Hepointed out that an isolated system must evolve from amacro state containing a small number of micro states to amacro state containing a large number of micro states itmust inevitably evolve from a state with an uneven prob-ability distribution of each micro state to a state with an evenprobability distribution of each micro state

)e outstanding feature of Boltzmannrsquos work is not onlythat it introduces a probabilistic method which provides afeasible solution for the calculation of the absolute value of thesystem entropy but also that it reveals the general creativesignificance and value of the concept of entropy through thiscalculation )at is to say what entropy describes is not theexisting mode and state of the general mass and energy of the

system but the mode and state of the organization matchingand distribution of these mass and energy )is links theentropy with the fabric order and order structure of systemelements For a thermodynamic system entropy is a mea-surement of the way of distribution fabric order andstructural arrangement state of thermodynamic molecules inthe system Likewise a change in entropy (entropy increase ordecrease) implies a change in the mode and status of thecomposition matching and distribution of system elementsBoltzmannrsquos work not only unveiled the mystery of theconcept of entropy and the principle of entropy increase butalso revealed that it is the introduction of the concept ofentropy that ushered the scientific vision towards the shiftfrom the study of themass and energy of general objects to thestudy of structure relationship and evolution direction ofgeneral objects In addition Boltzmannrsquos work also provideda scientific basis for the generalized development of the en-tropy concept and the entropy theory If the system elementsmean no longer just a molecule in the sense of thermody-namics then the structure and relationship of system ele-ments and the entropy representing it can obtain a moreuniversal character )is is why the concept of entropy hasremained attractive for more than a hundred years and hasbeen linked to concepts such as ldquoinformationrdquo and ldquonegativeentropyrdquo It has penetrated widely spanned many disciplinesand triggered the formation of a series of emerging marginalinterdisciplinary and comprehensive disciplines

Boltzmannrsquos statistical entropy clearly overcomes manylimitations of Clausius entropy and accordingly presentsthree advantages First it can measure the entropy value ofthe system itself second it directly correlates the entropyvalue with the number of micro states of the system and theirrespective probability of occurrence rather than indirectlymeasuring the entropy change of system through the in-termediate links of heat changes and effects on the systemproposed by Clausius entropy)e second one also leads to athird advantage that is Boltzmann entropy inherentlycontains the measurement of the system entropy valueformulated by factors other than heat Because of these threeadvantages Boltzmann entropy can be easily extended togeneralized entropy As long as the Boltzmann entropy is notrestricted to the fabric mode of the molecular system it canobtain the broadest universal character

Since the value of entropy is directly related to the numberof micro states of the system and their respective probabilityof occurrence this means that the more micro states main-taining the macro state of the system the more average theprobability of occurrence of each micro state and the greaterthe entropy of the system)e greater the entropy of system isthe greater the degree of disorder chaos and uncertainty ofthe system structure will be )us in the most general sensethe entropy value is regarded as a measure of the disorderchaos and uncertainty of the system structure

23 Shannonrsquos ldquoEntropy of Information Sourcesrdquo )e rele-vant theory that directly relates the two concepts of entropyand information was originated from Shannonrsquos ldquoentropy ofinformation sourcesrdquo theory which was proposedmore than

2 Complexity

80 years later after the emergence of Clausiusrsquo principle ofentropy increase

Shannonrsquos communication information theory wasfounded in 1948 under the direct enlightenment of Boltz-mann statistical entropy theory Shannon used two thingsfrom Boltzmannrsquos statistical entropy theory one is thestatistical method and the other is the entropy formula InShannonrsquos view an information source is a set system ca-pable of generating a set of random messages with their ownprobabilities of occurrence Based on this a mathematicalformula for measuring the information quantity generatedby the information source was proposed to be titled asldquoentropy of the information sourcerdquo ([8] (page 8)) Shan-nonrsquos information theory is actually a theory of informationentropy which can also be seen as the theory of entropy

In fact the calculation of Shannonrsquos information as thecalculation of physical entropy reveals the mutual formula-tions and relations between certain macro and micro states Ifwe consider the aggregate of all the messages that a sourcemay send as a macro performance of the characteristics of thesource then eachmessage event sent by the source constitutesa micro performance corresponding to this macro perfor-mance It is the comprehensive effect of all these microperformances that reflects the macro characteristics of thesource If every distribution of molecules in the physicalsystem is regarded as a possible message event then entropybecomes a measure of information in the sense of Shannon

)is inherent unity of physical entropy and informationentropy illustrates the truth that both entropy and infor-mation quantity are measures of a certain state of a materialsystem itself the popularization of the concept of physicalentropy in the field of communication is the uncertainty ofthe information source and the embodiment of the conceptof the information source uncertainty in the molecularphysics system is the physical entropy

In fact Shannon himself is very clear about the truemeaning of the information quantity formula he put for-ward He emphasized ldquo)e quantity H minusΣPi Log Pi

plays an important role in information theory as a measureof information choice and uncertainty )e formula of H isthe same as the so-called formula of entropy in statisticalmechanicsrdquo ([8] (page 7)) We noticed that many scholarsalways quoted these two sentences when evaluating thenature of Shannonrsquos information quantity ldquoInformation issomething used to eliminate random uncertaintyrdquo ldquoInfor-mation is eliminated uncertaintyrdquo It is pointed out that bothof these sentences were spoken by Shannon himself in hisarticle ldquoA Mathematical )eory of Communicationrdquo )esetwo sentences have now become the classic definition ofShannonrsquos information quantity However after verbatimverification we did not find such a discourse in ShannonrsquosldquoA Mathematical )eory of Communicationrdquo [1] On thecontrary what we found was that he repeatedly emphasizedthat what his information quantity measures is the uncer-tainty of the information generated by the informationsource and it is the ldquoentropy of the information sourcerdquo [1]

In fact the relationship between ldquouncertaintyrdquo andldquoinformationrdquo in these two sentences can be traced back tothe empiricist school Locke [9] a British philosopher and

thinker and Hume [10] a British philosopher Hume oncemade it clear that more information can be provided bychoosing from the greater possibilities It can be said that thisis the source of inspiration for the two sentences above

Warren Weaver (1894ndash1978) a well-known Americanscientist and social activist wrote about Shannonrsquos infor-mation quantity with detailed comments in the book ldquo)eMathematical )eory of Communicationrdquo [11] coauthoredwith Shannon in 1949 He emphasized three levels ofcommunication issues technical issues semantic issues andutility issues He believed that although Shannonrsquos infor-mation quantity focuses only on technical issues this doesnot mean that the engineering and technical aspects ofcommunication have nothing to do with semantic issues andutility issues In his related comments he particularly em-phasized that ldquoinformation is a measure of onesrsquo freedom ofchoice in selecting a messagerdquo ([12] (pages 614ndash616 619)) Inthis way according to Weaverrsquos evaluation the interpre-tation of information quantity in communications cannot bemerely limited to the ldquoentropy of the information sourcerdquo asit can also be related to issues of meaning and utility as wellas the subjective activities of the personrsquos selection andreception of information From this we noticed thatShannon emphasized that his information quantity is ldquoameasure of information choice and uncertaintyrdquo and isldquohow much lsquothe possibility of choicersquo is involved in thechoice of eventsrdquo or finding a way to measure how much theuncertainty of the result of choices is ([8] (page 7)))e termldquochoicerdquo has been used already in his theory Any kind ofldquochoicerdquo cannot be a purely objective activity of informationsource itself and cannot be separated from the corre-sponding activities of the person as the subject In thisregard in the stipulation of Shannonrsquos information quantitythere are inevitably some factors such as the meaning of themessage and the influence for the receiver Inferring fromthis the argument that Shannonrsquos information quantity ldquoisused to eliminate random uncertaintyrdquo and ldquois the elimi-nated uncertaintyrdquo is not completely false but if these areimposed on him without a second thought and differenti-ation it may sabotage his original intention

In addition we also noticed that in Shannonrsquos infor-mation theory the quantitative calculation of informationquantity is based on the difference between the probabilitythat a message actually occurs and the probability that it mayoccur that is

HShannon logP2P1

1113874 1113875 log(P2) minus log(P1) (1)

where P1 and P2 are the a priori and a posteriori proba-bilities respectively It can also be concluded from this thatShannonrsquos information quantity is ldquosomething used toeliminate random uncertaintyrdquo and ldquothe eliminateduncertaintyrdquo

According to the above discussion Shannonrsquos formula ofinformation entropy H minusΣPi Log Pi can be interpreted inmultiple senses

It can be a measure of the randomness of the messagesent by the source a measure of the a priori uncertainty ofthe message generated by the source a measure of the ability

Complexity 3

of a source to send information a measure of the uncertaintyof choosing among multiple messages a measure of theaverage information quantity (average eliminated uncer-tainty) carried by each message or a measure of the averagedegree to which information sink uncertainty is changed

)ese meanings can be roughly divided into two cate-gories either relative to the characteristics of the informa-tion source or relative to the characteristics of theinformation source changing the state of the informationsink If it is aimed at the characteristics of the informationsource itself the information entropy formula can beregarded as a measure of the entropy value of the infor-mation source itself )is is what Shannon called ldquothe en-tropy of the information sourcerdquo and the ldquomeasure ofuncertaintyrdquo of the information generated by the source If itis aimed at the characteristics of the information sourcechanging the state of the information sink it is an inferencemade by some later scholars based on the possible nature ofShannonrsquos information quantity Shannonrsquos informationquantity is ldquosomething to eliminate random uncertaintyrdquoand ldquois eliminated uncertaintyrdquo According to the expla-nation of the latter this kind of information quantity is nolonger ldquoentropyrdquo but has the meaning and value of ldquonegativeentropyrdquo which is opposite to ldquoentropyrdquo and eliminatesldquoentropyrdquo or eliminates uncertainty It is from this startingpoint that we can assert that although Shannonrsquos ldquoentropy ofinformation sourcesrdquo theory measures the uncertainty ofinformation generated by information sources this theoryhas paved the way for related theories using negative entropyto explain the information

3 Negentropy Theory

31 Schrodingerrsquos ldquoLife Feeds on Negative Entropyrdquo In thefield of general science the scientist who first proposed theconcept of negative entropy in a relationship corresponding tothe concept of entropy is Erwin Schrodinger a well-knownAustrian physicist and one of the founders of quantum me-chanics [13] In 1944 he wrote in What is life the famoussaying that ldquolife feeds on negative entropyrdquo and considerednegative entropy ldquoentropy with negative sign is a measure oforderrdquo He wrote ldquoHow does the living organism avoid declineto equilibrium Obviously it is by eating drinking breathingand (plant) assimilation )e special term is ldquometabolismrdquo meaning change or exchange Here comes a question-what toexchange At first it was undoubtedly referring to the ex-change of matter But it is absurd to think that itrsquos essentiallythe exchange of matter Any atom of nitrogen oxygen sulfuretc in the organism is the same as the atom of the same kindin the environment What advantages can be brought toexchange them Later some people said that we live on energy In fact it is ridiculous because the energy contained in anadult organism is fixed just like the matter it contains Sincethe value of one calorie in the body is the same as one calorieoutside the body it is really hard to understand the usefulnessof pure exchange ldquoWhat precious things in our food can saveus from deathrdquo )is is easy to answer A living organism isconstantly generating entropy (or it can be said that it isincreasing positive entropy) and gradually approaching the

dangerous state of maximum entropy namely death)e onlyway to get rid of death and to live is to continuously drawnegative entropy from the environment We will soon un-derstand that negative entropy is very positive Organisms liveon negative entropy Or to be clearer the essence of meta-bolism is to enable an organism to successfully eliminate all theentropy that it has to generate when it is alive ldquolsquoLife feeds onnegative entropyrsquo just as a living organism attract a string ofnegative entropy to offset the increase in entropy it generatesin life so as to maintains itself at a stable and low entropylevelrdquo)e way an organism stabilizes itself at a highly orderedlevel (equivalent to a fairly low level of entropy) is indeed tocontinuously draw order from the surrounding environment In fact as far as higher animals are concerned it is a fact thatpeople have known for a long time that they live entirely in theorder of absorption Because in the organism they take as foodfor of varying degrees of complexity the state of matter isextremely orderly After consuming these foods animalsexcrete the greatly degraded thingsrdquo ([14] (pages 69ndash70 72))From these statements by Schrodinger we have realized veryclearly that organisms do not devour food moisture and airfor the purpose of obtaining matter and energy What theorganism really needs to absorb from the environment areldquonegative entropyrdquo ldquoorderrdquo ldquoorderlinessrdquo ldquoorganizationrdquo andso on )e concepts of ldquonegative entropyrdquo ldquoorderrdquo ldquoorderli-nessrdquo and ldquoorganizationrdquo discussed in the communication andcontrol theory and some related theories that have been de-veloped later are interlinked with the functional interpretationof ldquoinformationrdquo in the most general sense In this regard ldquolifefeeds on negative entropyrdquo can be interpreted as ldquolife feeds oninformationrdquo

)e negative entropy theory of life proposed bySchrodinger actually opened up a research direction ofinformation theory which is to study the entropy change ofthe system when it is open instead of just making an effortunder the condition of isolated systems like the second lawof thermodynamics )e openness of the system to theenvironment and the fact that the system and the envi-ronment maintain a certain degree of exchange of matterenergy and information are the basic conditions on whichall negative entropy theories are established

Starting from Schrodingerrsquos work ldquonegative entropyrdquoacquired the nature of opposing to the concept of ldquoentropyrdquoIf entropy describes the degree of disorder chaos anduncertainty of a system then negative entropy describes thedegree of order organization and certainty From theperspective of the functionality of relative effect negativeentropy is the elimination of entropy and the elimination ofuncertainty From this we can more clearly grasp andunderstand the basic perspective and nature of a series ofrelated formulations and interpretations of the concept ofinformation made in the subsequent development of in-formation science

32 Wienerrsquos ldquoInformation Is Negative Entropyrdquo Almost atthe same time as Shannon founded his theory of commu-nication information entropy Wiener an Americanmathematician and the founder of cybernetics also

4 Complexity

proposed the theory of information negative entropy in theprocess of establishing cybernetics by integrating the theoryof communication and automatic control In his bookldquoCyberneticsrdquo [2] published in 1948 he independentlypresented Wienerrsquos formula which is only one minus signaway from Shannonrsquos information quantity formula Hewrote ldquo)e information quantity is the negative number ofthe logarithm of a quantity that can be regarded as aprobability which is essentially negative entropyrdquo ([15](pages 11 65)) From this we can also reasonably answer thequestion why do Wienerrsquos information formula andShannonrsquos formula differ by a negative sign It is because theformer measures ldquonegative entropyrdquo while the latter mea-sures ldquoentropyrdquo

Perhaps the analysis from the perspective of the dif-ferences in cognitive methods can help us find the root thatcauses the difference between the information quantity ofShannon and that of Wiener ([16] (pages 33ndash34))

We know that in the field of mechanical communica-tion the number of primitives sent of the messages by asource and the probability of sending each message arepredetermined and the information sink is fully aware of thisdetermination )e a priori estimation by the sink of theuncertainty of the message sent by the source is also derivedfrom this predetermination In this way the uncertainty ofwhat kind of message the source sends can be consideredboth as a feature of the estimation of the information sourcestate by the sink and as a feature of the source itself )edifference of minus sign between Shannonrsquos and Wienerrsquosinformation quantity formulas can be regarded as the resultof their examination from these two different perspectives

)e information quantity of communication can bededuced and calculated according to the principle of rela-tivity of interaction and mutual stipulation between thesource and sink )is leads to the stipulation of ldquoa prioriprobabilityrdquo and ldquoa posteriori probabilityrdquo

If the information quantity formula is deduced from theperspective of the state characteristics of the source itselfaccording to the principle of Shannon then the contributionof the prior probability to the information quantity is re-versed because it provides the ldquouncertaintyrdquo of the sourceestimated by the sink and its direction is opposite to thedirection of the sourcersquos own characteristics )e posteriorprobability contributes positively to the informationquantity because it provides the information state itself ofthe source that actually occurs at the moment and its di-rection is consistent with the direction of the sourcersquos ownstate characteristics )e expression in logarithmic functionis

HShannon logP2P1

1113874 1113875 (2)

If like Wiener the information formula is derived fromthe perspective of the understanding of the source by thesink then the contribution of the prior probability to theinformation quantity is positive On the contrary thecontribution of the posterior probability to the informationquantity is reversed )us in Wienerrsquos formula the formula

of information quantity will be a minus sign different fromthe Shannon formula

HWiener logP1P2

1113874 1113875

log(P1 ) minus log(P2 )

(3)

It is indicated in the fact that the information quantityformulas of Shannon and Wiener can be deduced from thetwo opposite angles and directions of mutual interactionsand determination of the information source and sink thatthe difference of a negative sign between these two formulashas a profound root in epistemology )is reflects the dif-ference and unity of philosophical ontology and episte-mological methods to a certain extent and significanceRegrettably this has not been clearly noticed in the past

It should be said that Wienerrsquos thinking is the same asSchrodingerrsquos Schrodingerrsquos negative entropy of life is usedto calculate the ability to resist the spontaneous entropyincrease in the living body while he information quantity ofWiener is used to calculate the amount of new knowledgebrought to the receiver by the message Both have two basicpoints in common ① the system is open and ② it caneliminate its own chaos by the environment Here whatWienerrsquos information quantity calculates is exactly whatSchrodingerrsquos negative entropy calculates It is no wonderthat Wiener has repeatedly emphasized the idea that theinformation quantity is negative entropy Again we see thatthe crux of the problem lies not in the names of the conceptsused but in the kind of problems that these concepts are usedto explore

33 Negative Entropy Flow of Prigogine Under the cir-cumstances that some theories such as entropy informationand negative entropy have been applied and developed inmore and more discipline theories the classical thermo-dynamics which takes entropy theory and entropy increaseprinciple as its basic characteristics is also developingconstantly )is development finally broke through thelimitations brought by the basic characteristics of classicalthermodynamics to itself

)e Brussels school represented by Belgian physicist andchemist Prigogine [17] reunderstood the second law ofthermodynamics based on a series of experiments andproposed the famous negative entropy theory of dissipativestructure theory in the 1960s [18] It pointed out that theprinciple of increase of entropy only holds in isolated sys-tems For an open system two factors which are the externalentropy flow caused by the exchange between the system andthe environment and the entropy generation within thesystem must be considered Based on this Prigogine pro-posed a generalized second law of thermodynamics which isapplicable to both open systems and isolated systems

Prigogine pointed out that the entropy change of asystem is caused by two factors One factor is the entropyexchanged between the system and the environment duringthe interaction (deS external entropy flow) and the other is

Complexity 5

the entropy generated spontaneously within the system (disinternal entropy change)

For an isolated system since there is no exchange ofmatter and energy between the system and the environmentit is impossible to have the exchange of entropy )ereforein an isolated system deS 0 so dS disge 0 It is the secondlaw of thermodynamics (narrow sense) proposed byClausius

For an open system there is an exchange of entropy atthe same time because of the exchange of matter and energybetween the system and the environment )erefore in anopen system the total entropy change of the system willshow a complex scenario When the external entropy flow isnegative and the absolute value of the external entropy flowis greater than the absolute value of the internal entropychange the system will move towards order along the di-rection of entropy decreasing It can be said that Clausiusrsquossecond law of thermodynamics is just a special case of thegeneralized second law of thermodynamics in an isolatedsystem

It is the generalized second law of thermodynamicsproposed by Prigogine that reveals the inevitability of anorderly evolution of the system along the direction of en-tropy decreasing under a suitable and open background Indissipative structure theory the system introduces negativeentropy flow from the outside to resist the increase of in-ternal entropy which is completely consistent with the basicideas of Schrodingerrsquos ldquonegative entropy theory of liferdquo andWienerrsquos ldquonegative entropy theory of informationrdquo How-ever dissipative structure theory has extended the functionscope of negative entropy into general physical and chemicalsystems )e essence of this expansion is to bring the en-tropy negative entropy and information theories into theall-embracing objective world since every system follows thegeneral laws of physics and chemistry

4 Philosophical Interpretation of theSignificance and Nature of InformationEntropy and Information NegativeEntropy Theories

In the analysis of related traditional literature entropy andnegative entropy are two concepts that correspond to eachother with opposite meanings However if we study themfurther then we will see that the two concepts have the samemeaning and mutually formulated properties

Generally speaking the concept of entropy is a measureof the degree of uncertainty of the fabric mode of the microstate of the system It can reveal the degree of disorder of thesystem organization from a specific level and angle and froma specific level of quantitative measurement Boltzmannrsquosstatistical physical entropy and Shannonrsquos ldquoentropy of in-formation sourcesrdquo are all established in this sense

As for the concept of negative entropy it can be for-mulated in two different senses in related general theoriesone is the degree to which the organization mode of itsstructure deviates from the standard value (the maximumentropy value) relative to the same system and the other is

the degree to which the entropy value of a system decreasesin the process of the change of system organization mode

If a formal description of the fabric mode of a system isneeded two quantities are needed one is the number ofpossible micro states of the system and the other is theprobability that each micro state may occur If A a1 a2 an is a set that represents the possible number of microstates of the system and P p1 p2 pn is a set thatrepresents the probability of possible occurrence of eachmicro state then the organization mode of the formalstructure of the system (M) can be expressed by a matrix asfollows

M A

P1113890 1113891

a1 a2 a3 middot middot middot an

p1 p2 p3 middot middot middot pn

1113890 1113891

(4)

)e organization mode of the systemrsquos structure de-scribed by this matrix may be in two extreme circumstancesone is the state of maximum entropy in this casep1 p2 pn 1n and smax log n and the other is thestate of minimum entropy in this case p1 1p2 p3 pn 0 and s log1 0

If we determine that the case of smax is the standard valueto which the organization mode of the systemrsquos structureshould be referenced then all cases where the system en-tropy is less than smax can be regarded as a deviation fromthis standard value What causes this deviation To whatextent is this deviation Obviously there should be a conceptto specify this factor and there should be a calculation tomeasure the extent of this deviation A very natural idea isthat the effect of this factor is the opposite of entropy whichis negative entropy )is calculation should be the differencebetween the standard entropy value and the actual entropyvalue Based on this idea we can get the following calcu-lation formula for negative entropy ([19] (pages 67ndash74))

negative entropy smax minus s (5)

Obviously there are two extreme circumstances of thisformula one is that when s smax the negative entropy valueof the system is 0 the other is that when s 0 the negativeentropy value of the system is maximum which is equal tosmax

Negative entropy not only can be specified in the sensethat the entropy value in the organization mode of a specificsystem deviates from the standard value (the maximumentropy) but also can be specified in the sense that theentropy value decreases in the process of the change of theorganization mode of the system Schrodingerrsquos negativeentropy theory of life Wienerrsquos negative entropy theory ofinformation Prigoginersquos negative entropy theory of dissi-pative structure and so forth in essence are all defined inthe sense of entropy decrease Negative entropy as a measureof the entropy decrease degree and entropy (Clausius en-tropy) as a measure of entropy increase degree are notmeasures of the systemrsquos absolute negative entropy or en-tropy value but they are measures of some kind of quantity

6 Complexity

ldquochangerdquo or ldquotransformationrdquo which is a measure of relativequantity No matter it is Schrodingerrsquos ldquolife feeds on negativeentropyrdquo Wienerrsquos ldquohow much new information is given tous by hindsightrdquo or the factors that resist the spontaneousentropy increase in the system by Prigogine are all developedfrom the perspective of relative functions that lead tochanges of organization mode (degree of uncertainty) ofsystem Just as the entropy increase effect does not simplydepend on howmuch heat is absorbed by the system but alsoon how much the relative degree of change that the heatbrings to the organization mode of the systemrsquos originalstructure is the entropy decrease effect also does not simplydepend on what kind of message the system receives or whatkind of mass or energy with a certain value of entropy ornegative entropy the system absorbs but also on how therelative degree of change (degree of uncertainty) that themessage the mass or energy brings to the organizationmode of the original structure of the system is)is brings upa very interesting phenomenon )e same mass or energy orthe same message which acts on systems of differentstructural states will play very different roles for differentreceivers lead to entropy increase or entropy decrease addnew information cause ideological disorder or not work(maintaining the original structural mode unchanged andmaintaining the original cognitive state unchanged) )is iswhy the Clausius entropy increase formula has 1T as theintegral factor and Wienerrsquos information formula has aprior probability as the reference factor

Although concepts such as entropy negative entropyinformation entropy increase and entropy decrease oftenhave very different specific meanings in different disciplinesand for different scientists these concepts are essentiallyconsistent in nature because they all study the same kind ofphenomena in a unified sense and the differences betweenthem emerge when the same kinds of concepts are applied tothe research of different directions of the same kind ofphenomena

It is reasonable to distinguish the formulations of theseconcepts into two categories one is a formulation given in astatic sense and the other is a formulation given in a dynamicsense In this way we can clearly see that the ambiguousinterpretation of these concepts is often caused by theconfusion of these two types of formulations

In essence Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy and the negative entropy indicatingthe degree of deviation from the standard entropy value ofthe system pointed out above are all the quantitative for-mulations of entropy (information entropy) and negativeentropy (information) in the sense of static state and ab-solute value of the system )e basic meaning of this for-mulation is to calculate the degree of indeterminacy(uncertainty) of the micro state of the certain system and theextent to which this degree of indeterminacy (uncertainty)deviates from the maximum possible degree of indetermi-nacy (uncertainty) )is point can be clearly seen from theprevious comparative interpretation of the statistical en-tropy formula and the information entropy formula as wellas in our explanation of ldquonegative entropy smax minus srdquo Sometexts believe that the statistical entropy formula calculates

the ldquoentropy (change) of the system in a certain processrdquowhile Shannonrsquos information quantity calculates ldquothe in-formation quantity (change) of the system in a certainprocessrdquo ([20] (pages 20ndash27)) and this statement is incor-rect Here we also want to emphasize one point that iswhether it is Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy or the negative entropy indicating thedegree of deviation from the standard value of the systemthey are still just a quantitative concept and none of themcan precisely define the general nature of the abstractmeaning of entropy negative entropy and information Interms of methodology the definition of the abstract generalnature of such concepts is not a task of these specific sciencesbut only a philosophical subject If concepts are used ac-curately we should replace them with such concepts ofquantity of entropy negative entropy or information

)e dynamic formulations for the concepts of entropyand information are developed in two directions one is thedirection of entropy increase based on the second law ofthermodynamics and the other is the direction of entropydecrease with the framework of various negative entropytheories constructed in the sense of resisting the entropyincrease of the system

)e very interesting fact is that the research on thedynamics of entropy and information was earlier than theresearch on its statics Clausius was already quantifying itschanges when people did not really understand what en-tropy was

Various forms of negative entropy theory are dynamicmeasures of changes in information (entropy) from anotherdirection opposite to the second law of thermodynamicsSchrodinger Wiener and Prigogine all have a common ideathat the system can input the external entropy (information)flow from the environment to resist the increasing trend ofentropy within the system )ey measure the amount ofexternal entropy (information) flow by the amount ofchange in entropy (information) within the system causedby the external entropy (information) flow Because theexternal entropy (information) flow may cause the entropydecrease effect within the system consequently the quantityof this external entropy (information) flow can be measuredby the degree of the entropy decrease effect within the systemcaused by it Moreover it is simultaneously and relativelydefined as negative entropy

It seems that the function ΣPi Log Pi has some uniquevalue In the static state its absolute value indicates theuncertainty of the micro state in which the system is locatedDynamically the change of the function value indicates thechange of the uncertainty of the micro state in which thesystem is located )is change is caused by a change in thevalue of n that indicates the number of micro states of thesystem and a change in the Pi probability distribution Ingeneral the increase in the value of n and the tendencytowards equilibrium of Pi value result in the process ofentropy increase while the decrease in the value of n and thetendency towards differentiation of Pi value result in theprocess of entropy decrease As for the general idea thatinformation and entropy are regarded as opposites it ismore like an artificial formulation )e statement that the

Complexity 7

entropy decrease effect of the system is caused by the ex-ternal information is equivalent to the statement that thesystemrsquos entropy decrease effect is caused by the entropyflow introduced by the external environment Prigogine usesexternal entropy flow Schrodinger uses negative entropyand Wiener uses both information and negative entropy infact they are measuring the quantity of the same type ofchange in the same process We have every reason to regardthe various theories of entropy information and negativeentropy as theories about entropy quantity and at the sametime we have all the reasons to regard them as theories aboutinformation quantity Based on this we can establish a kindof generalized entropy theory or a generalized informationtheory to unify the discussions of the quantity of entropy andinformation that have been carried out and are ongoing indifferent disciplines

5 Comments and Conclusions

At this point we are able to evaluate and summarize thenature of the relevant information entropy or informationnegative entropy theories

Firstly the information entropy or negative entropytheories are the relative measurement for the structurali-zation degree at a specific level of the system which hascertain characteristics of relativity )e information entropytheory measures the relationship between the diversity ofstructural organization and the degree of uncertainty dif-ference at a specific level of the system while the negativeentropy theory of information measures the diversity ofstructural organization methods and the degree of uncer-tainty reduction at a specific level of the system

Secondly we notice that in the general theory theconcept of information is defined and explained in a specialsense as negative entropy )ese are two related statementsin communication and control theory ldquoinformation is theeliminated uncertaintyrdquo and ldquoinformation is negative en-tropyrdquo However these two statements only emphasize therole of information to the receiver which is the functionaldefinition of information from a specific perspective )isdefinition does not reveal what the information is At mostit only emphasizes from a specific perspective the role ofinformation to the receiver It is impossible to reveal thenature of information from such an interpretation

Moreover the information entropy or negative entropytheories are only to measure the state of a certain aspect ofthe system and the degree of state change in that aspect by acertain calculation method of amount In this regard theinformation entropy or negative entropy theories have theproperty of specifically defined quantitative characteristic

In terms of such characteristics of relativity function-ality and quantification the theory of information entropyor negative entropy is only a technical quantitative pro-cessing method for mechanical communication and con-trolling processes in essence not a theory about the nature ofinformation

It is necessary to mention here that as early as 1928Hartley (1890ndash1970) an American communications expertpointed out in the article ldquoTransmission of Informationrdquo that

ldquoInformation refers to the message with new content andknowledgerdquo ([21] (page 535)) )is is also an acquireddefinition of information recognized and expressed bypeople in their daily life and in general literature Obviouslythis definition is in line with the meanings of ldquoinformation isthe eliminated uncertaintyrdquo and ldquoinformation is negativeentropyrdquo mentioned earlier and it is formulated in the senseof whether the message can bring new content to the re-ceiver Obviously such a definition is also relative andfunctional and cannot be used as an essential explanation ofwhat information is

Usually people always regard the ldquoinformation is neg-ative entropyrdquo as the standard definition of information byWiener and interpret the general nature of information fromthis However they did not seriously conduct discriminationand analysis but extended the explanation made by Wieneronly in the sense of the quantitative description of therelative functionalization of technical processing to thegeneral universal scope at will In factWienerrsquos statement onldquoinformation is negative entropyrdquo is just a practical inter-pretation of communication and control information fromthe perspective of technical processing by using the existingcalculationmethods of entropy and is only a kind of measureof the amount of practical information What he seeks isonly a method of quantitative processing realized by tech-nology but not to reveal the general nature of information atall By the same token the statement that ldquoinformation is theeliminated uncertaintyrdquo focuses only on a quantitativeprocessing method realized by technology As some scholarspointed out in the interdisciplinary research on informationldquoWienerrsquos types of mathematical definitions of informationrelated to mathematical or physical concepts of negativeentropy cannot adequately encompass the experientialembodied pragmatic semantic meaningful content of or-dinary sign games of living systems and the language gamesof embodied conscious humansrdquo ([22] (pages 622ndash633))

In fact Wiener himself is very clear in what sense hisldquoinformation is negative entropyrdquo is used because when heput forward this statement he also made a correspondingdiscussion on the general nature of information He has twoinfluential formulations One is ldquoinformation is informationnot matter or energy No materialism which does not admitthis can survive in the present dayrdquo ([15] (page 133)) )esecond is ldquoinformation is the name of the content that weexchange with the external world in the process of adaptingto it and making this adaptation felt by the external worldrdquo([23] (page 4))

Obviously the first sentence of Wiener emphasizes theontological status of information Although he failed tocorrectly define the nature of information from the positiveside in this sentence he correctly emphasized the inde-pendent value and significance of information comparedwith matter (quality) and energy and he also put forward awarning about the relevant materialism theory that failed tomake a reasonable interpretation of the ontological status ofinformation

Wienerrsquos second sentence further emphasizes the need toclarify the general nature of information Instead of simplyfocusing on the form of the information carrier or the

8 Complexity

function of the information we should grasp the infor-mation based on what we ldquoexchange with the outside worldrdquoSince it is ldquoexchangerdquo there should be in and out In this waythere is information not only within our subject but also inthe external environment In this regard the correspondingdoctrines of objective information and subjective infor-mation should be valid)is also shows the true charm of thesaying that ldquoinformation is information not matter or en-ergyrdquo which Wiener emphasizes

It is regrettable that for a long time Wienerrsquos clearwarning to philosophy has not attracted the attention ofmore philosophers and scientists Not only has the revo-lutionary value of information for the development ofphilosophy not been clearly revealed but also unified in-formation science has not been established because theestablishment of unified information science must be basedon the general theory of information philosophy

In addition we should also note that the statementsldquoinformation is the eliminated uncertaintyrdquo and ldquoinforma-tion is negative entropyrdquo are also single-faceted in the senseof functional definition Because in the real world the roleof information is multifaceted and multilayered it can notonly eliminate uncertainty but also increase uncertainty itcan play the role of negative entropy as well as the role ofentropy For example when a person is sick he should takemedicine to eliminate the disorder caused by the disease inhis body but what happens if he takes the wrong medicineObviously the medicine will provide him with the corre-sponding information but this information does not alwaysplay a role in eliminating uncertainty or negative entropy Insome cases it may play the opposite role which is to increaseuncertainty or entropy

An ancient Chinese literature ldquoStratagems of the War-ring States Qin Stratagem IIrdquo tells a parable of ldquoterrifyingrumorrdquo It was said that Zeng Zirsquos mother was weaving athome and a neighbor came to tell her that ldquoZeng Zi haskilled someonerdquo Zeng Zirsquos mother did not believe and saidldquoI know my son he will not kill peoplerdquo She continued toweave calmly After a while another neighbor came to tellher that ldquoZeng Zi has killed someonerdquo Zeng Zirsquos mother stilldid not believe it and said ldquoHe wonrsquot kill anyonerdquo andcontinued to weave However when the third neighborcame to tell her ldquoZeng Zi has killed someonerdquo finally ZengZirsquos mother could not sit still and she put down her workand fled across the wall

In this parable what effect does the information thatldquoZeng Zi has killed someonerdquo have on his mother Is itentropy or negative entropy Is it entropy increase or de-crease Is uncertainty increased or eliminated

Also if we generalize the functional definition of ldquoin-formation is the eliminated uncertaintyrdquo then we will seesome very ridiculous scenarios In a book published as early as1987 Wu once wrote ldquo)e role of information is funda-mentally different from what information itself is )e natureof information can only be sought from the inner basis of itsown content but cannot be formulated simply by its effect ona certain aspect of the sink Just as the definition of foodcannot be lsquoeliminated hungerrsquo the definition of informationcannot be lsquoeliminated uncertaintyrsquordquo ([24] (page 8))

Finally here comes the most essential aspect that shouldbe emphasized that is the relationship of structural dif-ferences at specific levels of the system measured only fromthe aspect of the form which does not aim at the infor-mation itself but merely aims at the structural characteristicsof the information carrier itself Because of this it is im-possible to deduce the general nature of information directlyfrom such a theory It is no wonder that some westernscholars have clearly and reasonably pointed out that ldquoIn-formation theory deals with the carrier of informationsymbols and signals not information itselfrdquo and ldquoInfor-mation theory does not deal with the information itself butthe carrier of the informationrdquo ([25] (page 150))

Since the calculation of the quantity of entropy andnegative entropy involves the probability distribution of themicro states of the system being measured it is reasonablethat relevant viewpoints such as the degree of orderly ordisorderly organization (order) of the system ldquodegree ofvariationrdquo ldquodifferences and constraintsrdquo ldquosymmetrybreakingrdquo ldquodifference that makes a differencerdquo ldquoform andstructurerdquo and ldquostate of thingsrdquo are directly derived from thetheory of entropy and negative entropy Since related viewssuch as these are directly deduced from the theories aboutthe quantity of entropy and negative entropy it is alsoimpossible to obtain the formulation of the general nature ofinformation through them

Obviously to reveal the essence of information weshould not just focus on the differential relationship of thecarrier forms but we must understand the contents ofrelevant properties characteristics existing modes andstates of the things itself presented by the information

In an article published as early as 1986 Wu wrote thefollowing sentences ldquoinformation is the formulation ofsomething itself displayed in another that alienated bysomething itself it is the indirect existence of somethingitself which exist in other things Information is theformulation of something revealed in the relationship be-tween something and other things Something is informa-tion when it displays itself as internal formulation in anexternal relationship which is expressed in the form ofexternalization of the characteristics of the objectrdquo ([26](page 19))

Based on the content of information and the dynamicmechanism of natural occurrence of information Wu onceclearly defined information as follows ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation of the existing mode and status of matter(direct being)rdquo in a paper entitled ldquoPhilosophical Classifi-cation of Information Formsrdquo which was published in 1984([27] (page 33)) In 2019 Wu expanded the definition ofinformation that was only restricted to the level of philo-sophical ontology based on the historical evolution of in-formation forms classified by him ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation and re-manifestation of the existing modeand status of matter (direct being) as well as the subjectivegrasp and creation of information by the subject of cognitionand practice including the cultural world that has beencreatedrdquo ([28] (page 143))

Complexity 9

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

For example Arellano Valle et al [5] have applied theresearch method of generalized skew-normal (GSN) negen-tropy to the complexity research of fish condition factor timeseries [5]

Since this interpretation method is proposed in thecontext of the mutual entanglement and correspondencebetween the two concepts of entropy and negative entropy itis necessary to start with the interpretation of the twoconcepts of entropy and negative entropy and their relatedtheories in order to clarify the specificmeaning and nature ofthis interpretation method

2 Entropy Theory

At the beginning the concept of entropy was proposed in theinterpretation of the second law of thermodynamics At thattime the concept of entropy was not linked to the concept ofinformation

21 Clausius Entropy and ldquoPrinciple of Entropy IncreaserdquoIn 1850 the German physicist Rudolf Clausius proposed thesecond law of thermodynamics In 1865 he put forward theconcept of ldquoentropyrdquo and accordingly expressed the secondlaw of thermodynamics as ldquothe principle of entropy increaserdquo

Since Clausius only grasped the concept of entropy in thesense of ldquotransformationrdquo and ldquochangerdquo (it can be seen fromthe intentional choice of the Greek word τρoπη ldquotroperdquomeaning ldquotransformationrdquo and add the prefix ε] to correspondto the word energy and constitute ldquoEntropiardquo with the samemeaning as ldquotransformation contentrdquo [6] German Entropie)he only pointed out the ldquoentropy increase phenomenonrdquo andthe state function of entropy can determine the direction andlimitation of the physical process in the macro state Howeverthe absolute value of the entropy of a physical system and themore general creative meaning and value of entropy were notclearly defined and explained by Clausius which made theClausius ldquoentropyrdquo a little mysterious and speculativeAccording to this the academic circles also metaphorically callClausius entropy ldquoClausius demonrdquo or ldquoentropy demonrdquo

22 Boltzmannrsquos Statistical Physical Entropy In 1877 theAustrian physicist Ludwig Boltzmann made a probabilisticexplanation of the physical meaning of entropy and theprinciple of entropy increase using statistical methods fromthe perspective of molecular kinematics ([7] (page 34)) Hepointed out that an isolated system must evolve from amacro state containing a small number of micro states to amacro state containing a large number of micro states itmust inevitably evolve from a state with an uneven prob-ability distribution of each micro state to a state with an evenprobability distribution of each micro state

)e outstanding feature of Boltzmannrsquos work is not onlythat it introduces a probabilistic method which provides afeasible solution for the calculation of the absolute value of thesystem entropy but also that it reveals the general creativesignificance and value of the concept of entropy through thiscalculation )at is to say what entropy describes is not theexisting mode and state of the general mass and energy of the

system but the mode and state of the organization matchingand distribution of these mass and energy )is links theentropy with the fabric order and order structure of systemelements For a thermodynamic system entropy is a mea-surement of the way of distribution fabric order andstructural arrangement state of thermodynamic molecules inthe system Likewise a change in entropy (entropy increase ordecrease) implies a change in the mode and status of thecomposition matching and distribution of system elementsBoltzmannrsquos work not only unveiled the mystery of theconcept of entropy and the principle of entropy increase butalso revealed that it is the introduction of the concept ofentropy that ushered the scientific vision towards the shiftfrom the study of themass and energy of general objects to thestudy of structure relationship and evolution direction ofgeneral objects In addition Boltzmannrsquos work also provideda scientific basis for the generalized development of the en-tropy concept and the entropy theory If the system elementsmean no longer just a molecule in the sense of thermody-namics then the structure and relationship of system ele-ments and the entropy representing it can obtain a moreuniversal character )is is why the concept of entropy hasremained attractive for more than a hundred years and hasbeen linked to concepts such as ldquoinformationrdquo and ldquonegativeentropyrdquo It has penetrated widely spanned many disciplinesand triggered the formation of a series of emerging marginalinterdisciplinary and comprehensive disciplines

Boltzmannrsquos statistical entropy clearly overcomes manylimitations of Clausius entropy and accordingly presentsthree advantages First it can measure the entropy value ofthe system itself second it directly correlates the entropyvalue with the number of micro states of the system and theirrespective probability of occurrence rather than indirectlymeasuring the entropy change of system through the in-termediate links of heat changes and effects on the systemproposed by Clausius entropy)e second one also leads to athird advantage that is Boltzmann entropy inherentlycontains the measurement of the system entropy valueformulated by factors other than heat Because of these threeadvantages Boltzmann entropy can be easily extended togeneralized entropy As long as the Boltzmann entropy is notrestricted to the fabric mode of the molecular system it canobtain the broadest universal character

Since the value of entropy is directly related to the numberof micro states of the system and their respective probabilityof occurrence this means that the more micro states main-taining the macro state of the system the more average theprobability of occurrence of each micro state and the greaterthe entropy of the system)e greater the entropy of system isthe greater the degree of disorder chaos and uncertainty ofthe system structure will be )us in the most general sensethe entropy value is regarded as a measure of the disorderchaos and uncertainty of the system structure

23 Shannonrsquos ldquoEntropy of Information Sourcesrdquo )e rele-vant theory that directly relates the two concepts of entropyand information was originated from Shannonrsquos ldquoentropy ofinformation sourcesrdquo theory which was proposedmore than

2 Complexity

80 years later after the emergence of Clausiusrsquo principle ofentropy increase

Shannonrsquos communication information theory wasfounded in 1948 under the direct enlightenment of Boltz-mann statistical entropy theory Shannon used two thingsfrom Boltzmannrsquos statistical entropy theory one is thestatistical method and the other is the entropy formula InShannonrsquos view an information source is a set system ca-pable of generating a set of random messages with their ownprobabilities of occurrence Based on this a mathematicalformula for measuring the information quantity generatedby the information source was proposed to be titled asldquoentropy of the information sourcerdquo ([8] (page 8)) Shan-nonrsquos information theory is actually a theory of informationentropy which can also be seen as the theory of entropy

In fact the calculation of Shannonrsquos information as thecalculation of physical entropy reveals the mutual formula-tions and relations between certain macro and micro states Ifwe consider the aggregate of all the messages that a sourcemay send as a macro performance of the characteristics of thesource then eachmessage event sent by the source constitutesa micro performance corresponding to this macro perfor-mance It is the comprehensive effect of all these microperformances that reflects the macro characteristics of thesource If every distribution of molecules in the physicalsystem is regarded as a possible message event then entropybecomes a measure of information in the sense of Shannon

)is inherent unity of physical entropy and informationentropy illustrates the truth that both entropy and infor-mation quantity are measures of a certain state of a materialsystem itself the popularization of the concept of physicalentropy in the field of communication is the uncertainty ofthe information source and the embodiment of the conceptof the information source uncertainty in the molecularphysics system is the physical entropy

In fact Shannon himself is very clear about the truemeaning of the information quantity formula he put for-ward He emphasized ldquo)e quantity H minusΣPi Log Pi

plays an important role in information theory as a measureof information choice and uncertainty )e formula of H isthe same as the so-called formula of entropy in statisticalmechanicsrdquo ([8] (page 7)) We noticed that many scholarsalways quoted these two sentences when evaluating thenature of Shannonrsquos information quantity ldquoInformation issomething used to eliminate random uncertaintyrdquo ldquoInfor-mation is eliminated uncertaintyrdquo It is pointed out that bothof these sentences were spoken by Shannon himself in hisarticle ldquoA Mathematical )eory of Communicationrdquo )esetwo sentences have now become the classic definition ofShannonrsquos information quantity However after verbatimverification we did not find such a discourse in ShannonrsquosldquoA Mathematical )eory of Communicationrdquo [1] On thecontrary what we found was that he repeatedly emphasizedthat what his information quantity measures is the uncer-tainty of the information generated by the informationsource and it is the ldquoentropy of the information sourcerdquo [1]

In fact the relationship between ldquouncertaintyrdquo andldquoinformationrdquo in these two sentences can be traced back tothe empiricist school Locke [9] a British philosopher and

thinker and Hume [10] a British philosopher Hume oncemade it clear that more information can be provided bychoosing from the greater possibilities It can be said that thisis the source of inspiration for the two sentences above

Warren Weaver (1894ndash1978) a well-known Americanscientist and social activist wrote about Shannonrsquos infor-mation quantity with detailed comments in the book ldquo)eMathematical )eory of Communicationrdquo [11] coauthoredwith Shannon in 1949 He emphasized three levels ofcommunication issues technical issues semantic issues andutility issues He believed that although Shannonrsquos infor-mation quantity focuses only on technical issues this doesnot mean that the engineering and technical aspects ofcommunication have nothing to do with semantic issues andutility issues In his related comments he particularly em-phasized that ldquoinformation is a measure of onesrsquo freedom ofchoice in selecting a messagerdquo ([12] (pages 614ndash616 619)) Inthis way according to Weaverrsquos evaluation the interpre-tation of information quantity in communications cannot bemerely limited to the ldquoentropy of the information sourcerdquo asit can also be related to issues of meaning and utility as wellas the subjective activities of the personrsquos selection andreception of information From this we noticed thatShannon emphasized that his information quantity is ldquoameasure of information choice and uncertaintyrdquo and isldquohow much lsquothe possibility of choicersquo is involved in thechoice of eventsrdquo or finding a way to measure how much theuncertainty of the result of choices is ([8] (page 7)))e termldquochoicerdquo has been used already in his theory Any kind ofldquochoicerdquo cannot be a purely objective activity of informationsource itself and cannot be separated from the corre-sponding activities of the person as the subject In thisregard in the stipulation of Shannonrsquos information quantitythere are inevitably some factors such as the meaning of themessage and the influence for the receiver Inferring fromthis the argument that Shannonrsquos information quantity ldquoisused to eliminate random uncertaintyrdquo and ldquois the elimi-nated uncertaintyrdquo is not completely false but if these areimposed on him without a second thought and differenti-ation it may sabotage his original intention

In addition we also noticed that in Shannonrsquos infor-mation theory the quantitative calculation of informationquantity is based on the difference between the probabilitythat a message actually occurs and the probability that it mayoccur that is

HShannon logP2P1

1113874 1113875 log(P2) minus log(P1) (1)

where P1 and P2 are the a priori and a posteriori proba-bilities respectively It can also be concluded from this thatShannonrsquos information quantity is ldquosomething used toeliminate random uncertaintyrdquo and ldquothe eliminateduncertaintyrdquo

According to the above discussion Shannonrsquos formula ofinformation entropy H minusΣPi Log Pi can be interpreted inmultiple senses

It can be a measure of the randomness of the messagesent by the source a measure of the a priori uncertainty ofthe message generated by the source a measure of the ability

Complexity 3

of a source to send information a measure of the uncertaintyof choosing among multiple messages a measure of theaverage information quantity (average eliminated uncer-tainty) carried by each message or a measure of the averagedegree to which information sink uncertainty is changed

)ese meanings can be roughly divided into two cate-gories either relative to the characteristics of the informa-tion source or relative to the characteristics of theinformation source changing the state of the informationsink If it is aimed at the characteristics of the informationsource itself the information entropy formula can beregarded as a measure of the entropy value of the infor-mation source itself )is is what Shannon called ldquothe en-tropy of the information sourcerdquo and the ldquomeasure ofuncertaintyrdquo of the information generated by the source If itis aimed at the characteristics of the information sourcechanging the state of the information sink it is an inferencemade by some later scholars based on the possible nature ofShannonrsquos information quantity Shannonrsquos informationquantity is ldquosomething to eliminate random uncertaintyrdquoand ldquois eliminated uncertaintyrdquo According to the expla-nation of the latter this kind of information quantity is nolonger ldquoentropyrdquo but has the meaning and value of ldquonegativeentropyrdquo which is opposite to ldquoentropyrdquo and eliminatesldquoentropyrdquo or eliminates uncertainty It is from this startingpoint that we can assert that although Shannonrsquos ldquoentropy ofinformation sourcesrdquo theory measures the uncertainty ofinformation generated by information sources this theoryhas paved the way for related theories using negative entropyto explain the information

3 Negentropy Theory

31 Schrodingerrsquos ldquoLife Feeds on Negative Entropyrdquo In thefield of general science the scientist who first proposed theconcept of negative entropy in a relationship corresponding tothe concept of entropy is Erwin Schrodinger a well-knownAustrian physicist and one of the founders of quantum me-chanics [13] In 1944 he wrote in What is life the famoussaying that ldquolife feeds on negative entropyrdquo and considerednegative entropy ldquoentropy with negative sign is a measure oforderrdquo He wrote ldquoHow does the living organism avoid declineto equilibrium Obviously it is by eating drinking breathingand (plant) assimilation )e special term is ldquometabolismrdquo meaning change or exchange Here comes a question-what toexchange At first it was undoubtedly referring to the ex-change of matter But it is absurd to think that itrsquos essentiallythe exchange of matter Any atom of nitrogen oxygen sulfuretc in the organism is the same as the atom of the same kindin the environment What advantages can be brought toexchange them Later some people said that we live on energy In fact it is ridiculous because the energy contained in anadult organism is fixed just like the matter it contains Sincethe value of one calorie in the body is the same as one calorieoutside the body it is really hard to understand the usefulnessof pure exchange ldquoWhat precious things in our food can saveus from deathrdquo )is is easy to answer A living organism isconstantly generating entropy (or it can be said that it isincreasing positive entropy) and gradually approaching the

dangerous state of maximum entropy namely death)e onlyway to get rid of death and to live is to continuously drawnegative entropy from the environment We will soon un-derstand that negative entropy is very positive Organisms liveon negative entropy Or to be clearer the essence of meta-bolism is to enable an organism to successfully eliminate all theentropy that it has to generate when it is alive ldquolsquoLife feeds onnegative entropyrsquo just as a living organism attract a string ofnegative entropy to offset the increase in entropy it generatesin life so as to maintains itself at a stable and low entropylevelrdquo)e way an organism stabilizes itself at a highly orderedlevel (equivalent to a fairly low level of entropy) is indeed tocontinuously draw order from the surrounding environment In fact as far as higher animals are concerned it is a fact thatpeople have known for a long time that they live entirely in theorder of absorption Because in the organism they take as foodfor of varying degrees of complexity the state of matter isextremely orderly After consuming these foods animalsexcrete the greatly degraded thingsrdquo ([14] (pages 69ndash70 72))From these statements by Schrodinger we have realized veryclearly that organisms do not devour food moisture and airfor the purpose of obtaining matter and energy What theorganism really needs to absorb from the environment areldquonegative entropyrdquo ldquoorderrdquo ldquoorderlinessrdquo ldquoorganizationrdquo andso on )e concepts of ldquonegative entropyrdquo ldquoorderrdquo ldquoorderli-nessrdquo and ldquoorganizationrdquo discussed in the communication andcontrol theory and some related theories that have been de-veloped later are interlinked with the functional interpretationof ldquoinformationrdquo in the most general sense In this regard ldquolifefeeds on negative entropyrdquo can be interpreted as ldquolife feeds oninformationrdquo

)e negative entropy theory of life proposed bySchrodinger actually opened up a research direction ofinformation theory which is to study the entropy change ofthe system when it is open instead of just making an effortunder the condition of isolated systems like the second lawof thermodynamics )e openness of the system to theenvironment and the fact that the system and the envi-ronment maintain a certain degree of exchange of matterenergy and information are the basic conditions on whichall negative entropy theories are established

Starting from Schrodingerrsquos work ldquonegative entropyrdquoacquired the nature of opposing to the concept of ldquoentropyrdquoIf entropy describes the degree of disorder chaos anduncertainty of a system then negative entropy describes thedegree of order organization and certainty From theperspective of the functionality of relative effect negativeentropy is the elimination of entropy and the elimination ofuncertainty From this we can more clearly grasp andunderstand the basic perspective and nature of a series ofrelated formulations and interpretations of the concept ofinformation made in the subsequent development of in-formation science

32 Wienerrsquos ldquoInformation Is Negative Entropyrdquo Almost atthe same time as Shannon founded his theory of commu-nication information entropy Wiener an Americanmathematician and the founder of cybernetics also

4 Complexity

proposed the theory of information negative entropy in theprocess of establishing cybernetics by integrating the theoryof communication and automatic control In his bookldquoCyberneticsrdquo [2] published in 1948 he independentlypresented Wienerrsquos formula which is only one minus signaway from Shannonrsquos information quantity formula Hewrote ldquo)e information quantity is the negative number ofthe logarithm of a quantity that can be regarded as aprobability which is essentially negative entropyrdquo ([15](pages 11 65)) From this we can also reasonably answer thequestion why do Wienerrsquos information formula andShannonrsquos formula differ by a negative sign It is because theformer measures ldquonegative entropyrdquo while the latter mea-sures ldquoentropyrdquo

Perhaps the analysis from the perspective of the dif-ferences in cognitive methods can help us find the root thatcauses the difference between the information quantity ofShannon and that of Wiener ([16] (pages 33ndash34))

We know that in the field of mechanical communica-tion the number of primitives sent of the messages by asource and the probability of sending each message arepredetermined and the information sink is fully aware of thisdetermination )e a priori estimation by the sink of theuncertainty of the message sent by the source is also derivedfrom this predetermination In this way the uncertainty ofwhat kind of message the source sends can be consideredboth as a feature of the estimation of the information sourcestate by the sink and as a feature of the source itself )edifference of minus sign between Shannonrsquos and Wienerrsquosinformation quantity formulas can be regarded as the resultof their examination from these two different perspectives

)e information quantity of communication can bededuced and calculated according to the principle of rela-tivity of interaction and mutual stipulation between thesource and sink )is leads to the stipulation of ldquoa prioriprobabilityrdquo and ldquoa posteriori probabilityrdquo

If the information quantity formula is deduced from theperspective of the state characteristics of the source itselfaccording to the principle of Shannon then the contributionof the prior probability to the information quantity is re-versed because it provides the ldquouncertaintyrdquo of the sourceestimated by the sink and its direction is opposite to thedirection of the sourcersquos own characteristics )e posteriorprobability contributes positively to the informationquantity because it provides the information state itself ofthe source that actually occurs at the moment and its di-rection is consistent with the direction of the sourcersquos ownstate characteristics )e expression in logarithmic functionis

HShannon logP2P1

1113874 1113875 (2)

If like Wiener the information formula is derived fromthe perspective of the understanding of the source by thesink then the contribution of the prior probability to theinformation quantity is positive On the contrary thecontribution of the posterior probability to the informationquantity is reversed )us in Wienerrsquos formula the formula

of information quantity will be a minus sign different fromthe Shannon formula

HWiener logP1P2

1113874 1113875

log(P1 ) minus log(P2 )

(3)

It is indicated in the fact that the information quantityformulas of Shannon and Wiener can be deduced from thetwo opposite angles and directions of mutual interactionsand determination of the information source and sink thatthe difference of a negative sign between these two formulashas a profound root in epistemology )is reflects the dif-ference and unity of philosophical ontology and episte-mological methods to a certain extent and significanceRegrettably this has not been clearly noticed in the past

It should be said that Wienerrsquos thinking is the same asSchrodingerrsquos Schrodingerrsquos negative entropy of life is usedto calculate the ability to resist the spontaneous entropyincrease in the living body while he information quantity ofWiener is used to calculate the amount of new knowledgebrought to the receiver by the message Both have two basicpoints in common ① the system is open and ② it caneliminate its own chaos by the environment Here whatWienerrsquos information quantity calculates is exactly whatSchrodingerrsquos negative entropy calculates It is no wonderthat Wiener has repeatedly emphasized the idea that theinformation quantity is negative entropy Again we see thatthe crux of the problem lies not in the names of the conceptsused but in the kind of problems that these concepts are usedto explore

33 Negative Entropy Flow of Prigogine Under the cir-cumstances that some theories such as entropy informationand negative entropy have been applied and developed inmore and more discipline theories the classical thermo-dynamics which takes entropy theory and entropy increaseprinciple as its basic characteristics is also developingconstantly )is development finally broke through thelimitations brought by the basic characteristics of classicalthermodynamics to itself

)e Brussels school represented by Belgian physicist andchemist Prigogine [17] reunderstood the second law ofthermodynamics based on a series of experiments andproposed the famous negative entropy theory of dissipativestructure theory in the 1960s [18] It pointed out that theprinciple of increase of entropy only holds in isolated sys-tems For an open system two factors which are the externalentropy flow caused by the exchange between the system andthe environment and the entropy generation within thesystem must be considered Based on this Prigogine pro-posed a generalized second law of thermodynamics which isapplicable to both open systems and isolated systems

Prigogine pointed out that the entropy change of asystem is caused by two factors One factor is the entropyexchanged between the system and the environment duringthe interaction (deS external entropy flow) and the other is

Complexity 5

the entropy generated spontaneously within the system (disinternal entropy change)

For an isolated system since there is no exchange ofmatter and energy between the system and the environmentit is impossible to have the exchange of entropy )ereforein an isolated system deS 0 so dS disge 0 It is the secondlaw of thermodynamics (narrow sense) proposed byClausius

For an open system there is an exchange of entropy atthe same time because of the exchange of matter and energybetween the system and the environment )erefore in anopen system the total entropy change of the system willshow a complex scenario When the external entropy flow isnegative and the absolute value of the external entropy flowis greater than the absolute value of the internal entropychange the system will move towards order along the di-rection of entropy decreasing It can be said that Clausiusrsquossecond law of thermodynamics is just a special case of thegeneralized second law of thermodynamics in an isolatedsystem

It is the generalized second law of thermodynamicsproposed by Prigogine that reveals the inevitability of anorderly evolution of the system along the direction of en-tropy decreasing under a suitable and open background Indissipative structure theory the system introduces negativeentropy flow from the outside to resist the increase of in-ternal entropy which is completely consistent with the basicideas of Schrodingerrsquos ldquonegative entropy theory of liferdquo andWienerrsquos ldquonegative entropy theory of informationrdquo How-ever dissipative structure theory has extended the functionscope of negative entropy into general physical and chemicalsystems )e essence of this expansion is to bring the en-tropy negative entropy and information theories into theall-embracing objective world since every system follows thegeneral laws of physics and chemistry

4 Philosophical Interpretation of theSignificance and Nature of InformationEntropy and Information NegativeEntropy Theories

In the analysis of related traditional literature entropy andnegative entropy are two concepts that correspond to eachother with opposite meanings However if we study themfurther then we will see that the two concepts have the samemeaning and mutually formulated properties

Generally speaking the concept of entropy is a measureof the degree of uncertainty of the fabric mode of the microstate of the system It can reveal the degree of disorder of thesystem organization from a specific level and angle and froma specific level of quantitative measurement Boltzmannrsquosstatistical physical entropy and Shannonrsquos ldquoentropy of in-formation sourcesrdquo are all established in this sense

As for the concept of negative entropy it can be for-mulated in two different senses in related general theoriesone is the degree to which the organization mode of itsstructure deviates from the standard value (the maximumentropy value) relative to the same system and the other is

the degree to which the entropy value of a system decreasesin the process of the change of system organization mode

If a formal description of the fabric mode of a system isneeded two quantities are needed one is the number ofpossible micro states of the system and the other is theprobability that each micro state may occur If A a1 a2 an is a set that represents the possible number of microstates of the system and P p1 p2 pn is a set thatrepresents the probability of possible occurrence of eachmicro state then the organization mode of the formalstructure of the system (M) can be expressed by a matrix asfollows

M A

P1113890 1113891

a1 a2 a3 middot middot middot an

p1 p2 p3 middot middot middot pn

1113890 1113891

(4)

)e organization mode of the systemrsquos structure de-scribed by this matrix may be in two extreme circumstancesone is the state of maximum entropy in this casep1 p2 pn 1n and smax log n and the other is thestate of minimum entropy in this case p1 1p2 p3 pn 0 and s log1 0

If we determine that the case of smax is the standard valueto which the organization mode of the systemrsquos structureshould be referenced then all cases where the system en-tropy is less than smax can be regarded as a deviation fromthis standard value What causes this deviation To whatextent is this deviation Obviously there should be a conceptto specify this factor and there should be a calculation tomeasure the extent of this deviation A very natural idea isthat the effect of this factor is the opposite of entropy whichis negative entropy )is calculation should be the differencebetween the standard entropy value and the actual entropyvalue Based on this idea we can get the following calcu-lation formula for negative entropy ([19] (pages 67ndash74))

negative entropy smax minus s (5)

Obviously there are two extreme circumstances of thisformula one is that when s smax the negative entropy valueof the system is 0 the other is that when s 0 the negativeentropy value of the system is maximum which is equal tosmax

Negative entropy not only can be specified in the sensethat the entropy value in the organization mode of a specificsystem deviates from the standard value (the maximumentropy) but also can be specified in the sense that theentropy value decreases in the process of the change of theorganization mode of the system Schrodingerrsquos negativeentropy theory of life Wienerrsquos negative entropy theory ofinformation Prigoginersquos negative entropy theory of dissi-pative structure and so forth in essence are all defined inthe sense of entropy decrease Negative entropy as a measureof the entropy decrease degree and entropy (Clausius en-tropy) as a measure of entropy increase degree are notmeasures of the systemrsquos absolute negative entropy or en-tropy value but they are measures of some kind of quantity

6 Complexity

ldquochangerdquo or ldquotransformationrdquo which is a measure of relativequantity No matter it is Schrodingerrsquos ldquolife feeds on negativeentropyrdquo Wienerrsquos ldquohow much new information is given tous by hindsightrdquo or the factors that resist the spontaneousentropy increase in the system by Prigogine are all developedfrom the perspective of relative functions that lead tochanges of organization mode (degree of uncertainty) ofsystem Just as the entropy increase effect does not simplydepend on howmuch heat is absorbed by the system but alsoon how much the relative degree of change that the heatbrings to the organization mode of the systemrsquos originalstructure is the entropy decrease effect also does not simplydepend on what kind of message the system receives or whatkind of mass or energy with a certain value of entropy ornegative entropy the system absorbs but also on how therelative degree of change (degree of uncertainty) that themessage the mass or energy brings to the organizationmode of the original structure of the system is)is brings upa very interesting phenomenon )e same mass or energy orthe same message which acts on systems of differentstructural states will play very different roles for differentreceivers lead to entropy increase or entropy decrease addnew information cause ideological disorder or not work(maintaining the original structural mode unchanged andmaintaining the original cognitive state unchanged) )is iswhy the Clausius entropy increase formula has 1T as theintegral factor and Wienerrsquos information formula has aprior probability as the reference factor

Although concepts such as entropy negative entropyinformation entropy increase and entropy decrease oftenhave very different specific meanings in different disciplinesand for different scientists these concepts are essentiallyconsistent in nature because they all study the same kind ofphenomena in a unified sense and the differences betweenthem emerge when the same kinds of concepts are applied tothe research of different directions of the same kind ofphenomena

It is reasonable to distinguish the formulations of theseconcepts into two categories one is a formulation given in astatic sense and the other is a formulation given in a dynamicsense In this way we can clearly see that the ambiguousinterpretation of these concepts is often caused by theconfusion of these two types of formulations

In essence Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy and the negative entropy indicatingthe degree of deviation from the standard entropy value ofthe system pointed out above are all the quantitative for-mulations of entropy (information entropy) and negativeentropy (information) in the sense of static state and ab-solute value of the system )e basic meaning of this for-mulation is to calculate the degree of indeterminacy(uncertainty) of the micro state of the certain system and theextent to which this degree of indeterminacy (uncertainty)deviates from the maximum possible degree of indetermi-nacy (uncertainty) )is point can be clearly seen from theprevious comparative interpretation of the statistical en-tropy formula and the information entropy formula as wellas in our explanation of ldquonegative entropy smax minus srdquo Sometexts believe that the statistical entropy formula calculates

the ldquoentropy (change) of the system in a certain processrdquowhile Shannonrsquos information quantity calculates ldquothe in-formation quantity (change) of the system in a certainprocessrdquo ([20] (pages 20ndash27)) and this statement is incor-rect Here we also want to emphasize one point that iswhether it is Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy or the negative entropy indicating thedegree of deviation from the standard value of the systemthey are still just a quantitative concept and none of themcan precisely define the general nature of the abstractmeaning of entropy negative entropy and information Interms of methodology the definition of the abstract generalnature of such concepts is not a task of these specific sciencesbut only a philosophical subject If concepts are used ac-curately we should replace them with such concepts ofquantity of entropy negative entropy or information

)e dynamic formulations for the concepts of entropyand information are developed in two directions one is thedirection of entropy increase based on the second law ofthermodynamics and the other is the direction of entropydecrease with the framework of various negative entropytheories constructed in the sense of resisting the entropyincrease of the system

)e very interesting fact is that the research on thedynamics of entropy and information was earlier than theresearch on its statics Clausius was already quantifying itschanges when people did not really understand what en-tropy was

Various forms of negative entropy theory are dynamicmeasures of changes in information (entropy) from anotherdirection opposite to the second law of thermodynamicsSchrodinger Wiener and Prigogine all have a common ideathat the system can input the external entropy (information)flow from the environment to resist the increasing trend ofentropy within the system )ey measure the amount ofexternal entropy (information) flow by the amount ofchange in entropy (information) within the system causedby the external entropy (information) flow Because theexternal entropy (information) flow may cause the entropydecrease effect within the system consequently the quantityof this external entropy (information) flow can be measuredby the degree of the entropy decrease effect within the systemcaused by it Moreover it is simultaneously and relativelydefined as negative entropy

It seems that the function ΣPi Log Pi has some uniquevalue In the static state its absolute value indicates theuncertainty of the micro state in which the system is locatedDynamically the change of the function value indicates thechange of the uncertainty of the micro state in which thesystem is located )is change is caused by a change in thevalue of n that indicates the number of micro states of thesystem and a change in the Pi probability distribution Ingeneral the increase in the value of n and the tendencytowards equilibrium of Pi value result in the process ofentropy increase while the decrease in the value of n and thetendency towards differentiation of Pi value result in theprocess of entropy decrease As for the general idea thatinformation and entropy are regarded as opposites it ismore like an artificial formulation )e statement that the

Complexity 7

entropy decrease effect of the system is caused by the ex-ternal information is equivalent to the statement that thesystemrsquos entropy decrease effect is caused by the entropyflow introduced by the external environment Prigogine usesexternal entropy flow Schrodinger uses negative entropyand Wiener uses both information and negative entropy infact they are measuring the quantity of the same type ofchange in the same process We have every reason to regardthe various theories of entropy information and negativeentropy as theories about entropy quantity and at the sametime we have all the reasons to regard them as theories aboutinformation quantity Based on this we can establish a kindof generalized entropy theory or a generalized informationtheory to unify the discussions of the quantity of entropy andinformation that have been carried out and are ongoing indifferent disciplines

5 Comments and Conclusions

At this point we are able to evaluate and summarize thenature of the relevant information entropy or informationnegative entropy theories

Firstly the information entropy or negative entropytheories are the relative measurement for the structurali-zation degree at a specific level of the system which hascertain characteristics of relativity )e information entropytheory measures the relationship between the diversity ofstructural organization and the degree of uncertainty dif-ference at a specific level of the system while the negativeentropy theory of information measures the diversity ofstructural organization methods and the degree of uncer-tainty reduction at a specific level of the system

Secondly we notice that in the general theory theconcept of information is defined and explained in a specialsense as negative entropy )ese are two related statementsin communication and control theory ldquoinformation is theeliminated uncertaintyrdquo and ldquoinformation is negative en-tropyrdquo However these two statements only emphasize therole of information to the receiver which is the functionaldefinition of information from a specific perspective )isdefinition does not reveal what the information is At mostit only emphasizes from a specific perspective the role ofinformation to the receiver It is impossible to reveal thenature of information from such an interpretation

Moreover the information entropy or negative entropytheories are only to measure the state of a certain aspect ofthe system and the degree of state change in that aspect by acertain calculation method of amount In this regard theinformation entropy or negative entropy theories have theproperty of specifically defined quantitative characteristic

In terms of such characteristics of relativity function-ality and quantification the theory of information entropyor negative entropy is only a technical quantitative pro-cessing method for mechanical communication and con-trolling processes in essence not a theory about the nature ofinformation

It is necessary to mention here that as early as 1928Hartley (1890ndash1970) an American communications expertpointed out in the article ldquoTransmission of Informationrdquo that

ldquoInformation refers to the message with new content andknowledgerdquo ([21] (page 535)) )is is also an acquireddefinition of information recognized and expressed bypeople in their daily life and in general literature Obviouslythis definition is in line with the meanings of ldquoinformation isthe eliminated uncertaintyrdquo and ldquoinformation is negativeentropyrdquo mentioned earlier and it is formulated in the senseof whether the message can bring new content to the re-ceiver Obviously such a definition is also relative andfunctional and cannot be used as an essential explanation ofwhat information is

Usually people always regard the ldquoinformation is neg-ative entropyrdquo as the standard definition of information byWiener and interpret the general nature of information fromthis However they did not seriously conduct discriminationand analysis but extended the explanation made by Wieneronly in the sense of the quantitative description of therelative functionalization of technical processing to thegeneral universal scope at will In factWienerrsquos statement onldquoinformation is negative entropyrdquo is just a practical inter-pretation of communication and control information fromthe perspective of technical processing by using the existingcalculationmethods of entropy and is only a kind of measureof the amount of practical information What he seeks isonly a method of quantitative processing realized by tech-nology but not to reveal the general nature of information atall By the same token the statement that ldquoinformation is theeliminated uncertaintyrdquo focuses only on a quantitativeprocessing method realized by technology As some scholarspointed out in the interdisciplinary research on informationldquoWienerrsquos types of mathematical definitions of informationrelated to mathematical or physical concepts of negativeentropy cannot adequately encompass the experientialembodied pragmatic semantic meaningful content of or-dinary sign games of living systems and the language gamesof embodied conscious humansrdquo ([22] (pages 622ndash633))

In fact Wiener himself is very clear in what sense hisldquoinformation is negative entropyrdquo is used because when heput forward this statement he also made a correspondingdiscussion on the general nature of information He has twoinfluential formulations One is ldquoinformation is informationnot matter or energy No materialism which does not admitthis can survive in the present dayrdquo ([15] (page 133)) )esecond is ldquoinformation is the name of the content that weexchange with the external world in the process of adaptingto it and making this adaptation felt by the external worldrdquo([23] (page 4))

Obviously the first sentence of Wiener emphasizes theontological status of information Although he failed tocorrectly define the nature of information from the positiveside in this sentence he correctly emphasized the inde-pendent value and significance of information comparedwith matter (quality) and energy and he also put forward awarning about the relevant materialism theory that failed tomake a reasonable interpretation of the ontological status ofinformation

Wienerrsquos second sentence further emphasizes the need toclarify the general nature of information Instead of simplyfocusing on the form of the information carrier or the

8 Complexity

function of the information we should grasp the infor-mation based on what we ldquoexchange with the outside worldrdquoSince it is ldquoexchangerdquo there should be in and out In this waythere is information not only within our subject but also inthe external environment In this regard the correspondingdoctrines of objective information and subjective infor-mation should be valid)is also shows the true charm of thesaying that ldquoinformation is information not matter or en-ergyrdquo which Wiener emphasizes

It is regrettable that for a long time Wienerrsquos clearwarning to philosophy has not attracted the attention ofmore philosophers and scientists Not only has the revo-lutionary value of information for the development ofphilosophy not been clearly revealed but also unified in-formation science has not been established because theestablishment of unified information science must be basedon the general theory of information philosophy

In addition we should also note that the statementsldquoinformation is the eliminated uncertaintyrdquo and ldquoinforma-tion is negative entropyrdquo are also single-faceted in the senseof functional definition Because in the real world the roleof information is multifaceted and multilayered it can notonly eliminate uncertainty but also increase uncertainty itcan play the role of negative entropy as well as the role ofentropy For example when a person is sick he should takemedicine to eliminate the disorder caused by the disease inhis body but what happens if he takes the wrong medicineObviously the medicine will provide him with the corre-sponding information but this information does not alwaysplay a role in eliminating uncertainty or negative entropy Insome cases it may play the opposite role which is to increaseuncertainty or entropy

An ancient Chinese literature ldquoStratagems of the War-ring States Qin Stratagem IIrdquo tells a parable of ldquoterrifyingrumorrdquo It was said that Zeng Zirsquos mother was weaving athome and a neighbor came to tell her that ldquoZeng Zi haskilled someonerdquo Zeng Zirsquos mother did not believe and saidldquoI know my son he will not kill peoplerdquo She continued toweave calmly After a while another neighbor came to tellher that ldquoZeng Zi has killed someonerdquo Zeng Zirsquos mother stilldid not believe it and said ldquoHe wonrsquot kill anyonerdquo andcontinued to weave However when the third neighborcame to tell her ldquoZeng Zi has killed someonerdquo finally ZengZirsquos mother could not sit still and she put down her workand fled across the wall

In this parable what effect does the information thatldquoZeng Zi has killed someonerdquo have on his mother Is itentropy or negative entropy Is it entropy increase or de-crease Is uncertainty increased or eliminated

Also if we generalize the functional definition of ldquoin-formation is the eliminated uncertaintyrdquo then we will seesome very ridiculous scenarios In a book published as early as1987 Wu once wrote ldquo)e role of information is funda-mentally different from what information itself is )e natureof information can only be sought from the inner basis of itsown content but cannot be formulated simply by its effect ona certain aspect of the sink Just as the definition of foodcannot be lsquoeliminated hungerrsquo the definition of informationcannot be lsquoeliminated uncertaintyrsquordquo ([24] (page 8))

Finally here comes the most essential aspect that shouldbe emphasized that is the relationship of structural dif-ferences at specific levels of the system measured only fromthe aspect of the form which does not aim at the infor-mation itself but merely aims at the structural characteristicsof the information carrier itself Because of this it is im-possible to deduce the general nature of information directlyfrom such a theory It is no wonder that some westernscholars have clearly and reasonably pointed out that ldquoIn-formation theory deals with the carrier of informationsymbols and signals not information itselfrdquo and ldquoInfor-mation theory does not deal with the information itself butthe carrier of the informationrdquo ([25] (page 150))

Since the calculation of the quantity of entropy andnegative entropy involves the probability distribution of themicro states of the system being measured it is reasonablethat relevant viewpoints such as the degree of orderly ordisorderly organization (order) of the system ldquodegree ofvariationrdquo ldquodifferences and constraintsrdquo ldquosymmetrybreakingrdquo ldquodifference that makes a differencerdquo ldquoform andstructurerdquo and ldquostate of thingsrdquo are directly derived from thetheory of entropy and negative entropy Since related viewssuch as these are directly deduced from the theories aboutthe quantity of entropy and negative entropy it is alsoimpossible to obtain the formulation of the general nature ofinformation through them

Obviously to reveal the essence of information weshould not just focus on the differential relationship of thecarrier forms but we must understand the contents ofrelevant properties characteristics existing modes andstates of the things itself presented by the information

In an article published as early as 1986 Wu wrote thefollowing sentences ldquoinformation is the formulation ofsomething itself displayed in another that alienated bysomething itself it is the indirect existence of somethingitself which exist in other things Information is theformulation of something revealed in the relationship be-tween something and other things Something is informa-tion when it displays itself as internal formulation in anexternal relationship which is expressed in the form ofexternalization of the characteristics of the objectrdquo ([26](page 19))

Based on the content of information and the dynamicmechanism of natural occurrence of information Wu onceclearly defined information as follows ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation of the existing mode and status of matter(direct being)rdquo in a paper entitled ldquoPhilosophical Classifi-cation of Information Formsrdquo which was published in 1984([27] (page 33)) In 2019 Wu expanded the definition ofinformation that was only restricted to the level of philo-sophical ontology based on the historical evolution of in-formation forms classified by him ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation and re-manifestation of the existing modeand status of matter (direct being) as well as the subjectivegrasp and creation of information by the subject of cognitionand practice including the cultural world that has beencreatedrdquo ([28] (page 143))

Complexity 9

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

80 years later after the emergence of Clausiusrsquo principle ofentropy increase

Shannonrsquos communication information theory wasfounded in 1948 under the direct enlightenment of Boltz-mann statistical entropy theory Shannon used two thingsfrom Boltzmannrsquos statistical entropy theory one is thestatistical method and the other is the entropy formula InShannonrsquos view an information source is a set system ca-pable of generating a set of random messages with their ownprobabilities of occurrence Based on this a mathematicalformula for measuring the information quantity generatedby the information source was proposed to be titled asldquoentropy of the information sourcerdquo ([8] (page 8)) Shan-nonrsquos information theory is actually a theory of informationentropy which can also be seen as the theory of entropy

In fact the calculation of Shannonrsquos information as thecalculation of physical entropy reveals the mutual formula-tions and relations between certain macro and micro states Ifwe consider the aggregate of all the messages that a sourcemay send as a macro performance of the characteristics of thesource then eachmessage event sent by the source constitutesa micro performance corresponding to this macro perfor-mance It is the comprehensive effect of all these microperformances that reflects the macro characteristics of thesource If every distribution of molecules in the physicalsystem is regarded as a possible message event then entropybecomes a measure of information in the sense of Shannon

)is inherent unity of physical entropy and informationentropy illustrates the truth that both entropy and infor-mation quantity are measures of a certain state of a materialsystem itself the popularization of the concept of physicalentropy in the field of communication is the uncertainty ofthe information source and the embodiment of the conceptof the information source uncertainty in the molecularphysics system is the physical entropy

In fact Shannon himself is very clear about the truemeaning of the information quantity formula he put for-ward He emphasized ldquo)e quantity H minusΣPi Log Pi

plays an important role in information theory as a measureof information choice and uncertainty )e formula of H isthe same as the so-called formula of entropy in statisticalmechanicsrdquo ([8] (page 7)) We noticed that many scholarsalways quoted these two sentences when evaluating thenature of Shannonrsquos information quantity ldquoInformation issomething used to eliminate random uncertaintyrdquo ldquoInfor-mation is eliminated uncertaintyrdquo It is pointed out that bothof these sentences were spoken by Shannon himself in hisarticle ldquoA Mathematical )eory of Communicationrdquo )esetwo sentences have now become the classic definition ofShannonrsquos information quantity However after verbatimverification we did not find such a discourse in ShannonrsquosldquoA Mathematical )eory of Communicationrdquo [1] On thecontrary what we found was that he repeatedly emphasizedthat what his information quantity measures is the uncer-tainty of the information generated by the informationsource and it is the ldquoentropy of the information sourcerdquo [1]

In fact the relationship between ldquouncertaintyrdquo andldquoinformationrdquo in these two sentences can be traced back tothe empiricist school Locke [9] a British philosopher and

thinker and Hume [10] a British philosopher Hume oncemade it clear that more information can be provided bychoosing from the greater possibilities It can be said that thisis the source of inspiration for the two sentences above

Warren Weaver (1894ndash1978) a well-known Americanscientist and social activist wrote about Shannonrsquos infor-mation quantity with detailed comments in the book ldquo)eMathematical )eory of Communicationrdquo [11] coauthoredwith Shannon in 1949 He emphasized three levels ofcommunication issues technical issues semantic issues andutility issues He believed that although Shannonrsquos infor-mation quantity focuses only on technical issues this doesnot mean that the engineering and technical aspects ofcommunication have nothing to do with semantic issues andutility issues In his related comments he particularly em-phasized that ldquoinformation is a measure of onesrsquo freedom ofchoice in selecting a messagerdquo ([12] (pages 614ndash616 619)) Inthis way according to Weaverrsquos evaluation the interpre-tation of information quantity in communications cannot bemerely limited to the ldquoentropy of the information sourcerdquo asit can also be related to issues of meaning and utility as wellas the subjective activities of the personrsquos selection andreception of information From this we noticed thatShannon emphasized that his information quantity is ldquoameasure of information choice and uncertaintyrdquo and isldquohow much lsquothe possibility of choicersquo is involved in thechoice of eventsrdquo or finding a way to measure how much theuncertainty of the result of choices is ([8] (page 7)))e termldquochoicerdquo has been used already in his theory Any kind ofldquochoicerdquo cannot be a purely objective activity of informationsource itself and cannot be separated from the corre-sponding activities of the person as the subject In thisregard in the stipulation of Shannonrsquos information quantitythere are inevitably some factors such as the meaning of themessage and the influence for the receiver Inferring fromthis the argument that Shannonrsquos information quantity ldquoisused to eliminate random uncertaintyrdquo and ldquois the elimi-nated uncertaintyrdquo is not completely false but if these areimposed on him without a second thought and differenti-ation it may sabotage his original intention

In addition we also noticed that in Shannonrsquos infor-mation theory the quantitative calculation of informationquantity is based on the difference between the probabilitythat a message actually occurs and the probability that it mayoccur that is

HShannon logP2P1

1113874 1113875 log(P2) minus log(P1) (1)

where P1 and P2 are the a priori and a posteriori proba-bilities respectively It can also be concluded from this thatShannonrsquos information quantity is ldquosomething used toeliminate random uncertaintyrdquo and ldquothe eliminateduncertaintyrdquo

According to the above discussion Shannonrsquos formula ofinformation entropy H minusΣPi Log Pi can be interpreted inmultiple senses

It can be a measure of the randomness of the messagesent by the source a measure of the a priori uncertainty ofthe message generated by the source a measure of the ability

Complexity 3

of a source to send information a measure of the uncertaintyof choosing among multiple messages a measure of theaverage information quantity (average eliminated uncer-tainty) carried by each message or a measure of the averagedegree to which information sink uncertainty is changed

)ese meanings can be roughly divided into two cate-gories either relative to the characteristics of the informa-tion source or relative to the characteristics of theinformation source changing the state of the informationsink If it is aimed at the characteristics of the informationsource itself the information entropy formula can beregarded as a measure of the entropy value of the infor-mation source itself )is is what Shannon called ldquothe en-tropy of the information sourcerdquo and the ldquomeasure ofuncertaintyrdquo of the information generated by the source If itis aimed at the characteristics of the information sourcechanging the state of the information sink it is an inferencemade by some later scholars based on the possible nature ofShannonrsquos information quantity Shannonrsquos informationquantity is ldquosomething to eliminate random uncertaintyrdquoand ldquois eliminated uncertaintyrdquo According to the expla-nation of the latter this kind of information quantity is nolonger ldquoentropyrdquo but has the meaning and value of ldquonegativeentropyrdquo which is opposite to ldquoentropyrdquo and eliminatesldquoentropyrdquo or eliminates uncertainty It is from this startingpoint that we can assert that although Shannonrsquos ldquoentropy ofinformation sourcesrdquo theory measures the uncertainty ofinformation generated by information sources this theoryhas paved the way for related theories using negative entropyto explain the information

3 Negentropy Theory

31 Schrodingerrsquos ldquoLife Feeds on Negative Entropyrdquo In thefield of general science the scientist who first proposed theconcept of negative entropy in a relationship corresponding tothe concept of entropy is Erwin Schrodinger a well-knownAustrian physicist and one of the founders of quantum me-chanics [13] In 1944 he wrote in What is life the famoussaying that ldquolife feeds on negative entropyrdquo and considerednegative entropy ldquoentropy with negative sign is a measure oforderrdquo He wrote ldquoHow does the living organism avoid declineto equilibrium Obviously it is by eating drinking breathingand (plant) assimilation )e special term is ldquometabolismrdquo meaning change or exchange Here comes a question-what toexchange At first it was undoubtedly referring to the ex-change of matter But it is absurd to think that itrsquos essentiallythe exchange of matter Any atom of nitrogen oxygen sulfuretc in the organism is the same as the atom of the same kindin the environment What advantages can be brought toexchange them Later some people said that we live on energy In fact it is ridiculous because the energy contained in anadult organism is fixed just like the matter it contains Sincethe value of one calorie in the body is the same as one calorieoutside the body it is really hard to understand the usefulnessof pure exchange ldquoWhat precious things in our food can saveus from deathrdquo )is is easy to answer A living organism isconstantly generating entropy (or it can be said that it isincreasing positive entropy) and gradually approaching the

dangerous state of maximum entropy namely death)e onlyway to get rid of death and to live is to continuously drawnegative entropy from the environment We will soon un-derstand that negative entropy is very positive Organisms liveon negative entropy Or to be clearer the essence of meta-bolism is to enable an organism to successfully eliminate all theentropy that it has to generate when it is alive ldquolsquoLife feeds onnegative entropyrsquo just as a living organism attract a string ofnegative entropy to offset the increase in entropy it generatesin life so as to maintains itself at a stable and low entropylevelrdquo)e way an organism stabilizes itself at a highly orderedlevel (equivalent to a fairly low level of entropy) is indeed tocontinuously draw order from the surrounding environment In fact as far as higher animals are concerned it is a fact thatpeople have known for a long time that they live entirely in theorder of absorption Because in the organism they take as foodfor of varying degrees of complexity the state of matter isextremely orderly After consuming these foods animalsexcrete the greatly degraded thingsrdquo ([14] (pages 69ndash70 72))From these statements by Schrodinger we have realized veryclearly that organisms do not devour food moisture and airfor the purpose of obtaining matter and energy What theorganism really needs to absorb from the environment areldquonegative entropyrdquo ldquoorderrdquo ldquoorderlinessrdquo ldquoorganizationrdquo andso on )e concepts of ldquonegative entropyrdquo ldquoorderrdquo ldquoorderli-nessrdquo and ldquoorganizationrdquo discussed in the communication andcontrol theory and some related theories that have been de-veloped later are interlinked with the functional interpretationof ldquoinformationrdquo in the most general sense In this regard ldquolifefeeds on negative entropyrdquo can be interpreted as ldquolife feeds oninformationrdquo

)e negative entropy theory of life proposed bySchrodinger actually opened up a research direction ofinformation theory which is to study the entropy change ofthe system when it is open instead of just making an effortunder the condition of isolated systems like the second lawof thermodynamics )e openness of the system to theenvironment and the fact that the system and the envi-ronment maintain a certain degree of exchange of matterenergy and information are the basic conditions on whichall negative entropy theories are established

Starting from Schrodingerrsquos work ldquonegative entropyrdquoacquired the nature of opposing to the concept of ldquoentropyrdquoIf entropy describes the degree of disorder chaos anduncertainty of a system then negative entropy describes thedegree of order organization and certainty From theperspective of the functionality of relative effect negativeentropy is the elimination of entropy and the elimination ofuncertainty From this we can more clearly grasp andunderstand the basic perspective and nature of a series ofrelated formulations and interpretations of the concept ofinformation made in the subsequent development of in-formation science

32 Wienerrsquos ldquoInformation Is Negative Entropyrdquo Almost atthe same time as Shannon founded his theory of commu-nication information entropy Wiener an Americanmathematician and the founder of cybernetics also

4 Complexity

proposed the theory of information negative entropy in theprocess of establishing cybernetics by integrating the theoryof communication and automatic control In his bookldquoCyberneticsrdquo [2] published in 1948 he independentlypresented Wienerrsquos formula which is only one minus signaway from Shannonrsquos information quantity formula Hewrote ldquo)e information quantity is the negative number ofthe logarithm of a quantity that can be regarded as aprobability which is essentially negative entropyrdquo ([15](pages 11 65)) From this we can also reasonably answer thequestion why do Wienerrsquos information formula andShannonrsquos formula differ by a negative sign It is because theformer measures ldquonegative entropyrdquo while the latter mea-sures ldquoentropyrdquo

Perhaps the analysis from the perspective of the dif-ferences in cognitive methods can help us find the root thatcauses the difference between the information quantity ofShannon and that of Wiener ([16] (pages 33ndash34))

We know that in the field of mechanical communica-tion the number of primitives sent of the messages by asource and the probability of sending each message arepredetermined and the information sink is fully aware of thisdetermination )e a priori estimation by the sink of theuncertainty of the message sent by the source is also derivedfrom this predetermination In this way the uncertainty ofwhat kind of message the source sends can be consideredboth as a feature of the estimation of the information sourcestate by the sink and as a feature of the source itself )edifference of minus sign between Shannonrsquos and Wienerrsquosinformation quantity formulas can be regarded as the resultof their examination from these two different perspectives

)e information quantity of communication can bededuced and calculated according to the principle of rela-tivity of interaction and mutual stipulation between thesource and sink )is leads to the stipulation of ldquoa prioriprobabilityrdquo and ldquoa posteriori probabilityrdquo

If the information quantity formula is deduced from theperspective of the state characteristics of the source itselfaccording to the principle of Shannon then the contributionof the prior probability to the information quantity is re-versed because it provides the ldquouncertaintyrdquo of the sourceestimated by the sink and its direction is opposite to thedirection of the sourcersquos own characteristics )e posteriorprobability contributes positively to the informationquantity because it provides the information state itself ofthe source that actually occurs at the moment and its di-rection is consistent with the direction of the sourcersquos ownstate characteristics )e expression in logarithmic functionis

HShannon logP2P1

1113874 1113875 (2)

If like Wiener the information formula is derived fromthe perspective of the understanding of the source by thesink then the contribution of the prior probability to theinformation quantity is positive On the contrary thecontribution of the posterior probability to the informationquantity is reversed )us in Wienerrsquos formula the formula

of information quantity will be a minus sign different fromthe Shannon formula

HWiener logP1P2

1113874 1113875

log(P1 ) minus log(P2 )

(3)

It is indicated in the fact that the information quantityformulas of Shannon and Wiener can be deduced from thetwo opposite angles and directions of mutual interactionsand determination of the information source and sink thatthe difference of a negative sign between these two formulashas a profound root in epistemology )is reflects the dif-ference and unity of philosophical ontology and episte-mological methods to a certain extent and significanceRegrettably this has not been clearly noticed in the past

It should be said that Wienerrsquos thinking is the same asSchrodingerrsquos Schrodingerrsquos negative entropy of life is usedto calculate the ability to resist the spontaneous entropyincrease in the living body while he information quantity ofWiener is used to calculate the amount of new knowledgebrought to the receiver by the message Both have two basicpoints in common ① the system is open and ② it caneliminate its own chaos by the environment Here whatWienerrsquos information quantity calculates is exactly whatSchrodingerrsquos negative entropy calculates It is no wonderthat Wiener has repeatedly emphasized the idea that theinformation quantity is negative entropy Again we see thatthe crux of the problem lies not in the names of the conceptsused but in the kind of problems that these concepts are usedto explore

33 Negative Entropy Flow of Prigogine Under the cir-cumstances that some theories such as entropy informationand negative entropy have been applied and developed inmore and more discipline theories the classical thermo-dynamics which takes entropy theory and entropy increaseprinciple as its basic characteristics is also developingconstantly )is development finally broke through thelimitations brought by the basic characteristics of classicalthermodynamics to itself

)e Brussels school represented by Belgian physicist andchemist Prigogine [17] reunderstood the second law ofthermodynamics based on a series of experiments andproposed the famous negative entropy theory of dissipativestructure theory in the 1960s [18] It pointed out that theprinciple of increase of entropy only holds in isolated sys-tems For an open system two factors which are the externalentropy flow caused by the exchange between the system andthe environment and the entropy generation within thesystem must be considered Based on this Prigogine pro-posed a generalized second law of thermodynamics which isapplicable to both open systems and isolated systems

Prigogine pointed out that the entropy change of asystem is caused by two factors One factor is the entropyexchanged between the system and the environment duringthe interaction (deS external entropy flow) and the other is

Complexity 5

the entropy generated spontaneously within the system (disinternal entropy change)

For an isolated system since there is no exchange ofmatter and energy between the system and the environmentit is impossible to have the exchange of entropy )ereforein an isolated system deS 0 so dS disge 0 It is the secondlaw of thermodynamics (narrow sense) proposed byClausius

For an open system there is an exchange of entropy atthe same time because of the exchange of matter and energybetween the system and the environment )erefore in anopen system the total entropy change of the system willshow a complex scenario When the external entropy flow isnegative and the absolute value of the external entropy flowis greater than the absolute value of the internal entropychange the system will move towards order along the di-rection of entropy decreasing It can be said that Clausiusrsquossecond law of thermodynamics is just a special case of thegeneralized second law of thermodynamics in an isolatedsystem

It is the generalized second law of thermodynamicsproposed by Prigogine that reveals the inevitability of anorderly evolution of the system along the direction of en-tropy decreasing under a suitable and open background Indissipative structure theory the system introduces negativeentropy flow from the outside to resist the increase of in-ternal entropy which is completely consistent with the basicideas of Schrodingerrsquos ldquonegative entropy theory of liferdquo andWienerrsquos ldquonegative entropy theory of informationrdquo How-ever dissipative structure theory has extended the functionscope of negative entropy into general physical and chemicalsystems )e essence of this expansion is to bring the en-tropy negative entropy and information theories into theall-embracing objective world since every system follows thegeneral laws of physics and chemistry

4 Philosophical Interpretation of theSignificance and Nature of InformationEntropy and Information NegativeEntropy Theories

In the analysis of related traditional literature entropy andnegative entropy are two concepts that correspond to eachother with opposite meanings However if we study themfurther then we will see that the two concepts have the samemeaning and mutually formulated properties

Generally speaking the concept of entropy is a measureof the degree of uncertainty of the fabric mode of the microstate of the system It can reveal the degree of disorder of thesystem organization from a specific level and angle and froma specific level of quantitative measurement Boltzmannrsquosstatistical physical entropy and Shannonrsquos ldquoentropy of in-formation sourcesrdquo are all established in this sense

As for the concept of negative entropy it can be for-mulated in two different senses in related general theoriesone is the degree to which the organization mode of itsstructure deviates from the standard value (the maximumentropy value) relative to the same system and the other is

the degree to which the entropy value of a system decreasesin the process of the change of system organization mode

If a formal description of the fabric mode of a system isneeded two quantities are needed one is the number ofpossible micro states of the system and the other is theprobability that each micro state may occur If A a1 a2 an is a set that represents the possible number of microstates of the system and P p1 p2 pn is a set thatrepresents the probability of possible occurrence of eachmicro state then the organization mode of the formalstructure of the system (M) can be expressed by a matrix asfollows

M A

P1113890 1113891

a1 a2 a3 middot middot middot an

p1 p2 p3 middot middot middot pn

1113890 1113891

(4)

)e organization mode of the systemrsquos structure de-scribed by this matrix may be in two extreme circumstancesone is the state of maximum entropy in this casep1 p2 pn 1n and smax log n and the other is thestate of minimum entropy in this case p1 1p2 p3 pn 0 and s log1 0

If we determine that the case of smax is the standard valueto which the organization mode of the systemrsquos structureshould be referenced then all cases where the system en-tropy is less than smax can be regarded as a deviation fromthis standard value What causes this deviation To whatextent is this deviation Obviously there should be a conceptto specify this factor and there should be a calculation tomeasure the extent of this deviation A very natural idea isthat the effect of this factor is the opposite of entropy whichis negative entropy )is calculation should be the differencebetween the standard entropy value and the actual entropyvalue Based on this idea we can get the following calcu-lation formula for negative entropy ([19] (pages 67ndash74))

negative entropy smax minus s (5)

Obviously there are two extreme circumstances of thisformula one is that when s smax the negative entropy valueof the system is 0 the other is that when s 0 the negativeentropy value of the system is maximum which is equal tosmax

Negative entropy not only can be specified in the sensethat the entropy value in the organization mode of a specificsystem deviates from the standard value (the maximumentropy) but also can be specified in the sense that theentropy value decreases in the process of the change of theorganization mode of the system Schrodingerrsquos negativeentropy theory of life Wienerrsquos negative entropy theory ofinformation Prigoginersquos negative entropy theory of dissi-pative structure and so forth in essence are all defined inthe sense of entropy decrease Negative entropy as a measureof the entropy decrease degree and entropy (Clausius en-tropy) as a measure of entropy increase degree are notmeasures of the systemrsquos absolute negative entropy or en-tropy value but they are measures of some kind of quantity

6 Complexity

ldquochangerdquo or ldquotransformationrdquo which is a measure of relativequantity No matter it is Schrodingerrsquos ldquolife feeds on negativeentropyrdquo Wienerrsquos ldquohow much new information is given tous by hindsightrdquo or the factors that resist the spontaneousentropy increase in the system by Prigogine are all developedfrom the perspective of relative functions that lead tochanges of organization mode (degree of uncertainty) ofsystem Just as the entropy increase effect does not simplydepend on howmuch heat is absorbed by the system but alsoon how much the relative degree of change that the heatbrings to the organization mode of the systemrsquos originalstructure is the entropy decrease effect also does not simplydepend on what kind of message the system receives or whatkind of mass or energy with a certain value of entropy ornegative entropy the system absorbs but also on how therelative degree of change (degree of uncertainty) that themessage the mass or energy brings to the organizationmode of the original structure of the system is)is brings upa very interesting phenomenon )e same mass or energy orthe same message which acts on systems of differentstructural states will play very different roles for differentreceivers lead to entropy increase or entropy decrease addnew information cause ideological disorder or not work(maintaining the original structural mode unchanged andmaintaining the original cognitive state unchanged) )is iswhy the Clausius entropy increase formula has 1T as theintegral factor and Wienerrsquos information formula has aprior probability as the reference factor

Although concepts such as entropy negative entropyinformation entropy increase and entropy decrease oftenhave very different specific meanings in different disciplinesand for different scientists these concepts are essentiallyconsistent in nature because they all study the same kind ofphenomena in a unified sense and the differences betweenthem emerge when the same kinds of concepts are applied tothe research of different directions of the same kind ofphenomena

It is reasonable to distinguish the formulations of theseconcepts into two categories one is a formulation given in astatic sense and the other is a formulation given in a dynamicsense In this way we can clearly see that the ambiguousinterpretation of these concepts is often caused by theconfusion of these two types of formulations

In essence Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy and the negative entropy indicatingthe degree of deviation from the standard entropy value ofthe system pointed out above are all the quantitative for-mulations of entropy (information entropy) and negativeentropy (information) in the sense of static state and ab-solute value of the system )e basic meaning of this for-mulation is to calculate the degree of indeterminacy(uncertainty) of the micro state of the certain system and theextent to which this degree of indeterminacy (uncertainty)deviates from the maximum possible degree of indetermi-nacy (uncertainty) )is point can be clearly seen from theprevious comparative interpretation of the statistical en-tropy formula and the information entropy formula as wellas in our explanation of ldquonegative entropy smax minus srdquo Sometexts believe that the statistical entropy formula calculates

the ldquoentropy (change) of the system in a certain processrdquowhile Shannonrsquos information quantity calculates ldquothe in-formation quantity (change) of the system in a certainprocessrdquo ([20] (pages 20ndash27)) and this statement is incor-rect Here we also want to emphasize one point that iswhether it is Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy or the negative entropy indicating thedegree of deviation from the standard value of the systemthey are still just a quantitative concept and none of themcan precisely define the general nature of the abstractmeaning of entropy negative entropy and information Interms of methodology the definition of the abstract generalnature of such concepts is not a task of these specific sciencesbut only a philosophical subject If concepts are used ac-curately we should replace them with such concepts ofquantity of entropy negative entropy or information

)e dynamic formulations for the concepts of entropyand information are developed in two directions one is thedirection of entropy increase based on the second law ofthermodynamics and the other is the direction of entropydecrease with the framework of various negative entropytheories constructed in the sense of resisting the entropyincrease of the system

)e very interesting fact is that the research on thedynamics of entropy and information was earlier than theresearch on its statics Clausius was already quantifying itschanges when people did not really understand what en-tropy was

Various forms of negative entropy theory are dynamicmeasures of changes in information (entropy) from anotherdirection opposite to the second law of thermodynamicsSchrodinger Wiener and Prigogine all have a common ideathat the system can input the external entropy (information)flow from the environment to resist the increasing trend ofentropy within the system )ey measure the amount ofexternal entropy (information) flow by the amount ofchange in entropy (information) within the system causedby the external entropy (information) flow Because theexternal entropy (information) flow may cause the entropydecrease effect within the system consequently the quantityof this external entropy (information) flow can be measuredby the degree of the entropy decrease effect within the systemcaused by it Moreover it is simultaneously and relativelydefined as negative entropy

It seems that the function ΣPi Log Pi has some uniquevalue In the static state its absolute value indicates theuncertainty of the micro state in which the system is locatedDynamically the change of the function value indicates thechange of the uncertainty of the micro state in which thesystem is located )is change is caused by a change in thevalue of n that indicates the number of micro states of thesystem and a change in the Pi probability distribution Ingeneral the increase in the value of n and the tendencytowards equilibrium of Pi value result in the process ofentropy increase while the decrease in the value of n and thetendency towards differentiation of Pi value result in theprocess of entropy decrease As for the general idea thatinformation and entropy are regarded as opposites it ismore like an artificial formulation )e statement that the

Complexity 7

entropy decrease effect of the system is caused by the ex-ternal information is equivalent to the statement that thesystemrsquos entropy decrease effect is caused by the entropyflow introduced by the external environment Prigogine usesexternal entropy flow Schrodinger uses negative entropyand Wiener uses both information and negative entropy infact they are measuring the quantity of the same type ofchange in the same process We have every reason to regardthe various theories of entropy information and negativeentropy as theories about entropy quantity and at the sametime we have all the reasons to regard them as theories aboutinformation quantity Based on this we can establish a kindof generalized entropy theory or a generalized informationtheory to unify the discussions of the quantity of entropy andinformation that have been carried out and are ongoing indifferent disciplines

5 Comments and Conclusions

At this point we are able to evaluate and summarize thenature of the relevant information entropy or informationnegative entropy theories

Firstly the information entropy or negative entropytheories are the relative measurement for the structurali-zation degree at a specific level of the system which hascertain characteristics of relativity )e information entropytheory measures the relationship between the diversity ofstructural organization and the degree of uncertainty dif-ference at a specific level of the system while the negativeentropy theory of information measures the diversity ofstructural organization methods and the degree of uncer-tainty reduction at a specific level of the system

Secondly we notice that in the general theory theconcept of information is defined and explained in a specialsense as negative entropy )ese are two related statementsin communication and control theory ldquoinformation is theeliminated uncertaintyrdquo and ldquoinformation is negative en-tropyrdquo However these two statements only emphasize therole of information to the receiver which is the functionaldefinition of information from a specific perspective )isdefinition does not reveal what the information is At mostit only emphasizes from a specific perspective the role ofinformation to the receiver It is impossible to reveal thenature of information from such an interpretation

Moreover the information entropy or negative entropytheories are only to measure the state of a certain aspect ofthe system and the degree of state change in that aspect by acertain calculation method of amount In this regard theinformation entropy or negative entropy theories have theproperty of specifically defined quantitative characteristic

In terms of such characteristics of relativity function-ality and quantification the theory of information entropyor negative entropy is only a technical quantitative pro-cessing method for mechanical communication and con-trolling processes in essence not a theory about the nature ofinformation

It is necessary to mention here that as early as 1928Hartley (1890ndash1970) an American communications expertpointed out in the article ldquoTransmission of Informationrdquo that

ldquoInformation refers to the message with new content andknowledgerdquo ([21] (page 535)) )is is also an acquireddefinition of information recognized and expressed bypeople in their daily life and in general literature Obviouslythis definition is in line with the meanings of ldquoinformation isthe eliminated uncertaintyrdquo and ldquoinformation is negativeentropyrdquo mentioned earlier and it is formulated in the senseof whether the message can bring new content to the re-ceiver Obviously such a definition is also relative andfunctional and cannot be used as an essential explanation ofwhat information is

Usually people always regard the ldquoinformation is neg-ative entropyrdquo as the standard definition of information byWiener and interpret the general nature of information fromthis However they did not seriously conduct discriminationand analysis but extended the explanation made by Wieneronly in the sense of the quantitative description of therelative functionalization of technical processing to thegeneral universal scope at will In factWienerrsquos statement onldquoinformation is negative entropyrdquo is just a practical inter-pretation of communication and control information fromthe perspective of technical processing by using the existingcalculationmethods of entropy and is only a kind of measureof the amount of practical information What he seeks isonly a method of quantitative processing realized by tech-nology but not to reveal the general nature of information atall By the same token the statement that ldquoinformation is theeliminated uncertaintyrdquo focuses only on a quantitativeprocessing method realized by technology As some scholarspointed out in the interdisciplinary research on informationldquoWienerrsquos types of mathematical definitions of informationrelated to mathematical or physical concepts of negativeentropy cannot adequately encompass the experientialembodied pragmatic semantic meaningful content of or-dinary sign games of living systems and the language gamesof embodied conscious humansrdquo ([22] (pages 622ndash633))

In fact Wiener himself is very clear in what sense hisldquoinformation is negative entropyrdquo is used because when heput forward this statement he also made a correspondingdiscussion on the general nature of information He has twoinfluential formulations One is ldquoinformation is informationnot matter or energy No materialism which does not admitthis can survive in the present dayrdquo ([15] (page 133)) )esecond is ldquoinformation is the name of the content that weexchange with the external world in the process of adaptingto it and making this adaptation felt by the external worldrdquo([23] (page 4))

Obviously the first sentence of Wiener emphasizes theontological status of information Although he failed tocorrectly define the nature of information from the positiveside in this sentence he correctly emphasized the inde-pendent value and significance of information comparedwith matter (quality) and energy and he also put forward awarning about the relevant materialism theory that failed tomake a reasonable interpretation of the ontological status ofinformation

Wienerrsquos second sentence further emphasizes the need toclarify the general nature of information Instead of simplyfocusing on the form of the information carrier or the

8 Complexity

function of the information we should grasp the infor-mation based on what we ldquoexchange with the outside worldrdquoSince it is ldquoexchangerdquo there should be in and out In this waythere is information not only within our subject but also inthe external environment In this regard the correspondingdoctrines of objective information and subjective infor-mation should be valid)is also shows the true charm of thesaying that ldquoinformation is information not matter or en-ergyrdquo which Wiener emphasizes

It is regrettable that for a long time Wienerrsquos clearwarning to philosophy has not attracted the attention ofmore philosophers and scientists Not only has the revo-lutionary value of information for the development ofphilosophy not been clearly revealed but also unified in-formation science has not been established because theestablishment of unified information science must be basedon the general theory of information philosophy

In addition we should also note that the statementsldquoinformation is the eliminated uncertaintyrdquo and ldquoinforma-tion is negative entropyrdquo are also single-faceted in the senseof functional definition Because in the real world the roleof information is multifaceted and multilayered it can notonly eliminate uncertainty but also increase uncertainty itcan play the role of negative entropy as well as the role ofentropy For example when a person is sick he should takemedicine to eliminate the disorder caused by the disease inhis body but what happens if he takes the wrong medicineObviously the medicine will provide him with the corre-sponding information but this information does not alwaysplay a role in eliminating uncertainty or negative entropy Insome cases it may play the opposite role which is to increaseuncertainty or entropy

An ancient Chinese literature ldquoStratagems of the War-ring States Qin Stratagem IIrdquo tells a parable of ldquoterrifyingrumorrdquo It was said that Zeng Zirsquos mother was weaving athome and a neighbor came to tell her that ldquoZeng Zi haskilled someonerdquo Zeng Zirsquos mother did not believe and saidldquoI know my son he will not kill peoplerdquo She continued toweave calmly After a while another neighbor came to tellher that ldquoZeng Zi has killed someonerdquo Zeng Zirsquos mother stilldid not believe it and said ldquoHe wonrsquot kill anyonerdquo andcontinued to weave However when the third neighborcame to tell her ldquoZeng Zi has killed someonerdquo finally ZengZirsquos mother could not sit still and she put down her workand fled across the wall

In this parable what effect does the information thatldquoZeng Zi has killed someonerdquo have on his mother Is itentropy or negative entropy Is it entropy increase or de-crease Is uncertainty increased or eliminated

Also if we generalize the functional definition of ldquoin-formation is the eliminated uncertaintyrdquo then we will seesome very ridiculous scenarios In a book published as early as1987 Wu once wrote ldquo)e role of information is funda-mentally different from what information itself is )e natureof information can only be sought from the inner basis of itsown content but cannot be formulated simply by its effect ona certain aspect of the sink Just as the definition of foodcannot be lsquoeliminated hungerrsquo the definition of informationcannot be lsquoeliminated uncertaintyrsquordquo ([24] (page 8))

Finally here comes the most essential aspect that shouldbe emphasized that is the relationship of structural dif-ferences at specific levels of the system measured only fromthe aspect of the form which does not aim at the infor-mation itself but merely aims at the structural characteristicsof the information carrier itself Because of this it is im-possible to deduce the general nature of information directlyfrom such a theory It is no wonder that some westernscholars have clearly and reasonably pointed out that ldquoIn-formation theory deals with the carrier of informationsymbols and signals not information itselfrdquo and ldquoInfor-mation theory does not deal with the information itself butthe carrier of the informationrdquo ([25] (page 150))

Since the calculation of the quantity of entropy andnegative entropy involves the probability distribution of themicro states of the system being measured it is reasonablethat relevant viewpoints such as the degree of orderly ordisorderly organization (order) of the system ldquodegree ofvariationrdquo ldquodifferences and constraintsrdquo ldquosymmetrybreakingrdquo ldquodifference that makes a differencerdquo ldquoform andstructurerdquo and ldquostate of thingsrdquo are directly derived from thetheory of entropy and negative entropy Since related viewssuch as these are directly deduced from the theories aboutthe quantity of entropy and negative entropy it is alsoimpossible to obtain the formulation of the general nature ofinformation through them

Obviously to reveal the essence of information weshould not just focus on the differential relationship of thecarrier forms but we must understand the contents ofrelevant properties characteristics existing modes andstates of the things itself presented by the information

In an article published as early as 1986 Wu wrote thefollowing sentences ldquoinformation is the formulation ofsomething itself displayed in another that alienated bysomething itself it is the indirect existence of somethingitself which exist in other things Information is theformulation of something revealed in the relationship be-tween something and other things Something is informa-tion when it displays itself as internal formulation in anexternal relationship which is expressed in the form ofexternalization of the characteristics of the objectrdquo ([26](page 19))

Based on the content of information and the dynamicmechanism of natural occurrence of information Wu onceclearly defined information as follows ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation of the existing mode and status of matter(direct being)rdquo in a paper entitled ldquoPhilosophical Classifi-cation of Information Formsrdquo which was published in 1984([27] (page 33)) In 2019 Wu expanded the definition ofinformation that was only restricted to the level of philo-sophical ontology based on the historical evolution of in-formation forms classified by him ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation and re-manifestation of the existing modeand status of matter (direct being) as well as the subjectivegrasp and creation of information by the subject of cognitionand practice including the cultural world that has beencreatedrdquo ([28] (page 143))

Complexity 9

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

of a source to send information a measure of the uncertaintyof choosing among multiple messages a measure of theaverage information quantity (average eliminated uncer-tainty) carried by each message or a measure of the averagedegree to which information sink uncertainty is changed

)ese meanings can be roughly divided into two cate-gories either relative to the characteristics of the informa-tion source or relative to the characteristics of theinformation source changing the state of the informationsink If it is aimed at the characteristics of the informationsource itself the information entropy formula can beregarded as a measure of the entropy value of the infor-mation source itself )is is what Shannon called ldquothe en-tropy of the information sourcerdquo and the ldquomeasure ofuncertaintyrdquo of the information generated by the source If itis aimed at the characteristics of the information sourcechanging the state of the information sink it is an inferencemade by some later scholars based on the possible nature ofShannonrsquos information quantity Shannonrsquos informationquantity is ldquosomething to eliminate random uncertaintyrdquoand ldquois eliminated uncertaintyrdquo According to the expla-nation of the latter this kind of information quantity is nolonger ldquoentropyrdquo but has the meaning and value of ldquonegativeentropyrdquo which is opposite to ldquoentropyrdquo and eliminatesldquoentropyrdquo or eliminates uncertainty It is from this startingpoint that we can assert that although Shannonrsquos ldquoentropy ofinformation sourcesrdquo theory measures the uncertainty ofinformation generated by information sources this theoryhas paved the way for related theories using negative entropyto explain the information

3 Negentropy Theory

31 Schrodingerrsquos ldquoLife Feeds on Negative Entropyrdquo In thefield of general science the scientist who first proposed theconcept of negative entropy in a relationship corresponding tothe concept of entropy is Erwin Schrodinger a well-knownAustrian physicist and one of the founders of quantum me-chanics [13] In 1944 he wrote in What is life the famoussaying that ldquolife feeds on negative entropyrdquo and considerednegative entropy ldquoentropy with negative sign is a measure oforderrdquo He wrote ldquoHow does the living organism avoid declineto equilibrium Obviously it is by eating drinking breathingand (plant) assimilation )e special term is ldquometabolismrdquo meaning change or exchange Here comes a question-what toexchange At first it was undoubtedly referring to the ex-change of matter But it is absurd to think that itrsquos essentiallythe exchange of matter Any atom of nitrogen oxygen sulfuretc in the organism is the same as the atom of the same kindin the environment What advantages can be brought toexchange them Later some people said that we live on energy In fact it is ridiculous because the energy contained in anadult organism is fixed just like the matter it contains Sincethe value of one calorie in the body is the same as one calorieoutside the body it is really hard to understand the usefulnessof pure exchange ldquoWhat precious things in our food can saveus from deathrdquo )is is easy to answer A living organism isconstantly generating entropy (or it can be said that it isincreasing positive entropy) and gradually approaching the

dangerous state of maximum entropy namely death)e onlyway to get rid of death and to live is to continuously drawnegative entropy from the environment We will soon un-derstand that negative entropy is very positive Organisms liveon negative entropy Or to be clearer the essence of meta-bolism is to enable an organism to successfully eliminate all theentropy that it has to generate when it is alive ldquolsquoLife feeds onnegative entropyrsquo just as a living organism attract a string ofnegative entropy to offset the increase in entropy it generatesin life so as to maintains itself at a stable and low entropylevelrdquo)e way an organism stabilizes itself at a highly orderedlevel (equivalent to a fairly low level of entropy) is indeed tocontinuously draw order from the surrounding environment In fact as far as higher animals are concerned it is a fact thatpeople have known for a long time that they live entirely in theorder of absorption Because in the organism they take as foodfor of varying degrees of complexity the state of matter isextremely orderly After consuming these foods animalsexcrete the greatly degraded thingsrdquo ([14] (pages 69ndash70 72))From these statements by Schrodinger we have realized veryclearly that organisms do not devour food moisture and airfor the purpose of obtaining matter and energy What theorganism really needs to absorb from the environment areldquonegative entropyrdquo ldquoorderrdquo ldquoorderlinessrdquo ldquoorganizationrdquo andso on )e concepts of ldquonegative entropyrdquo ldquoorderrdquo ldquoorderli-nessrdquo and ldquoorganizationrdquo discussed in the communication andcontrol theory and some related theories that have been de-veloped later are interlinked with the functional interpretationof ldquoinformationrdquo in the most general sense In this regard ldquolifefeeds on negative entropyrdquo can be interpreted as ldquolife feeds oninformationrdquo

)e negative entropy theory of life proposed bySchrodinger actually opened up a research direction ofinformation theory which is to study the entropy change ofthe system when it is open instead of just making an effortunder the condition of isolated systems like the second lawof thermodynamics )e openness of the system to theenvironment and the fact that the system and the envi-ronment maintain a certain degree of exchange of matterenergy and information are the basic conditions on whichall negative entropy theories are established

Starting from Schrodingerrsquos work ldquonegative entropyrdquoacquired the nature of opposing to the concept of ldquoentropyrdquoIf entropy describes the degree of disorder chaos anduncertainty of a system then negative entropy describes thedegree of order organization and certainty From theperspective of the functionality of relative effect negativeentropy is the elimination of entropy and the elimination ofuncertainty From this we can more clearly grasp andunderstand the basic perspective and nature of a series ofrelated formulations and interpretations of the concept ofinformation made in the subsequent development of in-formation science

32 Wienerrsquos ldquoInformation Is Negative Entropyrdquo Almost atthe same time as Shannon founded his theory of commu-nication information entropy Wiener an Americanmathematician and the founder of cybernetics also

4 Complexity

proposed the theory of information negative entropy in theprocess of establishing cybernetics by integrating the theoryof communication and automatic control In his bookldquoCyberneticsrdquo [2] published in 1948 he independentlypresented Wienerrsquos formula which is only one minus signaway from Shannonrsquos information quantity formula Hewrote ldquo)e information quantity is the negative number ofthe logarithm of a quantity that can be regarded as aprobability which is essentially negative entropyrdquo ([15](pages 11 65)) From this we can also reasonably answer thequestion why do Wienerrsquos information formula andShannonrsquos formula differ by a negative sign It is because theformer measures ldquonegative entropyrdquo while the latter mea-sures ldquoentropyrdquo

Perhaps the analysis from the perspective of the dif-ferences in cognitive methods can help us find the root thatcauses the difference between the information quantity ofShannon and that of Wiener ([16] (pages 33ndash34))

We know that in the field of mechanical communica-tion the number of primitives sent of the messages by asource and the probability of sending each message arepredetermined and the information sink is fully aware of thisdetermination )e a priori estimation by the sink of theuncertainty of the message sent by the source is also derivedfrom this predetermination In this way the uncertainty ofwhat kind of message the source sends can be consideredboth as a feature of the estimation of the information sourcestate by the sink and as a feature of the source itself )edifference of minus sign between Shannonrsquos and Wienerrsquosinformation quantity formulas can be regarded as the resultof their examination from these two different perspectives

)e information quantity of communication can bededuced and calculated according to the principle of rela-tivity of interaction and mutual stipulation between thesource and sink )is leads to the stipulation of ldquoa prioriprobabilityrdquo and ldquoa posteriori probabilityrdquo

If the information quantity formula is deduced from theperspective of the state characteristics of the source itselfaccording to the principle of Shannon then the contributionof the prior probability to the information quantity is re-versed because it provides the ldquouncertaintyrdquo of the sourceestimated by the sink and its direction is opposite to thedirection of the sourcersquos own characteristics )e posteriorprobability contributes positively to the informationquantity because it provides the information state itself ofthe source that actually occurs at the moment and its di-rection is consistent with the direction of the sourcersquos ownstate characteristics )e expression in logarithmic functionis

HShannon logP2P1

1113874 1113875 (2)

If like Wiener the information formula is derived fromthe perspective of the understanding of the source by thesink then the contribution of the prior probability to theinformation quantity is positive On the contrary thecontribution of the posterior probability to the informationquantity is reversed )us in Wienerrsquos formula the formula

of information quantity will be a minus sign different fromthe Shannon formula

HWiener logP1P2

1113874 1113875

log(P1 ) minus log(P2 )

(3)

It is indicated in the fact that the information quantityformulas of Shannon and Wiener can be deduced from thetwo opposite angles and directions of mutual interactionsand determination of the information source and sink thatthe difference of a negative sign between these two formulashas a profound root in epistemology )is reflects the dif-ference and unity of philosophical ontology and episte-mological methods to a certain extent and significanceRegrettably this has not been clearly noticed in the past

It should be said that Wienerrsquos thinking is the same asSchrodingerrsquos Schrodingerrsquos negative entropy of life is usedto calculate the ability to resist the spontaneous entropyincrease in the living body while he information quantity ofWiener is used to calculate the amount of new knowledgebrought to the receiver by the message Both have two basicpoints in common ① the system is open and ② it caneliminate its own chaos by the environment Here whatWienerrsquos information quantity calculates is exactly whatSchrodingerrsquos negative entropy calculates It is no wonderthat Wiener has repeatedly emphasized the idea that theinformation quantity is negative entropy Again we see thatthe crux of the problem lies not in the names of the conceptsused but in the kind of problems that these concepts are usedto explore

33 Negative Entropy Flow of Prigogine Under the cir-cumstances that some theories such as entropy informationand negative entropy have been applied and developed inmore and more discipline theories the classical thermo-dynamics which takes entropy theory and entropy increaseprinciple as its basic characteristics is also developingconstantly )is development finally broke through thelimitations brought by the basic characteristics of classicalthermodynamics to itself

)e Brussels school represented by Belgian physicist andchemist Prigogine [17] reunderstood the second law ofthermodynamics based on a series of experiments andproposed the famous negative entropy theory of dissipativestructure theory in the 1960s [18] It pointed out that theprinciple of increase of entropy only holds in isolated sys-tems For an open system two factors which are the externalentropy flow caused by the exchange between the system andthe environment and the entropy generation within thesystem must be considered Based on this Prigogine pro-posed a generalized second law of thermodynamics which isapplicable to both open systems and isolated systems

Prigogine pointed out that the entropy change of asystem is caused by two factors One factor is the entropyexchanged between the system and the environment duringthe interaction (deS external entropy flow) and the other is

Complexity 5

the entropy generated spontaneously within the system (disinternal entropy change)

For an isolated system since there is no exchange ofmatter and energy between the system and the environmentit is impossible to have the exchange of entropy )ereforein an isolated system deS 0 so dS disge 0 It is the secondlaw of thermodynamics (narrow sense) proposed byClausius

For an open system there is an exchange of entropy atthe same time because of the exchange of matter and energybetween the system and the environment )erefore in anopen system the total entropy change of the system willshow a complex scenario When the external entropy flow isnegative and the absolute value of the external entropy flowis greater than the absolute value of the internal entropychange the system will move towards order along the di-rection of entropy decreasing It can be said that Clausiusrsquossecond law of thermodynamics is just a special case of thegeneralized second law of thermodynamics in an isolatedsystem

It is the generalized second law of thermodynamicsproposed by Prigogine that reveals the inevitability of anorderly evolution of the system along the direction of en-tropy decreasing under a suitable and open background Indissipative structure theory the system introduces negativeentropy flow from the outside to resist the increase of in-ternal entropy which is completely consistent with the basicideas of Schrodingerrsquos ldquonegative entropy theory of liferdquo andWienerrsquos ldquonegative entropy theory of informationrdquo How-ever dissipative structure theory has extended the functionscope of negative entropy into general physical and chemicalsystems )e essence of this expansion is to bring the en-tropy negative entropy and information theories into theall-embracing objective world since every system follows thegeneral laws of physics and chemistry

4 Philosophical Interpretation of theSignificance and Nature of InformationEntropy and Information NegativeEntropy Theories

In the analysis of related traditional literature entropy andnegative entropy are two concepts that correspond to eachother with opposite meanings However if we study themfurther then we will see that the two concepts have the samemeaning and mutually formulated properties

Generally speaking the concept of entropy is a measureof the degree of uncertainty of the fabric mode of the microstate of the system It can reveal the degree of disorder of thesystem organization from a specific level and angle and froma specific level of quantitative measurement Boltzmannrsquosstatistical physical entropy and Shannonrsquos ldquoentropy of in-formation sourcesrdquo are all established in this sense

As for the concept of negative entropy it can be for-mulated in two different senses in related general theoriesone is the degree to which the organization mode of itsstructure deviates from the standard value (the maximumentropy value) relative to the same system and the other is

the degree to which the entropy value of a system decreasesin the process of the change of system organization mode

If a formal description of the fabric mode of a system isneeded two quantities are needed one is the number ofpossible micro states of the system and the other is theprobability that each micro state may occur If A a1 a2 an is a set that represents the possible number of microstates of the system and P p1 p2 pn is a set thatrepresents the probability of possible occurrence of eachmicro state then the organization mode of the formalstructure of the system (M) can be expressed by a matrix asfollows

M A

P1113890 1113891

a1 a2 a3 middot middot middot an

p1 p2 p3 middot middot middot pn

1113890 1113891

(4)

)e organization mode of the systemrsquos structure de-scribed by this matrix may be in two extreme circumstancesone is the state of maximum entropy in this casep1 p2 pn 1n and smax log n and the other is thestate of minimum entropy in this case p1 1p2 p3 pn 0 and s log1 0

If we determine that the case of smax is the standard valueto which the organization mode of the systemrsquos structureshould be referenced then all cases where the system en-tropy is less than smax can be regarded as a deviation fromthis standard value What causes this deviation To whatextent is this deviation Obviously there should be a conceptto specify this factor and there should be a calculation tomeasure the extent of this deviation A very natural idea isthat the effect of this factor is the opposite of entropy whichis negative entropy )is calculation should be the differencebetween the standard entropy value and the actual entropyvalue Based on this idea we can get the following calcu-lation formula for negative entropy ([19] (pages 67ndash74))

negative entropy smax minus s (5)

Obviously there are two extreme circumstances of thisformula one is that when s smax the negative entropy valueof the system is 0 the other is that when s 0 the negativeentropy value of the system is maximum which is equal tosmax

Negative entropy not only can be specified in the sensethat the entropy value in the organization mode of a specificsystem deviates from the standard value (the maximumentropy) but also can be specified in the sense that theentropy value decreases in the process of the change of theorganization mode of the system Schrodingerrsquos negativeentropy theory of life Wienerrsquos negative entropy theory ofinformation Prigoginersquos negative entropy theory of dissi-pative structure and so forth in essence are all defined inthe sense of entropy decrease Negative entropy as a measureof the entropy decrease degree and entropy (Clausius en-tropy) as a measure of entropy increase degree are notmeasures of the systemrsquos absolute negative entropy or en-tropy value but they are measures of some kind of quantity

6 Complexity

ldquochangerdquo or ldquotransformationrdquo which is a measure of relativequantity No matter it is Schrodingerrsquos ldquolife feeds on negativeentropyrdquo Wienerrsquos ldquohow much new information is given tous by hindsightrdquo or the factors that resist the spontaneousentropy increase in the system by Prigogine are all developedfrom the perspective of relative functions that lead tochanges of organization mode (degree of uncertainty) ofsystem Just as the entropy increase effect does not simplydepend on howmuch heat is absorbed by the system but alsoon how much the relative degree of change that the heatbrings to the organization mode of the systemrsquos originalstructure is the entropy decrease effect also does not simplydepend on what kind of message the system receives or whatkind of mass or energy with a certain value of entropy ornegative entropy the system absorbs but also on how therelative degree of change (degree of uncertainty) that themessage the mass or energy brings to the organizationmode of the original structure of the system is)is brings upa very interesting phenomenon )e same mass or energy orthe same message which acts on systems of differentstructural states will play very different roles for differentreceivers lead to entropy increase or entropy decrease addnew information cause ideological disorder or not work(maintaining the original structural mode unchanged andmaintaining the original cognitive state unchanged) )is iswhy the Clausius entropy increase formula has 1T as theintegral factor and Wienerrsquos information formula has aprior probability as the reference factor

Although concepts such as entropy negative entropyinformation entropy increase and entropy decrease oftenhave very different specific meanings in different disciplinesand for different scientists these concepts are essentiallyconsistent in nature because they all study the same kind ofphenomena in a unified sense and the differences betweenthem emerge when the same kinds of concepts are applied tothe research of different directions of the same kind ofphenomena

It is reasonable to distinguish the formulations of theseconcepts into two categories one is a formulation given in astatic sense and the other is a formulation given in a dynamicsense In this way we can clearly see that the ambiguousinterpretation of these concepts is often caused by theconfusion of these two types of formulations

In essence Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy and the negative entropy indicatingthe degree of deviation from the standard entropy value ofthe system pointed out above are all the quantitative for-mulations of entropy (information entropy) and negativeentropy (information) in the sense of static state and ab-solute value of the system )e basic meaning of this for-mulation is to calculate the degree of indeterminacy(uncertainty) of the micro state of the certain system and theextent to which this degree of indeterminacy (uncertainty)deviates from the maximum possible degree of indetermi-nacy (uncertainty) )is point can be clearly seen from theprevious comparative interpretation of the statistical en-tropy formula and the information entropy formula as wellas in our explanation of ldquonegative entropy smax minus srdquo Sometexts believe that the statistical entropy formula calculates

the ldquoentropy (change) of the system in a certain processrdquowhile Shannonrsquos information quantity calculates ldquothe in-formation quantity (change) of the system in a certainprocessrdquo ([20] (pages 20ndash27)) and this statement is incor-rect Here we also want to emphasize one point that iswhether it is Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy or the negative entropy indicating thedegree of deviation from the standard value of the systemthey are still just a quantitative concept and none of themcan precisely define the general nature of the abstractmeaning of entropy negative entropy and information Interms of methodology the definition of the abstract generalnature of such concepts is not a task of these specific sciencesbut only a philosophical subject If concepts are used ac-curately we should replace them with such concepts ofquantity of entropy negative entropy or information

)e dynamic formulations for the concepts of entropyand information are developed in two directions one is thedirection of entropy increase based on the second law ofthermodynamics and the other is the direction of entropydecrease with the framework of various negative entropytheories constructed in the sense of resisting the entropyincrease of the system

)e very interesting fact is that the research on thedynamics of entropy and information was earlier than theresearch on its statics Clausius was already quantifying itschanges when people did not really understand what en-tropy was

Various forms of negative entropy theory are dynamicmeasures of changes in information (entropy) from anotherdirection opposite to the second law of thermodynamicsSchrodinger Wiener and Prigogine all have a common ideathat the system can input the external entropy (information)flow from the environment to resist the increasing trend ofentropy within the system )ey measure the amount ofexternal entropy (information) flow by the amount ofchange in entropy (information) within the system causedby the external entropy (information) flow Because theexternal entropy (information) flow may cause the entropydecrease effect within the system consequently the quantityof this external entropy (information) flow can be measuredby the degree of the entropy decrease effect within the systemcaused by it Moreover it is simultaneously and relativelydefined as negative entropy

It seems that the function ΣPi Log Pi has some uniquevalue In the static state its absolute value indicates theuncertainty of the micro state in which the system is locatedDynamically the change of the function value indicates thechange of the uncertainty of the micro state in which thesystem is located )is change is caused by a change in thevalue of n that indicates the number of micro states of thesystem and a change in the Pi probability distribution Ingeneral the increase in the value of n and the tendencytowards equilibrium of Pi value result in the process ofentropy increase while the decrease in the value of n and thetendency towards differentiation of Pi value result in theprocess of entropy decrease As for the general idea thatinformation and entropy are regarded as opposites it ismore like an artificial formulation )e statement that the

Complexity 7

entropy decrease effect of the system is caused by the ex-ternal information is equivalent to the statement that thesystemrsquos entropy decrease effect is caused by the entropyflow introduced by the external environment Prigogine usesexternal entropy flow Schrodinger uses negative entropyand Wiener uses both information and negative entropy infact they are measuring the quantity of the same type ofchange in the same process We have every reason to regardthe various theories of entropy information and negativeentropy as theories about entropy quantity and at the sametime we have all the reasons to regard them as theories aboutinformation quantity Based on this we can establish a kindof generalized entropy theory or a generalized informationtheory to unify the discussions of the quantity of entropy andinformation that have been carried out and are ongoing indifferent disciplines

5 Comments and Conclusions

At this point we are able to evaluate and summarize thenature of the relevant information entropy or informationnegative entropy theories

Firstly the information entropy or negative entropytheories are the relative measurement for the structurali-zation degree at a specific level of the system which hascertain characteristics of relativity )e information entropytheory measures the relationship between the diversity ofstructural organization and the degree of uncertainty dif-ference at a specific level of the system while the negativeentropy theory of information measures the diversity ofstructural organization methods and the degree of uncer-tainty reduction at a specific level of the system

Secondly we notice that in the general theory theconcept of information is defined and explained in a specialsense as negative entropy )ese are two related statementsin communication and control theory ldquoinformation is theeliminated uncertaintyrdquo and ldquoinformation is negative en-tropyrdquo However these two statements only emphasize therole of information to the receiver which is the functionaldefinition of information from a specific perspective )isdefinition does not reveal what the information is At mostit only emphasizes from a specific perspective the role ofinformation to the receiver It is impossible to reveal thenature of information from such an interpretation

Moreover the information entropy or negative entropytheories are only to measure the state of a certain aspect ofthe system and the degree of state change in that aspect by acertain calculation method of amount In this regard theinformation entropy or negative entropy theories have theproperty of specifically defined quantitative characteristic

In terms of such characteristics of relativity function-ality and quantification the theory of information entropyor negative entropy is only a technical quantitative pro-cessing method for mechanical communication and con-trolling processes in essence not a theory about the nature ofinformation

It is necessary to mention here that as early as 1928Hartley (1890ndash1970) an American communications expertpointed out in the article ldquoTransmission of Informationrdquo that

ldquoInformation refers to the message with new content andknowledgerdquo ([21] (page 535)) )is is also an acquireddefinition of information recognized and expressed bypeople in their daily life and in general literature Obviouslythis definition is in line with the meanings of ldquoinformation isthe eliminated uncertaintyrdquo and ldquoinformation is negativeentropyrdquo mentioned earlier and it is formulated in the senseof whether the message can bring new content to the re-ceiver Obviously such a definition is also relative andfunctional and cannot be used as an essential explanation ofwhat information is

Usually people always regard the ldquoinformation is neg-ative entropyrdquo as the standard definition of information byWiener and interpret the general nature of information fromthis However they did not seriously conduct discriminationand analysis but extended the explanation made by Wieneronly in the sense of the quantitative description of therelative functionalization of technical processing to thegeneral universal scope at will In factWienerrsquos statement onldquoinformation is negative entropyrdquo is just a practical inter-pretation of communication and control information fromthe perspective of technical processing by using the existingcalculationmethods of entropy and is only a kind of measureof the amount of practical information What he seeks isonly a method of quantitative processing realized by tech-nology but not to reveal the general nature of information atall By the same token the statement that ldquoinformation is theeliminated uncertaintyrdquo focuses only on a quantitativeprocessing method realized by technology As some scholarspointed out in the interdisciplinary research on informationldquoWienerrsquos types of mathematical definitions of informationrelated to mathematical or physical concepts of negativeentropy cannot adequately encompass the experientialembodied pragmatic semantic meaningful content of or-dinary sign games of living systems and the language gamesof embodied conscious humansrdquo ([22] (pages 622ndash633))

In fact Wiener himself is very clear in what sense hisldquoinformation is negative entropyrdquo is used because when heput forward this statement he also made a correspondingdiscussion on the general nature of information He has twoinfluential formulations One is ldquoinformation is informationnot matter or energy No materialism which does not admitthis can survive in the present dayrdquo ([15] (page 133)) )esecond is ldquoinformation is the name of the content that weexchange with the external world in the process of adaptingto it and making this adaptation felt by the external worldrdquo([23] (page 4))

Obviously the first sentence of Wiener emphasizes theontological status of information Although he failed tocorrectly define the nature of information from the positiveside in this sentence he correctly emphasized the inde-pendent value and significance of information comparedwith matter (quality) and energy and he also put forward awarning about the relevant materialism theory that failed tomake a reasonable interpretation of the ontological status ofinformation

Wienerrsquos second sentence further emphasizes the need toclarify the general nature of information Instead of simplyfocusing on the form of the information carrier or the

8 Complexity

function of the information we should grasp the infor-mation based on what we ldquoexchange with the outside worldrdquoSince it is ldquoexchangerdquo there should be in and out In this waythere is information not only within our subject but also inthe external environment In this regard the correspondingdoctrines of objective information and subjective infor-mation should be valid)is also shows the true charm of thesaying that ldquoinformation is information not matter or en-ergyrdquo which Wiener emphasizes

It is regrettable that for a long time Wienerrsquos clearwarning to philosophy has not attracted the attention ofmore philosophers and scientists Not only has the revo-lutionary value of information for the development ofphilosophy not been clearly revealed but also unified in-formation science has not been established because theestablishment of unified information science must be basedon the general theory of information philosophy

In addition we should also note that the statementsldquoinformation is the eliminated uncertaintyrdquo and ldquoinforma-tion is negative entropyrdquo are also single-faceted in the senseof functional definition Because in the real world the roleof information is multifaceted and multilayered it can notonly eliminate uncertainty but also increase uncertainty itcan play the role of negative entropy as well as the role ofentropy For example when a person is sick he should takemedicine to eliminate the disorder caused by the disease inhis body but what happens if he takes the wrong medicineObviously the medicine will provide him with the corre-sponding information but this information does not alwaysplay a role in eliminating uncertainty or negative entropy Insome cases it may play the opposite role which is to increaseuncertainty or entropy

An ancient Chinese literature ldquoStratagems of the War-ring States Qin Stratagem IIrdquo tells a parable of ldquoterrifyingrumorrdquo It was said that Zeng Zirsquos mother was weaving athome and a neighbor came to tell her that ldquoZeng Zi haskilled someonerdquo Zeng Zirsquos mother did not believe and saidldquoI know my son he will not kill peoplerdquo She continued toweave calmly After a while another neighbor came to tellher that ldquoZeng Zi has killed someonerdquo Zeng Zirsquos mother stilldid not believe it and said ldquoHe wonrsquot kill anyonerdquo andcontinued to weave However when the third neighborcame to tell her ldquoZeng Zi has killed someonerdquo finally ZengZirsquos mother could not sit still and she put down her workand fled across the wall

In this parable what effect does the information thatldquoZeng Zi has killed someonerdquo have on his mother Is itentropy or negative entropy Is it entropy increase or de-crease Is uncertainty increased or eliminated

Also if we generalize the functional definition of ldquoin-formation is the eliminated uncertaintyrdquo then we will seesome very ridiculous scenarios In a book published as early as1987 Wu once wrote ldquo)e role of information is funda-mentally different from what information itself is )e natureof information can only be sought from the inner basis of itsown content but cannot be formulated simply by its effect ona certain aspect of the sink Just as the definition of foodcannot be lsquoeliminated hungerrsquo the definition of informationcannot be lsquoeliminated uncertaintyrsquordquo ([24] (page 8))

Finally here comes the most essential aspect that shouldbe emphasized that is the relationship of structural dif-ferences at specific levels of the system measured only fromthe aspect of the form which does not aim at the infor-mation itself but merely aims at the structural characteristicsof the information carrier itself Because of this it is im-possible to deduce the general nature of information directlyfrom such a theory It is no wonder that some westernscholars have clearly and reasonably pointed out that ldquoIn-formation theory deals with the carrier of informationsymbols and signals not information itselfrdquo and ldquoInfor-mation theory does not deal with the information itself butthe carrier of the informationrdquo ([25] (page 150))

Since the calculation of the quantity of entropy andnegative entropy involves the probability distribution of themicro states of the system being measured it is reasonablethat relevant viewpoints such as the degree of orderly ordisorderly organization (order) of the system ldquodegree ofvariationrdquo ldquodifferences and constraintsrdquo ldquosymmetrybreakingrdquo ldquodifference that makes a differencerdquo ldquoform andstructurerdquo and ldquostate of thingsrdquo are directly derived from thetheory of entropy and negative entropy Since related viewssuch as these are directly deduced from the theories aboutthe quantity of entropy and negative entropy it is alsoimpossible to obtain the formulation of the general nature ofinformation through them

Obviously to reveal the essence of information weshould not just focus on the differential relationship of thecarrier forms but we must understand the contents ofrelevant properties characteristics existing modes andstates of the things itself presented by the information

In an article published as early as 1986 Wu wrote thefollowing sentences ldquoinformation is the formulation ofsomething itself displayed in another that alienated bysomething itself it is the indirect existence of somethingitself which exist in other things Information is theformulation of something revealed in the relationship be-tween something and other things Something is informa-tion when it displays itself as internal formulation in anexternal relationship which is expressed in the form ofexternalization of the characteristics of the objectrdquo ([26](page 19))

Based on the content of information and the dynamicmechanism of natural occurrence of information Wu onceclearly defined information as follows ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation of the existing mode and status of matter(direct being)rdquo in a paper entitled ldquoPhilosophical Classifi-cation of Information Formsrdquo which was published in 1984([27] (page 33)) In 2019 Wu expanded the definition ofinformation that was only restricted to the level of philo-sophical ontology based on the historical evolution of in-formation forms classified by him ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation and re-manifestation of the existing modeand status of matter (direct being) as well as the subjectivegrasp and creation of information by the subject of cognitionand practice including the cultural world that has beencreatedrdquo ([28] (page 143))

Complexity 9

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

proposed the theory of information negative entropy in theprocess of establishing cybernetics by integrating the theoryof communication and automatic control In his bookldquoCyberneticsrdquo [2] published in 1948 he independentlypresented Wienerrsquos formula which is only one minus signaway from Shannonrsquos information quantity formula Hewrote ldquo)e information quantity is the negative number ofthe logarithm of a quantity that can be regarded as aprobability which is essentially negative entropyrdquo ([15](pages 11 65)) From this we can also reasonably answer thequestion why do Wienerrsquos information formula andShannonrsquos formula differ by a negative sign It is because theformer measures ldquonegative entropyrdquo while the latter mea-sures ldquoentropyrdquo

Perhaps the analysis from the perspective of the dif-ferences in cognitive methods can help us find the root thatcauses the difference between the information quantity ofShannon and that of Wiener ([16] (pages 33ndash34))

We know that in the field of mechanical communica-tion the number of primitives sent of the messages by asource and the probability of sending each message arepredetermined and the information sink is fully aware of thisdetermination )e a priori estimation by the sink of theuncertainty of the message sent by the source is also derivedfrom this predetermination In this way the uncertainty ofwhat kind of message the source sends can be consideredboth as a feature of the estimation of the information sourcestate by the sink and as a feature of the source itself )edifference of minus sign between Shannonrsquos and Wienerrsquosinformation quantity formulas can be regarded as the resultof their examination from these two different perspectives

)e information quantity of communication can bededuced and calculated according to the principle of rela-tivity of interaction and mutual stipulation between thesource and sink )is leads to the stipulation of ldquoa prioriprobabilityrdquo and ldquoa posteriori probabilityrdquo

If the information quantity formula is deduced from theperspective of the state characteristics of the source itselfaccording to the principle of Shannon then the contributionof the prior probability to the information quantity is re-versed because it provides the ldquouncertaintyrdquo of the sourceestimated by the sink and its direction is opposite to thedirection of the sourcersquos own characteristics )e posteriorprobability contributes positively to the informationquantity because it provides the information state itself ofthe source that actually occurs at the moment and its di-rection is consistent with the direction of the sourcersquos ownstate characteristics )e expression in logarithmic functionis

HShannon logP2P1

1113874 1113875 (2)

If like Wiener the information formula is derived fromthe perspective of the understanding of the source by thesink then the contribution of the prior probability to theinformation quantity is positive On the contrary thecontribution of the posterior probability to the informationquantity is reversed )us in Wienerrsquos formula the formula

of information quantity will be a minus sign different fromthe Shannon formula

HWiener logP1P2

1113874 1113875

log(P1 ) minus log(P2 )

(3)

It is indicated in the fact that the information quantityformulas of Shannon and Wiener can be deduced from thetwo opposite angles and directions of mutual interactionsand determination of the information source and sink thatthe difference of a negative sign between these two formulashas a profound root in epistemology )is reflects the dif-ference and unity of philosophical ontology and episte-mological methods to a certain extent and significanceRegrettably this has not been clearly noticed in the past

It should be said that Wienerrsquos thinking is the same asSchrodingerrsquos Schrodingerrsquos negative entropy of life is usedto calculate the ability to resist the spontaneous entropyincrease in the living body while he information quantity ofWiener is used to calculate the amount of new knowledgebrought to the receiver by the message Both have two basicpoints in common ① the system is open and ② it caneliminate its own chaos by the environment Here whatWienerrsquos information quantity calculates is exactly whatSchrodingerrsquos negative entropy calculates It is no wonderthat Wiener has repeatedly emphasized the idea that theinformation quantity is negative entropy Again we see thatthe crux of the problem lies not in the names of the conceptsused but in the kind of problems that these concepts are usedto explore

33 Negative Entropy Flow of Prigogine Under the cir-cumstances that some theories such as entropy informationand negative entropy have been applied and developed inmore and more discipline theories the classical thermo-dynamics which takes entropy theory and entropy increaseprinciple as its basic characteristics is also developingconstantly )is development finally broke through thelimitations brought by the basic characteristics of classicalthermodynamics to itself

)e Brussels school represented by Belgian physicist andchemist Prigogine [17] reunderstood the second law ofthermodynamics based on a series of experiments andproposed the famous negative entropy theory of dissipativestructure theory in the 1960s [18] It pointed out that theprinciple of increase of entropy only holds in isolated sys-tems For an open system two factors which are the externalentropy flow caused by the exchange between the system andthe environment and the entropy generation within thesystem must be considered Based on this Prigogine pro-posed a generalized second law of thermodynamics which isapplicable to both open systems and isolated systems

Prigogine pointed out that the entropy change of asystem is caused by two factors One factor is the entropyexchanged between the system and the environment duringthe interaction (deS external entropy flow) and the other is

Complexity 5

the entropy generated spontaneously within the system (disinternal entropy change)

For an isolated system since there is no exchange ofmatter and energy between the system and the environmentit is impossible to have the exchange of entropy )ereforein an isolated system deS 0 so dS disge 0 It is the secondlaw of thermodynamics (narrow sense) proposed byClausius

For an open system there is an exchange of entropy atthe same time because of the exchange of matter and energybetween the system and the environment )erefore in anopen system the total entropy change of the system willshow a complex scenario When the external entropy flow isnegative and the absolute value of the external entropy flowis greater than the absolute value of the internal entropychange the system will move towards order along the di-rection of entropy decreasing It can be said that Clausiusrsquossecond law of thermodynamics is just a special case of thegeneralized second law of thermodynamics in an isolatedsystem

It is the generalized second law of thermodynamicsproposed by Prigogine that reveals the inevitability of anorderly evolution of the system along the direction of en-tropy decreasing under a suitable and open background Indissipative structure theory the system introduces negativeentropy flow from the outside to resist the increase of in-ternal entropy which is completely consistent with the basicideas of Schrodingerrsquos ldquonegative entropy theory of liferdquo andWienerrsquos ldquonegative entropy theory of informationrdquo How-ever dissipative structure theory has extended the functionscope of negative entropy into general physical and chemicalsystems )e essence of this expansion is to bring the en-tropy negative entropy and information theories into theall-embracing objective world since every system follows thegeneral laws of physics and chemistry

4 Philosophical Interpretation of theSignificance and Nature of InformationEntropy and Information NegativeEntropy Theories

In the analysis of related traditional literature entropy andnegative entropy are two concepts that correspond to eachother with opposite meanings However if we study themfurther then we will see that the two concepts have the samemeaning and mutually formulated properties

Generally speaking the concept of entropy is a measureof the degree of uncertainty of the fabric mode of the microstate of the system It can reveal the degree of disorder of thesystem organization from a specific level and angle and froma specific level of quantitative measurement Boltzmannrsquosstatistical physical entropy and Shannonrsquos ldquoentropy of in-formation sourcesrdquo are all established in this sense

As for the concept of negative entropy it can be for-mulated in two different senses in related general theoriesone is the degree to which the organization mode of itsstructure deviates from the standard value (the maximumentropy value) relative to the same system and the other is

the degree to which the entropy value of a system decreasesin the process of the change of system organization mode

If a formal description of the fabric mode of a system isneeded two quantities are needed one is the number ofpossible micro states of the system and the other is theprobability that each micro state may occur If A a1 a2 an is a set that represents the possible number of microstates of the system and P p1 p2 pn is a set thatrepresents the probability of possible occurrence of eachmicro state then the organization mode of the formalstructure of the system (M) can be expressed by a matrix asfollows

M A

P1113890 1113891

a1 a2 a3 middot middot middot an

p1 p2 p3 middot middot middot pn

1113890 1113891

(4)

)e organization mode of the systemrsquos structure de-scribed by this matrix may be in two extreme circumstancesone is the state of maximum entropy in this casep1 p2 pn 1n and smax log n and the other is thestate of minimum entropy in this case p1 1p2 p3 pn 0 and s log1 0

If we determine that the case of smax is the standard valueto which the organization mode of the systemrsquos structureshould be referenced then all cases where the system en-tropy is less than smax can be regarded as a deviation fromthis standard value What causes this deviation To whatextent is this deviation Obviously there should be a conceptto specify this factor and there should be a calculation tomeasure the extent of this deviation A very natural idea isthat the effect of this factor is the opposite of entropy whichis negative entropy )is calculation should be the differencebetween the standard entropy value and the actual entropyvalue Based on this idea we can get the following calcu-lation formula for negative entropy ([19] (pages 67ndash74))

negative entropy smax minus s (5)

Obviously there are two extreme circumstances of thisformula one is that when s smax the negative entropy valueof the system is 0 the other is that when s 0 the negativeentropy value of the system is maximum which is equal tosmax

Negative entropy not only can be specified in the sensethat the entropy value in the organization mode of a specificsystem deviates from the standard value (the maximumentropy) but also can be specified in the sense that theentropy value decreases in the process of the change of theorganization mode of the system Schrodingerrsquos negativeentropy theory of life Wienerrsquos negative entropy theory ofinformation Prigoginersquos negative entropy theory of dissi-pative structure and so forth in essence are all defined inthe sense of entropy decrease Negative entropy as a measureof the entropy decrease degree and entropy (Clausius en-tropy) as a measure of entropy increase degree are notmeasures of the systemrsquos absolute negative entropy or en-tropy value but they are measures of some kind of quantity

6 Complexity

ldquochangerdquo or ldquotransformationrdquo which is a measure of relativequantity No matter it is Schrodingerrsquos ldquolife feeds on negativeentropyrdquo Wienerrsquos ldquohow much new information is given tous by hindsightrdquo or the factors that resist the spontaneousentropy increase in the system by Prigogine are all developedfrom the perspective of relative functions that lead tochanges of organization mode (degree of uncertainty) ofsystem Just as the entropy increase effect does not simplydepend on howmuch heat is absorbed by the system but alsoon how much the relative degree of change that the heatbrings to the organization mode of the systemrsquos originalstructure is the entropy decrease effect also does not simplydepend on what kind of message the system receives or whatkind of mass or energy with a certain value of entropy ornegative entropy the system absorbs but also on how therelative degree of change (degree of uncertainty) that themessage the mass or energy brings to the organizationmode of the original structure of the system is)is brings upa very interesting phenomenon )e same mass or energy orthe same message which acts on systems of differentstructural states will play very different roles for differentreceivers lead to entropy increase or entropy decrease addnew information cause ideological disorder or not work(maintaining the original structural mode unchanged andmaintaining the original cognitive state unchanged) )is iswhy the Clausius entropy increase formula has 1T as theintegral factor and Wienerrsquos information formula has aprior probability as the reference factor

Although concepts such as entropy negative entropyinformation entropy increase and entropy decrease oftenhave very different specific meanings in different disciplinesand for different scientists these concepts are essentiallyconsistent in nature because they all study the same kind ofphenomena in a unified sense and the differences betweenthem emerge when the same kinds of concepts are applied tothe research of different directions of the same kind ofphenomena

It is reasonable to distinguish the formulations of theseconcepts into two categories one is a formulation given in astatic sense and the other is a formulation given in a dynamicsense In this way we can clearly see that the ambiguousinterpretation of these concepts is often caused by theconfusion of these two types of formulations

In essence Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy and the negative entropy indicatingthe degree of deviation from the standard entropy value ofthe system pointed out above are all the quantitative for-mulations of entropy (information entropy) and negativeentropy (information) in the sense of static state and ab-solute value of the system )e basic meaning of this for-mulation is to calculate the degree of indeterminacy(uncertainty) of the micro state of the certain system and theextent to which this degree of indeterminacy (uncertainty)deviates from the maximum possible degree of indetermi-nacy (uncertainty) )is point can be clearly seen from theprevious comparative interpretation of the statistical en-tropy formula and the information entropy formula as wellas in our explanation of ldquonegative entropy smax minus srdquo Sometexts believe that the statistical entropy formula calculates

the ldquoentropy (change) of the system in a certain processrdquowhile Shannonrsquos information quantity calculates ldquothe in-formation quantity (change) of the system in a certainprocessrdquo ([20] (pages 20ndash27)) and this statement is incor-rect Here we also want to emphasize one point that iswhether it is Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy or the negative entropy indicating thedegree of deviation from the standard value of the systemthey are still just a quantitative concept and none of themcan precisely define the general nature of the abstractmeaning of entropy negative entropy and information Interms of methodology the definition of the abstract generalnature of such concepts is not a task of these specific sciencesbut only a philosophical subject If concepts are used ac-curately we should replace them with such concepts ofquantity of entropy negative entropy or information

)e dynamic formulations for the concepts of entropyand information are developed in two directions one is thedirection of entropy increase based on the second law ofthermodynamics and the other is the direction of entropydecrease with the framework of various negative entropytheories constructed in the sense of resisting the entropyincrease of the system

)e very interesting fact is that the research on thedynamics of entropy and information was earlier than theresearch on its statics Clausius was already quantifying itschanges when people did not really understand what en-tropy was

Various forms of negative entropy theory are dynamicmeasures of changes in information (entropy) from anotherdirection opposite to the second law of thermodynamicsSchrodinger Wiener and Prigogine all have a common ideathat the system can input the external entropy (information)flow from the environment to resist the increasing trend ofentropy within the system )ey measure the amount ofexternal entropy (information) flow by the amount ofchange in entropy (information) within the system causedby the external entropy (information) flow Because theexternal entropy (information) flow may cause the entropydecrease effect within the system consequently the quantityof this external entropy (information) flow can be measuredby the degree of the entropy decrease effect within the systemcaused by it Moreover it is simultaneously and relativelydefined as negative entropy

It seems that the function ΣPi Log Pi has some uniquevalue In the static state its absolute value indicates theuncertainty of the micro state in which the system is locatedDynamically the change of the function value indicates thechange of the uncertainty of the micro state in which thesystem is located )is change is caused by a change in thevalue of n that indicates the number of micro states of thesystem and a change in the Pi probability distribution Ingeneral the increase in the value of n and the tendencytowards equilibrium of Pi value result in the process ofentropy increase while the decrease in the value of n and thetendency towards differentiation of Pi value result in theprocess of entropy decrease As for the general idea thatinformation and entropy are regarded as opposites it ismore like an artificial formulation )e statement that the

Complexity 7

entropy decrease effect of the system is caused by the ex-ternal information is equivalent to the statement that thesystemrsquos entropy decrease effect is caused by the entropyflow introduced by the external environment Prigogine usesexternal entropy flow Schrodinger uses negative entropyand Wiener uses both information and negative entropy infact they are measuring the quantity of the same type ofchange in the same process We have every reason to regardthe various theories of entropy information and negativeentropy as theories about entropy quantity and at the sametime we have all the reasons to regard them as theories aboutinformation quantity Based on this we can establish a kindof generalized entropy theory or a generalized informationtheory to unify the discussions of the quantity of entropy andinformation that have been carried out and are ongoing indifferent disciplines

5 Comments and Conclusions

At this point we are able to evaluate and summarize thenature of the relevant information entropy or informationnegative entropy theories

Firstly the information entropy or negative entropytheories are the relative measurement for the structurali-zation degree at a specific level of the system which hascertain characteristics of relativity )e information entropytheory measures the relationship between the diversity ofstructural organization and the degree of uncertainty dif-ference at a specific level of the system while the negativeentropy theory of information measures the diversity ofstructural organization methods and the degree of uncer-tainty reduction at a specific level of the system

Secondly we notice that in the general theory theconcept of information is defined and explained in a specialsense as negative entropy )ese are two related statementsin communication and control theory ldquoinformation is theeliminated uncertaintyrdquo and ldquoinformation is negative en-tropyrdquo However these two statements only emphasize therole of information to the receiver which is the functionaldefinition of information from a specific perspective )isdefinition does not reveal what the information is At mostit only emphasizes from a specific perspective the role ofinformation to the receiver It is impossible to reveal thenature of information from such an interpretation

Moreover the information entropy or negative entropytheories are only to measure the state of a certain aspect ofthe system and the degree of state change in that aspect by acertain calculation method of amount In this regard theinformation entropy or negative entropy theories have theproperty of specifically defined quantitative characteristic

In terms of such characteristics of relativity function-ality and quantification the theory of information entropyor negative entropy is only a technical quantitative pro-cessing method for mechanical communication and con-trolling processes in essence not a theory about the nature ofinformation

It is necessary to mention here that as early as 1928Hartley (1890ndash1970) an American communications expertpointed out in the article ldquoTransmission of Informationrdquo that

ldquoInformation refers to the message with new content andknowledgerdquo ([21] (page 535)) )is is also an acquireddefinition of information recognized and expressed bypeople in their daily life and in general literature Obviouslythis definition is in line with the meanings of ldquoinformation isthe eliminated uncertaintyrdquo and ldquoinformation is negativeentropyrdquo mentioned earlier and it is formulated in the senseof whether the message can bring new content to the re-ceiver Obviously such a definition is also relative andfunctional and cannot be used as an essential explanation ofwhat information is

Usually people always regard the ldquoinformation is neg-ative entropyrdquo as the standard definition of information byWiener and interpret the general nature of information fromthis However they did not seriously conduct discriminationand analysis but extended the explanation made by Wieneronly in the sense of the quantitative description of therelative functionalization of technical processing to thegeneral universal scope at will In factWienerrsquos statement onldquoinformation is negative entropyrdquo is just a practical inter-pretation of communication and control information fromthe perspective of technical processing by using the existingcalculationmethods of entropy and is only a kind of measureof the amount of practical information What he seeks isonly a method of quantitative processing realized by tech-nology but not to reveal the general nature of information atall By the same token the statement that ldquoinformation is theeliminated uncertaintyrdquo focuses only on a quantitativeprocessing method realized by technology As some scholarspointed out in the interdisciplinary research on informationldquoWienerrsquos types of mathematical definitions of informationrelated to mathematical or physical concepts of negativeentropy cannot adequately encompass the experientialembodied pragmatic semantic meaningful content of or-dinary sign games of living systems and the language gamesof embodied conscious humansrdquo ([22] (pages 622ndash633))

In fact Wiener himself is very clear in what sense hisldquoinformation is negative entropyrdquo is used because when heput forward this statement he also made a correspondingdiscussion on the general nature of information He has twoinfluential formulations One is ldquoinformation is informationnot matter or energy No materialism which does not admitthis can survive in the present dayrdquo ([15] (page 133)) )esecond is ldquoinformation is the name of the content that weexchange with the external world in the process of adaptingto it and making this adaptation felt by the external worldrdquo([23] (page 4))

Obviously the first sentence of Wiener emphasizes theontological status of information Although he failed tocorrectly define the nature of information from the positiveside in this sentence he correctly emphasized the inde-pendent value and significance of information comparedwith matter (quality) and energy and he also put forward awarning about the relevant materialism theory that failed tomake a reasonable interpretation of the ontological status ofinformation

Wienerrsquos second sentence further emphasizes the need toclarify the general nature of information Instead of simplyfocusing on the form of the information carrier or the

8 Complexity

function of the information we should grasp the infor-mation based on what we ldquoexchange with the outside worldrdquoSince it is ldquoexchangerdquo there should be in and out In this waythere is information not only within our subject but also inthe external environment In this regard the correspondingdoctrines of objective information and subjective infor-mation should be valid)is also shows the true charm of thesaying that ldquoinformation is information not matter or en-ergyrdquo which Wiener emphasizes

It is regrettable that for a long time Wienerrsquos clearwarning to philosophy has not attracted the attention ofmore philosophers and scientists Not only has the revo-lutionary value of information for the development ofphilosophy not been clearly revealed but also unified in-formation science has not been established because theestablishment of unified information science must be basedon the general theory of information philosophy

In addition we should also note that the statementsldquoinformation is the eliminated uncertaintyrdquo and ldquoinforma-tion is negative entropyrdquo are also single-faceted in the senseof functional definition Because in the real world the roleof information is multifaceted and multilayered it can notonly eliminate uncertainty but also increase uncertainty itcan play the role of negative entropy as well as the role ofentropy For example when a person is sick he should takemedicine to eliminate the disorder caused by the disease inhis body but what happens if he takes the wrong medicineObviously the medicine will provide him with the corre-sponding information but this information does not alwaysplay a role in eliminating uncertainty or negative entropy Insome cases it may play the opposite role which is to increaseuncertainty or entropy

An ancient Chinese literature ldquoStratagems of the War-ring States Qin Stratagem IIrdquo tells a parable of ldquoterrifyingrumorrdquo It was said that Zeng Zirsquos mother was weaving athome and a neighbor came to tell her that ldquoZeng Zi haskilled someonerdquo Zeng Zirsquos mother did not believe and saidldquoI know my son he will not kill peoplerdquo She continued toweave calmly After a while another neighbor came to tellher that ldquoZeng Zi has killed someonerdquo Zeng Zirsquos mother stilldid not believe it and said ldquoHe wonrsquot kill anyonerdquo andcontinued to weave However when the third neighborcame to tell her ldquoZeng Zi has killed someonerdquo finally ZengZirsquos mother could not sit still and she put down her workand fled across the wall

In this parable what effect does the information thatldquoZeng Zi has killed someonerdquo have on his mother Is itentropy or negative entropy Is it entropy increase or de-crease Is uncertainty increased or eliminated

Also if we generalize the functional definition of ldquoin-formation is the eliminated uncertaintyrdquo then we will seesome very ridiculous scenarios In a book published as early as1987 Wu once wrote ldquo)e role of information is funda-mentally different from what information itself is )e natureof information can only be sought from the inner basis of itsown content but cannot be formulated simply by its effect ona certain aspect of the sink Just as the definition of foodcannot be lsquoeliminated hungerrsquo the definition of informationcannot be lsquoeliminated uncertaintyrsquordquo ([24] (page 8))

Finally here comes the most essential aspect that shouldbe emphasized that is the relationship of structural dif-ferences at specific levels of the system measured only fromthe aspect of the form which does not aim at the infor-mation itself but merely aims at the structural characteristicsof the information carrier itself Because of this it is im-possible to deduce the general nature of information directlyfrom such a theory It is no wonder that some westernscholars have clearly and reasonably pointed out that ldquoIn-formation theory deals with the carrier of informationsymbols and signals not information itselfrdquo and ldquoInfor-mation theory does not deal with the information itself butthe carrier of the informationrdquo ([25] (page 150))

Since the calculation of the quantity of entropy andnegative entropy involves the probability distribution of themicro states of the system being measured it is reasonablethat relevant viewpoints such as the degree of orderly ordisorderly organization (order) of the system ldquodegree ofvariationrdquo ldquodifferences and constraintsrdquo ldquosymmetrybreakingrdquo ldquodifference that makes a differencerdquo ldquoform andstructurerdquo and ldquostate of thingsrdquo are directly derived from thetheory of entropy and negative entropy Since related viewssuch as these are directly deduced from the theories aboutthe quantity of entropy and negative entropy it is alsoimpossible to obtain the formulation of the general nature ofinformation through them

Obviously to reveal the essence of information weshould not just focus on the differential relationship of thecarrier forms but we must understand the contents ofrelevant properties characteristics existing modes andstates of the things itself presented by the information

In an article published as early as 1986 Wu wrote thefollowing sentences ldquoinformation is the formulation ofsomething itself displayed in another that alienated bysomething itself it is the indirect existence of somethingitself which exist in other things Information is theformulation of something revealed in the relationship be-tween something and other things Something is informa-tion when it displays itself as internal formulation in anexternal relationship which is expressed in the form ofexternalization of the characteristics of the objectrdquo ([26](page 19))

Based on the content of information and the dynamicmechanism of natural occurrence of information Wu onceclearly defined information as follows ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation of the existing mode and status of matter(direct being)rdquo in a paper entitled ldquoPhilosophical Classifi-cation of Information Formsrdquo which was published in 1984([27] (page 33)) In 2019 Wu expanded the definition ofinformation that was only restricted to the level of philo-sophical ontology based on the historical evolution of in-formation forms classified by him ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation and re-manifestation of the existing modeand status of matter (direct being) as well as the subjectivegrasp and creation of information by the subject of cognitionand practice including the cultural world that has beencreatedrdquo ([28] (page 143))

Complexity 9

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

the entropy generated spontaneously within the system (disinternal entropy change)

For an isolated system since there is no exchange ofmatter and energy between the system and the environmentit is impossible to have the exchange of entropy )ereforein an isolated system deS 0 so dS disge 0 It is the secondlaw of thermodynamics (narrow sense) proposed byClausius

For an open system there is an exchange of entropy atthe same time because of the exchange of matter and energybetween the system and the environment )erefore in anopen system the total entropy change of the system willshow a complex scenario When the external entropy flow isnegative and the absolute value of the external entropy flowis greater than the absolute value of the internal entropychange the system will move towards order along the di-rection of entropy decreasing It can be said that Clausiusrsquossecond law of thermodynamics is just a special case of thegeneralized second law of thermodynamics in an isolatedsystem

It is the generalized second law of thermodynamicsproposed by Prigogine that reveals the inevitability of anorderly evolution of the system along the direction of en-tropy decreasing under a suitable and open background Indissipative structure theory the system introduces negativeentropy flow from the outside to resist the increase of in-ternal entropy which is completely consistent with the basicideas of Schrodingerrsquos ldquonegative entropy theory of liferdquo andWienerrsquos ldquonegative entropy theory of informationrdquo How-ever dissipative structure theory has extended the functionscope of negative entropy into general physical and chemicalsystems )e essence of this expansion is to bring the en-tropy negative entropy and information theories into theall-embracing objective world since every system follows thegeneral laws of physics and chemistry

4 Philosophical Interpretation of theSignificance and Nature of InformationEntropy and Information NegativeEntropy Theories

In the analysis of related traditional literature entropy andnegative entropy are two concepts that correspond to eachother with opposite meanings However if we study themfurther then we will see that the two concepts have the samemeaning and mutually formulated properties

Generally speaking the concept of entropy is a measureof the degree of uncertainty of the fabric mode of the microstate of the system It can reveal the degree of disorder of thesystem organization from a specific level and angle and froma specific level of quantitative measurement Boltzmannrsquosstatistical physical entropy and Shannonrsquos ldquoentropy of in-formation sourcesrdquo are all established in this sense

As for the concept of negative entropy it can be for-mulated in two different senses in related general theoriesone is the degree to which the organization mode of itsstructure deviates from the standard value (the maximumentropy value) relative to the same system and the other is

the degree to which the entropy value of a system decreasesin the process of the change of system organization mode

If a formal description of the fabric mode of a system isneeded two quantities are needed one is the number ofpossible micro states of the system and the other is theprobability that each micro state may occur If A a1 a2 an is a set that represents the possible number of microstates of the system and P p1 p2 pn is a set thatrepresents the probability of possible occurrence of eachmicro state then the organization mode of the formalstructure of the system (M) can be expressed by a matrix asfollows

M A

P1113890 1113891

a1 a2 a3 middot middot middot an

p1 p2 p3 middot middot middot pn

1113890 1113891

(4)

)e organization mode of the systemrsquos structure de-scribed by this matrix may be in two extreme circumstancesone is the state of maximum entropy in this casep1 p2 pn 1n and smax log n and the other is thestate of minimum entropy in this case p1 1p2 p3 pn 0 and s log1 0

If we determine that the case of smax is the standard valueto which the organization mode of the systemrsquos structureshould be referenced then all cases where the system en-tropy is less than smax can be regarded as a deviation fromthis standard value What causes this deviation To whatextent is this deviation Obviously there should be a conceptto specify this factor and there should be a calculation tomeasure the extent of this deviation A very natural idea isthat the effect of this factor is the opposite of entropy whichis negative entropy )is calculation should be the differencebetween the standard entropy value and the actual entropyvalue Based on this idea we can get the following calcu-lation formula for negative entropy ([19] (pages 67ndash74))

negative entropy smax minus s (5)

Obviously there are two extreme circumstances of thisformula one is that when s smax the negative entropy valueof the system is 0 the other is that when s 0 the negativeentropy value of the system is maximum which is equal tosmax

Negative entropy not only can be specified in the sensethat the entropy value in the organization mode of a specificsystem deviates from the standard value (the maximumentropy) but also can be specified in the sense that theentropy value decreases in the process of the change of theorganization mode of the system Schrodingerrsquos negativeentropy theory of life Wienerrsquos negative entropy theory ofinformation Prigoginersquos negative entropy theory of dissi-pative structure and so forth in essence are all defined inthe sense of entropy decrease Negative entropy as a measureof the entropy decrease degree and entropy (Clausius en-tropy) as a measure of entropy increase degree are notmeasures of the systemrsquos absolute negative entropy or en-tropy value but they are measures of some kind of quantity

6 Complexity

ldquochangerdquo or ldquotransformationrdquo which is a measure of relativequantity No matter it is Schrodingerrsquos ldquolife feeds on negativeentropyrdquo Wienerrsquos ldquohow much new information is given tous by hindsightrdquo or the factors that resist the spontaneousentropy increase in the system by Prigogine are all developedfrom the perspective of relative functions that lead tochanges of organization mode (degree of uncertainty) ofsystem Just as the entropy increase effect does not simplydepend on howmuch heat is absorbed by the system but alsoon how much the relative degree of change that the heatbrings to the organization mode of the systemrsquos originalstructure is the entropy decrease effect also does not simplydepend on what kind of message the system receives or whatkind of mass or energy with a certain value of entropy ornegative entropy the system absorbs but also on how therelative degree of change (degree of uncertainty) that themessage the mass or energy brings to the organizationmode of the original structure of the system is)is brings upa very interesting phenomenon )e same mass or energy orthe same message which acts on systems of differentstructural states will play very different roles for differentreceivers lead to entropy increase or entropy decrease addnew information cause ideological disorder or not work(maintaining the original structural mode unchanged andmaintaining the original cognitive state unchanged) )is iswhy the Clausius entropy increase formula has 1T as theintegral factor and Wienerrsquos information formula has aprior probability as the reference factor

Although concepts such as entropy negative entropyinformation entropy increase and entropy decrease oftenhave very different specific meanings in different disciplinesand for different scientists these concepts are essentiallyconsistent in nature because they all study the same kind ofphenomena in a unified sense and the differences betweenthem emerge when the same kinds of concepts are applied tothe research of different directions of the same kind ofphenomena

It is reasonable to distinguish the formulations of theseconcepts into two categories one is a formulation given in astatic sense and the other is a formulation given in a dynamicsense In this way we can clearly see that the ambiguousinterpretation of these concepts is often caused by theconfusion of these two types of formulations

In essence Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy and the negative entropy indicatingthe degree of deviation from the standard entropy value ofthe system pointed out above are all the quantitative for-mulations of entropy (information entropy) and negativeentropy (information) in the sense of static state and ab-solute value of the system )e basic meaning of this for-mulation is to calculate the degree of indeterminacy(uncertainty) of the micro state of the certain system and theextent to which this degree of indeterminacy (uncertainty)deviates from the maximum possible degree of indetermi-nacy (uncertainty) )is point can be clearly seen from theprevious comparative interpretation of the statistical en-tropy formula and the information entropy formula as wellas in our explanation of ldquonegative entropy smax minus srdquo Sometexts believe that the statistical entropy formula calculates

the ldquoentropy (change) of the system in a certain processrdquowhile Shannonrsquos information quantity calculates ldquothe in-formation quantity (change) of the system in a certainprocessrdquo ([20] (pages 20ndash27)) and this statement is incor-rect Here we also want to emphasize one point that iswhether it is Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy or the negative entropy indicating thedegree of deviation from the standard value of the systemthey are still just a quantitative concept and none of themcan precisely define the general nature of the abstractmeaning of entropy negative entropy and information Interms of methodology the definition of the abstract generalnature of such concepts is not a task of these specific sciencesbut only a philosophical subject If concepts are used ac-curately we should replace them with such concepts ofquantity of entropy negative entropy or information

)e dynamic formulations for the concepts of entropyand information are developed in two directions one is thedirection of entropy increase based on the second law ofthermodynamics and the other is the direction of entropydecrease with the framework of various negative entropytheories constructed in the sense of resisting the entropyincrease of the system

)e very interesting fact is that the research on thedynamics of entropy and information was earlier than theresearch on its statics Clausius was already quantifying itschanges when people did not really understand what en-tropy was

Various forms of negative entropy theory are dynamicmeasures of changes in information (entropy) from anotherdirection opposite to the second law of thermodynamicsSchrodinger Wiener and Prigogine all have a common ideathat the system can input the external entropy (information)flow from the environment to resist the increasing trend ofentropy within the system )ey measure the amount ofexternal entropy (information) flow by the amount ofchange in entropy (information) within the system causedby the external entropy (information) flow Because theexternal entropy (information) flow may cause the entropydecrease effect within the system consequently the quantityof this external entropy (information) flow can be measuredby the degree of the entropy decrease effect within the systemcaused by it Moreover it is simultaneously and relativelydefined as negative entropy

It seems that the function ΣPi Log Pi has some uniquevalue In the static state its absolute value indicates theuncertainty of the micro state in which the system is locatedDynamically the change of the function value indicates thechange of the uncertainty of the micro state in which thesystem is located )is change is caused by a change in thevalue of n that indicates the number of micro states of thesystem and a change in the Pi probability distribution Ingeneral the increase in the value of n and the tendencytowards equilibrium of Pi value result in the process ofentropy increase while the decrease in the value of n and thetendency towards differentiation of Pi value result in theprocess of entropy decrease As for the general idea thatinformation and entropy are regarded as opposites it ismore like an artificial formulation )e statement that the

Complexity 7

entropy decrease effect of the system is caused by the ex-ternal information is equivalent to the statement that thesystemrsquos entropy decrease effect is caused by the entropyflow introduced by the external environment Prigogine usesexternal entropy flow Schrodinger uses negative entropyand Wiener uses both information and negative entropy infact they are measuring the quantity of the same type ofchange in the same process We have every reason to regardthe various theories of entropy information and negativeentropy as theories about entropy quantity and at the sametime we have all the reasons to regard them as theories aboutinformation quantity Based on this we can establish a kindof generalized entropy theory or a generalized informationtheory to unify the discussions of the quantity of entropy andinformation that have been carried out and are ongoing indifferent disciplines

5 Comments and Conclusions

At this point we are able to evaluate and summarize thenature of the relevant information entropy or informationnegative entropy theories

Firstly the information entropy or negative entropytheories are the relative measurement for the structurali-zation degree at a specific level of the system which hascertain characteristics of relativity )e information entropytheory measures the relationship between the diversity ofstructural organization and the degree of uncertainty dif-ference at a specific level of the system while the negativeentropy theory of information measures the diversity ofstructural organization methods and the degree of uncer-tainty reduction at a specific level of the system

Secondly we notice that in the general theory theconcept of information is defined and explained in a specialsense as negative entropy )ese are two related statementsin communication and control theory ldquoinformation is theeliminated uncertaintyrdquo and ldquoinformation is negative en-tropyrdquo However these two statements only emphasize therole of information to the receiver which is the functionaldefinition of information from a specific perspective )isdefinition does not reveal what the information is At mostit only emphasizes from a specific perspective the role ofinformation to the receiver It is impossible to reveal thenature of information from such an interpretation

Moreover the information entropy or negative entropytheories are only to measure the state of a certain aspect ofthe system and the degree of state change in that aspect by acertain calculation method of amount In this regard theinformation entropy or negative entropy theories have theproperty of specifically defined quantitative characteristic

In terms of such characteristics of relativity function-ality and quantification the theory of information entropyor negative entropy is only a technical quantitative pro-cessing method for mechanical communication and con-trolling processes in essence not a theory about the nature ofinformation

It is necessary to mention here that as early as 1928Hartley (1890ndash1970) an American communications expertpointed out in the article ldquoTransmission of Informationrdquo that

ldquoInformation refers to the message with new content andknowledgerdquo ([21] (page 535)) )is is also an acquireddefinition of information recognized and expressed bypeople in their daily life and in general literature Obviouslythis definition is in line with the meanings of ldquoinformation isthe eliminated uncertaintyrdquo and ldquoinformation is negativeentropyrdquo mentioned earlier and it is formulated in the senseof whether the message can bring new content to the re-ceiver Obviously such a definition is also relative andfunctional and cannot be used as an essential explanation ofwhat information is

Usually people always regard the ldquoinformation is neg-ative entropyrdquo as the standard definition of information byWiener and interpret the general nature of information fromthis However they did not seriously conduct discriminationand analysis but extended the explanation made by Wieneronly in the sense of the quantitative description of therelative functionalization of technical processing to thegeneral universal scope at will In factWienerrsquos statement onldquoinformation is negative entropyrdquo is just a practical inter-pretation of communication and control information fromthe perspective of technical processing by using the existingcalculationmethods of entropy and is only a kind of measureof the amount of practical information What he seeks isonly a method of quantitative processing realized by tech-nology but not to reveal the general nature of information atall By the same token the statement that ldquoinformation is theeliminated uncertaintyrdquo focuses only on a quantitativeprocessing method realized by technology As some scholarspointed out in the interdisciplinary research on informationldquoWienerrsquos types of mathematical definitions of informationrelated to mathematical or physical concepts of negativeentropy cannot adequately encompass the experientialembodied pragmatic semantic meaningful content of or-dinary sign games of living systems and the language gamesof embodied conscious humansrdquo ([22] (pages 622ndash633))

In fact Wiener himself is very clear in what sense hisldquoinformation is negative entropyrdquo is used because when heput forward this statement he also made a correspondingdiscussion on the general nature of information He has twoinfluential formulations One is ldquoinformation is informationnot matter or energy No materialism which does not admitthis can survive in the present dayrdquo ([15] (page 133)) )esecond is ldquoinformation is the name of the content that weexchange with the external world in the process of adaptingto it and making this adaptation felt by the external worldrdquo([23] (page 4))

Obviously the first sentence of Wiener emphasizes theontological status of information Although he failed tocorrectly define the nature of information from the positiveside in this sentence he correctly emphasized the inde-pendent value and significance of information comparedwith matter (quality) and energy and he also put forward awarning about the relevant materialism theory that failed tomake a reasonable interpretation of the ontological status ofinformation

Wienerrsquos second sentence further emphasizes the need toclarify the general nature of information Instead of simplyfocusing on the form of the information carrier or the

8 Complexity

function of the information we should grasp the infor-mation based on what we ldquoexchange with the outside worldrdquoSince it is ldquoexchangerdquo there should be in and out In this waythere is information not only within our subject but also inthe external environment In this regard the correspondingdoctrines of objective information and subjective infor-mation should be valid)is also shows the true charm of thesaying that ldquoinformation is information not matter or en-ergyrdquo which Wiener emphasizes

It is regrettable that for a long time Wienerrsquos clearwarning to philosophy has not attracted the attention ofmore philosophers and scientists Not only has the revo-lutionary value of information for the development ofphilosophy not been clearly revealed but also unified in-formation science has not been established because theestablishment of unified information science must be basedon the general theory of information philosophy

In addition we should also note that the statementsldquoinformation is the eliminated uncertaintyrdquo and ldquoinforma-tion is negative entropyrdquo are also single-faceted in the senseof functional definition Because in the real world the roleof information is multifaceted and multilayered it can notonly eliminate uncertainty but also increase uncertainty itcan play the role of negative entropy as well as the role ofentropy For example when a person is sick he should takemedicine to eliminate the disorder caused by the disease inhis body but what happens if he takes the wrong medicineObviously the medicine will provide him with the corre-sponding information but this information does not alwaysplay a role in eliminating uncertainty or negative entropy Insome cases it may play the opposite role which is to increaseuncertainty or entropy

An ancient Chinese literature ldquoStratagems of the War-ring States Qin Stratagem IIrdquo tells a parable of ldquoterrifyingrumorrdquo It was said that Zeng Zirsquos mother was weaving athome and a neighbor came to tell her that ldquoZeng Zi haskilled someonerdquo Zeng Zirsquos mother did not believe and saidldquoI know my son he will not kill peoplerdquo She continued toweave calmly After a while another neighbor came to tellher that ldquoZeng Zi has killed someonerdquo Zeng Zirsquos mother stilldid not believe it and said ldquoHe wonrsquot kill anyonerdquo andcontinued to weave However when the third neighborcame to tell her ldquoZeng Zi has killed someonerdquo finally ZengZirsquos mother could not sit still and she put down her workand fled across the wall

In this parable what effect does the information thatldquoZeng Zi has killed someonerdquo have on his mother Is itentropy or negative entropy Is it entropy increase or de-crease Is uncertainty increased or eliminated

Also if we generalize the functional definition of ldquoin-formation is the eliminated uncertaintyrdquo then we will seesome very ridiculous scenarios In a book published as early as1987 Wu once wrote ldquo)e role of information is funda-mentally different from what information itself is )e natureof information can only be sought from the inner basis of itsown content but cannot be formulated simply by its effect ona certain aspect of the sink Just as the definition of foodcannot be lsquoeliminated hungerrsquo the definition of informationcannot be lsquoeliminated uncertaintyrsquordquo ([24] (page 8))

Finally here comes the most essential aspect that shouldbe emphasized that is the relationship of structural dif-ferences at specific levels of the system measured only fromthe aspect of the form which does not aim at the infor-mation itself but merely aims at the structural characteristicsof the information carrier itself Because of this it is im-possible to deduce the general nature of information directlyfrom such a theory It is no wonder that some westernscholars have clearly and reasonably pointed out that ldquoIn-formation theory deals with the carrier of informationsymbols and signals not information itselfrdquo and ldquoInfor-mation theory does not deal with the information itself butthe carrier of the informationrdquo ([25] (page 150))

Since the calculation of the quantity of entropy andnegative entropy involves the probability distribution of themicro states of the system being measured it is reasonablethat relevant viewpoints such as the degree of orderly ordisorderly organization (order) of the system ldquodegree ofvariationrdquo ldquodifferences and constraintsrdquo ldquosymmetrybreakingrdquo ldquodifference that makes a differencerdquo ldquoform andstructurerdquo and ldquostate of thingsrdquo are directly derived from thetheory of entropy and negative entropy Since related viewssuch as these are directly deduced from the theories aboutthe quantity of entropy and negative entropy it is alsoimpossible to obtain the formulation of the general nature ofinformation through them

Obviously to reveal the essence of information weshould not just focus on the differential relationship of thecarrier forms but we must understand the contents ofrelevant properties characteristics existing modes andstates of the things itself presented by the information

In an article published as early as 1986 Wu wrote thefollowing sentences ldquoinformation is the formulation ofsomething itself displayed in another that alienated bysomething itself it is the indirect existence of somethingitself which exist in other things Information is theformulation of something revealed in the relationship be-tween something and other things Something is informa-tion when it displays itself as internal formulation in anexternal relationship which is expressed in the form ofexternalization of the characteristics of the objectrdquo ([26](page 19))

Based on the content of information and the dynamicmechanism of natural occurrence of information Wu onceclearly defined information as follows ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation of the existing mode and status of matter(direct being)rdquo in a paper entitled ldquoPhilosophical Classifi-cation of Information Formsrdquo which was published in 1984([27] (page 33)) In 2019 Wu expanded the definition ofinformation that was only restricted to the level of philo-sophical ontology based on the historical evolution of in-formation forms classified by him ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation and re-manifestation of the existing modeand status of matter (direct being) as well as the subjectivegrasp and creation of information by the subject of cognitionand practice including the cultural world that has beencreatedrdquo ([28] (page 143))

Complexity 9

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

ldquochangerdquo or ldquotransformationrdquo which is a measure of relativequantity No matter it is Schrodingerrsquos ldquolife feeds on negativeentropyrdquo Wienerrsquos ldquohow much new information is given tous by hindsightrdquo or the factors that resist the spontaneousentropy increase in the system by Prigogine are all developedfrom the perspective of relative functions that lead tochanges of organization mode (degree of uncertainty) ofsystem Just as the entropy increase effect does not simplydepend on howmuch heat is absorbed by the system but alsoon how much the relative degree of change that the heatbrings to the organization mode of the systemrsquos originalstructure is the entropy decrease effect also does not simplydepend on what kind of message the system receives or whatkind of mass or energy with a certain value of entropy ornegative entropy the system absorbs but also on how therelative degree of change (degree of uncertainty) that themessage the mass or energy brings to the organizationmode of the original structure of the system is)is brings upa very interesting phenomenon )e same mass or energy orthe same message which acts on systems of differentstructural states will play very different roles for differentreceivers lead to entropy increase or entropy decrease addnew information cause ideological disorder or not work(maintaining the original structural mode unchanged andmaintaining the original cognitive state unchanged) )is iswhy the Clausius entropy increase formula has 1T as theintegral factor and Wienerrsquos information formula has aprior probability as the reference factor

Although concepts such as entropy negative entropyinformation entropy increase and entropy decrease oftenhave very different specific meanings in different disciplinesand for different scientists these concepts are essentiallyconsistent in nature because they all study the same kind ofphenomena in a unified sense and the differences betweenthem emerge when the same kinds of concepts are applied tothe research of different directions of the same kind ofphenomena

It is reasonable to distinguish the formulations of theseconcepts into two categories one is a formulation given in astatic sense and the other is a formulation given in a dynamicsense In this way we can clearly see that the ambiguousinterpretation of these concepts is often caused by theconfusion of these two types of formulations

In essence Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy and the negative entropy indicatingthe degree of deviation from the standard entropy value ofthe system pointed out above are all the quantitative for-mulations of entropy (information entropy) and negativeentropy (information) in the sense of static state and ab-solute value of the system )e basic meaning of this for-mulation is to calculate the degree of indeterminacy(uncertainty) of the micro state of the certain system and theextent to which this degree of indeterminacy (uncertainty)deviates from the maximum possible degree of indetermi-nacy (uncertainty) )is point can be clearly seen from theprevious comparative interpretation of the statistical en-tropy formula and the information entropy formula as wellas in our explanation of ldquonegative entropy smax minus srdquo Sometexts believe that the statistical entropy formula calculates

the ldquoentropy (change) of the system in a certain processrdquowhile Shannonrsquos information quantity calculates ldquothe in-formation quantity (change) of the system in a certainprocessrdquo ([20] (pages 20ndash27)) and this statement is incor-rect Here we also want to emphasize one point that iswhether it is Boltzmannrsquos statistical entropy Shannonrsquosinformation entropy or the negative entropy indicating thedegree of deviation from the standard value of the systemthey are still just a quantitative concept and none of themcan precisely define the general nature of the abstractmeaning of entropy negative entropy and information Interms of methodology the definition of the abstract generalnature of such concepts is not a task of these specific sciencesbut only a philosophical subject If concepts are used ac-curately we should replace them with such concepts ofquantity of entropy negative entropy or information

)e dynamic formulations for the concepts of entropyand information are developed in two directions one is thedirection of entropy increase based on the second law ofthermodynamics and the other is the direction of entropydecrease with the framework of various negative entropytheories constructed in the sense of resisting the entropyincrease of the system

)e very interesting fact is that the research on thedynamics of entropy and information was earlier than theresearch on its statics Clausius was already quantifying itschanges when people did not really understand what en-tropy was

Various forms of negative entropy theory are dynamicmeasures of changes in information (entropy) from anotherdirection opposite to the second law of thermodynamicsSchrodinger Wiener and Prigogine all have a common ideathat the system can input the external entropy (information)flow from the environment to resist the increasing trend ofentropy within the system )ey measure the amount ofexternal entropy (information) flow by the amount ofchange in entropy (information) within the system causedby the external entropy (information) flow Because theexternal entropy (information) flow may cause the entropydecrease effect within the system consequently the quantityof this external entropy (information) flow can be measuredby the degree of the entropy decrease effect within the systemcaused by it Moreover it is simultaneously and relativelydefined as negative entropy

It seems that the function ΣPi Log Pi has some uniquevalue In the static state its absolute value indicates theuncertainty of the micro state in which the system is locatedDynamically the change of the function value indicates thechange of the uncertainty of the micro state in which thesystem is located )is change is caused by a change in thevalue of n that indicates the number of micro states of thesystem and a change in the Pi probability distribution Ingeneral the increase in the value of n and the tendencytowards equilibrium of Pi value result in the process ofentropy increase while the decrease in the value of n and thetendency towards differentiation of Pi value result in theprocess of entropy decrease As for the general idea thatinformation and entropy are regarded as opposites it ismore like an artificial formulation )e statement that the

Complexity 7

entropy decrease effect of the system is caused by the ex-ternal information is equivalent to the statement that thesystemrsquos entropy decrease effect is caused by the entropyflow introduced by the external environment Prigogine usesexternal entropy flow Schrodinger uses negative entropyand Wiener uses both information and negative entropy infact they are measuring the quantity of the same type ofchange in the same process We have every reason to regardthe various theories of entropy information and negativeentropy as theories about entropy quantity and at the sametime we have all the reasons to regard them as theories aboutinformation quantity Based on this we can establish a kindof generalized entropy theory or a generalized informationtheory to unify the discussions of the quantity of entropy andinformation that have been carried out and are ongoing indifferent disciplines

5 Comments and Conclusions

At this point we are able to evaluate and summarize thenature of the relevant information entropy or informationnegative entropy theories

Firstly the information entropy or negative entropytheories are the relative measurement for the structurali-zation degree at a specific level of the system which hascertain characteristics of relativity )e information entropytheory measures the relationship between the diversity ofstructural organization and the degree of uncertainty dif-ference at a specific level of the system while the negativeentropy theory of information measures the diversity ofstructural organization methods and the degree of uncer-tainty reduction at a specific level of the system

Secondly we notice that in the general theory theconcept of information is defined and explained in a specialsense as negative entropy )ese are two related statementsin communication and control theory ldquoinformation is theeliminated uncertaintyrdquo and ldquoinformation is negative en-tropyrdquo However these two statements only emphasize therole of information to the receiver which is the functionaldefinition of information from a specific perspective )isdefinition does not reveal what the information is At mostit only emphasizes from a specific perspective the role ofinformation to the receiver It is impossible to reveal thenature of information from such an interpretation

Moreover the information entropy or negative entropytheories are only to measure the state of a certain aspect ofthe system and the degree of state change in that aspect by acertain calculation method of amount In this regard theinformation entropy or negative entropy theories have theproperty of specifically defined quantitative characteristic

In terms of such characteristics of relativity function-ality and quantification the theory of information entropyor negative entropy is only a technical quantitative pro-cessing method for mechanical communication and con-trolling processes in essence not a theory about the nature ofinformation

It is necessary to mention here that as early as 1928Hartley (1890ndash1970) an American communications expertpointed out in the article ldquoTransmission of Informationrdquo that

ldquoInformation refers to the message with new content andknowledgerdquo ([21] (page 535)) )is is also an acquireddefinition of information recognized and expressed bypeople in their daily life and in general literature Obviouslythis definition is in line with the meanings of ldquoinformation isthe eliminated uncertaintyrdquo and ldquoinformation is negativeentropyrdquo mentioned earlier and it is formulated in the senseof whether the message can bring new content to the re-ceiver Obviously such a definition is also relative andfunctional and cannot be used as an essential explanation ofwhat information is

Usually people always regard the ldquoinformation is neg-ative entropyrdquo as the standard definition of information byWiener and interpret the general nature of information fromthis However they did not seriously conduct discriminationand analysis but extended the explanation made by Wieneronly in the sense of the quantitative description of therelative functionalization of technical processing to thegeneral universal scope at will In factWienerrsquos statement onldquoinformation is negative entropyrdquo is just a practical inter-pretation of communication and control information fromthe perspective of technical processing by using the existingcalculationmethods of entropy and is only a kind of measureof the amount of practical information What he seeks isonly a method of quantitative processing realized by tech-nology but not to reveal the general nature of information atall By the same token the statement that ldquoinformation is theeliminated uncertaintyrdquo focuses only on a quantitativeprocessing method realized by technology As some scholarspointed out in the interdisciplinary research on informationldquoWienerrsquos types of mathematical definitions of informationrelated to mathematical or physical concepts of negativeentropy cannot adequately encompass the experientialembodied pragmatic semantic meaningful content of or-dinary sign games of living systems and the language gamesof embodied conscious humansrdquo ([22] (pages 622ndash633))

In fact Wiener himself is very clear in what sense hisldquoinformation is negative entropyrdquo is used because when heput forward this statement he also made a correspondingdiscussion on the general nature of information He has twoinfluential formulations One is ldquoinformation is informationnot matter or energy No materialism which does not admitthis can survive in the present dayrdquo ([15] (page 133)) )esecond is ldquoinformation is the name of the content that weexchange with the external world in the process of adaptingto it and making this adaptation felt by the external worldrdquo([23] (page 4))

Obviously the first sentence of Wiener emphasizes theontological status of information Although he failed tocorrectly define the nature of information from the positiveside in this sentence he correctly emphasized the inde-pendent value and significance of information comparedwith matter (quality) and energy and he also put forward awarning about the relevant materialism theory that failed tomake a reasonable interpretation of the ontological status ofinformation

Wienerrsquos second sentence further emphasizes the need toclarify the general nature of information Instead of simplyfocusing on the form of the information carrier or the

8 Complexity

function of the information we should grasp the infor-mation based on what we ldquoexchange with the outside worldrdquoSince it is ldquoexchangerdquo there should be in and out In this waythere is information not only within our subject but also inthe external environment In this regard the correspondingdoctrines of objective information and subjective infor-mation should be valid)is also shows the true charm of thesaying that ldquoinformation is information not matter or en-ergyrdquo which Wiener emphasizes

It is regrettable that for a long time Wienerrsquos clearwarning to philosophy has not attracted the attention ofmore philosophers and scientists Not only has the revo-lutionary value of information for the development ofphilosophy not been clearly revealed but also unified in-formation science has not been established because theestablishment of unified information science must be basedon the general theory of information philosophy

In addition we should also note that the statementsldquoinformation is the eliminated uncertaintyrdquo and ldquoinforma-tion is negative entropyrdquo are also single-faceted in the senseof functional definition Because in the real world the roleof information is multifaceted and multilayered it can notonly eliminate uncertainty but also increase uncertainty itcan play the role of negative entropy as well as the role ofentropy For example when a person is sick he should takemedicine to eliminate the disorder caused by the disease inhis body but what happens if he takes the wrong medicineObviously the medicine will provide him with the corre-sponding information but this information does not alwaysplay a role in eliminating uncertainty or negative entropy Insome cases it may play the opposite role which is to increaseuncertainty or entropy

An ancient Chinese literature ldquoStratagems of the War-ring States Qin Stratagem IIrdquo tells a parable of ldquoterrifyingrumorrdquo It was said that Zeng Zirsquos mother was weaving athome and a neighbor came to tell her that ldquoZeng Zi haskilled someonerdquo Zeng Zirsquos mother did not believe and saidldquoI know my son he will not kill peoplerdquo She continued toweave calmly After a while another neighbor came to tellher that ldquoZeng Zi has killed someonerdquo Zeng Zirsquos mother stilldid not believe it and said ldquoHe wonrsquot kill anyonerdquo andcontinued to weave However when the third neighborcame to tell her ldquoZeng Zi has killed someonerdquo finally ZengZirsquos mother could not sit still and she put down her workand fled across the wall

In this parable what effect does the information thatldquoZeng Zi has killed someonerdquo have on his mother Is itentropy or negative entropy Is it entropy increase or de-crease Is uncertainty increased or eliminated

Also if we generalize the functional definition of ldquoin-formation is the eliminated uncertaintyrdquo then we will seesome very ridiculous scenarios In a book published as early as1987 Wu once wrote ldquo)e role of information is funda-mentally different from what information itself is )e natureof information can only be sought from the inner basis of itsown content but cannot be formulated simply by its effect ona certain aspect of the sink Just as the definition of foodcannot be lsquoeliminated hungerrsquo the definition of informationcannot be lsquoeliminated uncertaintyrsquordquo ([24] (page 8))

Finally here comes the most essential aspect that shouldbe emphasized that is the relationship of structural dif-ferences at specific levels of the system measured only fromthe aspect of the form which does not aim at the infor-mation itself but merely aims at the structural characteristicsof the information carrier itself Because of this it is im-possible to deduce the general nature of information directlyfrom such a theory It is no wonder that some westernscholars have clearly and reasonably pointed out that ldquoIn-formation theory deals with the carrier of informationsymbols and signals not information itselfrdquo and ldquoInfor-mation theory does not deal with the information itself butthe carrier of the informationrdquo ([25] (page 150))

Since the calculation of the quantity of entropy andnegative entropy involves the probability distribution of themicro states of the system being measured it is reasonablethat relevant viewpoints such as the degree of orderly ordisorderly organization (order) of the system ldquodegree ofvariationrdquo ldquodifferences and constraintsrdquo ldquosymmetrybreakingrdquo ldquodifference that makes a differencerdquo ldquoform andstructurerdquo and ldquostate of thingsrdquo are directly derived from thetheory of entropy and negative entropy Since related viewssuch as these are directly deduced from the theories aboutthe quantity of entropy and negative entropy it is alsoimpossible to obtain the formulation of the general nature ofinformation through them

Obviously to reveal the essence of information weshould not just focus on the differential relationship of thecarrier forms but we must understand the contents ofrelevant properties characteristics existing modes andstates of the things itself presented by the information

In an article published as early as 1986 Wu wrote thefollowing sentences ldquoinformation is the formulation ofsomething itself displayed in another that alienated bysomething itself it is the indirect existence of somethingitself which exist in other things Information is theformulation of something revealed in the relationship be-tween something and other things Something is informa-tion when it displays itself as internal formulation in anexternal relationship which is expressed in the form ofexternalization of the characteristics of the objectrdquo ([26](page 19))

Based on the content of information and the dynamicmechanism of natural occurrence of information Wu onceclearly defined information as follows ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation of the existing mode and status of matter(direct being)rdquo in a paper entitled ldquoPhilosophical Classifi-cation of Information Formsrdquo which was published in 1984([27] (page 33)) In 2019 Wu expanded the definition ofinformation that was only restricted to the level of philo-sophical ontology based on the historical evolution of in-formation forms classified by him ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation and re-manifestation of the existing modeand status of matter (direct being) as well as the subjectivegrasp and creation of information by the subject of cognitionand practice including the cultural world that has beencreatedrdquo ([28] (page 143))

Complexity 9

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

entropy decrease effect of the system is caused by the ex-ternal information is equivalent to the statement that thesystemrsquos entropy decrease effect is caused by the entropyflow introduced by the external environment Prigogine usesexternal entropy flow Schrodinger uses negative entropyand Wiener uses both information and negative entropy infact they are measuring the quantity of the same type ofchange in the same process We have every reason to regardthe various theories of entropy information and negativeentropy as theories about entropy quantity and at the sametime we have all the reasons to regard them as theories aboutinformation quantity Based on this we can establish a kindof generalized entropy theory or a generalized informationtheory to unify the discussions of the quantity of entropy andinformation that have been carried out and are ongoing indifferent disciplines

5 Comments and Conclusions

At this point we are able to evaluate and summarize thenature of the relevant information entropy or informationnegative entropy theories

Firstly the information entropy or negative entropytheories are the relative measurement for the structurali-zation degree at a specific level of the system which hascertain characteristics of relativity )e information entropytheory measures the relationship between the diversity ofstructural organization and the degree of uncertainty dif-ference at a specific level of the system while the negativeentropy theory of information measures the diversity ofstructural organization methods and the degree of uncer-tainty reduction at a specific level of the system

Secondly we notice that in the general theory theconcept of information is defined and explained in a specialsense as negative entropy )ese are two related statementsin communication and control theory ldquoinformation is theeliminated uncertaintyrdquo and ldquoinformation is negative en-tropyrdquo However these two statements only emphasize therole of information to the receiver which is the functionaldefinition of information from a specific perspective )isdefinition does not reveal what the information is At mostit only emphasizes from a specific perspective the role ofinformation to the receiver It is impossible to reveal thenature of information from such an interpretation

Moreover the information entropy or negative entropytheories are only to measure the state of a certain aspect ofthe system and the degree of state change in that aspect by acertain calculation method of amount In this regard theinformation entropy or negative entropy theories have theproperty of specifically defined quantitative characteristic

In terms of such characteristics of relativity function-ality and quantification the theory of information entropyor negative entropy is only a technical quantitative pro-cessing method for mechanical communication and con-trolling processes in essence not a theory about the nature ofinformation

It is necessary to mention here that as early as 1928Hartley (1890ndash1970) an American communications expertpointed out in the article ldquoTransmission of Informationrdquo that

ldquoInformation refers to the message with new content andknowledgerdquo ([21] (page 535)) )is is also an acquireddefinition of information recognized and expressed bypeople in their daily life and in general literature Obviouslythis definition is in line with the meanings of ldquoinformation isthe eliminated uncertaintyrdquo and ldquoinformation is negativeentropyrdquo mentioned earlier and it is formulated in the senseof whether the message can bring new content to the re-ceiver Obviously such a definition is also relative andfunctional and cannot be used as an essential explanation ofwhat information is

Usually people always regard the ldquoinformation is neg-ative entropyrdquo as the standard definition of information byWiener and interpret the general nature of information fromthis However they did not seriously conduct discriminationand analysis but extended the explanation made by Wieneronly in the sense of the quantitative description of therelative functionalization of technical processing to thegeneral universal scope at will In factWienerrsquos statement onldquoinformation is negative entropyrdquo is just a practical inter-pretation of communication and control information fromthe perspective of technical processing by using the existingcalculationmethods of entropy and is only a kind of measureof the amount of practical information What he seeks isonly a method of quantitative processing realized by tech-nology but not to reveal the general nature of information atall By the same token the statement that ldquoinformation is theeliminated uncertaintyrdquo focuses only on a quantitativeprocessing method realized by technology As some scholarspointed out in the interdisciplinary research on informationldquoWienerrsquos types of mathematical definitions of informationrelated to mathematical or physical concepts of negativeentropy cannot adequately encompass the experientialembodied pragmatic semantic meaningful content of or-dinary sign games of living systems and the language gamesof embodied conscious humansrdquo ([22] (pages 622ndash633))

In fact Wiener himself is very clear in what sense hisldquoinformation is negative entropyrdquo is used because when heput forward this statement he also made a correspondingdiscussion on the general nature of information He has twoinfluential formulations One is ldquoinformation is informationnot matter or energy No materialism which does not admitthis can survive in the present dayrdquo ([15] (page 133)) )esecond is ldquoinformation is the name of the content that weexchange with the external world in the process of adaptingto it and making this adaptation felt by the external worldrdquo([23] (page 4))

Obviously the first sentence of Wiener emphasizes theontological status of information Although he failed tocorrectly define the nature of information from the positiveside in this sentence he correctly emphasized the inde-pendent value and significance of information comparedwith matter (quality) and energy and he also put forward awarning about the relevant materialism theory that failed tomake a reasonable interpretation of the ontological status ofinformation

Wienerrsquos second sentence further emphasizes the need toclarify the general nature of information Instead of simplyfocusing on the form of the information carrier or the

8 Complexity

function of the information we should grasp the infor-mation based on what we ldquoexchange with the outside worldrdquoSince it is ldquoexchangerdquo there should be in and out In this waythere is information not only within our subject but also inthe external environment In this regard the correspondingdoctrines of objective information and subjective infor-mation should be valid)is also shows the true charm of thesaying that ldquoinformation is information not matter or en-ergyrdquo which Wiener emphasizes

It is regrettable that for a long time Wienerrsquos clearwarning to philosophy has not attracted the attention ofmore philosophers and scientists Not only has the revo-lutionary value of information for the development ofphilosophy not been clearly revealed but also unified in-formation science has not been established because theestablishment of unified information science must be basedon the general theory of information philosophy

In addition we should also note that the statementsldquoinformation is the eliminated uncertaintyrdquo and ldquoinforma-tion is negative entropyrdquo are also single-faceted in the senseof functional definition Because in the real world the roleof information is multifaceted and multilayered it can notonly eliminate uncertainty but also increase uncertainty itcan play the role of negative entropy as well as the role ofentropy For example when a person is sick he should takemedicine to eliminate the disorder caused by the disease inhis body but what happens if he takes the wrong medicineObviously the medicine will provide him with the corre-sponding information but this information does not alwaysplay a role in eliminating uncertainty or negative entropy Insome cases it may play the opposite role which is to increaseuncertainty or entropy

An ancient Chinese literature ldquoStratagems of the War-ring States Qin Stratagem IIrdquo tells a parable of ldquoterrifyingrumorrdquo It was said that Zeng Zirsquos mother was weaving athome and a neighbor came to tell her that ldquoZeng Zi haskilled someonerdquo Zeng Zirsquos mother did not believe and saidldquoI know my son he will not kill peoplerdquo She continued toweave calmly After a while another neighbor came to tellher that ldquoZeng Zi has killed someonerdquo Zeng Zirsquos mother stilldid not believe it and said ldquoHe wonrsquot kill anyonerdquo andcontinued to weave However when the third neighborcame to tell her ldquoZeng Zi has killed someonerdquo finally ZengZirsquos mother could not sit still and she put down her workand fled across the wall

In this parable what effect does the information thatldquoZeng Zi has killed someonerdquo have on his mother Is itentropy or negative entropy Is it entropy increase or de-crease Is uncertainty increased or eliminated

Also if we generalize the functional definition of ldquoin-formation is the eliminated uncertaintyrdquo then we will seesome very ridiculous scenarios In a book published as early as1987 Wu once wrote ldquo)e role of information is funda-mentally different from what information itself is )e natureof information can only be sought from the inner basis of itsown content but cannot be formulated simply by its effect ona certain aspect of the sink Just as the definition of foodcannot be lsquoeliminated hungerrsquo the definition of informationcannot be lsquoeliminated uncertaintyrsquordquo ([24] (page 8))

Finally here comes the most essential aspect that shouldbe emphasized that is the relationship of structural dif-ferences at specific levels of the system measured only fromthe aspect of the form which does not aim at the infor-mation itself but merely aims at the structural characteristicsof the information carrier itself Because of this it is im-possible to deduce the general nature of information directlyfrom such a theory It is no wonder that some westernscholars have clearly and reasonably pointed out that ldquoIn-formation theory deals with the carrier of informationsymbols and signals not information itselfrdquo and ldquoInfor-mation theory does not deal with the information itself butthe carrier of the informationrdquo ([25] (page 150))

Since the calculation of the quantity of entropy andnegative entropy involves the probability distribution of themicro states of the system being measured it is reasonablethat relevant viewpoints such as the degree of orderly ordisorderly organization (order) of the system ldquodegree ofvariationrdquo ldquodifferences and constraintsrdquo ldquosymmetrybreakingrdquo ldquodifference that makes a differencerdquo ldquoform andstructurerdquo and ldquostate of thingsrdquo are directly derived from thetheory of entropy and negative entropy Since related viewssuch as these are directly deduced from the theories aboutthe quantity of entropy and negative entropy it is alsoimpossible to obtain the formulation of the general nature ofinformation through them

Obviously to reveal the essence of information weshould not just focus on the differential relationship of thecarrier forms but we must understand the contents ofrelevant properties characteristics existing modes andstates of the things itself presented by the information

In an article published as early as 1986 Wu wrote thefollowing sentences ldquoinformation is the formulation ofsomething itself displayed in another that alienated bysomething itself it is the indirect existence of somethingitself which exist in other things Information is theformulation of something revealed in the relationship be-tween something and other things Something is informa-tion when it displays itself as internal formulation in anexternal relationship which is expressed in the form ofexternalization of the characteristics of the objectrdquo ([26](page 19))

Based on the content of information and the dynamicmechanism of natural occurrence of information Wu onceclearly defined information as follows ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation of the existing mode and status of matter(direct being)rdquo in a paper entitled ldquoPhilosophical Classifi-cation of Information Formsrdquo which was published in 1984([27] (page 33)) In 2019 Wu expanded the definition ofinformation that was only restricted to the level of philo-sophical ontology based on the historical evolution of in-formation forms classified by him ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation and re-manifestation of the existing modeand status of matter (direct being) as well as the subjectivegrasp and creation of information by the subject of cognitionand practice including the cultural world that has beencreatedrdquo ([28] (page 143))

Complexity 9

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

function of the information we should grasp the infor-mation based on what we ldquoexchange with the outside worldrdquoSince it is ldquoexchangerdquo there should be in and out In this waythere is information not only within our subject but also inthe external environment In this regard the correspondingdoctrines of objective information and subjective infor-mation should be valid)is also shows the true charm of thesaying that ldquoinformation is information not matter or en-ergyrdquo which Wiener emphasizes

It is regrettable that for a long time Wienerrsquos clearwarning to philosophy has not attracted the attention ofmore philosophers and scientists Not only has the revo-lutionary value of information for the development ofphilosophy not been clearly revealed but also unified in-formation science has not been established because theestablishment of unified information science must be basedon the general theory of information philosophy

In addition we should also note that the statementsldquoinformation is the eliminated uncertaintyrdquo and ldquoinforma-tion is negative entropyrdquo are also single-faceted in the senseof functional definition Because in the real world the roleof information is multifaceted and multilayered it can notonly eliminate uncertainty but also increase uncertainty itcan play the role of negative entropy as well as the role ofentropy For example when a person is sick he should takemedicine to eliminate the disorder caused by the disease inhis body but what happens if he takes the wrong medicineObviously the medicine will provide him with the corre-sponding information but this information does not alwaysplay a role in eliminating uncertainty or negative entropy Insome cases it may play the opposite role which is to increaseuncertainty or entropy

An ancient Chinese literature ldquoStratagems of the War-ring States Qin Stratagem IIrdquo tells a parable of ldquoterrifyingrumorrdquo It was said that Zeng Zirsquos mother was weaving athome and a neighbor came to tell her that ldquoZeng Zi haskilled someonerdquo Zeng Zirsquos mother did not believe and saidldquoI know my son he will not kill peoplerdquo She continued toweave calmly After a while another neighbor came to tellher that ldquoZeng Zi has killed someonerdquo Zeng Zirsquos mother stilldid not believe it and said ldquoHe wonrsquot kill anyonerdquo andcontinued to weave However when the third neighborcame to tell her ldquoZeng Zi has killed someonerdquo finally ZengZirsquos mother could not sit still and she put down her workand fled across the wall

In this parable what effect does the information thatldquoZeng Zi has killed someonerdquo have on his mother Is itentropy or negative entropy Is it entropy increase or de-crease Is uncertainty increased or eliminated

Also if we generalize the functional definition of ldquoin-formation is the eliminated uncertaintyrdquo then we will seesome very ridiculous scenarios In a book published as early as1987 Wu once wrote ldquo)e role of information is funda-mentally different from what information itself is )e natureof information can only be sought from the inner basis of itsown content but cannot be formulated simply by its effect ona certain aspect of the sink Just as the definition of foodcannot be lsquoeliminated hungerrsquo the definition of informationcannot be lsquoeliminated uncertaintyrsquordquo ([24] (page 8))

Finally here comes the most essential aspect that shouldbe emphasized that is the relationship of structural dif-ferences at specific levels of the system measured only fromthe aspect of the form which does not aim at the infor-mation itself but merely aims at the structural characteristicsof the information carrier itself Because of this it is im-possible to deduce the general nature of information directlyfrom such a theory It is no wonder that some westernscholars have clearly and reasonably pointed out that ldquoIn-formation theory deals with the carrier of informationsymbols and signals not information itselfrdquo and ldquoInfor-mation theory does not deal with the information itself butthe carrier of the informationrdquo ([25] (page 150))

Since the calculation of the quantity of entropy andnegative entropy involves the probability distribution of themicro states of the system being measured it is reasonablethat relevant viewpoints such as the degree of orderly ordisorderly organization (order) of the system ldquodegree ofvariationrdquo ldquodifferences and constraintsrdquo ldquosymmetrybreakingrdquo ldquodifference that makes a differencerdquo ldquoform andstructurerdquo and ldquostate of thingsrdquo are directly derived from thetheory of entropy and negative entropy Since related viewssuch as these are directly deduced from the theories aboutthe quantity of entropy and negative entropy it is alsoimpossible to obtain the formulation of the general nature ofinformation through them

Obviously to reveal the essence of information weshould not just focus on the differential relationship of thecarrier forms but we must understand the contents ofrelevant properties characteristics existing modes andstates of the things itself presented by the information

In an article published as early as 1986 Wu wrote thefollowing sentences ldquoinformation is the formulation ofsomething itself displayed in another that alienated bysomething itself it is the indirect existence of somethingitself which exist in other things Information is theformulation of something revealed in the relationship be-tween something and other things Something is informa-tion when it displays itself as internal formulation in anexternal relationship which is expressed in the form ofexternalization of the characteristics of the objectrdquo ([26](page 19))

Based on the content of information and the dynamicmechanism of natural occurrence of information Wu onceclearly defined information as follows ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation of the existing mode and status of matter(direct being)rdquo in a paper entitled ldquoPhilosophical Classifi-cation of Information Formsrdquo which was published in 1984([27] (page 33)) In 2019 Wu expanded the definition ofinformation that was only restricted to the level of philo-sophical ontology based on the historical evolution of in-formation forms classified by him ldquoInformation is aphilosophical category indicating indirect being It is theself-manifestation and re-manifestation of the existing modeand status of matter (direct being) as well as the subjectivegrasp and creation of information by the subject of cognitionand practice including the cultural world that has beencreatedrdquo ([28] (page 143))

Complexity 9

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

)e relevant discussion in this paper was not to negatethe great success of entropy and negative entropy theories inphysics communication and information science andtechnology artificial intelligence science and technology lifescience technology and other related science and technologyfields )e main purpose of the article was to reveal thespecific properties of the entropy and negative entropytheories )at is what those theories reveal are only thequantitative formulations of the static or dynamic relativedifference in the formal structure of the information carrierSuch a provision does not involve the essence of the in-formation itself )is scenario also stipulates many com-parative interpretations of the nature of information basedon entropy and negative entropy theories which are alsoimpossible to guide us to truly grasp and understand thenature of information In addition from the perspective ofmethodology entropy and negative entropy theories focusonly on the relationship between the material structures ofthe information carrier the method used is still that ofdealing with material phenomena and relationships Al-though the corresponding material structure processingmethod is still technically feasible and successful since thematerial relationships between information and its carrierstructure are corresponding to each other it is necessary toemphasize that since the theories and methods of entropyand negative entropy are not directly concerning the exis-tence mode of information itself as well as the meaning andvalue of information to truly reveal the nature of infor-mation and the fundamental difference between it andmaterial phenomena we need to find another way which isthe research level and research method based on a morecomprehensive and general meta science or meta philosophyand focusing on the existence mode of information itself andits meaning and value

Data Availability

)e data used to support the findings of this study areavailable from the corresponding author upon request

Conflicts of Interest

)e authors declare no conflicts of interest

Authorsrsquo Contributions

KunWu is the host of the project Qiong Nan and TianqiWuare both participants in the project

Acknowledgments

)is article was funded by a major project of the NationalSocial Science Foundation of China )e History PresentSituation and Future of Information Philosophy (ProjectApproval no 18ZDA027)

References

[1] C Shannon ldquo)e mathematical theory of communicationrdquoBellsystem Technical Journal vol 27 pp P379ndashP423 1948

[2] NWiener Cybernetics Or Control and Communication in theAnimal and the Machine Technology PressJohn Wiley ampSons New York NY USA 1948

[3] B Castellani and R Rajaram ldquoPast the power law complexsystems and the limiting law of restricted diversityrdquo Com-plexity vol 21 no S2 pp 99ndash112 2016

[4] J E Contreras-Reyes ldquoRenyi entropy and complexity mea-sure for skew-Gaussian distributions and related familiesrdquoPhysica A Statistical Mechanics and its Applications vol 433pp 84ndash91 2015

[5] R Arellano-Valle J Contreras-Reyes and M StehlıkldquoGeneralized skew-normal negentropy and its application tofish condition factor time seriesrdquo Entropy vol 19 no 10p 528 2017

[6] R Clausius ldquoUber verschiedene fur die Anwendung bequemeFormen der Hauptgleichungen der mechanischenWarmetheorierdquo Abhandlungen Uber Die MechanischeWarmetheorie vol 2 p 34 1867

[7] L BoltzmannVorlesungen uber Gastheorie vol II Leipzig JABarth translated together with Volume I by SG BrushUniversity of California Press Berkeley CA USA Lectureson Gas )eory University of California Press Berkeley CAUSA 1964

[8] C Shannon Fe Mathematical Feory of Communication inFeoretical Foundations of Information Feory ShanghaiScience and Technology Compilation Museum ShanghaiChina 1965

[9] J Locke An Essay Concerning Human UnderstandingW Yolton Ed Dutton New York NY USA 1961

[10] D Hume A Treatise of Human Nature L A Selby-Bigge EdClarendon Press Oxford UK 1896

[11] C Shannon and W Weaver Fe Mathematical Feory ofCommunication University of Illinois Press Urbana IL USA1949

[12] P Yuanzheng and L Jianhua Selected Compilation of ClassicalDocuments of System Feory Classical Documents of SystemFeory Cybernetics and Information Feory vol 614ndash616Beijing Pragmatic Press Beijing China 1989

[13] E SchrodingerWhat is Life Fe Physical Aspect of the LivingCell Cambridge University Press Cambridge UK 1994

[14] E SchrodingerWhat is life L Laiou and L Liaofu Eds vol69ndash70 Changsha Hunan Science and Technology PressChangsha China 2003

[15] N Wiener Cybernetics H Jiren Ed Beijing Science PressBeijing China 1963

[16] K Wu ldquo)e difference and unity of information quantity for-mulas of Shannon and Wiener from the perspective of phi-losophyrdquoYanji Journal of YanbianUniversity vol Z1 pp 33-341987

[17] I Prigogine Etude Fermodynamique des Phenomenes Irre-versibles Dunod Paris France 1947

[18] I Prigogine and I Stengers Order Out of Chaos Manrsquos NewDialogue with Nature Bantam New York NY USA 1984

[19] K Wu ldquoAnalysis of the scientific meaning of several conceptsrelated to entropyrdquo Journal of Dialectics of Nature vol 5pp 67ndash74 1996

[20] C Tielin Entropy Increase and Negative Entropy Increase andProposal of Fe Law of Conservation of Entropy QuantityBeijing Studies in Dialectics of Nature Beijing China 1992

[21] R V L Hartley ldquoTransmission of informationrdquo Bell SystemTechnical Journal vol 535 no 7 1928

[22] S Brier ldquoFinding an information concept suited for a uni-versal theory of informationrdquo Progress in Biophysics andMolecular Biology vol 119 no 3 pp 622ndash633 2015

10 Complexity

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11

[23] N Wiener Selected Works of Wiener Z Ren Ed ShanghaiShanghai Translation Press Shanghai China 1978

[24] K Wu and L Qi Introduction to Philosophical InformationShaanxi Shaanxi Peoplersquos Press Shaanxi China 1987

[25] L Floridi Guide to the Philosophy of Computing and Infor-mation (I) L Gang Ed Beijing Commercial Press BeijingChina 2010

[26] K Wu ldquoOn in-itself informationrdquo Shanghai AcademicMonthly vol 19 1986

[27] K Wu ldquoPhilosophical classification of information formsrdquoBeijing Potential Science vol 33 no 3 1984

[28] K Wu W Jian and W Tianqi An Introduction to thePhilosophy of Information Xirsquoan Jiaotong University PressXirsquoan China 2019

Complexity 11