One Problem – One Thousand Faces (IS4IS video script)

Page 1

ONE PROBLEM – ONE THOUSAND FACES (Bridging Philosophy and Science via “Core Informatics”) SCRIPT: Hi, this is Marcus. This technical talk covers material I presented at the 2015 summit of the International Society for Information Studies. It covers parts of a paper that is even more detailed. Links to the full paper are given at the end. To introduce this talk, philosophy and science often appear as two opposed views, but this ignores their shared origin as Natural Philosophy. Natural Philosophy arose in early models of Earthly cause-and-effect, and today, cause-and-effect remains the foundation of all philosophy and science. In creating an “informational edifice,” our strength in asking philosophic questions and finding scientific answers has produced such informational abundance that philosophy and science now seem separate – divided by the One Thousand Faces noted in the title. But despite our gains, explanatory gaps still linger, such as: “How does life arise from physics?” and “How does consciousness emerge from life, and what is intelligence?,” and so on. To help address such issues, this talk revisits our interest in cause-and-affect to infer a new line of inquiry. It reframes matters in terms of information science or “core informatics” so that new models may arise. Such a foundational informatics may enhance our philosophic and scientific views, as both are information processes. To begin, when we speak of science we target a clear description and explanation of events. But to cover what lies beneath science, an more basic view is needed. Core informatics thus study the dynamic presentation and processing of all natural events, and from which scientific rules arise. This means that on the surface many types of information exist, where scientific information is a fraction of that larger informational pool. Moreover, several basic informational issues lie beyond classic scientific thought, as with the explanatory gaps noted earlier. In studying the presentation and process of natural events – or what I call information dynamics or the reading of information – a plain view can speed progress. For example, in information technology (or IT) a hard disk drive (or HDD) offers a simple informational model and can help resolve some of those issues. From this HDD start, this analysis explores ways in which those issues ARE and ARE NOT covered by current thinking on information. 2:47, slide 4 start (+9) To begin this analysis, imagine an HDD in a computer before an operating system or applications are installed; before letters, photos and music are loaded – that is to say an HDD without useful content. Inversely, we can also imagine an HDD where useful content does exists. This points to a key question: “What is the central difference between those two informational states?” What is the difference between useless and useful content? A direct physical study of an HDD’s before-and-after states shows no real difference, even in the minute Bits. Despite suggestions from physicist John Wheeler, no informational it from Bit is instantly obvious. Only when several Bits are shown in a specific order, and where that order


drives a specific function, does useful content arise. This order and “meaning” are conveyed via metadata, usually recorded on track0 of an HDD. Without metadata HDD content is not possible. In starting with this routine view of metadata we lessen the issues we must confront. First, metadata offer a firm model of information – as the presentation of: 1) elemental data, 2) with a recurrent logical order, 3) that drives specific “meaning” or functions. Metadata thus answer the question of “What is information?” Second, metadata speak to the symbol grounding problem by showing data and symbols in functionally grounded roles. Third, while Chalmers’s Hard Problem is not directly resolved, its answer only requires that we name a material basis for metadata, which I cover shortly. Metadata interpretations of HDD Bits sustain all steps of using a computer and a wide range of useful outputs. This simple-to-complex informational scale is, in fact, a key benefit of ASCII metadata. That informational scale affirms that many styles of information exist, but then this variety stirs confusion when studying information, since naming the correct point for starting one’s analysis is unclear. To avoid debate on which of the One Thousand Faces is most central to studying information, a reductive view is needed. Philosopher Immanuel Kant states that to examine what is essentially “an experience” we must consider what exists prior to that experience – an a priori model is required. 5:24, slide 6 start (+9, +9) Further, as with information, it is vital to see many styles of metadata exist. DNA, the alphabet, a map, or a computer program are all types of metadata. Metadata variety also makes defining a model of information difficult, unless we focus on core elements. Opportunely, HDD metadata lend themselves to an a priori view. I therefore explore metadata in detail to develop an a priori model with core informational elements. But first, I raise a certain hurdle. In attempting a scientific model our terms must be exact. That clarity is what makes science, Science. But when we look at definitions for metadata we often see an off-hand notion of “data about data.” This simple framing has many flaws but foremost is that no rationale is given for how “data” might differ from “data.” If we seek a better concept, Bateson’s “difference that makes a difference” is similar and more specific. But Bateson also saw his own terms did not cover enough and stated “differences themselves must be differentiated” – but with no more detail. Further, Cyberneticist Sören Brier notes that “to whom or to what” it makes a difference is never noted – or information subjectivity. To this I add that “to what end” it makes a difference is not named – or information objectivity; marking gaps in Bateson’s view of information. Still, Bateson’s differentiated differences is central to a reductive view, since metadata are chiefly a record of differentiated differences. I thus explore metadata to see what lies beneath and to infer an a priori model. This should also help to realize Bateson’s deeper aim of naming a Necessary Unity, which is also a goal of this analysis. I begin this reconstruction of metadata by using the most basic view possible, that of binary digits or Bits. 7:19, slide 9 start (+26 = 9, 9, 8)


To start, I ask you to imagine a binary cosmos without metadata; no awareness exists of what these Bits might mean; we begin with an a priori state. In this binary flatland only two states are possible. If we could observe those states we would see random events as disorder, and regular traits as a type of generic order. This mirrors the randomness of cosmic background radiation, and the natural order of stars, planets, life, and so on. Further, it reflects Shannon’s view of noise and signal. We call these two states many things but I label one side generic entropy and the other I label recurrent material order. Beyond this simple duality, we may also see transitions occurring between the two states. Those transitions then indicate a crude “random versus recurrent” functioning – but with no details. Thus, primal duality, WITH crude functional roles, presents a useful difference for an entity that can observe and exploit that three part role. In fact, any agent-driven attempt to read and exploit that role adds useful details, via practical outcomes. I label this useful difference “delta Z.” Following Bateson’s logic, delta Z presents a functional difference that may make a Darwinian difference. This means that material or reproductive gains afforded by an entity’s useful sensate differentiations outstrip Chalmers’s Hard Problem. It answers all questions on the nature of useful experiences in terms of enduring life or likely death. Further, it names the one shared trait common to any material aspect with a fixed identity – that of material resilience. I call this core dualistic-triune role delta Z for many reasons, but most notably because ShannonWeaver name signal entropy Level A and assume a prior existing message. This means a “prior to Level A” role must be explained – which delta Z covers. Finally, while delta Z is close to an a priori view it is no longer a priori, since we are now discussing things experienced as disorder and order. 9:46, slide 11 start (+41) To resume this metadata reconstruction, I return to the binary cosmos and study its order, since delta Z tells us the ordered side is more useful. From an uninformed start, without metadata, how do we make sense of those Bits? We may experience this cosmos in many ways, but which has functional significance? Functions are found only in selecting one of many likely relationships, attempting a specific role, and then recording the results – where that record later informs us. Trial-and-error thus mediates value as failure or success in selected roles, from many material possibilities. It refines delta Z’s crude functioning in an ensuing layer of differentiated roles. Also, if memory is NOT linked to that mediation, trial-and-error must be wastefully re-done for every similar situation, to create similar results. This presence of differentiated functional roles marks evidence of self-sustaining systems that benefit from and advance those roles. The manner in which a role arises thus speaks to the nature of a system and its traits. If a function arises purely via recurring ad hoc material properties, it is a materially direct role that helps to constitute delta Z’s order. If a function arises as a trial-anderror parsing and probing of material traits, and a record of outputs, it is a mediated role. Mediation denotes a NEW order of indirect logic, beyond purely material roles. We often link mediated logic to living entities, where the roles are then validated or negated via natural selection.


A core difference between direct and indirect logic is the number of traits an entity must resolve to differentiate a function. Complex or ambiguous traits thus demand greater trail-and-error and more sensate skills. As a simple example, gold and platinum rarely bond with other elements, but then potassium and sodium are highly reactive. As such, gold and platinum have few useful material possibilities and are more direct in their material roles, than are potassium and sodium. Similar examples are possible for all physical, mechanical, and biological systems in the cosmos, which then suggests a continuum of roles. Further, direct versus indirect comparisons imply a range of energy needs across diverse systems; which pushes the modeling of systems beyond thermodynamic views and instead invokes many forms of energy. Lastly, mixed or hybrid roles inevitably fill the space between direct and mediated systems, again implying a continuum of roles rather than pristine systems. I label this continuous sensate mediation of functional roles Empiric Entropy, or delta E, and see it as a subset of delta Z’s order. Between delta Z and delta E likenesses and differences arise. Both have two object terms, 1s and 0s, but the Bit relationships differ greatly. Externally, delta Z’s Bit roles are infinite and delta E’s are finite, which indicates a drop in generic entropy. This entropic drop means that uncertainty is lessened, but not removed; so delta E can still appear rather noisy. To model this entropic drop the differences beneath delta Z and delta E must be differentiated. 13:15, slide 15 start (+1:18) I explain these sub-ordinate object differences, or delta O, and subject differences, or delta S, by returning to the ASCII metadata. I call your attention to the binary end of the table. To model delta O and delta S I use just two places and not the usual eight, for the most basic view possible. Thus, delta O has two object terms and delta S shows four two-term values; and anything else is noise. Delta O and delta S jointly convey Shannon’s model of information or Signal Entropy, but to highlight useful data they are shown apart. Here, we see delta O, delta E, and delta Z have identical object terms, 1s and 0s; but the object roles change each time. This means that shifts in delta S alone mark entropic drops or useful data, and merit special attention. Delta S’s role is noted in the literature, first in 1977 as Gibson’s affordance, and later as Kaufmann’s more literal adjacent possibilities. Other names are possible for delta S but they all point to diverse ways of invoking the role of delta S. To expand on shifts in delta S, I hold delta O constant and show delta S being variable in its terms, instead of with four unique pairs. Here, shifts in delta S are erratic and preclude useful data; akin to Empiric Entropy. Any type of irregularity evokes the noisy side of delta Z, regardless of how well-ordered the Bits may otherwise be. As a second case, I may assert a cosmos made of only 1s or only 0s. Now, a lack of object differences allow no shift at all in delta S, which bars useful data and marks the polar opposite of noise. For a third case, I might claim that ONLY delta O is important and ignore delta S – typical of material monism. Here, useful


data are not possible since a naked 1 or naked 0 cannot support any functional role or interactive context. All three cases lie beyond what we experience as sensible data. Thus, an a priori view is again seen. This means that useful data must surpass all three cases, such that: 1) data are recurrent, 2) data mark differences, 3) data note a specific role or context. 15:53, slide 18 start (+1:27) For added contrast, I NOW show a case where new object terms DO arise. Shifts in delta O disrupt all prior roles, which incites noise – an increase in generic entropy versus delta S’s earlier entropic drop. But if the new terms show a recurrent role, vis-à-vis delta S, NEW useful prospects arise. Imposing new object terms on a system thus incites a fourth rule as information emergence: if new objects meet delta S criteria, shifts in scale are possible. These mediated entropic increases grow the total number of functional possibilities, also seen in ASCII metadata. Further, functional expansion in the face of generic entropy suggests a style of entropic mimicry in systems that absorb new objects. This model thus shows how delta O and delta S join to afford useful data across informational levels. But information emergence incites a key question, “How do new object terms arise?” Here, recombination affords the presentation of new object terms, by direct and indirect means. Natural recombination grows delta S’s potential is three ways, by shifts in capacity, additive shifts, and reductive shifts – that jointly add complexity. I label this natural expansion delta Q, and see it holding aspects of delta O and delta S, yet standing apart. To enlarge further on delta O and delta S, I resume the metadata reconstruction with a case where SOME metadata exist. Let’s say trial-and-error tells us eight Bit strings are best for working with a binary cosmos. 17:36, start slide 22 (+1:28) This empiric result sets a new object value of “delta O8.” In turn, this selection of 8-Bit strings demands a new subjective order. This last entropic drop, the ordering of 8-Bit strings, I label Aesthetic Entropy, or delta A8. The superscript 8 indicates that delta A8 is JUST ONE of many empiric possibilities from delta E. This also means that delta A8 entails delta O8 and delta S8, where delta S8 is the final useful ordering of Bits in each string, shown already as ASCII metadata. This last entropic drop entails a shift in subject and object roles I call subject-object selection or selection dynamics. Within an objectively uncertain cosmos, the subjective fixing, of object roles, presents new subjective demands, to fix objective results. This reductive selection of roles across informational levels I label delta X, in contrast to delta Q’s natural expansion. This mingling of subject-object roles also clouds the study of information and further stresses the value of an a priori approach. Within subject-object selection, the S8 ordering of Bits into metadata is a human construct. As such, the HDD and Binary cosmos examples now fail as general models and can go no further.


Instead of naming a general view, we project a subjective role onto material Bits to assign ASCII functions. The systemic projection of subjective functions onto objective elements defines every symbol set and all manner of memory and metadata. Further, those projections underlie the formation of all engineered goods, works of art, and so on. 19:32, start slide 25 (+1:33) Whether a projection is materially direct, instinctually driven, or arises by other mediated means, matters little. Those projections are how every system extends itself into the cosmos, and creates an identity, which is then gauged by natural selection. Further, to mistake any one projection for a general model, as example, in the case of humans, would be to commit an anthropic error. This issue of systemic projection should halt any further study since the analysis is human based. But as we hold an a priori ideal, the chance of naming a general model seems likely if we guard against human bias. With this risk in mind, to now attempt a general model imagine an undifferentiated cosmos. If the Big Bang and cosmic background radiation are valid concepts, this undifferentiated state is +pre-photopic plasma in a cosmos nearly 200,000 years old. The passage of more time brings on cooling and basic material differentiation. This emergence of reflexively ordered matter marks the arrival of delta Z. While binary terms are shown here, tying this view to the Standard Model in physics is easy. Use leptons and quarks for object terms, in place of the 1s and 0s, and show delta S as bosons in relational roles. This also supports the earlier dualistic-triune structure that fills the cosmos, seen in: the structure of atoms, in DNA’s base pairs and triplet codons, and even as space-time. This dualistic-triune role bolsters the chance of naming a general information model, and is expanded upon in the links given at the end. To continue this general ontogeny, more time yields new differentiated states, that later produce complexity, and then the indirect habits of sentient life, which marks the arrival Empiric Entropy or delta E. From early Empiric Entropy, by means of selection dynamics, functional roles then refine further as niche formation and predator-prey roles. Each evolving role extends distinct Aesthetic values as unique tracks of usefully differentiated differences, which eventually presents the vast evolutionary tree we see today. Within that unfolding sentience, human evolution arises abstractly, with a key adaptive means of information. Humans have no purposeful Aesthetic tools like fangs and claws. Instead, we construct informationally derived tools to exploit our environs. Tool-based Aesthetics then sustain a vast informational edifice, the limits of which we forever test, and marking a unique human sensibility. To summarize, this talk offers a reductive functional view of information. The value of this delta based, a priori analysis is that it accepts new events, rather than faces disruption. It entails natural semantic and emergent roles and facets that other views often omit. This natural informatics or “thinking like nature� thus implies a path for realizing leaps in our informational roles and technologies. For a deeper look at these and related issues I invite you to view the material available through the links shown here. Thank you for your attention.


Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.