Cognitive Theoretic Model of the Universe - CTMU

A related discussion: Here I discuss in detail the logical equivalence between SMN and a SCSPL operating on a self-excited circuit. This discussion covers the basic algorithmic processes that underly SMN and it also leads on to a description of the full non-ergodic SMN algorithm. This allows for complete flexibility within the empirical context. Both the structure and behaviour of systems are able to dynamically evolve. See SMN and SCSPL for more information.


Below are some informative quotes from ("The Cognitive-Theoretic Model of the Universe: A New Kind of Reality Theory" by C.M. Langan, http://www.ctmu.net) and some brief comments.

About metaphysical theories in general:
The goal of providing a scientific model and mechanism for the evolution of complex systems ultimately requires a supporting theory of reality of which perception itself is the model (or theory-to-universe mapping). Where information is the abstract currency of perception, such a theory must incorporate the theory of information while extending the information concept to incorporate reflexive self-processing in order to achieve an intrinsic (self-contained) description of reality. This extension is associated with a limiting formulation of model theory identifying mental and physical reality, resulting in a reflexively self-generating, self-modeling theory of reality identical to this universe on the syntactic level.

Regarding idioms:
Everything that can be described or conceived, including every structure or process or law, is isomorphic to a description or definition and therefore qualifies as a language, and every sentient creature constantly affirms the linguistic structure of nature by exploiting syntactic isomorphism to perceive, conceptualize and refer to it. Even cognition and perception are languages based on what Kant might have called “phenomenal syntax”. With logic and mathematics counted among its most fundamental syntactic ingredients, language defines the very structure of information. This is more than an empirical truth; it is a rational and scientific necessity.

By the nature of its derivation, this theory, the Cognitive Theoretic Model of the Universe or CTMU, can be regarded as a supertautological reality-theoretic extension of logic. Uniting the theory of reality with an advanced form of computational language theory, the CTMU describes reality as a Self-Configuring Self-Processing Language or SCSPL, a reflexive intrinsic language characterized not only by self-reference and recursive self-definition, but full self-configuration and self-execution (reflexive read-write functionality). SCSPL reality embodies a dual-aspect monism consisting of infocognition, self-transducing information residing in self-recognizing SCSPL elements called syntactic operators.

Also see these few quotes regarding the Word of God, which is a virtual-reality generative 'utterance' or projection that manifests and sustains every aspect of manifest existence.

Information as the foundation:
Bucking the traditional physical reductionism of the hard sciences, complexity theory has given rise to a new trend, informational reductionism, which holds that the basis of reality is not matter and energy, but information.

Unfortunately, this new form of reductionism is as problematic as the old one. As mathematician David Berlinski writes regarding the material and informational aspects of DNA: “We quite know what DNA is: it is a macromolecule and so a material object. We quite know what it achieves: apparently everything. Are the two sides of this equation in balance?” More generally, Berlinski observes that since the information embodied in a string of DNA or protein cannot affect the material dynamic of reality without being read by a material transducer, information is meaningless without matter.
Here he assumes that information always rests in material configurations. But matter itself arises within a transcendent computational space, so transcendent information preceeds matter, which then acts as a medium for empirical information. Information is simply discernable difference, transcendent information is discerned by the transcendent computational process (which underlies the functioning of the SCSPL) and empirical information is discerned by empirical systems within the simulation.

In his own way he then argues toward this same conclusion but in different words. He proposes that reality is a transcendent information process that he describes as analogous to the concept 'language'. This is essentially what I mean by the concept 'idiom'.

Regarding the 'language' or idiom:
Moreover, it is profoundly reflexive and self-contained with respect to configuration, execution and read-write operations. Only the few and the daring have been willing to consider how this might work…to ask where in reality the laws might reside, how they might be expressed and implemented, why and how they came to be, and how their consistency and universality are maintained. Although these questions are clearly of great scientific interest, science alone is logically inadequate to answer them; a new explanatory framework is required...

On a note of forbearance, there has always been comfort in the belief that the standard hybrid empirical-mathematical methods of physics and cosmology will ultimately suffice to reveal the true heart of nature. However, there have been numerous signals that it may be time to try a new approach... which might be colorfully rendered as “reality theory is wedded to language theory and they beget a synthesis”, has the advantage that it leaves the current picture of reality virtually intact. It merely creates a logical mirror image of the current picture ... and attempts to extract meaningful implications. Science as we now know it is thereby changed but little in return for what may, if fate smiles upon us, turn out to be vast gains in depth, significance and explanatory power.

In the search for ever deeper and broader explanations, science has reached the point at which it can no longer deny the existence of intractable conceptual difficulties devolving to the explanatory inadequacies of its fundamental conceptual models of reality. This has spawned a new discipline known as reality theory, the study of the nature of reality in its broadest sense. The overall goal of reality theory is to provide new models and new paradigms in terms of which reality can be understood, and the consistency of science restored as it deepens and expands in scope.

Mainstream reality theory counts among its hotter foci the interpretation of quantum theory and its reconciliation with classical physics, the study of subjective consciousness and its relationship to objective material reality, the reconciliation of science and mathematics, complexity theory, cosmology, and related branches of science, mathematics, philosophy and theology. But in an integrated sense, it is currently in an exploratory mode, being occupied with the search for a general conceptual framework in which to develop a more specific theory and model of reality capable of resolving the paradoxes and conceptual inconsistencies plaguing its various fields of interest...
This work is also tied in with "reality theory":

Regarding finite discrete approaches:
as science in general has become more enamored of and dependent on computer simulation as an experimental tool, the traditional continuum model of classical physics has gradually lost ground to a new class of models to which the concepts of information and computation are essential. Called “discrete models”, they depict reality in terms of bits, quanta, quantum events, computational operations and other discrete, recursively-related units. Whereas continuum models are based on the notion of a continuum, a unified extensible whole with one or more distance parameters that can be infinitely subdivided in such a way that any two distinct points are separated by an infinite number of intermediate points, discrete models are distinguished by realistic acknowledgement of the fact that it is impossible to describe or define a change or separation in any way that does not involve a sudden finite jump in some parameter.

It is not only computers that enamor people to discrete models but also quantum physics, it indicates that continuum models are idealised abstractions of the underlying discrete dynamics. But he rejects discrete models because he conceives of them as bound to material computers and thus not able to be involved in the dynamics that underly matter itself, i.e.
Unfortunately, the advantages of discrete models, which are receiving increasingly serious consideration from the scientific and philosophical communities, are outweighed by certain basic deficiencies. Not only do they exhibit scaling and nonlocality problems associated with their “display hardware”, but they are inadequate by themselves to generate the conceptual infrastructure required to explain the medium, device or array in which they evolve, or their initial states and state-transition programming. Moreover, they remain anchored in materialism, objectivism and Cartesian dualism, each of which has proven obstructive to the development of a comprehensive explanation of reality. Materialism arbitrarily excludes the possibility that reality has a meaningful nonmaterial aspect, objectivism arbitrarily excludes the possibility that reality has a meaningful subjective aspect, and although Cartesian dualism technically excludes neither, it arbitrarily denies that the mental and material, or subjective and objective, sides of reality share common substance.

I disagree that discrete processes are innadequate. Process physics shows that computational models are not "inadequate by themselves to generate the conceptual infrastructure required to explain the medium" and the resulting context is discrete as verified by quantum physics. Furthermore I strongly agree with his comments on materialism, objectivism and Cartesian dualism however discrete processes are not in any way bound to them. Discrete processes do not depend on material computers; the material universe (world of appearances) itself emerges from the underlying dynamics of a discrete computational process and in this sense they are not "anchored in materialism".

But he says, in regards to claims of computational processes:
such claims exhibit an unmistakable irony: classical reality is precisely that on which information and computation are defined! Like classical reality itself, a well-defined entity unable to account for its own genesis, information and computation are well-defined and non-self-generative aspects of reality as it is observationally presented to us at an advanced stage of its existence. So they invite the same questions as does classical reality: how, and by what, were they originally defined and generated? Without an answer to this question, little can be gained by replacing one kind of reality with the other.

I propose that the empirical universe is a simulation and there is a transcendent computational space, but he claims that the empirical universe is all that exists. This comment has been countered by the following (received by email):
Empirical/physical reality is not the foundation in the CTMU model; it is a part of the self-simulation of SCSPL reality. Page 42 of Langan's A New Kind of Reality Theory paper states "in effect, the universe becomes a 'self-simulation' running inside its own contents."”.

By 'empirical' I do not mean 'physical', I mean that which is a product of experience. I claim that the underlying causal foundation of the universe is 'transcedent' in that it is that which implements the process of experience and thereby cannot itself be experienced. In that respect we cannot ever directly experience the transcendent causal foundation and we can only intuit it and make inferences and build models and derive empirical implications and test these against observation. That is the general approach of my work.

Evidently Langan speaks of the one reality without distinguishing between empirical and transcendent, but the SCSPL IS a transcendent aspect of his one reality and the resulting universe IS an empirical aspect of that one reality, regardless of whether he calls them such. Furthermore his SCSPL is itself a computational process but he argues against computational processes because he conceives of them as bound to physical 'computers'. This aspect of his work is a little empiricentric I feel. I speak of two contexts, I use 'empirical' to refer to that which is a construct of experiences within the simulation, i.e. all that we can know through experience. Whereas I refer to as 'transcendent' that which constitutes the actual 'machinery' that computes the simulation and which underlies every experience within the simulation. As for the question "how, and by what, were they originally defined and generated?", we know that we cannot know, because we inhabit different information spaces (just like a computer program can only know its host computer from 'within'), that is a natural end to our ability to know.

He then goes on to a detailed discussion of "intrinsic self-determinacy" and "telic feedback" I think to explain the a-priori being of things in a world (he begins with the assumption of the world and the positing of things in that world). It could just be a confusion of terminology but I didn't follow it, these are extremely subtle subjects. The transcendent context is real, it is the absolute reality but it is not at all like the apparent 'reality' that we know, it has no objects in space, it is more like a computational space with data objects and functional objects. They integrate to implement the SCSPL processor that manifests the empirical experiential universe.

With his concepts of "intrinsic self-determinacy" and "telic feedback" I think he is trying to explain the coherent 'inner' aspect or computational functioning of his primitive entities without the use of a transcendent computational process. He tries to begin "in the world" but the world is a product of a transcendent cause.

A system that evolves by means of telic recursion ... is not merely computational, but protocomputational. That is, its primary level of processing configures its secondary (computational and informational) level of processing by telic recursion.
Here he hints at the transcendent level. What he calls 'protocomputation' or "its primary level of processing" I call transcendent computation and what he calls the "secondary (computational and informational) level of processing" I call empirical computation.

Where language consists of information and information has linguistic structure, the Principle of Linguistic Reducibility implies that information is as fundamental as language. Insofar as we cannot understand reality except in theoretical (linguistic, informational) terms, this permits us to cast reality as a “self-processing language”, or self-defining, self-explaining, self-modeling theory-universe ensemble, without fear of being proven wrong by some alternate theoretical reduction. However, the linguistic reduction of reality is superficially macroscopic. Just as a perfectly self-contained language must be self-processing (for lack of anything external to process it), so must the information of which it consists. This leads to the concept of self-processing information, and ultimately to a microscopic (quantum) theory of information.

It is easy to show that information is self-processing. Structure is attributive; the parts of any structure possess attributes that position them or otherwise define them relative to other parts. To be meaningful and thus informative, information must have structure; therefore, information must possess attributes. Attributive relationships, intrinsic or otherwise, must conform to the logical rules that govern attribution, i.e. to an attributive logical syntax incorporating the propositional and predicate calculi. So information can exist only in conjunction with attributive logical syntax. Because it necessarily incorporates attributive syntax, it has enough native self-processing capacity to maintain its intrinsic structure, which is precisely what it must do to qualify as “informational”.

Because cognition and generic information transduction are identical up to isomorphism - after all, cognition is just the specific form of information processing that occurs in a mind - information processing can be described as “generalized cognition”, and the coincidence of information and processor can be referred to as infocognition. Reality thus consists of a single “substance”, infocognition, with two aspects corresponding to transduction and being transduced. Describing reality as infocognition thus amounts to (infocognitive) dual aspect monism. Where infocognition equals the distributed generalized self-perception and self-cognition of reality, infocognitive monism implies a stratified form of “panpsychism” in which at least three levels of self-cognition can be distinguished with respect to scope, power and coherence: global, agentive and subordinate.

Ultimately, the conceptual shift from information to self-transducing information requires extensions of information-intensive theories including the theories of information, computation and cybernetics. The problem stems from the fact that as it is understood in these fields, information is a limited concept based on an engineering model in which the existence of senders, receivers, messages, channels and transmissive media is already conveniently given, complete with all of the structural and dynamical laws required to make them work together.

By the Principle of Linguistic Reducibility, reality is a language. Because it is self-contained with respect to processing as well as configuration, it is a Self-Configuring Self-Processing Language or SCSPL whose general spatiotemporal structure is hologically replicated everywhere within it as self-transductive syntax. This reduces the generative phase of reality, including physical cosmogony, to the generative grammar of SCSPL. This reality-generative grammar is called G grammar, and the MU form, being the most general or prior form of reality, is its basis. By the Principle of Infocognitive Monism and the hology of MU, SCSPL consists of MU-configured infocognition, and G grammar describes the generation and transformation of this universal constituent.

SCSPL is not an ordinary language, and G grammar is not an ordinary generative grammar. The reasons come down to the inherent limitations of computational language theory. In standard computation theory, a language consists of the set of strings accepted by a given automaton or class of automata; e.g., a language L is called “regular” if there is a finite-state automaton that accepts it. However, this approach is inadequate for SCSPL. First, it centers on computation, a general type of information processing associated with an abstract automaton, the Turing machine or “universal computer”, that could never have generated the informational structure of the real universe. Being an informational and metainformational (syntactic) construct, the universal computer can itself account for the genesis of neither syntax nor information. Second, unlike ordinary languages, the reality-language cannot rely on an external mind or automaton or preexisting hardware substrate for recognition and processing. Since any processor real enough to recognize and process reality is necessarily a part of reality, the language-processor distinction is without ontological force.
at the end here he is stridently empiricentric, stating that there is only one level of reality and that is a criterion of any theory at all... This need for self-creation requires some intricate conceptual footwork to include, whereas the transcendent/empirical approach is straight forward. One's experience of the simulation need not be a foundation of one's explanation of the origin of the simulation; the simulation is a perceptual product of the simulator. I say, rather than look only within the simulation, look beyond it as well, there is much that can be known about the transcendent context and it helps make sense of the empirical context.

Thus, while ordinary discrete models of reality rely heavily on the language-processor distinction, SCSPL incurs no such debt. For example, cellular automaton models typically distinguish between a spatial array, the informational objects existing therein, and the distributed set of temporal state-transition rules by which the array and its contents are regulated. In contrast, SCSPL regards language and processor as aspects of an underlying infocognitive unity. By conspansive (ectomorphism-endomorphism) duality, SCSPL objects contain space and time in as real a sense as that in which spacetime contains the objects, resulting in a partial identification of space, time and matter. SCSPL is more than a reflexive programming language endowed with the capacity for computational self-execution; it is a protocomputational entity capable of telic recursion, and thus of generating its own informational and syntactic structure and dynamics.

Whereas ordinary computational models are informational and syntactic in character, the protocomputational nature of SCSPL requires a generalization of information and syntax. With respect to the origin or ultimate nature of perceptual reality, explanation is a reductive/inductive process that regressively unbinds constraints in order to lay bare those of highest priority and generality. This process eventually leads to the most basic intelligible descriptor that can be formulated, beyond which lies only the unintelligible.

The Self-Excited Circuit, the informational logic loop through which physics engenders observer participation, which engenders information, which engenders physics, is a tight characterization of SCSPL…so tight that it would be difficult if not impossible to replace SCSPL with anything else and neither violate nor fall short of Wheeler’s description. SCSPL is logical in construction, has a loop-like dynamic, and creates information and syntax, including the laws of physics, through telic recursion generated by agent-level syntactic operators whose acts of observer-participation are essential to the self-configuration of the Participatory Universe. These acts are linked by telic recursion to the generalized cognitive-perceptual interactions of quantum-level syntactic operators, the minimal events comprising the fabric of spacetime.

Anything that is self evident proves (or evidences) itself, and any construct that is implicated in its own proof is tautological. Indeed, insofar as observers are real, perception amounts to reality tautologically perceiving itself. The logical ramifications of this statement are developed in the supertautological CTMU, according to which the model in question coincides logically and geometrically, syntactically and informationally, with the process of generating the model, i.e. with generalized cognition and perception. Information thus coincides with information transduction, and reality is a tautological self-interpretative process evolving through SCSPL grammar.

www.Anandavala.info