- 78 Views
- Uploaded on

Download Presentation
## PowerPoint Slideshow about ' From Big to Meta Data or conversely' - vera-bray

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

### From Big to Meta Dataor conversely

Building Technical Capacity for Earth Science Data and Information in Disaster Planning, Response, Management and Awareness

(How) is Data complexity/level-up to Metadata achievable?

Paths/Organization to achieve such a goal, framing research results

The example of the ‘Discinnet process’

Toward a global scale interdisciplinary experimentation

The Big to Meta (semantic) data issue

Some epistemological background assumptions

Important experimental results and interpretations

Presentation of the ‘Discinnet process’

On-line demonstration

Immediate impact on research/knowledge pace

Interdisciplinary model for research fields trajectories

Proposition for a global scale related experimentation

Expected general epistemological model confirmation

Consequence for earth science question to the conference

Conclusion

Q&A

ESIPFED - July 12th 2013 - PJ

SummaryMetadata have meaning, scope, usefulness, persistence as opposed to ‘Big’

However our investigation instruments produce ever tinier, simpler, bigger data

E.g. simple Laws from Statistical ‘Physics’ up to Information/complexity theory

CC, Languages and Math/Category oppose reasoning to computing (Dwk,..Lng)

We relate complex /Meta data strictly to future/subject, Big to past/objects

But highlight the major issue of ‘incommensurability’ of big to meta transitions

The search for symmetries, highest value for purpose and simulating ability

The concept of Oracle and OTM as good reference for Metadata implementing

Relation to P/NP, to the FCP, to time and to complexity (example TSV 615->616)

Big data are numerous, scattered, simple objects, Meta dense, singular, complex

We relate strictly Big object data to past and Meta subjective to future

The issue is about ‘our’ future predictability and optimal drive into it

ESIPFED - July 12th 2013 - PJ

1 – The Big to Metadata issueA promising start is the widest spatial scale boundary gap universe vs. quanta

- However still wider is the complexity versus most simple ladder with:
- The 1010^123universe (neg)entropy problem (Penrose)
- As compared to the 10120 cosmological constant problem (n°1)
- There are still other ‘Big’ order levels than these objectal/entropy field and mass-energy fields issues – to which we propose solutions aside the project presented here on research forecast – since incommensurable, such as the meaning, renormalization, life..
- In such domains Jane Goodall march 2013: “we don’t live solely in the present as intelligent animals such as chimpanzees and elephants.. It is human language that makes the whole difference. With it we can discuss, exchange ideas, learn from the past and project ourselves into the future.” [last point to be debated]
- This led for instance Heyliguen after Chomsky’s natural/formal language in the project of “A structural language for the foundations of physics”, which is also part of this project, skipping for now the constructivist vs. positivist issue.
- As a summary the main question is on the conditions making Big to Meta, or B2M or D2K possible, perhaps only as K2D2K?, and on the science information management and organizations allowing or conversely preventing such futures

ESIPFED - July 12th 2013 - PJ

2 – Epistemological / measurable backgroundContext and goal of this presentation and proposition

- Research > $ 1 trillion/yr is insufficiently funded and yet too risky and long term
- How to accelerate/secure Knowledge discovery and proofing?
- Hence get scientific, predictable hence reproducible new objects, i.e. technologies?
- Using results from own research, physics, linguistics/C.C…
- Own research about FCP principle as fundamental pattern/process/place(reality) whereby ‘Any phenomenon/field is an FCP occurrence’ with definite (St, F) lapse and where securing F is our ‘targeted result’ but F is more complex ‘phenomenal’ space
- Physics : discussion about the maximal framework of complex universe vs quantum
- CC/Computational complexity (natural to formal linguistics and category theory

In “The Knowledge Complexity of Interactive Proof Systems”, Goldwasser, Micali and Rackoff ask “How much knowledge should be communicated for proving a theorem T?” and have the goal to “give a computational complexity measure of knowledge and measure the amount of additional knowledge contained in proofs.”

They then start from the NP class as ‘very successful formalization of the notion of theorem-proving procedure” and use Cook-Levin definition of ‘NP proof-system’ as 2-communicating Turing machine, a prover and a verifier, with the prover deterministic exponential time (EXP) and the verifier polynomial time (P)

ESIPFED - July 12th 2013 - PJ

2.1 – Epistemological / measurable backgroundPositivist and ‘Pure Reason’ epistemologies*, models vs. atoms

- Evolutionist and/or Constructivist, toward best new worlds

In all cases there comes better, next, new, more knowledgeable world but evolutionist ‘looks’ more N(P) and Constructivist more oracled, yet both for best and the status of the human (author, actor, automate) and artificial varies

- Kuhn basic model as epistemological ‘case’ to be observed
- Propositions that:
- ‘Discinnet process’ optimally models a ‘research field’ as object of research
- May therefore precisely enough represent and predict their behavior
- Therefore might allow Epistemology to move from Philosophy to Science
- FCP as one of the epistemological models to be tested

* There is a holistic perspective on Kant’s epistemology (see exhibits)

ESIPFED - July 12th 2013 - PJ

2.2 – Epistemological backgrounds and goalsWhat we have at hand for measurement are distances, axes, projectors, vector lattice, elementary shapes, of which we show the usefulness:

- let’s see infinite dimensional lattice, even H

- yet compare with exponential compact object such as circle

or an egg..and work on the dynamics implied, for instance, by series of non-local harmonics

Importance of experimental results in space, from cosmology to particles, and now getting into Quantum computing from quantum measurement

Importance of the types and numbers of dimensions or of ‘Dimension’

For instance role of Wick rotation (use by Hawking for ‘no-boundary’ universe) in our proposed FCP framework with subjective closure as future

And some Big to Meta data Grammar models with 4 levels, theorems

(and the general FCP pattern from Euclid axioms to Category theory)

In next pages we establish with benchmark NP referential 3-SAT(CNF) vs. P 2-SAT (1 dimension less) , then transition to IP and MIP / QMIP, the optimal closure (entangled/oracle) of scientific knowledge by Discinnet

ESIPFED - July 12th 2013 - PJ

2.3 – Important experimental issues3 – Big to Metadata main figures

ESIPFED - July 12th 2013 - PJ

Figure elaborated with Dr. Sören Auer (Bonn U.) while preparing common European research program pres. 2010

Researchers(<millions), ‘Big’ authors

And communities, networks (severaltenths)

ExperimentalVERY BIG

TOO BIG DATA (-> trillions)

OPEN DATA LINKAGE ISSUE

Researchprojects (limited)

META DATA

Discinnetprocess

ResearchPapers/docs (-> billions?)

PB. : FAST CUMULATIVE GROWTH

8

Funders and researchers alike expect efficient organization/process

Currently they combine semantic/natural language with somehow isolated recourse to experiments, categories, alphabets, even meaning

Tools are available:

- for procedures, grammars, semantic versus formal (algorithms) levels
- about hierarchies and then ordering and numbering (linear)
- Then about spatial/vector/matrix lattice through sets of projectors

These are particularly:

G (V, , Pr, St) Chomsky project for grammar, with End outside

Yet examples of productions in Physics (d->NN), Chemistry (Na + Cl -> NaCl)

The Chomsky project was to account for both natural/semantics and formal

Notice that the End is outside and related to the Productions (hence types 0 to 3)

We will see/assert that the position of the author/observer/oracle is the key

It explains the nature of the differences between P, NP, IP, MIP

A key is the NP equivalence between Non-deterministic and ‘Oracle before/out’

ESIPFED - July 12th 2013 - PJ

3.1 – Need of Metagrammar for MetadataESIPFED - July 12th 2013 - PJ

3.11 – Some stuff to be represented

Major Complexity levels and proposed Big (V) to Meta (A/Pk) equivalences :

BIG =P(TIME) NP IP=QIP=PSPACE=NSPACE MIP=NEXP =pb.OTM META

Plus new results with Quantum Multi-prover Interactive Protocols (QMIP)

. such as [Ito, Vidick, 2012] (Verifier = experimenter, Provers = devices)

. [BFK 2010] QMIP = MIP* and “the entire power of quantum information in MIP* is captured by the shared entanglement, not quantum inform.”

The MIP = pb.OTM = NEXP equality and the ‘prior entanglement’ prove that Provers (us, common sense, researchers ‘shared’ meanings) commonly and collaboratively have access to future (‘targeted’, ‘expected’) results

And both the probabilistic frame of the IP and MIP classes and procedures and Alice/Bob split information tests to V, reflect the scientific process

Example of a universality : = rand. limN-> M/N is an ‘Oracle before’

ESIPFED - July 12th 2013 - PJ

3.2 – Results from Complexity theoryESIPFED - July 12th 2013 - PJ3.3 – Computational Complexity hierarchy

Observe a closing on :

PSPACE = NSPACE (= IP )

Which we link to fundamental space as FCP field property with formal and for instance Planck cell therefore allowing holistic metadata/meaning effective sharing from Big data/lattice (continuum/computational), see also Sethna on Universality/scale invariance

Pursuing with simple Turing machine grammar model where we highlight the ‘F inside’:

Gfa (K, , , (q0, F)) (linear, finite automaton, linked to F within K )

NP referential 3-SAT(3CNF) opposed to P 2-SAT, which has one dimension less.

In 3 CNF (i(j)) – the benchmark! - the number N of variables -> to maintain Satisfiability if M clauses -> at random (which is where stats/‘Big data’ operate)

ESIPFED - July 12th 2013 - PJ

3.4 – Some fundamental underlying experimental resultsAspect experiments / Penrose support Formal/”non local realism”

- Aspect experiments prove “formal realism”
- We can also replicate photon and other particle split
- This again proves that N dimensions are the deep background
- The scheme at left is from Larrabee 1993

AfterTuring centenary let’s focus on his Oracles

Concept introduced in 1959 => Oracle Turing machines

Used in many (most?) theorems (including Cook’s foundational)

Quest for physical oracles going on but here is CC representation

ESIPFED - July 12th 2013 - PJ

3.5 – Time as Complexity density from CCWe produce it by folding the infinite discrete oracle tape into first square (or later cubic) lattice, then polygonal hence polynomial corresponding to P/N C. Classes, to End with a Spherical one as oracle is defined as ‘instantaneous’ or otherwise EXP vs. POLY TIME

Polygone/polynomial with n symmetries while e i is infinite symmetrical, but infinite process also has to be included, hence S2

ESIPFED - July 12th 2013 - PJ3.6 – Epistemic status

We tend to see objects fundamentally ‘in’ space, actually 3D unique background/universe

This is not what QM and other experiments tell and what shared knowledge implies

FCP provides with another picture of the world (see AIP references, last as #1456)

With a wealth of FCP complex coarse-grained histories/objects interwoven and closing

Therefore a cluster of common sense projected intentions seeks for its next oracle/closure

Wick rotation = it exchanges Lorentz (local GR) versus Euclidean

Euclidean apparent objective 3D shape, including Universe

ESIPFED - July 12th 2013 - PJ3.7 – Epistemic closure to be experimented

- Recall the constructivist versus Positivist debate, i.e. invention versus discovery: to what extent do we create an extended nature to the points of It becoming unnatural even with tests/failures and success (a ‘practical reason’ issue about limit to social & earth testing)
- A main proposition is that Absolute Time/Future FCP Complexity closing, hence the overall outcome is determined but :
- the paths are not, and some may be much more harmful than others,
- moreover the time compression is not inocuous
- if so, invention belongs to the path and discovery to the next boundary
- This could seem quite philosophical, foreign to our practical purpose, but this is what the ‘physical boundary’ (assumed so and indeed much experimented) yields, says Penrose:
- “.. The projector set provides refinements for, or alterations to, coarse-graining ‘boxes’, as in classical phase space and this accounts for the term ‘coarse-grained history’.. [objects]
- - A consistent set of coarse-grained histories is called maximally refined if one cannot insert another projector set (..) without destroying its consistency..” [hence closing]
- A key fact is that the unavoidable role of the ‘observers’ is the striking puzzling result of QM experiments, with non-local formal clustering such as entangled pair is basic

ESIPFED - July 12th 2013 - PJ4 – Presentation of the ‘Discinnet process’

Applying the FCP principle, the ‘Discinnet process’ is assumed to be the most efficient way to represent research fields, study their behavior, model it as clusters of community researchers’ projected intentions versus results and presumably predict their trajectories, hence space (appropriate) and time of their outcomes.

Research would appear for outsiders to live in a world of terms, wholly semantic, when it has to come to a world of measure, quantized.

This is through research projects, from where terms (Meta) are projected onto less but more numerous measurement ‘Big’ data axes

Projectors from projects do matter

They may cascade down the terms to measurement hierarchies to end with the basic MKS grammar base.

New dimensions are how complexity hence future comes to terms and conversely how future as terms are may be consistently projected and shown consistent with the past (resilient) within the set of already realized clauses in phenomena, word of existing objects

ESIPFED - July 12th 2013 - PJ4.1 – ‘DiscinNetprocess’ symbolism

State of the Art

Projectedinto K-space

fromanticipated or targetedresults(Shaping)

WithavailableExperimental Protocol-path(Real)

‘In-progress’ hence collaborative, financed, reduceduncertainty, closed future (Implementing)

Measuredresults(Achieved)

Published

This slicing is therefore not arbitrary but directly deriving from the FCP framework

ESIPFED - July 12th 2013 - PJ4.2 – Theoretical elementary research field representations

A researcher may select at least one most significant targeted result and the related status will derived from DiscInNet process analysis

Example of a ‘Cluster’ of State-of-the-Art (published) significant experimental results

1 – State of the Art (each red star plots the distinctive contribution of a paper)

2 – State of the Art + Proposition of a new research program

3 – State of the Art + Proposition of new program + Visualization of the full set of comparable projects with same advancement status in the world or given areas

- Example of visualization of two status among the five :
- STATE OF THE ART: ‘mapped ‘published’ papers/red stars
- ONGOING RESEARCH: a worldwide representation of most long term research projects performance main goals ‘Shaping’ level projects (blue quarters of a pie)

For Researchers as for an Observers of the field it is particularly useful to be able to plot the distinctive position/contribution of (2) as compared to other status in the world (3)!

3

2

1

1

The trend seems to be about a splitting on the long term

ESIPFED - July 12th 2013 - PJ

4.3 – Research fields are ‘E’* Science maximal/meta data

WithoutDiscInNet

WithDiscInNet

The ‘Discinnet process’ models epistemology transition from philosophy to science inasmuch as it maps and conveys fields into objects of research through appropriate space/metrics

* Through Discinnet process the data to metadata interaction is both spatial and very earthly

ESIPFED - July 12th 2013 - PJ

4.4 – Discinnet Synoptic

ESIPFED - July 12th 2013 - PJ5 – On-line demonstration

If a researcher summarizes 200 pages report in one figure to get funding, isn’t it a ‘proving’ use case?

- Renown senior researchers
- Mark and comment research projects much efficiently

Evaluations

A : Excellent

B : Fair

C : Questionable

D : Spam

O : No comment

ESIPFED - July 12th 2013 - PJ5 – On-line demonstration

This is collaborative

Microalgae biofuels

Aviation biofuel

Vegetal Biofuels

Plant-based biofuels

Oleaginous plant-based biofuels

Wood-based biofuels

Microalgae harvesting

GMO Biofuels

Crop-based biofuels

Microalgae processing

Biotechnology for Biofuels

Biofuels socio-economic impact

Biofuels from waste

ESIPFED - July 12th 2013 - PJ6 – Immediate impact on research (S)pace

- Researchers get immediately visualized for their ‘discriminating results’
- Another researcher, engineer, funder or other observer may compare papers and projects in minutes rather than weeks
- Papers from the past that will never be accessible may be immediately located
- Projects for the future may be positioned anonymously at global scale
- Independent sets of theoretical results imply that few ‘dishonest’ don’t matter
- Research fields and their interactions become a new kind of research object
- The related organization is not in place, while it is in other domains
- The transitions and matching between theories, protocols, experimentations, results, financing cycles, are systematized
- Status of research progress with theoretical goal or design/protocol to be financed or ‘in progress’ or ‘just achieved’ pre-publishing is immediately seen
- No risk (and yet uncertainty much reduced, in fact ‘framed’!): this is about ‘what’ and not so much about ‘how’, which remains privative or even anonymous

ESIPFED - July 12th 2013 - PJ7 – Interdisciplinary model for research fields trajectories

- We take results summarized in section 3 as theoretical proofs of the Discinnet process
- However this has to be experimentally ratified and for this purpose implemented
- We also take them as supporting the FCP principle, model and procedure, as already applied for interdisciplinary research fields interactive support by the Discinne process
- This is particularly related to research fields being scientific through replicable outcomes
- Recall that the universality – to be refuted ! – is well exemplified by:
- Oracles (end, cf. [AIV,..]) unavoidable (‘non-relativizing’) but objects coupling (Cat theory)
- and for instance Penrose U/’R’ Objective Reduction (decoherence) or or I (from (M)IP)
- similar conclusion in chemistry, bio, etc…, Information Systems management (Monod)
- FCP points to ways to solve the interdisciplinary explosion with the=is <A I O> taken as the time redefined as <F I St> with F as final time Complexity level with the Proving power to emulate any St, hence as their closure, examples being surfaces, functions, end categories
- Therefore we will systematize this in future interdisciplinary versions
- Research field trajectory then relies on the FCP type of connection between two fields

ESIPFED - July 12th 2013 - PJ8 – Meta to Big to Meta data Interactive Experiment

We will implement and test an Interdisciplinary model on up to 5 pilot cases (fields

FEEL FREE TO PARTICIPATE AND JOIN THE GOVERNANCE OF THIS SERIES OF CLUSTERS AND INTERCLUSTERS TO COVER MAJOR RESEARCH AREAS

Practically we will bring the solution to non-for-profit or foundation in North America with opened interdisciplinary governance from research and appropriation by networks

Current version is on a dedicated server and a lot still has to be done before comprehensive and fully interdisciplinary system but it already works

Our goal remains to get support to strengthen our research program while research institutions so far met have expressed interest under the provision to have access to maximal global cluster ongoing content and yet use it maximally internally, whether on their servers or a limited cloud

Therefore we have decided that the only condition would be that all versions/platforms should at least share ‘published’ (red stars) USP

Then we are ready to put it and next ones on U.S. server(s) at least under a non-for-profit corp. umbrella with governance to be decided and further open licensing, options through this body

As a result research groups may either decide to use it on such server or get it for limited networks or participate to governance and the construction of next versions as drafted next page.

ESIPFED - July 12th 2013 - PJ

8.1 – PropositionsFor whom : The immediate main beneficiaries should be doctoral schools because a thesis is where successful state of the art and community feedback from inception is where a career may take off or crash

Postdocs, DataScientists/community managers, researchers, funders

With whom : we would like to set-up this governance with/by/from interdisciplinary scientific federation… ESIP is broad and would be great

To whom : we have to convince universities, labs, editors everywhere but would like to do this in an organized fashion

By whom (bringing, discussing, proving, explaining): initially ‘us’ (Discinnet)

Communicating deeply enough the program requires some amount of time in explaining, sharing, training, assistance and integrating remarks

This can be done under specific programs of 3/5 days depending depth and next steps, hopefully with and within appropriate frame

Next and much more ambitious goal is to set a governance with specific advantages first for governing body then for contributors to governance

At international level we think that it could participate to RDA project

ESIPFED - July 12th 2013 - PJ

8.2 – Experimental proposition (next)To be true we would have a funding issue if we had to do all this for free

So here is the idea we have come with:

- For those willing to use it for free the fact is that… it is free, but no deep assistance
- For those that could contribute somehow then we have to bring value (not new)

We are putting in place a non-profit corporation in the US to handle maximal rights

To initiate this governance it appears that a budget to support itself would be appropriate, such as $ 1,500 for a field/lab, $ 3,500 for a department/center and $ 7,500 for a university or a foundation (the ‘naïve?’ base goal being to reach $ 250 K per annum or better (goal $ 500K)) to be used by I4 governance then DIN

For our own part we bring related training and support, for instance 100 man days (half budget) to make it fully shared with US organizations either involved in the governance and/or willing to use a version of their own.

These services could come on top or through licensing to US based I4 non-profit corp.

Full budget would allow more ambitious research sharing through governing body including interconnection with other tools, database, Drupal, etc..

For instance we appreciate support already received from Indiana and the RDA goals

ESIPFED - July 12th 2013 - PJ

8.3 – Experimental proposition (next)Once established with FCP that:

- Paths to the future are going to close to next FCP oracle
- But the wandering may be long and even growing
- It is important to test solutions to avoid risk of implosion
- FCP as one of the epistemological models to be tested
- With one question at least : is FCP merely an interpretation or (at it would be when applied as ‘Discinnet process’) is it also fruitful heuristics/procedure for Discovery?
- In Discinnet, FCP foundational on the role of dimensions / background independence
- Case 1 : a suggested contribution to Electromagnetism versus Gravitation GR gap, both being different basic phenomena hence different spaces after Q. vacuum**
- Case 2 : similar type of interpretation for a diversity of cosmological issues
- Case 3 (heuristics) : it suggests a path for equation or rather procedure of living
- Case 4 : equally applied in other fields of research, maths to social sciences

ESIPFED - July 12th 2013 - PJ

9 – Some Epistemological testing : FCP caseESIPFED - July 12th 2013 - PJ9.1 – Epistemic status of projectors, provers, oracles

- Applications of the FCP principle (pattern/process)
- QM framework is the widest Meta to Big Data framework (renormalization)
- “Data ontological” : split between Density Matrix and Protocol and Oracle
- Future is Complexity but lies within common reach when potentially objective (technology)
- In “Relativizing versus Non-relativizing Techniques” [AIV92] conclude that oracles may not even be avoided and ‘Normal maths’ (Peano arithmetic) should explicit an O function
- QMIP and MIP* classes power have for instance a scheme with Verifier=Experimenter and Research field/objects are the Provers (devices) but these may be malicious or dishonest
- On the other hand we may take Researchers (fields) as Proversand experimented object and protocol as Verifier, since these are rather objects/categories polynomial bounded
- This sets the goal and path to the interdisciplinary predictive operational organization : currently the categorical objects/key words (‘ontologies’), dimensions not yet interactive
- Future is Complexity but lies within common reach when potentially objective: this makes research fields trajectories predictable since this future objects are past elsewhere
- For comprehensive (interdisciplinary) trajectory prediction we will have to link related positions and closure capacity from different levels of oracles since some fields are already objects (with their oracles) harvested by others still in coarse-graining phase.

9.2 – (Absolute) Time as Complexity density

ESIPFED - July 12th 2013 - PJ

We draw from Penrose a case to use as example of ‘natural’ ‘OR’ oracle

This technical oracle is for instance what is targeted through quantum computing or cryptography

Penrose pinpoints the fact that the entanglement even is recovered when one entangled particle is measured, meaning a leap backward 5 years

We interpret this as the fact that more complex entanglement technicality and its intermediary Riemann sphere remain in fact ahead in time to both isolated particles

In other words the more complex theory and its oracle automatically lie in the future of its experimental proof.

A striking consequence of replicable EPR Aspect’s experiments

38

P. JourneauRicheact - I4OpenResearch

ESIPFED - July 12th 2013 - PJ9.3 – Interdisciplinary model using FCP /Complexity

Recall IP introduced Interactive protocol, from where MIP make it Collaborative Discinnet, MIP standing for Multi-Prover (honest or not) Interactive protocol (V, P1,…Pk) which fits perfectly with natural sciences, even constructed (in fact always constructed), whereby a common research is field (of research/meant at first) V tested and testing Provers (depending whether your stance is more transcendental modelist or empirical realist)

To St(art) with IP, the difference with NP was the split between two interacting machines, P out of V that is to say the (Final (oracle), St) couple becoming embedded within the IP class

We take IP = PSPACE = NSPACE very seriously, i.e. equivalent GR definition of space-time! This is an example of closure of a notion of space with ‘real’ oracle, here Euclidean (mapping) as defining the form of – or rather ‘with’ – usual (potentially gravitational) space.

Entropy vs. Neguentropy (complexity)

Question asked by R. Penrose, but otherwise by C. Thiercelin in“Does the empire of the meaning belong to the empire of nature”, a question related to the Epistemological considerations going to be tested

« Is the whole universe an ‘isolated system’?»

ESIPFED - July 12th 2013 - PJ9.4 – Status of constructed fields/objects versus ‘Nature’

Steps such as inflaton, light, mass(Higgs field), cosmological constant, life, etc.. And then the fast growing ‘Earth’(s) transformation transitions in construction…?’

Or the ‘OR’ issue (U/R) by Penrose versus ‘Consciousness ‘and other Interpretations?

What this proves is that the best data oriented organization interacts optimally and at all scale collaboratively human authors, sole bearers of the meta-meaning: this is clearly already being put in place

However it has to integrate considerably more the most appropriate interactions between them, experimental data and docs/texts.

Currently, even in most advanced collaborative tools/platforms, this is still far too much scattered, far too semantic, foreign to their assumed common experimental/verification/confrontation space or…. Battle field!

ESIPFED - July 12th 2013 - PJ

10 – Applying to Complexity, Climate, B2M organizationAssuming research fields are comprised within the boundaries of QM renormalization and related complexity, hence between the highest complexity classes oracles, form where they undergo objective reduction by projection on N-dimensional lattice, the Discinnet process wholly and optimally represents their entire dynamics.

ESIPFED - July 12th 2013 - PJ

11 – ConclusionP. Journeau Richeact - Discinnet Labs0 - A Background of the Discinnet process

Through single-Prover/researcher (IP class) versus Multi-Prover Interactive Protocol (MIP ) collaborative design, complexity degree changes the direction of time

- Two examples of orientation:
- From ‘target’ (Abstract) to ‘domain’ measurement observed Data: see how MIP class relate Oracle to process
- From Data to Authorship (Insight/Knowledge)

What some fields of research tell : 1.1 : Category theory

Still debated thirty years ago, whether Set theory or Category theory is the most foundational to mathematics: solved in favor of categories, back to Kant.

A Category is defined in [AL91] as:

- “A collection ObC of objects a, b.
- A collection MorC of morphisms (arrows)
- Two operations dom, cod, assigning to each arrow f two objects respectively called domain (sources) and codomain (target) of f
- An operation id assigning to each object b a morphismidb (the identity of b) such that dom(idb) = cod(idb) = b
- An operation “°” (composition) assigning to each pair f, g of arrows with dom(f) = cod(g) an arrow f ° g such that dom (f ° g) = dom (g) and cod(f ° g) = cod(f)
- Identity and composition, moreover, must satisfy the following conditions: identity law… associative law…”

The book then presents some “common categories”: the category of sets, with functions as morphisms, the topological spaces, with continuous functions, the vector spaces, with linear transformations, the groups, with group homomorphisms and the partially ordered sets, with monotone functions.

We put in bold some major, especially for our FCP conclusion that (meta)data are oriented with the role of ‘source’ and ‘target’ domains and ‘morphisms’ between them.

FCP Metamodel

Exhibit 1 – Teachings from related fields a /1a this document reuses elements from recent or pending communications, namely CCCA’12, foundations of science, Toulouse Space KM conf. 2012

What fields of research tell : 1.2 : Linguistics/Formal languages

Related to Category theory, with objects as strings based upon some alphabet

Turing added the concept of oracle, which we link to natural vs. formal languages

Chomsky’s universal grammar (meta)model (worked practically for formal) follows:

Gc ≡ (VN, VT, Pr, St)

In this definition VN represents “variables”, VT “terminals” – corresponding to words for objects – Pr “productions” and St the “Start symbol”, belonging to VN.

Both VN and VT are finite subsets of V*, itself defined as “the set of all sentences (or strings) composed of symbols of V, including the empty sentence”, hence non countably infinite, V being some alphabet.

Pr Productions are transitions or processes such as denoted in any a b (Morphisms)

St marks Input and process (algorithm) eventually ends with Output or decision or target if ‘Accepted’ , whether Authorized by human halting or Automated if predictable.

There are four levels of formal languages or equivalently decision problem, of which the least powerful is the ‘regular’ or type 3 or finite automaton (fa) language, embedded into the type 2 ‘context free language’ (cfl) type, itself less powerful and embedded into the broader type 1 ‘context-sensitive languages’ (csl), or recursive (all algorithms) itself within most general (without oracles) recursively enumerable (re) type 0.

We propose that these (meta)(data)models derive from next FCP universal model.

FCP Metamodel

Exhibit 1 – Teachings from related fields /2What fields of research tell : 1.3 : Turing machines to semantics

- From poorest level (previous finite automaton) to Computers/C. Complexity

Turing Machines TM (types 1 to 3) bear their own/predictable/de(finite) targets F:

Gfa (K, , , (q0, F)) (linear, finite automaton, linked to F within K )

K and for TM (computers) wholly relate to Chomsky’s grammar models; besides, equivalences between Church, Turing and Chomsky’s models were demonstrated.

We encompass the oriented (q0, F) within general time A : the target domain for F is all the more limited for fa and on the contrary all the more (meta) powerful when there is a change of cardinality or, in Computational Complexity, of complexity class.

(see particularly interesting case of the P to NP to IP to MIP transitions next pages)

B) From most powerful computing to semantic space

- Semanticb spaces (or rather semantic to syntactic) link languages to meaning
- Meaning is the Author level nurturing semantic (contextual) space
- The Authority space is therefore utmost meta-level yet following Kojeve’sc somehow FCP definition of authority, which doesn’t require force (typical of scientific authority)
- Turing/CC oracles definitions/theorems bring useful measurement for semantics (A)

FCP Metamodel

Exhibit 1 – Teachings from related fields /3b C. Mouton, Induction du sens de mots à partir de multiples espaces sémantiques, Recital 2009

c A. Kojeve, La notion d’autorité, 1942

What fields of research tell : 1.4 : foundations of Physics

There are several types of bridges between so-called formal and physical domains

However most basic physical data model is time-space as a de(formed) field, whether continuous (Einstein’s GR proposition) or finite discrete quanta lattice

G ≡ (, , , r) or G ≡ (, , , r)

There are spherical coordinates but in the second definition we replace second angle by change

Observe that the first definition has two angular type dimension and only one spatial

The second definition better exhibits the dynamics of perturbative theory, contributes to bridge the gap between physics and grammar’s Pr Productions/Transition/Morphisms a b

A typical transition representation , from Feynman graphs to nuclear and chemical reactions

In both cases the angular fits with the formal content, whether for instance as curvature for gravitational data or other typical angles of physical to chemical properties.

Also we generalize this through a representation of oracles easy to derive from classical typewriting

Now the most interesting case is the extended definition of time, related to authority as complexity level: the experimenter asks questions and expects answers from measurement apparatus through reducing projections (Penrose)

Time oriented measures complexity difference as well as related number of steps

We propose that these (meta)(data)models is a derivation of FCP universal model

which may relate (cf. AIP # 1446, 2012) entangled particles NEXP power as compared to separated to general mechanism

FCP Metamodel

Exhibit 1 – Teachings from related fields /4FCP Metamodel

Exhibit 1 – First conclusions and applications /6What fields of research tell : Theory about research trajectories

We derive some conclusions from Computation complexity result

Particularly from the applicability of Practically Predictive Data class P since recursive (reusable/predictive) as effective algorithms (Polynomial time) to Knowledge ‘hard’ NP (Non-deterministic) class particularly as NP-complete, which was pushed to the limits through IP and MIP classes.

- NP class (Non-deterministic Poly time), defined as ‘at once’ brought solution then polynomial time verified, was split in Provers vs. Verifier

- IP stands for Interactive Protocol between Provers all powerful (human i.e. (semantic) oracles) and polynomial bounded Verifier: it is interesting to see how this applies through the process presented on slide 3.

- MIP is Multi-Prover Interactive Protocol, shown to hav e NEXP power level

[FOR92]concludes about demonstrated difference IP versus MIP:

- "this model differs from one-prover interactive protocol in that the oracle must be set ahead of time while in an Interactive Protocol the prover may let his future answers depend on previous ones”
- We conclude conversely that MIP indeed has superior predictive power, see slide 3

- (Meta)data are (utmost) Authorized as goals/targets or Abstracted (and this is well modeled/quantifiable within MIP class & as Time)
- (Meta)data are processed paths more than particles (from input to output) hence are always Implemented (in new meta from previous meta) using concepts/oracles resulting from other fields as objects (as components), with (or Penrose ‘OR’)=> Spontaneous change
- (Meta)data have structure as (in)formation, syntactic/Linguistic
- (Meta)data have some extension (length r, Numbered)

A ‘co-necessity’ model presented as FCP, denoted (A, L, S, N), implemented in Discinnet to widest types of languages/structures iv

Examples of applications to other types of metadata model?

- Category and Formal theories by singularizing distance or replacing usual physical distance by other distances, whether related or not
- Physics data model conversely admit limited levels of complexity

(somehow as we strictly relate Time to FCP complexity and physical are most ancient, simple phases)

FCP Metamodel

2 – Principles of French (FCP) Discinnet modelivAlso see F. Heylighen, A Structural Language for the Foundations of Physics, I.J.G.S. 18, 1990

P. Journeau Richeact - I4OpenResearchRelations de fermeture transitive

Immermann [14] montre quelques équivalences :

CSL (FO + positive TC) NL ( Non-déterministe LOGSPACE)

FO logique First Order (sur N) et TC, ‘Transitive Closure’, fermeture transitive des relations

(SO+TC) PSPACE = NSPACE (spécifique CC mais comparabilité )

SO logique Second Order (fonctions sur N) et PSPACE Polynomial Non-déterministe PSPACE

En rappelant les premiers niveaux de la hiérarchie de complexité computationnelle :

L NL *L P NP *P PSPACE = NSPACE

Où P est Polynomial Time et NP est Non-déterministe Polynomial Time MAIS ‘Linear Time’

Alternation des espaces et temps comparable à quantificateurs de la hiérarchie mais aussi U/R en physique : les relations ci-dessus montrent l’encadrement de la zone polynomiale temps par les zones CSL (algorithmes) = (FO+TC) au niveau LOGSPACE et (SO+TC) au niveau PSPACE, reliés au principe FCP

Immermann produit encore les autres théorèmes suivants :

P = (FO + LFP) = (FO + ATC)

Où LFP est pour Least Fixed Point et ATC, Alternating Transitive ClosureOperator

Capacity of the model to be more predictive through metadata:

On one hand we have demonstrations from Computation complexity

- More precisely MIP, Multi-Interactive Protocol, N-EXPonential time efficient, is:
- If x L then Pr(A1, …, Ak and V on x accept) > 1 – 2-n (where n ‘sufficiently large”, i.e. n |x|)
- If x L then Pr(A’1, …, A’k and V on x accept) < 2-n
- [FO92] emphasizes equivalence with Probabilistic Oracle TM definition:
- If x L then O such that MO accepts x with probability > 1 – 1/p(|x|) (1)
- If x L then O, MO accepts x with probability <1/p(|x|) for all polynomials and x large

Now in order to prove the predictive power of FCP model processed through Discinnet we will show that it uniquely embeds MIP needs:

- Provers project ‘discriminating results’ to answer Verifier -> ), sharing them
- A limited number of Provers suffices if the Verifier is well enough collaborative
- It proves that Metadata produced in such process have unique predictive power
- Hence are indeed meta, i.e. more (FCP) complex, i.e. more in the future
- Already compelling examples at quantum level or see example next page

FCP Metamodel

Exhibit 2 – Principles of (FCP) Discinnet model /1Main Computational complexity references here used are:

- J. Hopcroft & J. Ullman, Formal Languages and their Relation to Automata, Addison-Wesley, 1969
- S. Arora, B. Barak, Computational Complexity, a modern approach, Cambridge U. P., 2009
- M. Mezard, Optimization and Physics: On the satisfiability of random Boolean formulas, cond-math/0212448v2
- N. Immermann, Languages that capture complexity classes, SIAM J. of Computing, 1987
- A. Asperti, G. Longo, Foundations of computing series, MIT press, 1991
- S. Arora, R. Impagliazzo, U. Vazirani : Relativizing versus non-relativizing techniques : the role of local checkability,
- L. Fortnow, Oracles, Proofs and Checking, 1992
- Some of the main Physics/Cosmology references used here are:
- F. Heylighen, A Structural Language for the Foundations of Physics, International Journal of General Systems 18, 1990
- A. Aspect, J. Dalibard & G. Roger, Experimental test of Bell’s inequalities using time-varying analyzers, Physical Review Letters 49, n°25, 1982
- R. Penrose, the road to reality, Alfred Knopf, NY, 2004
- R. Penrose, The Question of the Cosmic Censorship, J. Astrophys. Astr., 1999, 20, 233-248
- J. Polonyi, Dynamical breakdown of time reversal invariance and causality, arXiv:1109.2228v2
- M. Gasperini, Elements of String Cosmology, Cambridge U. P., 2007
- Some intermediary results may be found in our previous papers:
- P. Journeau, Evolution of the Concept of Dimension, American Institute of Physics CP905, 2007
- P. Journeau, New concepts of dimensions and consequences, AIP n°1018, 2008
- P. Journeau, Emergence of dimensions in cosmology, New Advances in Physics vol 4 n°4, 2010
- P. Journeau, Evolution of the concept of dimension and potential impacts in physics, AIP n°1456

P. Journeau Richeact - Discinnet Labs

Main ReferencesP. Journeau Richeact - Discinnet Labs

- T. Larrabee, Y. Tsuji, Evidence for a satisfiability threshold for random 3CNF formulas, UCSC-CRL- 92-42
- R. Monasson, R. Zecchina, Entropy of the k-SATisfiability problem, Physical Review letters 76-21, 1996
- J. Wang, discussion on “implementing a quantum oracle” issue, FFP10, Perth, 2009
- P. Journeau, Une épistémologie générale pour l’auto-orientation, AO2011, Polytechnique, Palaiseau, 2011
- F. Heylighen, The Self-organization of Time and Causality : steps towards understanding the ultimate origin, VrijeUniversiteit, 2008
- J. Manchak, On the Possibility of Supertasks in General Relativity,
- J. Meek, Independence of P vs. NP in regards to oracle relativizations, arXiv:0805.2170v6
- T. Baker, J. Gill & R. Solovay, Relativizations of the P=?NP question, SIAM Journal of Computing, 1975
- N. Immermann, Expressibility as a complexity measure, Yaleu/DCS TR 538, 1987
- G. Dowek, The physical Church thesis as an explanation of the Galileo thesis, 2010
- M. Gasperini, Elements of String Cosmology, Cambridge U. P., 2007
- A. Montanari, F. Ricci-Tersenghi, G. Semerjian, Clusters of solutions and replica symmetry breaking in random k-satisfiability, ArXivcond-mat.dis-nn 0802.3627v2, 2008
- C. Thiercelin, L’Empire du sens fait-ilpartie de l’empire de la nature, Critique, n. 612, Paris, 1998
- J. Wang, discussion on “implementing a quantum oracle” issue, FFP10,
- B. Russell, On the notion of Cause, Proceedings, Aristotelian Society, 13, 1912 – 1913 / Scientia, 13, 1913
- J. Sethna, Entropy, Order Parameters and Complexity, Oxford University Press 2008
- J. M. Hombert et al., Aux origines des langues et du langage, Fayard, Paris, 2005
- L. Fortnow, The Role of Relativization in Complexity Theory, Chicago University, 1992
- S. Mertens, M. Mézard, R Zecchina, Threshold Values of Random K-Sat from the cavity Method, DOI 10.1002/rsa 20090
- N. Immermann, Nondeterministic Space is Closed under Complementation, SIAM 17:5, 1988
- E. Allender, M. Loui & K. Regan, Complexity Classes, Rutgers, 1995
- A. Chandra, D. Kozen & L. Stockmeyer, Alternation, ACM, 1981
- A. Albrecht et al., Findings of the Joint Dark Energy Mission Figure of Merit Science Working Group, astro-ph.IM, 2009
- M. Tegmark, Dimensionless constant, cosmology and other dark matters, ArXiv : astro-ph/0511774v3, 2006

Download Presentation

Connecting to Server..