1 / 57

From Big to Meta Data or conversely

  I4OpenResearch. Building Technical Capacity for Earth Science Data and Information in Disaster Planning, Response, Management and Awareness. (How) is Data complexity/level-up to Metadata achievable ? Paths/Organization to achieve such a goal, framing research results

vera-bray
Download Presentation

From Big to Meta Data or conversely

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. I4OpenResearch Building Technical Capacity for Earth Science Data and Information in Disaster Planning, Response, Management and Awareness (How) is Data complexity/level-up to Metadata achievable? Paths/Organization to achieve such a goal, framing research results The example of the ‘Discinnet process’ Toward a global scale interdisciplinary experimentation From Big to Meta Dataor conversely

  2. The Big to Meta (semantic) data issue Some epistemological background assumptions Important experimental results and interpretations Presentation of the ‘Discinnet process’ On-line demonstration Immediate impact on research/knowledge pace Interdisciplinary model for research fields trajectories Proposition for a global scale related experimentation Expected general epistemological model confirmation Consequence for earth science question to the conference Conclusion Q&A ESIPFED - July 12th 2013 - PJ Summary

  3. Metadata have meaning, scope, usefulness, persistence as opposed to ‘Big’ However our investigation instruments produce ever tinier, simpler, bigger data E.g. simple Laws from Statistical ‘Physics’ up to Information/complexity theory CC, Languages and Math/Category oppose reasoning to computing (Dwk,..Lng) We relate complex /Meta data strictly to future/subject, Big to past/objects But highlight the major issue of ‘incommensurability’ of big to meta transitions The search for symmetries, highest value for purpose and simulating ability The concept of Oracle and OTM as good reference for Metadata implementing Relation to P/NP, to the FCP, to time and to complexity (example TSV 615->616) Big data are numerous, scattered, simple objects, Meta dense, singular, complex We relate strictly Big object data to past and Meta subjective to future The issue is about ‘our’ future predictability and optimal drive into it ESIPFED - July 12th 2013 - PJ 1 – The Big to Metadata issue

  4. A promising start is the widest spatial scale boundary gap universe vs. quanta • However still wider is the complexity versus most simple ladder with: • The 1010^123universe (neg)entropy problem (Penrose) • As compared to the 10120 cosmological constant problem (n°1) • There are still other ‘Big’ order levels than these objectal/entropy field and mass-energy fields issues – to which we propose solutions aside the project presented here on research forecast – since incommensurable, such as the meaning, renormalization, life.. • In such domains Jane Goodall march 2013: “we don’t live solely in the present as intelligent animals such as chimpanzees and elephants.. It is human language that makes the whole difference. With it we can discuss, exchange ideas, learn from the past and project ourselves into the future.” [last point to be debated] • This led for instance Heyliguen after Chomsky’s natural/formal language in the project of “A structural language for the foundations of physics”, which is also part of this project, skipping for now the constructivist vs. positivist issue. • As a summary the main question is on the conditions making Big to Meta, or B2M or D2K possible, perhaps only as K2D2K?, and on the science information management and organizations allowing or conversely preventing such futures ESIPFED - July 12th 2013 - PJ 2 – Epistemological / measurable background

  5. Context and goal of this presentation and proposition • Research > $ 1 trillion/yr is insufficiently funded and yet too risky and long term • How to accelerate/secure Knowledge discovery and proofing? • Hence get scientific, predictable hence reproducible new objects, i.e. technologies? • Using results from own research, physics, linguistics/C.C… • Own research about FCP principle as fundamental pattern/process/place(reality) whereby ‘Any phenomenon/field is an FCP occurrence’ with definite (St, F) lapse and where securing F is our ‘targeted result’ but F is more complex ‘phenomenal’ space • Physics : discussion about the maximal framework of complex universe vs quantum • CC/Computational complexity (natural to formal linguistics and category theory In “The Knowledge Complexity of Interactive Proof Systems”, Goldwasser, Micali and Rackoff ask “How much knowledge should be communicated for proving a theorem T?” and have the goal to “give a computational complexity measure of knowledge and measure the amount of additional knowledge contained in proofs.” They then start from the NP class as ‘very successful formalization of the notion of theorem-proving procedure” and use Cook-Levin definition of ‘NP proof-system’ as 2-communicating Turing machine, a prover and a verifier, with the prover deterministic exponential time (EXP) and the verifier polynomial time (P) ESIPFED - July 12th 2013 - PJ 2.1 – Epistemological / measurable background

  6. Positivist and ‘Pure Reason’ epistemologies*, models vs. atoms • Evolutionist and/or Constructivist, toward best new worlds In all cases there comes better, next, new, more knowledgeable world but evolutionist ‘looks’ more N(P) and Constructivist more oracled, yet both for best and the status of the human (author, actor, automate) and artificial varies • Kuhn basic model as epistemological ‘case’ to be observed • Propositions that: • ‘Discinnet process’ optimally models a ‘research field’ as object of research • May therefore precisely enough represent and predict their behavior • Therefore might allow Epistemology to move from Philosophy to Science • FCP as one of the epistemological models to be tested * There is a holistic perspective on Kant’s epistemology (see exhibits) ESIPFED - July 12th 2013 - PJ 2.2 – Epistemological backgrounds and goals

  7. What we have at hand for measurement are distances, axes, projectors, vector lattice, elementary shapes, of which we show the usefulness: - let’s see infinite dimensional lattice, even H - yet compare with exponential compact object such as circle or an egg..and work on the dynamics implied, for instance, by series of non-local harmonics Importance of experimental results in space, from cosmology to particles, and now getting into Quantum computing from quantum measurement Importance of the types and numbers of dimensions or of ‘Dimension’ For instance role of Wick rotation (use by Hawking for ‘no-boundary’ universe) in our proposed FCP framework with subjective closure as future And some Big to Meta data Grammar models with 4 levels, theorems (and the general FCP pattern from Euclid axioms to Category theory) In next pages we establish with benchmark NP referential 3-SAT(CNF) vs. P 2-SAT (1 dimension less) , then transition to IP and MIP / QMIP, the optimal closure (entangled/oracle) of scientific knowledge by Discinnet ESIPFED - July 12th 2013 - PJ 2.3 – Important experimental issues

  8. 3 – Big to Metadata main figures ESIPFED - July 12th 2013 - PJ Figure elaborated with Dr. Sören Auer (Bonn U.) while preparing common European research program pres. 2010 Researchers(<millions), ‘Big’ authors And communities, networks (severaltenths) ExperimentalVERY BIG TOO BIG DATA (-> trillions) OPEN DATA LINKAGE ISSUE Researchprojects (limited) META DATA Discinnetprocess ResearchPapers/docs (-> billions?) PB. : FAST CUMULATIVE GROWTH 8

  9. Funders and researchers alike expect efficient organization/process Currently they combine semantic/natural language with somehow isolated recourse to experiments, categories, alphabets, even meaning Tools are available: • for procedures, grammars, semantic versus formal (algorithms) levels • about hierarchies and then ordering and numbering (linear) • Then about spatial/vector/matrix lattice through sets of projectors These are particularly: G  (V, , Pr, St) Chomsky project for grammar, with End outside Yet examples of productions in Physics (d->NN), Chemistry (Na + Cl -> NaCl) The Chomsky project was to account for both natural/semantics and formal Notice that the End is outside and related to the Productions (hence types 0 to 3) We will see/assert that the position of the author/observer/oracle is the key It explains the nature of the differences between P, NP, IP, MIP A key is the NP equivalence between Non-deterministic and ‘Oracle before/out’ ESIPFED - July 12th 2013 - PJ 3.1 – Need of Metagrammar for Metadata

  10. ESIPFED - July 12th 2013 - PJ 3.11 – Some stuff to be represented

  11. Major Complexity levels and proposed Big (V) to Meta (A/Pk) equivalences : BIG =P(TIME)  NP  IP=QIP=PSPACE=NSPACE  MIP=NEXP =pb.OTM META Plus new results with Quantum Multi-prover Interactive Protocols (QMIP) . such as [Ito, Vidick, 2012] (Verifier = experimenter, Provers = devices) . [BFK 2010] QMIP = MIP* and “the entire power of quantum information in MIP* is captured by the shared entanglement, not quantum inform.” The MIP = pb.OTM = NEXP equality and the ‘prior entanglement’ prove that Provers (us, common sense, researchers ‘shared’ meanings) commonly and collaboratively have access to future (‘targeted’, ‘expected’) results And both the probabilistic frame of the IP and MIP classes and procedures and Alice/Bob split information tests to V, reflect the scientific process Example of a universality :  = rand. limN-> M/N is an ‘Oracle before’ ESIPFED - July 12th 2013 - PJ 3.2 – Results from Complexity theory

  12. ESIPFED - July 12th 2013 - PJ 3.3 – Computational Complexity hierarchy Observe a closing on : PSPACE = NSPACE (= IP ) Which we link to fundamental space as FCP field property with formal and for instance Planck cell  therefore allowing holistic metadata/meaning effective sharing from Big data/lattice (continuum/computational), see also Sethna on Universality/scale invariance

  13. Pursuing with simple Turing machine grammar model where we highlight the ‘F inside’: Gfa (K, , , (q0, F)) (linear, finite automaton, linked to F within K ) NP referential 3-SAT(3CNF) opposed to P 2-SAT, which has one dimension less. In 3 CNF (i(j)) – the benchmark! - the number N of variables -> to maintain Satisfiability if M clauses -> at random (which is where stats/‘Big data’ operate) ESIPFED - July 12th 2013 - PJ 3.4 – Some fundamental underlying experimental results Aspect experiments / Penrose support Formal/”non local realism” • Aspect experiments prove “formal realism” • We can also replicate photon and other particle split • This again proves that N dimensions are the deep background • The scheme at left is from Larrabee 1993

  14. AfterTuring centenary let’s focus on his Oracles Concept introduced in 1959 => Oracle Turing machines Used in many (most?) theorems (including Cook’s foundational) Quest for physical oracles going on but here is CC representation ESIPFED - July 12th 2013 - PJ 3.5 – Time as Complexity density from CC We produce it by folding the infinite discrete oracle tape into first square (or later cubic) lattice, then polygonal hence polynomial corresponding to P/N C. Classes, to End with a Spherical one as oracle is defined as ‘instantaneous’ or otherwise EXP vs. POLY TIME Polygone/polynomial with n symmetries while e i is infinite symmetrical, but  infinite process also has to be included, hence S2

  15. ESIPFED - July 12th 2013 - PJ 3.6 – Epistemic status We tend to see objects fundamentally ‘in’ space, actually 3D unique background/universe This is not what QM and other experiments tell and what shared knowledge implies FCP provides with another picture of the world (see AIP references, last as #1456) With a wealth of FCP complex coarse-grained histories/objects interwoven and closing Therefore a cluster of common sense projected intentions seeks for its next oracle/closure Wick rotation  = it exchanges Lorentz (local GR) versus Euclidean Euclidean apparent objective 3D shape, including Universe

  16. ESIPFED - July 12th 2013 - PJ 3.7 – Epistemic closure to be experimented • Recall the constructivist versus Positivist debate, i.e. invention versus discovery: to what extent do we create an extended nature to the points of It becoming unnatural even with tests/failures and success (a ‘practical reason’ issue about limit to social & earth testing) • A main proposition is that Absolute Time/Future  FCP Complexity closing, hence the overall outcome is determined but : • the paths are not, and some may be much more harmful than others, • moreover the time compression is not inocuous • if so, invention belongs to the path and discovery to the next boundary • This could seem quite philosophical, foreign to our practical purpose, but this is what the ‘physical boundary’ (assumed so and indeed much experimented) yields, says Penrose: • “.. The projector set provides refinements for, or alterations to, coarse-graining ‘boxes’, as in classical phase space and this accounts for the term ‘coarse-grained history’.. [objects] • - A consistent set of coarse-grained histories is called maximally refined if one cannot insert another projector set (..) without destroying its consistency..” [hence closing] • A key fact is that the unavoidable role of the ‘observers’ is the striking puzzling result of QM experiments, with non-local formal clustering such as entangled pair is basic

  17. ESIPFED - July 12th 2013 - PJ 4 – Presentation of the ‘Discinnet process’ Applying the FCP principle, the ‘Discinnet process’ is assumed to be the most efficient way to represent research fields, study their behavior, model it as clusters of community researchers’ projected intentions versus results and presumably predict their trajectories, hence space (appropriate) and time of their outcomes. Research would appear for outsiders to live in a world of terms, wholly semantic, when it has to come to a world of measure, quantized. This is through research projects, from where terms (Meta) are projected onto less but more numerous measurement ‘Big’ data axes Projectors from projects do matter They may cascade down the terms to measurement hierarchies to end with the basic MKS grammar base. New dimensions are how complexity hence future comes to terms and conversely how future as terms are may be consistently projected and shown consistent with the past (resilient) within the set of already realized clauses in phenomena, word of existing objects

  18. ESIPFED - July 12th 2013 - PJ 4.1 – ‘DiscinNetprocess’ symbolism State of the Art Projectedinto K-space fromanticipated or targetedresults(Shaping) WithavailableExperimental Protocol-path(Real) ‘In-progress’ hence collaborative, financed, reduceduncertainty, closed future (Implementing) Measuredresults(Achieved) Published This slicing is therefore not arbitrary but directly deriving from the FCP framework

  19. ESIPFED - July 12th 2013 - PJ 4.2 – Theoretical elementary research field representations A researcher may select at least one most significant targeted result and the related status will derived from DiscInNet process analysis Example of a ‘Cluster’ of State-of-the-Art (published) significant experimental results 1 – State of the Art (each red star plots the distinctive contribution of a paper) 2 – State of the Art + Proposition of a new research program 3 – State of the Art + Proposition of new program + Visualization of the full set of comparable projects with same advancement status in the world or given areas • Example of visualization of two status among the five : • STATE OF THE ART: ‘mapped ‘published’ papers/red stars • ONGOING RESEARCH: a worldwide representation of most long term research projects performance main goals ‘Shaping’ level projects (blue quarters of a pie) For Researchers as for an Observers of the field it is particularly useful to be able to plot the distinctive position/contribution of (2) as compared to other status in the world (3)! 3 2 1 1 The trend seems to be about a splitting on the long term

  20. ESIPFED - July 12th 2013 - PJ 4.3 – Research fields are ‘E’* Science maximal/meta data WithoutDiscInNet WithDiscInNet The ‘Discinnet process’ models epistemology transition from philosophy to science inasmuch as it maps and conveys fields into objects of research through appropriate space/metrics * Through Discinnet process the data to metadata interaction is both spatial and very earthly

  21. ESIPFED - July 12th 2013 - PJ 4.4 – Discinnet Synoptic

  22. ESIPFED - July 12th 2013 - PJ 5 – On-line demonstration If a researcher summarizes 200 pages report in one figure to get funding, isn’t it a ‘proving’ use case?

  23. ESIPFED - July 12th 2013 - PJ 5 – On-line demonstration

  24. ESIPFED - July 12th 2013 - PJ 5 – On-line demonstration

  25. ESIPFED - July 12th 2013 - PJ 5 – On-line demonstration

  26. Cluster Peer Committees • Renown senior researchers • Mark and comment research projects much efficiently Project marking/ranking in a muchdemocratizedway Evaluations A : Excellent B : Fair C : Questionable D : Spam O : No comment

  27. ESIPFED - July 12th 2013 - PJ 5 – On-line demonstration

  28. ESIPFED - July 12th 2013 - PJ 5 – On-line demonstration This is collaborative

  29. ESIPFED - July 12th 2013 - PJ 5 – On-line demonstration This is collaborative Microalgae biofuels Aviation biofuel Vegetal Biofuels Plant-based biofuels Oleaginous plant-based biofuels Wood-based biofuels Microalgae harvesting GMO Biofuels Crop-based biofuels Microalgae processing Biotechnology for Biofuels Biofuels socio-economic impact Biofuels from waste

  30. ESIPFED - July 12th 2013 - PJ 6 – Immediate impact on research (S)pace • Researchers get immediately visualized for their ‘discriminating results’ • Another researcher, engineer, funder or other observer may compare papers and projects in minutes rather than weeks • Papers from the past that will never be accessible may be immediately located • Projects for the future may be positioned anonymously at global scale • Independent sets of theoretical results imply that few ‘dishonest’ don’t matter • Research fields and their interactions become a new kind of research object • The related organization is not in place, while it is in other domains • The transitions and matching between theories, protocols, experimentations, results, financing cycles, are systematized • Status of research progress with theoretical goal or design/protocol to be financed or ‘in progress’ or ‘just achieved’ pre-publishing is immediately seen • No risk (and yet uncertainty much reduced, in fact ‘framed’!): this is about ‘what’ and not so much about ‘how’, which remains privative or even anonymous

  31. ESIPFED - July 12th 2013 - PJ 7 – Interdisciplinary model for research fields trajectories • We take results summarized in section 3 as theoretical proofs of the Discinnet process • However this has to be experimentally ratified and for this purpose implemented • We also take them as supporting the FCP principle, model and procedure, as already applied for interdisciplinary research fields interactive support by the Discinne process • This is particularly related to research fields being scientific through replicable outcomes • Recall that the universality – to be refuted ! – is well exemplified by: • Oracles (end, cf. [AIV,..]) unavoidable (‘non-relativizing’) but objects coupling (Cat theory) • and for instance Penrose U/’R’ Objective Reduction (decoherence) or  or I (from (M)IP) • similar conclusion in chemistry, bio, etc…, Information Systems management (Monod) • FCP points to ways to solve the interdisciplinary explosion with the=is <A I O> taken as the time redefined as <F I St> with F as final time  Complexity level with the Proving power to emulate any St, hence as their closure, examples being surfaces, functions, end categories • Therefore we will systematize this in future interdisciplinary versions • Research field trajectory then relies on the FCP type of connection between two fields

  32. ESIPFED - July 12th 2013 - PJ 8 – Meta to Big to Meta data Interactive Experiment We will implement and test an Interdisciplinary model on up to 5 pilot cases (fields FEEL FREE TO PARTICIPATE AND JOIN THE GOVERNANCE OF THIS SERIES OF CLUSTERS AND INTERCLUSTERS TO COVER MAJOR RESEARCH AREAS Practically we will bring the solution to non-for-profit or foundation in North America with opened interdisciplinary governance from research and appropriation by networks

  33. Current version is on a dedicated server and a lot still has to be done before comprehensive and fully interdisciplinary system but it already works Our goal remains to get support to strengthen our research program while research institutions so far met have expressed interest under the provision to have access to maximal global cluster ongoing content and yet use it maximally internally, whether on their servers or a limited cloud Therefore we have decided that the only condition would be that all versions/platforms should at least share ‘published’ (red stars) USP Then we are ready to put it and next ones on U.S. server(s) at least under a non-for-profit corp. umbrella with governance to be decided and further open licensing, options through this body As a result research groups may either decide to use it on such server or get it for limited networks or participate to governance and the construction of next versions as drafted next page. ESIPFED - July 12th 2013 - PJ 8.1 – Propositions

  34. For whom : The immediate main beneficiaries should be doctoral schools because a thesis is where successful state of the art and community feedback from inception is where a career may take off or crash Postdocs, DataScientists/community managers, researchers, funders With whom : we would like to set-up this governance with/by/from interdisciplinary scientific federation… ESIP is broad and would be great To whom : we have to convince universities, labs, editors everywhere but would like to do this in an organized fashion By whom (bringing, discussing, proving, explaining): initially ‘us’ (Discinnet) Communicating deeply enough the program requires some amount of time in explaining, sharing, training, assistance and integrating remarks This can be done under specific programs of 3/5 days depending depth and next steps, hopefully with and within appropriate frame Next and much more ambitious goal is to set a governance with specific advantages first for governing body then for contributors to governance At international level we think that it could participate to RDA project ESIPFED - July 12th 2013 - PJ 8.2 – Experimental proposition (next)

  35. To be true we would have a funding issue if we had to do all this for free So here is the idea we have come with: • For those willing to use it for free the fact is that… it is free, but no deep assistance • For those that could contribute somehow then we have to bring value (not new) We are putting in place a non-profit corporation in the US to handle maximal rights To initiate this governance it appears that a budget to support itself would be appropriate, such as $ 1,500 for a field/lab, $ 3,500 for a department/center and $ 7,500 for a university or a foundation (the ‘naïve?’ base goal being to reach $ 250 K per annum or better (goal $ 500K)) to be used by I4 governance then DIN For our own part we bring related training and support, for instance 100 man days (half budget) to make it fully shared with US organizations either involved in the governance and/or willing to use a version of their own. These services could come on top or through licensing to US based I4 non-profit corp. Full budget would allow more ambitious research sharing through governing body including interconnection with other tools, database, Drupal, etc.. For instance we appreciate support already received from Indiana and the RDA goals ESIPFED - July 12th 2013 - PJ 8.3 – Experimental proposition (next)

  36. Once established with FCP that: • Paths to the future are going to close to next FCP oracle • But the wandering may be long and even growing • It is important to test solutions to avoid risk of implosion • FCP as one of the epistemological models to be tested • With one question at least : is FCP merely an interpretation or (at it would be when applied as ‘Discinnet process’) is it also fruitful heuristics/procedure for Discovery? • In Discinnet, FCP foundational on the role of dimensions / background independence • Case 1 : a suggested contribution to Electromagnetism versus Gravitation GR gap, both being different basic phenomena hence different spaces after Q. vacuum** • Case 2 : similar type of interpretation for a diversity of cosmological issues • Case 3 (heuristics) : it suggests a path for equation or rather procedure of living • Case 4 : equally applied in other fields of research, maths to social sciences ESIPFED - July 12th 2013 - PJ 9 – Some Epistemological testing : FCP case

  37. ESIPFED - July 12th 2013 - PJ 9.1 – Epistemic status of projectors, provers, oracles • Applications of the FCP principle (pattern/process) • QM framework is the widest Meta to Big Data framework (renormalization) • “Data ontological” : split between Density Matrix and Protocol and Oracle • Future is Complexity but lies within common reach when potentially objective (technology) • In “Relativizing versus Non-relativizing Techniques” [AIV92] conclude that oracles may not even be avoided and ‘Normal maths’ (Peano arithmetic) should explicit an O function • QMIP and MIP* classes power have for instance a scheme with Verifier=Experimenter and Research field/objects are the Provers (devices) but these may be malicious or dishonest • On the other hand we may take Researchers (fields) as Proversand experimented object and protocol as Verifier, since these are rather objects/categories polynomial bounded • This sets the goal and path to the interdisciplinary predictive operational organization : currently the categorical objects/key words (‘ontologies’), dimensions not yet interactive • Future is Complexity but lies within common reach when potentially objective: this makes research fields trajectories predictable since this future objects are past elsewhere • For comprehensive (interdisciplinary) trajectory prediction we will have to link related positions and closure capacity from different levels of oracles since some fields are already objects (with their oracles) harvested by others still in coarse-graining phase.

  38. 9.2 – (Absolute) Time as Complexity density ESIPFED - July 12th 2013 - PJ We draw from Penrose a case to use as example of ‘natural’ ‘OR’ oracle This technical oracle is for instance what is targeted through quantum computing or cryptography Penrose pinpoints the fact that the entanglement even is recovered when one entangled particle is measured, meaning a leap backward 5 years We interpret this as the fact that more complex entanglement technicality and its intermediary Riemann sphere remain in fact ahead in time to both isolated particles In other words the more complex theory and its oracle automatically lie in the future of its experimental proof. A striking consequence of replicable EPR Aspect’s experiments 38 P. JourneauRicheact - I4OpenResearch

  39. ESIPFED - July 12th 2013 - PJ 9.3 – Interdisciplinary model using FCP /Complexity Recall IP introduced Interactive protocol, from where MIP make it Collaborative Discinnet, MIP standing for Multi-Prover (honest or not) Interactive protocol (V, P1,…Pk) which fits perfectly with natural sciences, even constructed (in fact always constructed), whereby a common research is field (of research/meant at first) V tested and testing Provers (depending whether your stance is more transcendental modelist or empirical realist) To St(art) with IP, the difference with NP was the split between two interacting machines, P out of V that is to say the (Final (oracle), St) couple becoming embedded within the IP class We take IP = PSPACE = NSPACE very seriously, i.e. equivalent GR definition of space-time! This is an example of closure of a notion of space with ‘real’ oracle, here Euclidean (mapping) as defining the form of – or rather ‘with’ – usual (potentially gravitational) space.

  40. Entropy vs. Neguentropy (complexity) Question asked by R. Penrose, but otherwise by C. Thiercelin in“Does the empire of the meaning belong to the empire of nature”, a question related to the Epistemological considerations going to be tested « Is the whole universe an ‘isolated system’?»

  41. ESIPFED - July 12th 2013 - PJ 9.4 – Status of constructed fields/objects versus ‘Nature’ Steps such as inflaton, light, mass(Higgs field), cosmological constant, life, etc.. And then the fast growing ‘Earth’(s) transformation transitions in construction…?’ Or the ‘OR’ issue (U/R) by Penrose versus ‘Consciousness ‘and other Interpretations?

  42. What this proves is that the best data oriented organization interacts optimally and at all scale collaboratively human authors, sole bearers of the meta-meaning: this is clearly already being put in place However it has to integrate considerably more the most appropriate interactions between them, experimental data and docs/texts. Currently, even in most advanced collaborative tools/platforms, this is still far too much scattered, far too semantic, foreign to their assumed common experimental/verification/confrontation space or…. Battle field! ESIPFED - July 12th 2013 - PJ 10 – Applying to Complexity, Climate, B2M organization

  43. Assuming research fields are comprised within the boundaries of QM renormalization and related complexity, hence between the highest complexity classes oracles, form where they undergo objective reduction by projection on N-dimensional lattice, the Discinnet process wholly and optimally represents their entire dynamics. ESIPFED - July 12th 2013 - PJ 11 – Conclusion

  44. ESIPFED - July 12th 2013 - PJ

  45. P. Journeau Richeact - Discinnet Labs 0 - A Background of the Discinnet process Through single-Prover/researcher (IP class) versus Multi-Prover Interactive Protocol (MIP ) collaborative design, complexity degree changes the direction of time • Two examples of orientation: • From ‘target’ (Abstract) to ‘domain’ measurement observed Data: see how MIP class relate Oracle to process • From Data to Authorship (Insight/Knowledge)

  46. I4OpenResearch What some fields of research tell : 1.1 : Category theory Still debated thirty years ago, whether Set theory or Category theory is the most foundational to mathematics: solved in favor of categories, back to Kant. A Category is defined in [AL91] as: • “A collection ObC of objects a, b. • A collection MorC of morphisms (arrows) • Two operations dom, cod, assigning to each arrow f two objects respectively called domain (sources) and codomain (target) of f • An operation id assigning to each object b a morphismidb (the identity of b) such that dom(idb) = cod(idb) = b • An operation “°” (composition) assigning to each pair f, g of arrows with dom(f) = cod(g) an arrow f ° g such that dom (f ° g) = dom (g) and cod(f ° g) = cod(f) • Identity and composition, moreover, must satisfy the following conditions: identity law… associative law…” The book then presents some “common categories”: the category of sets, with functions as morphisms, the topological spaces, with continuous functions, the vector spaces, with linear transformations, the groups, with group homomorphisms and the partially ordered sets, with monotone functions. We put in bold some major, especially for our FCP conclusion that (meta)data are oriented with the role of ‘source’ and ‘target’ domains and ‘morphisms’ between them. FCP Metamodel Exhibit 1 – Teachings from related fields a /1 a this document reuses elements from recent or pending communications, namely CCCA’12, foundations of science, Toulouse Space KM conf. 2012

  47. I4OpenResearch What fields of research tell : 1.2 : Linguistics/Formal languages Related to Category theory, with objects as strings based upon some alphabet Turing added the concept of oracle, which we link to natural vs. formal languages Chomsky’s universal grammar (meta)model (worked practically for formal) follows: Gc ≡ (VN, VT, Pr, St) In this definition VN represents “variables”, VT “terminals” – corresponding to words for objects – Pr “productions” and St the “Start symbol”, belonging to VN. Both VN and VT are finite subsets of V*, itself defined as “the set of all sentences (or strings) composed of symbols of V, including the empty sentence”, hence non countably infinite, V being some alphabet. Pr Productions are transitions or processes such as denoted in any a  b (Morphisms) St marks Input and process (algorithm) eventually ends with Output or decision or target if ‘Accepted’ , whether Authorized by human halting or Automated if predictable. There are four levels of formal languages or equivalently decision problem, of which the least powerful is the ‘regular’ or type 3 or finite automaton (fa) language, embedded into the type 2 ‘context free language’ (cfl) type, itself less powerful and embedded into the broader type 1 ‘context-sensitive languages’ (csl), or recursive (all algorithms) itself within most general (without oracles) recursively enumerable (re) type 0. We propose that these (meta)(data)models derive from next FCP universal model. FCP Metamodel Exhibit 1 – Teachings from related fields /2

  48. I4OpenResearch What fields of research tell : 1.3 : Turing machines to semantics • From poorest level (previous finite automaton) to Computers/C. Complexity Turing Machines TM (types 1 to 3) bear their own/predictable/de(finite) targets F: Gfa (K, , , (q0, F)) (linear, finite automaton, linked to F within K ) K and  for TM (computers) wholly relate to Chomsky’s grammar models; besides, equivalences between Church, Turing and Chomsky’s models were demonstrated. We encompass the oriented (q0, F) within general time A : the target domain for F is all the more limited for fa and on the contrary all the more (meta) powerful when there is a change of cardinality or, in Computational Complexity, of complexity class. (see particularly interesting case of the P to NP to IP to MIP transitions next pages) B) From most powerful computing to semantic space • Semanticb spaces (or rather semantic to syntactic) link languages to meaning • Meaning is the Author level nurturing semantic (contextual) space • The Authority space is therefore utmost meta-level yet following Kojeve’sc somehow FCP definition of authority, which doesn’t require force (typical of scientific authority) • Turing/CC oracles definitions/theorems bring useful measurement for semantics (A) FCP Metamodel Exhibit 1 – Teachings from related fields /3 b C. Mouton, Induction du sens de mots à partir de multiples espaces sémantiques, Recital 2009 c A. Kojeve, La notion d’autorité, 1942

  49. I4OpenResearch What fields of research tell : 1.4 : foundations of Physics There are several types of bridges between so-called formal and physical domains However most basic physical data model is time-space as a de(formed) field, whether continuous (Einstein’s GR proposition) or finite discrete quanta lattice G ≡ (, , , r) or G ≡ (, , , r) There are spherical coordinates but in the second definition we replace second angle by change Observe that the first definition has two angular type dimension and only one spatial The second definition better exhibits the dynamics of perturbative theory, contributes to bridge the gap between physics and grammar’s Pr Productions/Transition/Morphisms a  b A typical transition representation , from Feynman graphs to nuclear and chemical reactions In both cases the angular  fits with the formal content, whether for instance as curvature for gravitational data or other typical angles of physical to chemical properties. Also we generalize this through a representation of oracles easy to derive from classical typewriting Now the most interesting case is the extended definition of time, related to authority as complexity level: the experimenter asks questions and expects answers from measurement apparatus through reducing projections (Penrose) Time oriented  measures complexity difference as well as related number of steps We propose that these (meta)(data)models is a derivation of FCP universal model which may relate (cf. AIP # 1446, 2012) entangled particles NEXP power as compared to separated to general mechanism FCP Metamodel Exhibit 1 – Teachings from related fields /4

  50. I4OpenResearch FCP Metamodel Exhibit 1 – First conclusions and applications /6 What fields of research tell : Theory about research trajectories We derive some conclusions from Computation complexity result Particularly from the applicability of Practically Predictive Data class P since recursive (reusable/predictive) as effective algorithms (Polynomial time) to Knowledge ‘hard’ NP (Non-deterministic) class particularly as NP-complete, which was pushed to the limits through IP and MIP classes. - NP class (Non-deterministic Poly time), defined as ‘at once’ brought solution then polynomial time verified, was split in Provers vs. Verifier - IP stands for Interactive Protocol between Provers all powerful (human i.e. (semantic) oracles) and polynomial bounded Verifier: it is interesting to see how this applies through the process presented on slide 3. - MIP is Multi-Prover Interactive Protocol, shown to hav e NEXP power level [FOR92]concludes about demonstrated difference IP versus MIP: • "this model differs from one-prover interactive protocol in that the oracle must be set ahead of time while in an Interactive Protocol the prover may let his future answers depend on previous ones” • We conclude conversely that MIP indeed has superior predictive power, see slide 3

More Related