1 / 23

Vertical Integration Across Biological Scales

Vertical Integration Across Biological Scales. A New Framework for the Systematic Integration of Models in Systems Biology. University College London CoMPLEX DTI Beacon Project. Talk Outline. The Need for a new Computational Framework in Systems Biology

lana
Download Presentation

Vertical Integration Across Biological Scales

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Vertical Integration Across Biological Scales A New Framework for the Systematic Integration of Models in Systems Biology University College London CoMPLEX DTI Beacon Project

  2. Talk Outline • The Need for a new Computational Framework in Systems Biology • Main Features of our suggested Framework • An example: Neta’s sleeping and feeding patterns • Framework proposal in detail • Next steps (Items 3 and 4 only if time permits)

  3. Status of biological modelling • Cornucopia of models • Different paradigms • Different assumptions • Based on different experimental evidence • Created in ‘foreign’ languages • Documented without standardisation • “Manual integration”, often at the code level • No repository of models

  4. Two Main Challenges • The Processing Challenge • Model Integration • Tool Support • Visualization • Automatic model generation • The Information Management Challenge • Cataloguing existing models • Linking to experimental results and previous modeling results

  5. Requirement Survey • Diversity of biological processes to be modeled • Biological Organization • Different modeling Schemes • ODE’s, Stochastic, Process algebra • Many different tools for available for each scheme • Iterative Modelling Process • Model verification: One should be able to use the model to make predictions, which would later be verified by further experiments • The model would then be modified, in light of the new experimental results.

  6. Requirements Survey - Conclusion The main features of the new modeling approach should be: • Modularity – probably through componentization • Modularity also serves to support gradual, piece wise development • Heterogeneity - Integration of models created in different schemes. • Support for Meaningful Integration • Care for Semantics (Probably through use of Ontologies) • Representation and observance of assumptions and constraints

  7. Suggested Component Based, MiddleWare Framework • Consists of a component middleware and a set of supporting services • Context/Interpretation Repository, Model Repository • Models are executed in their original language, using their native environment, in a distributed manner • Instantiated models are exposed as components, having well defined interfaces. • A new language enables one to create composite models. • An orchestrator executes a composite model, by calling the different engines to execute the submodels it is composed of, and integrating the computational results

  8. Orchestrator C++ (Running model A) Compounder Mathematica (Running model C) Xppaut (running model B) Proposed Middleware Architecture Context Service Interpretation Service 9 9 1 3 Engine Wrapper 2 4 5 6 5 Engine Wrapper 8 Engine Wrapper 7 7

  9. Neta’s Sleeping and feeding patterns

  10. Neta’s Sleeping and Feeding Patterns – A model Assume: • x(t) - Neta’s sleepiness at any given time (1- very alert, 10 very sleepy), • y(t) - Neta’s hunger level at any given time (1 – bottle rejected in disgust, 10 – had been screaming for a bottle for the last 10 minutes) • z(t) - Neta’s interest level in the current activity provided her, at any given time (1 – not interested, up to 10, very interested) • a and b are parameters for the model: a is Neta’s daily nutritional requirement, and b is Neta’s daily sleep requirement. Then: dx/dt = bx(t) - y(t) – z(t) dy/dt = -x(t)+ay(t) + z(t)

  11. One Model, Different Interfaces: • A single model can support a diverse range of functionality, or interfaces: • Each possible interface has its own required context • “If you provide me with …. , I will be able to tell you …. • Thus, each model’s meta-data, should specify a context/provided interface table

  12. The context/provided interface table for Neta’s model:

  13. Proprietry and Standard Engines • Models are interpreted using engines. The engine actually implements the interfaces (functionality), according to the model specs. • An engine can be seen as a Model Instance Factory

  14. Engine – A Model Instance Factory Context 1 (a,b,polytrack z(t)) O. Margo Neta’s Model Model Instance Factory Creates Model Instance ‘yellow’ (supports yellow methods)

  15. Engine – A Model Instance Factory O. Margo Neta’s Model Context 2 (a,b) Model Instance Factory Creates Model Instance ‘green’ (supports green methods)

  16. Proprietary & Standard Engines • How does one exploit a model’s functionality - How does one communicate with the model? • Usually this is done in a proprietary, specific way, provided and specified by the engine on which the model is interpreted. • But we would like this communication, or model usage, to be performed in a standard way, in order to enable model integration • The solution: Use wrappers. • Wrappers also expose the Engine’s Model Instance Factory functionality in a standard way • We have already written such wrappers for Xppaut and Mathematica.

  17. Standard Engines and Engine Wrappers Outside World Wrapper Standard_call1(input, output in standard format) Proprietry Engine Very_weird_call1(input in proprietry format, output in proprietry format) Standard Engine

  18. The Orchestrator • The Orchestrator is an Engine, that can interpret composite models • It is mainly a coordinator, or a workflow execution service. • May run models either sequentially or concurrently • May use a compounder to solve concurrently two or more models, which form a feedback loop • Composite Models are specified in the Composite Models Language – BeCMolla (Beacon Composite Models Language)

  19. A Simple Example(1) Assume we also have : • Liat’s Neta model, depicting Neta’s interest level in the current activity: dz/dt = C(i) – k*(a*x(t-t0)+y(t-t0)) • The NHS standard feeding calculator • When provided when Neta’s age and weight, gives you Neta’s daily feeding requirements (a) and hours of sleep required (b)

  20. A Simple Example (2) • We can now create a composite model, to predict Neta’s feeding, sleeping, and interest level patterns: • This composite model would provide the following functionality:

  21. A Simple Example(3) This model is composed, and will be solved, as follows: The NHS model would be run first, and then Ofer and Liat’s model would be solved together, using a compounder NHS Liat Ofer

  22. Orchestrator C++ (Running model A) Compounder Mathematica (Running model C) Xppaut (running model B) A Simple Example (4) Context Service Interpretation Service 9 9 1 3 Engine Wrapper 2 4 5 6 5 Engine Wrapper 8 Engine Wrapper 7 7

  23. Where now? • We need to define the ‘model components’ interface language • Select an underlying middleware architecture we would like to use: • Web Services? • Web Services + (OGSA|BEPL|XLANG|W3C Choreography)? • Java objects? • CCA? • SBW? • Devise the numerical algorithms which can be used to integrate models together • Actually build a framework (an implementation)

More Related