1 / 31

Collaborative Design & Data Exchange Leveraging SOA

Collaborative Design & Data Exchange Leveraging SOA. Henri van den Bulk PDE2009. Agenda. Vision Current Environment Role of SOA Conceptual Architecture Data Exchange Framework. Gives people the ability to observe and understand operations and opportunities.

raechel
Download Presentation

Collaborative Design & Data Exchange Leveraging SOA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Collaborative Design & Data Exchange Leveraging SOA Henri van den Bulk PDE2009

  2. Agenda • Vision • Current Environment • Role of SOA • Conceptual Architecture • Data Exchange Framework

  3. Gives people the ability to observe and understand operations and opportunities. • Business Activity Monitoring • Complex Event Processing • Portal • Rich Client Connects applications and assets across yourextended enterprise. Enables the coordinated and adaptable execution of activities and transactions. • Application Integration • B2B Integration • Data Integration • Mainframe Integration • Modeling • Execution • Analytics Provides foundation for manageable service-oriented and event-driven architecture.  • Messaging • Monitoring and Management • Service Deployment Platform   How TIBCO Helps You Integrate Your Assets Standards-based Common Environment Secure and Scalable

  4. Vision Provide a consistent interface, based on industry accepted standards, for delivering acceptance data and interacting with program data that resides within the environment.

  5. Background • Organizations are increasingly focused on the role of systems integrator, relying on partners and suppliers to provide significant portions of the product designs • Partners are required to provide data packages based on their contracts • Such as CAD, Products, Parts, Requirements etc. • Based partner-conducted studies a general recommendation emerged for leveraging Industry standards for data delivery and integration, specifically ISO 10303 with STEP.

  6. Problem Statement • Delivery of data packages • Need for a system interface for delivery of data • Delivery of large data sets, e.g. CAD • Tracking of data deliveries • Verification of data delivery • Approval routing of acceptance data and partner notification • Dissemination of data • Collaboration / Interoperability • Ability to work in a geographically distributed environment with different stakeholders • Interoperability between systems • Provide access to program data

  7. Current Environment • Design data resides within different sources inside and outside of the environment • Manual delivery of data occurs either as a file or upload manual into different systems • No standard external interfaces

  8. Integrated Collaborative Environment Web-centric environment which is used by industry, academia and government for: sharing, collaborating, integrating, accessing and controlling management information and product data defining all of the products. • Single source access to program data • Integrated systems that support the environment • Accessibility to all stakeholders

  9. Major Components Product Life Cycle Management – enables access to all sources of authoritative program data resident in multiple data sources. Programmatic Management – Management of all programmatic data (cost, schedule, performance, risk) as well as providing a single view into information allowing users to develop earned value metrics and risk mitigation plans. Project Collaboration – Application for sharing and teaming among all ICE constituents that provides access control, discussion forums, resource management and project reporting. Process Automation (Workflow) – automated process management ensuring consistency across programs. Visualization – web-centric capability providing 2D and 3D visual collaboration, mock-up, prototyping, review and study as part of the product development process.

  10. Dx Objectives • Establish the data exchange standards to be used for data delivery and interaction with data • Provide services to allow for delivery of data in automated processes that support large data file delivery • Enable collaboration between partners • Provide process management to support variations in acceptance processes • Establish data dissemination framework • Provide security across the capabilities

  11. Role of SOA in this domain • Focus on wrap and re-use of existing capabilities in the infrastructure • Enable Business Process Management (BPM) across the Product Life Cycle • Uniform access to PLM data • Dissemination of key Business Events • Support business and process change, create organizational agility

  12. PLM Architecture Components for SOA Product Life Cycle Management Engineering Change Management Product Data Management Simulation Data Management Manufacturing Data Management Service Data Management Business Process Management (BPM) & Complex Event Processing (CEP) Data Integration and Master Data Management (MDM) Application Integration

  13. Architecture Components • Business Process Management (BPM) – Management of processes across the lifecycle, capabilities and layers of design • Complex Event Processing (CEP) – Provide the capability to correlate events from different sources and stages of PLM process. For example, anomalies occurring in different areas of the process taken together might indicate a design flaw or defect • Data Integration and Master Data Management (MDM) – Manage Products, Part and other PLM objects • Application Integration – Provide connectivity to the underlying systems (PDM, Requirement Management, etc.)

  14. Service Model and Properties Virtualized Loose Coupling Modular Consumer Provider Service Standardized Composable Abstracted

  15. Principles of SOA Standardizationis something that is just not talked about by many people. Forget reuse, this is a programmers view of the world. Standardization is a business matter and relates to how an enterprise manages its business processes, its data and application portfolio. Standardization is required because the business demands consistency, not because IT desires reuse. Abstractionis the most powerful of tools in the agility toolbox. A small amount of work to generalize a service specification can allow the service to support many different contexts, for example product, channel or geographic data and rules may be abstracted to allow a common capability to be used in a consistent manner across an enterprise or ecosystem, or to allow support for future business change within minimal or no effort. Composabilityis again a hugely powerful technique that takes advantage of the fractal nature of SOA that allows hierarchies or assemblies to be constructed based on more common, standardized services at the lower layers that are increasingly specialized at the higher layers. Modularityis a concept that can be implemented at many levels in an SOA. Relative dependency and modularity should be determined in the business model and applied to the business processes, services and components. In the early stages an architect should be looking to reduce dependency so that the horizon of change can be predicable, measured and minimized. But as the portfolio is more widely based on service interfaces that make the underlying applications more transparent there will be many opportunities to componentize at all levels of the architecture with considerable benefits of increased agility and reduction in cost. Virtualizationis an important part of the SOA. The basic service concept, with or without web services, provides a high level of transparency of the underlying resources, providing the loose coupling has been properly implemented and there are no design or platform dependencies established by the service consumer or provider. The virtualization then provides opportunities for the provider and consumer to act independently and to have different life and upgrade cycles with consequent increased agility and response to change.

  16. SOA Challenge SOA application .NET/J2EE application Provision Customer Order FF Manufacture

  17. Service Consumers Governance (Service Registry & Policy Management) Enterprise Service Bus Leverage ESB Web 2.0 Composite and AJAX Rich Internet Applications Core Business Process Service Virtualization Developers/ Producers A A A A A A O O O O O O Program / Project Management Ecosystem Ecosystem Risk Management Requirements Management PDM Partners EVM

  18. ESB Capabilities • Ubiquitous Mediation Layer • Separation between Business Logic and Transport / Technology • Transport Bridging • Eventing • Content Based Routing • Service Virtualization • Service Scaling Horizontal / Vertical • Policy Management • Ensure loosely coupling of services • Messaging Based • Orchestration and Choreography

  19. Architecture Views SOA Architecture is based on industry best practices Leverage the OASIS Reference Model for SOA Model describes different architecture views: Business view – high-level business requirements and utilization Infrastructure view – concepts from a system infrastructure perspective Metadata view – concept for metadata

  20. Business View Business Function: A service executes (encompasses) a useful business function. Close Risk Business Event: An incident occurring in the business environment, which warrants some action from the business. Baseline of Requirements SOA Reference Model

  21. Infrastructure View

  22. Metadata View

  23. Possible Data Exchange Mechanisms • Manual – Either provide a data package by file or upload this package into a tool manually. For example, taking a package and putting this into the PDM System • File Transfer – Delivery of package in a tool specific format using standard file transfer mechanisms. The format is then used for importing / exporting data. The tool can have structured or unstructured data, but is specific to that tool • Service Oriented – Provide standardized interfaces that can accept and route information independent of format.

  24. Conceptual Architecture Partner Data Delivery UI Data Validation and Verification File Based Drop Box Partner Systems Data Exchange Services Mass data exchange (asynchronous) Review Service Based Partner Systems / UI PLM / PLCS Services Data Distribution System Synchronous Connectivity & Interoperability (ESB/MFT) PDM Req. Management Program / Portfolio Management Others

  25. Data Exchange Framework The framework provides distinct components of the exchange Conversation Partner Agreement Message Envelope Security Transport Protocol

  26. Components of a Data Exchange • Data Exchange Agreement - This is a specific agreement between partners. It refers to the particular conversation, message structure, transport protocol, and security attributes that partners choose for their communications. Depending on the business protocol, this may also include technical details like the certificates file and the URL for HTTPS transport. • Conversation - This includes certain communication options. Depending on the business protocol, these may include transaction types like notify and synchronous or asynchronous request-response, as well as options like time-outs, retries, and exception handling. • Message Envelope - Depending on the business protocol, this may include MIME, S/MIME, XML, or OWL. Each business protocol must provide a message envelope to carry the message body. This envelope and message are then wrapped in an envelope provided by the transport protocol. • Transport Protocol - Depending on the business protocol, this may include HTTP, HTTPS, (S)FTP, or SMTP. • Security - Defines the policies that govern the previous mentioned areas of a data exchange. Depending on the business protocol, this may include authentication, access control, non-repudiation, and encryption.

  27. Common Data Model (payload of envelope) • Data that’s being delivered and it’s meta data needs to be consistent such that validation and dissemination can be done • Data is contained as payload in the envelope. Each message contains meta-data that describes the delivery • The CDM needs to be based on standards to ensure interoperability

  28. Why use Standards? Using Industry Standards provide the following benefits: • Reduce likelihood of mistakes • Easier to communicate and generate ideas – a large number of people working together, function best when they fluently speak a common language (set of standards) • Facilitates data interchange and representation and management of information • Lower costs and increased efficiency • Higher Acceptance among partners

  29. Which Standards Apply? • ISO STEP AP203 Configuration Controlled Design, • ISO STEP AP233 Systems Engineering Data Exchange, • ISO STEP AP239 Product Life Cycle Support (PLCS) • OASIS PLCS Data Exchange Specifications (DEX) - Product Life Cycle Support (PLCS) standard (ISO 10303-239) • Product Lifecycle Management (PLM) Services (OMG standard)

  30. Challenges with Standards • Focus is generic • Can result in loss of tool specific capabilities • Richness of tool specific functionality • Some parts are left up for interpretation • Flexibility can cause compatibility issues • Who drives the standards? • Keeping standards current with emerging technologies, OWL • How do they apply to specific industries?

  31. Challenges Ahead • Determine the delta between standards and business needs • Consistent Master Data Management • How do you perform approvals via PLM services for data changes? • What have been your finding around establishing exchanges using the standards?

More Related