Service oriented computing service discovery and composition
1 / 48

Service-Oriented Computing Service Discovery and Composition - PowerPoint PPT Presentation

  • Updated On :

Service-Oriented Computing Service Discovery and Composition. Service Discovery. UDDI Discovery based on WSDL information WS-Discovery Provide an interface for service discovery Define a multicast discovery protocol Limitations: No service liveness information, limited service description

Related searches for Service-Oriented Computing Service Discovery and Composition

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'Service-Oriented Computing Service Discovery and Composition' - EllenMixel

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Service oriented computing service discovery and composition l.jpg

Service-Oriented ComputingService Discovery and Composition

Service discovery l.jpg
Service Discovery

  • UDDI

    • Discovery based on WSDL information

  • WS-Discovery

    • Provide an interface for service discovery

    • Define a multicast discovery protocol

    • Limitations: No service liveness information, limited service description

  • Universal Plug and Play (UPnP)

    • Enables dynamic networking of intelligent appliances, wireless devices, and PCs

    • Not exactly web services, but is another form of standardization for physical systems

Ws discovery l.jpg

  • Extend UDDI to make it distributed

  • WS-Discovery multicast message types:

    • Hello: Sent by a Target Service when it joins a network

    • Bye: Sent by a Target Service when it leaves a network

    • Probe: Sent by a Client searching for a Target Service

      • Search by Type and/or Scope

    • Resolve: Sent by a Client searching for a Target Service by name

      • Already know the target service by name, but may not know the communication details

  • Response uni-cast message types:

    • Probe Match: a Target Service matches a “Probe”

    • Resolve Match: a Target Service matches a “Resolve”

Slide4 l.jpg

  • Devices

    • When entering a network, register to the control point

    • When leaving the network, information the control point

  • Control point

    • When entering the network, search for devices

Slide5 l.jpg

  • Device specification

    • Device name, vender name, URL for the device, etc.

    • Commands, actions, parameters, etc.

    • Current state information

  • Interactions

    • Control: Control point sends commands/actions (in XML) to activate the device

    • Event:

      • Control point can request device to send updates when some variables (specified in the event) are updated

      • Device can accept the request and respond with a event duration

    • Presentation

      • If the device has an URL, control point can request and fetch it

      • Some devices can be controlled through the URL interface

Slide6 l.jpg

  • Compare to UDDI

    • Similar, but has an additional interaction feature

    • Lack of semantics, hard to compose the devices together to achieve a client goal

  • Can we wrap devices into high-level services and use the OWL-S technologies to add semantics to devices?

Semantic web service discovery l.jpg
Semantic Web Service Discovery

  • Semantic search template

    • User’s service requirements

  • Semantic web service descriptions

  • A similarity based matching scheme (just an example)

    • The Search Template is matched against a set of candidate Web services in a registry and the match Scores are calculated

      • Overall similarity = Weighted average of syntactic similarity and Functional similarity (normalized sum of operational similarity)

      • Syntactic similarities = Weighted average of the Name similarity and Description similarity

      • Operation similarity = Weighted average of syntactic similarity, conceptual similarity, and I/O similarity

      • … (further decompose the similarity definitions)

Semantic web service discovery8 l.jpg
Semantic Web Service Discovery

  • Example search template

Semantic web service discovery9 l.jpg
Semantic Web Service Discovery

  • Example candidate service

Semantic web service discovery10 l.jpg
Semantic Web Service Discovery

  • Similarity definitions

Best match!

Compare a desired service name with the names of several candidate services

Best match!

Compare a desired operation with all the operations of a candidate service

Service composition l.jpg
Service Composition

  • How to put services together to achieve the desired goal

    • Same old problem

    • Design patterns has shown to be effective in software design

    • Many industrial efforts on SOA design patterns

    • What is a pattern

      • "A solution to a problem in a context"?

      • Each pattern describes a problem which occurs over and over again ... and then describes the core of the solution to that problem, in such a way that you can use this solution over and over again

  • Research

    • Pattern based research considers how to specify patterns, i.e., how to specify the problem, solution, effects

    • Semantic Web service composition community consider AI planning techniques for composition reasoning

Ai planning for service composition l.jpg
AI Planning for Service Composition

  • Planning

    • move(x,y)

      • Pre-condition: clear(x) and clear(y)

      • Effect: on(x,y) and clear(x)

      • Delete effects: on(x,?), clear(y)

    • Always

      • clear(table)

      • Does not conflict with on(x, table)

Initial state




clear(a), clear(c)

Goal state















Ai planning for service composition13 l.jpg

















. . .


AI Planning for Service Composition







. . .

Planning is different from other search algorithms

- which generally based on a quantitative measure

Planner involves state computation and maintenance

Ai planning for service composition14 l.jpg
AI Planning for Service Composition

  • Map semantic web to the planning domain

    • Definitions for web services

      • Syntactical definition: I/O parameters

      • Semantic definition: pre-condition and effects

        • Supported in OWL-S and WSMO

    • Map services to actions in planning domain

      • Pre-condition/effects of the services become the pre-condition/effects of the actions

      • I/O definitions are translated t o the pre-condition/effects

    • Map the problem to the planning domain

      • Define the goal for the problem

      • Define the initial facts

Ai planning for service composition15 l.jpg
AI Planning for Service Composition

  • Limitations of traditional planners for service composition

    • Atomic actions with deterministic effects, only able to generate sequential plans

      • Conditional, iterative planning: Construct a plan with branches, taking all possible nondeterministic effects and contingencies into account

    • Complete knowledge of the world, full observability

      • Conformant planning: Find a plan which works in any initial situation or incomplete knowledge

      • Contingency planning: Consider all possible nondeterministic effects or replan when unexpected situation occurs

Ai planning for service composition16 l.jpg
AI Planning for Service Composition

  • General limitations for service composition

    • Pre-conditions and effects, initial states, and goals are mostly simple conjunctions of propositions

      • Can real-world web services be easily specified based on these?

      • Has been a core problem in SE for 20+ years!!! Will it work now?

    • Scaling issues

      • There may be thousands of services each with multiple ports

      • Even worse in cyber-physical systems

        • A lot devices with similar functionalities

      • Solutions

        • Hierarchical search: First use keyword based search to filter out unlikely actions, then use planner to explore the possible actions

        • Service ontology: Categorize services and specify service relations using an ontology

Ai planning for service composition17 l.jpg
AI Planning for Service Composition

  • General limitations for service composition

    • Assume a static and finite set of actions

      • In SE, a problem can be decomposed and then find the corresponding components

      • Research work trying to handle partial planning with missing actions

    • Interaction with users

      • Planer better be more mixed-initiative

    • knowledge engineering issues

      • Efficiently and effectively interacting with XML based information

      • This should be the simplest problem among the many issues

Qos in service composition l.jpg
QoS in Service Composition

  • QoS (quality of service)

    • Nonfunctional properties to be satisfied

    • E.g., availability, performance, price, reliability

  • Service composition with QoS considerations

    • Find the best service to meet user QoS requirements

    • Need to specify client QoS requirements

      • First, need to define what QoS is (the properties of concern)

    • Need to know the QoS properties of the services

      • For a single service, QoS properties can be measured

      • For a composite service, how to derive the properties of the composed service?

        • For some properties, property aggregation can be very difficult

        • Also, need to understand the interaction behaviors among services

    • Decision making: which services to select?

Qos in service composition19 l.jpg
QoS in Service Composition

  • Need to have

    • A formal process to do this systematically, from QoS specification to negotiation, to finalize the selection

    • An agreement between the involved entities to ensure that the negotiated QoS terms are exercised

       Service Level Agreement (SLA)

  • WS-Agreement

    • Provides the specification standards for SLA between the client and the service providers

    • Dynamically established and dynamically managed QoS

    • Use QoS ontology for QoS specifications, negotiation, and management

Ws agreement l.jpg
















  • Negotiation Layer

  • Agreement Layer

  • Service Layer

    • Factory: creates

      the instance


















Application Instance




Ws agreement21 l.jpg

  • Negotiation layer

    • Provides a Web service-based generic negotiation capability

    • Newly added (original only has two layers)

    • Negotiation state transition:

Ws agreement22 l.jpg




  • Context

    • Agreement initiator, responder,

      expire time, etc.

  • Service Terms

    • Identify the specific services

      to be provided

  • Guarantee Terms

    • The service levels that

      the parties are agreeing upon

    • Can be used for monitoring



Service Terms

Guarantee Terms

Ws agreement23 l.jpg




A guarantee term has a scope – e.g. operation of service







Qualifying Condition


A guarantee term may have collection of service level objectives

e.g. responseTime < 2 seconds

There might be business values associated with each guarantee terms. Business values include importance, confidence, penalty, and reward.

e.g. Penalty 5 USD


A guarantee term may have a qualifying condition for SLO’s to hold.

e.g. numRequests < 100
















OWL ontology

Assessment Interval

Assessment Interval






Ws agreement agreement schema l.jpg
WS-Agreement – Agreement Schema

WS-Agreement lacks formalism for requirement specification

This can make the negotiation and selection process difficult.

SWAPS provides these specifics.

Agreement reasoning l.jpg
Agreement Reasoning

  • Semantic WS-Agreement Partner Selection (SWAPS)

    • WS-Agreement

    • Temporal Concepts: time.owl

      • OWL version of time (

      • Example concepts: seconds, dayOfWeek, ends

    • QoS ontology

      • E.g., Ont-Qos (IBM)

      • E.g., QoS ontology in Metero-S

      • Example concepts: responseTime, failurePerDay

    • Domain Ontology

      • Represent the domain knowledge

    • Semantics of predicates rules, such as <, =, etc.

  • User defined rules

    • Allow users to customize matchmaking declaratively

Example domain rules l.jpg
Example Domain Rules

  • Consumer:

    • Requirement: Availability is greater than 95%

  • Provider:

    • Mean time to recover (MTTR) = 5 minutes

    • Mean time between failures (MTBF) = 15 hours

  • Domain rule:

    • Availability = MTBF / (MTBF + MTTR)

  • Reasoning

    • Availability of the provider = 99.4%.

Agreement reasoning28 l.jpg
Agreement Reasoning

  • An alternative alt1 is a suitable match for alt2 if:

    • (("Gi) such that Gi alt1  requirement(alt1, Gi) 

      ($Gj) such that Gj alt2  capability(alt2, Gj) 

      scope(Gi) = scope(Gj)  obligation(Gi) = obligation(Gj) 

      satisfies(Gj, Gi)

      • G: a term in the agreement

      • requirement(alt, G): True if G is a requirement of alt

      • capability(alt, G): True if G is a capability of alt

      • scope(G): The scope (the service operation) of G

      • obligation(G): The obligated party of G

      • satisfies(Gj, Gi): True if the SLO of Gj is equivalent to or stronger than the SLO of Gi

Agreement reasoning29 l.jpg
Agreement Reasoning



responseTime < 14 s

QC: day of week = weekday

Penalty: 15 USD


99% of

responseTimes < 14 s




FailurePerWeek < 7

Penalty 10USD


failurePerWeek < 10



transmitTime < 4s

QC: maxNumUsers < 1000

Penalty: 1 USD


processTime < 5 s

QC: numRequests < 500

Penalty: 1 USD



failurePerWeek < 7

Penalty: 2USD

Agreement reasoning30 l.jpg
Agreement Reasoning


responseTime < 14 s

QC: day of week = weekday

Penalty: 15 USD



99% of

responseTimes < 14 s




FailurePerWeek < 7

Penalty 10USD


failurePerWeek < 10


responseTime < 9s

QC: maxNumUsers < 1000

QC: numRequests < 500 Penalty: 1 USD

Domain Specific Rules:

responseTime =

processTime + transmitTime



failurePerWeek < 7

Penalty: 2USD


Agreement reasoning31 l.jpg
Agreement Reasoning

  • Problem

    • There may be no matching providers

    • Specification can be optimization based

      • E.g., minimize responseTime

         Choose the provider that yields best fit

  • How to combine different terms

    • Some non-exact-matching terms, each with a satisfaction level

    • User define a weighted function to combine them

    • Multi-objective optimization

  • When there are a large number of choices

    • Use some filtering mechanisms first

    • E.g., consider one term first

Agreement reasoning32 l.jpg
Agreement Reasoning

  • Problem

    • Only consider selection of a single service

    • Requirements are frequently specified as end-to-end requirements

    • E.g., the response time of the composite service is < 1 sec

      • But the response times are specified only for the atomic services

  • How to aggregate the properties

    • For some QoS aspects, this can be very difficult

  • When there are a large number of choices

    • Each customer required service may have many providers

    • Combination can be extensive

       Use efficient search algorithms to find the best match

    • Linear programming, genetic algorithms, etc. have been used

Qos property aggregation l.jpg
QoS Property Aggregation

  • Time

    • Generally is the simplest property to handle

    • Sum for sequential composition, Max for parallel composition, Min for choices

    • Is this accurate?

      • Some works consider multi-tier queuing network for time estimation

    • How about communication cost

      • Keeping a table of pair-wise communication cost? O(N2) space

      • Where to maintain the cost?

      • Infeasible in case each service may be able to compose with many other services

  • Cost

    • Sum for sequential/parallel composition, Min for choices

Qos property aggregation34 l.jpg
QoS Property Aggregation

  • Power consumption

    • Sum for sequential composition, Min for choices

    • Is this accurate? How about cumulative heat, etc.

    • Complicated aggregation for parallel composition

  • Availability

    • MTTF / (MTTF+MTTR)

      • Mean time to failure (MTTF)

      • Mean time to repair (MTTR)

    • How to aggregate the availability of the software, hardware, and communication?

    • Large versus small MTTF & MTTR

      • Pacemaker: MTTF = 1years, MTTR = 5min, A = 0.99999

      • Pacemaker: MTTF = 3 hours, MTTR = .01 sec, A = 0.99999

      • Need to consider request frequency and failure consequences

Qos property aggregation35 l.jpg
QoS Property Aggregation

  • Reliability

    • Software reliability

      • A: reliability = 1, B: reliability = 0.99

      • A+B: reliability = ?

        • A always invokes B’s incorrect path

        • A never invokes B’s incorrect path

      • Need to consider operational profile for software reliability estimation

        • Determine the probability of each type of input

          • Mostly partitioned based on a group of program paths

        • Determine the reliability according to the probability of failure for each input type (testing results) and probability of the input

      • So, aggregation need to consider mapping of the operational profiles

        • From A’s operational profile to estimate B’s operational profile

    • System reliability

      • Combining software and hardware reliability, how?

Qos property aggregation36 l.jpg
QoS Property Aggregation

  • Quality

    • Difficult to measure quality even for a single service

    • Use fuzzy logic to express quality

    • How to compose them?

  • Security

    • Focus on policy based composition, not quantitative measures

      • No established method to quantitatively measure security yet

      • Too many factors

        • Policy issues, attacks, etc.

    • Generally only consider the policies, but not the attacks

      • Even for policies, mostly focus on the policies for individual services, not the information flow in the composite service

    • Attacks may compromise the system, making policies useless

Qos property aggregation37 l.jpg
QoS Property Aggregation

  • Simulation approach

    • Obtain the aggregate QoS properties based on simulation

      • Time: Rapidly compose the system and obtain the cumulative time

      • Reliability: Run test cases with the actual composite service

    • If there are many different combinations, simulation of each case is infeasible

Search for a composition solution l.jpg
Search for a Composition Solution

  • Genetic algorithm

    • Selection

      • E.g., Roulette-wheel selector

      • Probability of selecting Si = E(Si) / k E(Sk)

    • Mutation, Crossover

Search for a composition solution39 l.jpg
Search for a Composition Solution

  • Multi-object problem and Pareto-optimal solutions

    • Separate considerations of the objectives

    • Comparison problem (which is better)

    • Dominate solution (one is definitely better than the other)

    • Pareto-optimal solutions: non-dominated solutions

Search for a composition solution40 l.jpg
Search for a Composition Solution

  • Problem

    • Services may be QoS reconfigurable

      • E.g., a security service may offer many different security configurations, providing security and time tradeoffs

      • E.g., an AI search algorithm may offer different thresholds, providing quality and time tradeoffs

    • Need to consider not only which provider, but also the specific configurations of the provider services

    • Greatly increase the search space

  • Consider the paper

    • QoS-Reconfigurable Web Services and Compositions for High-Assurance Systems

Search for a composition solution41 l.jpg
Search for a Composition Solution

  • Example: Media security gateway for VoIP

Search for a composition solution42 l.jpg
Search for a Composition Solution

  • Example:

    • Reconfigurable SRTP service:

      • Implement secure and real-time audio packets transmission

      • Configurable parameters: Different encryption key sizes, encryption algorithms, authentication key sizes, and authentication algorithms

    • Reconfigurable AMR codec service

      • Implement the AMR codec standards

      • Configurable parameters: Different encoding (compression) rates

      • Tradeoffs between voice quality, commu/exec times

    • Reconfigurable Packetization service

      • Put audio frames into packets and add error correction and retransmission control codes

      • Configurable parameters: Different error correction codes, retransmission windows, etc.

      • Tradeoffs between the packet loss rate (which impacts voice quality) and the commu/exec times

Search for a composition solution43 l.jpg
Search for a Composition Solution

  • Compositional approach

    • First find the Pareto-optimal solutions within the services

    • Compose the Pareto-optimal solutions

      • Compose Pareto-optimal solutions as the initial population

        • If PO-solutions of the components do not guarantee the PO-solutions of the composed system

      • Only compose Pareto-optimal solutions as the final solutions

        • Much more efficient due to the greatly reduced search space

Search for a composition solution44 l.jpg



Service Domain

- Life detection service

- Terrain search service

- 

Swarm of robots

Search for a Composition Solution

  • Problem: independent admission control in each domain

QoS-driven Service Composition

Service Composition

Service Domain

Service Domain



  • Execution environment in each domain:

  • Each service runs on a VM

  • How to allocate cores to VMS

  • (or other resources to services)

  • Manage resources and power such that:

  • Best satisfy QoS goals of all clients

  • Minimize power consumption

  • (e.g., dynamic voltage scaling)

QoS properties of the composed system







Service Composition



Need to perform admission control:

To ensure that the admitted workloads will have agreed upon levels of QoS and will not violate power constraints

  • Correspondingly, service composition should consider

  • Execution environment (e.g., admission control and energy-awareness)

  • Dynamic adaptation (e.g., selecting configurable services rather than

  • those that provide the most satisfactory QoS for the time being

  • System QoS prediction from QoS of individual constituent services

  • All the above: May have many potential candidates to consider

  •  efficiency concerns

Search for a composition solution45 l.jpg
Search for a Composition Solution

  • Major issues:

    • Each domain requires admission control in order to achieve the guaranteed service agreement

    • But during the service selection time, the execution environment (whether the service can be admitted) is not specifically considered

    • If we do consider admission at the selection time:

      • There may be many matching services

      • Interacting with each matching service to find out the feasibility can be overkill

    • When considering composition, this can be worse

Search for a composition solution46 l.jpg
Search for a Composition Solution

  • Reference: Service Composition for Real-Time Assurance

  • Multi-level agreement approach

    • First: find a set of top-choice compositions based on published provider information, without concerning the admission

      • When deciding top choices, better ensure that there are no common services in all choices; otherwise, if admission for one of these services fails, no composition is valid

    • Then proceed to check the admission and proceed with agreement

      • When to proceed with the agreement? In a composite service, if one of the atomic service selected cannot admit the customer, then all the agreements should be voided

      • If a provider agrees to provide a service and reserved the resources for customer x, and because of this, it denied another customer y. When customer x revokes all the agreements, what can be done?

      • Need a pre-agreement phase

Search for a composition solution47 l.jpg
Search for a Composition Solution

  • Multi-level agreement approach

    • In the first level, use simple QoS estimation methods

      • May have a large number of compositions to be estimated

      • Time: simply add together

      • Reliability: simply multiply together

      • Security: only evaluate important policies

    • In the higher levels: Predict QoS with more sophisticated methods

      • Time: consider queuing analysis

      • Reliability, consider sophisticated mapping of operational profiles

    • In the final stage

      • Simulation to validate the correctness of the composition (and estimation)

Qos in service composition48 l.jpg
QoS in Service Composition

  • Decision algorithms

    • Has been the focus of many papers

    • But is not the major difficulties in QoS-driven service composition

  • QoS property aggregation

    • Highly challenging

  • Admission issues

    • No perfect solution

    • Multi-level approach would help

    • Do not always go for best-choice would help diversify the selections of different customers