cloud based analytics for cloud based applications n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Cloud Based Analytics for Cloud Based Applications PowerPoint Presentation
Download Presentation
Cloud Based Analytics for Cloud Based Applications

Loading in 2 Seconds...

play fullscreen
1 / 20

Cloud Based Analytics for Cloud Based Applications - PowerPoint PPT Presentation


  • 95 Views
  • Uploaded on

Cloud Based Analytics for Cloud Based Applications. András Pataricza 1 , Imre Kocsis 1 , Zsolt Kocsis 2 et al. 1 Dept. of Meas. and Information Systems, BME, Hungary 2 IBM CAS Budapest, Hungary {pataric,ikocsis}@mit.bme.hu ICA CON 2012, April 20, 2012. Cloud s for demanding applications ?.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Cloud Based Analytics for Cloud Based Applications' - gabriella


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
cloud based analytics for cloud based applications

Cloud Based Analytics forCloud Based Applications

András Pataricza1, Imre Kocsis1, Zsolt Kocsis2 et al.

1Dept. of Meas. and Information Systems, BME, Hungary

2IBM CAS Budapest, Hungary

{pataric,ikocsis}@mit.bme.hu

ICA CON 2012, April 20, 2012

cloud s for demanding applications
Clouds for demanding applications?

Standard infrastructure

vs

demanding application?

cloud s for demanding applications1
Clouds for demanding applications?

Extra-functional reqs:

throughput, timeliness, availability

„Small problems” have high impact

(soft real time)

Virtual Desktop

Infrastructure

Telecommunications

experimental setup
Experimental setup

N.B.: VMware R&D published similar

(March 2012)

it eda is big data
IT EDA is Big Data!

Which determine the QoS?

Hypervizor (host + VMs), OS, application, ...

it eda is big data1
IT EDA is Big Data!

High availabilty, rare faults

Rare events:

granularity AND long horizon

Searching for outliers

rare events lot of sand a few pellets
Rare events: lot of sand, a few pellets

Typically sand: gold mining ≠ data mining

visual analytics causal insight
Visual analytics = causal insight

Computing power use = CPU use ×

CPU clock rate (const.)

Should be pure proportional

Correlation coefficient:

0.99998477434137

Well-visible, but numerically suppressed

Origin???

visual analytics
Visual analytics

Noisy…

High frequency components dominate

But they correlate (93%!)

YOU DON’T SEE IT

impacts of resource sharing
Impacts of resource sharing?

Self-induced

Parasitic influence

short transient faults long recovery
Short transient faults – long recovery

As if you unplug your desktop for a second...

30 sec service outage

120 sec SLA violation

8 sec platform overload

deterministic run time in the public cloud
Deterministic (?!) run-time in the public cloud...

Performance outage intolerable by overcapacity

Variance tolerable by overcapacity

the noisy neighbour problem
The noisy neighbour problem

Logic „fence”

Tenant

Neighbor

Hypervisor

the mistery shopper concept
The mistery shopper concept
  • Basic logic as with benchmarks, but...
  • Metric req:
    • same interference-sensitivities as the service
    • same resource-sensitivities as the service
    • representative for types of services
  • Runtime req:
    • Non-intrusiveness (instead of saturation)
    • Long running (rare events)
    • (Low specific impact on service)

Not trivially feasible...

but everything else impossible

Example: short computation bursts sampling available CPU for longer computation

indirect platform qos observability
Indirect platform & QoS observability

1. Connect

3. Infer (qualitatively)

Works without the application!

2. Observe

The „classic” approach:

deploy, run/test, observe, analyze

The „classic” approach:

deploy, run/test, observe, analyze

The „classic” approach:

deploy, run/test, observe, analyze

Observability problems (if present) bypassed

mystery shopper service qos
Mystery shopper & service QoS

Application failure

Main application

Fast detection

Reaction time

window

Reaction time

window

VM internal fault

Noisy neighbour fault

Mystery shopper

summary
Summary
  • Technical
    • SLA coverage needed for all aspects
    • Missing guarantees can be (somewhat) compensated
      • Cheap computing power -> redundancy
      • „Double” autonomic computing
        • Cloud level – provider
        • Application level – user
  • Methodology
    • Visual exploratory data analysis for insight
    • Algorithmic analysis for proofs and evaluation
    • Fault-tolerance design patterns revisited
      • Cheap redundancy in the cloud