slide1 n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Bahan kajian pada MK. Metode Penelitian Interdisiplin Kajian Lingkungan OPERATION RESEARCH DALAM KAJIAN LINGKUNGAN Diab PowerPoint Presentation
Download Presentation
Bahan kajian pada MK. Metode Penelitian Interdisiplin Kajian Lingkungan OPERATION RESEARCH DALAM KAJIAN LINGKUNGAN Diab

Loading in 2 Seconds...

play fullscreen
1 / 53

Bahan kajian pada MK. Metode Penelitian Interdisiplin Kajian Lingkungan OPERATION RESEARCH DALAM KAJIAN LINGKUNGAN Diab - PowerPoint PPT Presentation


  • 245 Views
  • Uploaded on

Bahan kajian pada MK. Metode Penelitian Interdisiplin Kajian Lingkungan OPERATION RESEARCH DALAM KAJIAN LINGKUNGAN Diabstraksikan oleh Smno.psl.ppsub.okt2012. Optimization (mathematics)

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Bahan kajian pada MK. Metode Penelitian Interdisiplin Kajian Lingkungan OPERATION RESEARCH DALAM KAJIAN LINGKUNGAN Diab' - zavad


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
slide1

Bahankajianpada

MK. MetodePenelitianInterdisiplinKajianLingkungan

OPERATION RESEARCH

DALAM KAJIAN LINGKUNGAN

Diabstraksikanoleh

Smno.psl.ppsub.okt2012

slide2

Optimization (mathematics)

In mathematics, the term optimization, or mathematical programming, refers to the study of problems in which one seeks to minimize or maximize a real function by systematically choosing the values of real or integer variables from within an allowed set.

This problem can be represented in the following way

Given: a function f : A R from some setA to the real numbers

Sought: an element x0 in A such that f(x0) ≤ f(x) for all x in A ("minimization") or such that f(x0) ≥ f(x) for all x in A ("maximization").

Such a formulation is called an optimization problem or a mathematical programming problem (a term not directly related to computer programming, but still in use for example in linear programming.

slide3

Many real-world and theoretical problems may be modeled in this general framework.

Typically, A is some subset of the Euclidean space Rn, often specified by a set of constraints, equalities or inequalities that the members of A have to satisfy. The elements of A are called feasible solutions.

The function f is called an objective function, or cost function.

A feasible solution that minimizes (or maximizes, if that is the goal) the objective function is called an optimal solution.

slide4

The domain A of f is called the search space, while the elements of A are called candidate solutions or feasible solutions.

Generally, when the feasible region or the objective function of the problem does not present convexity, there may be several local minima and maxima, where a local minimum x* is defined as a point for which there exists some δ > 0 so that for all x such that:

the expression:

holds; that is to say, on some region around x* all of the function values are greater than or equal to the value at that point. Local maxima are defined similarly.

slide5

A large number of algorithms proposed for solving non-convex problems – including the majority of commercially available solvers – are not capable of making a distinction between local optimal solutions and rigorous optimal solutions, and will treat the former as actual solutions to the original problem.

The branch of applied mathematics and numerical analysis that is concerned with the development of deterministic algorithms that are capable of guaranteeing convergence in finite time to the actual optimal solution of a non-convex problem is called global optimization.

slide6

Notation

Optimization problems are often expressed with special notation. Here are some examples:

This asks for the minimum value for the objective function x2 + 1, where x ranges over the real numbers R. The minimum value in this case is 1, occurring at x = 0.

This asks for the maximum value for the objective function 2x, where x ranges over the reals. In this case, there is no such maximum as the objective function is unbounded, so the answer is "infinity" or "undefined".

slide7

This asks for the value (or values) of x in the interval [−∞, −1] that minimizes (or minimize) the objective function x2 + 1. (The actual minimum value of that function does not matter.) In this case, the answer is x = −1.

This asks for the (x, y) pair (or pairs) that maximizes (or maximize) the value of the objective function x·cos(y), with the added constraint that x lies in the interval [−5, 5].

(Again, the actual maximum value of the expression does not matter.)

In this case, the solutions are the pairs of the form (5, 2πk) and (−5, (2k + 1)π), where k ranges over all integers.

slide8

Major subfields

Linear programming studies the case in which the objective function f is linear and the set A is specified using only linear equalities and inequalities. Such a set is called a polyhedron or a polytope if it is bounded.

Integer programming studies linear programs in which some or all variables are constrained to take on integer values.

Quadratic programming allows the objective function to have quadratic terms, while the set A must be specified with linear equalities and inequalities.

slide9

Nonlinear programming studies the general case in which the objective function or the constraints or both contain nonlinear parts.

Convex programming studies the case when the objective function is convex and the constraints, if any, form a convex set. This can be viewed as a particular case of nonlinear programming or as generalization of linear or convex quadratic programming.

Semidefinite programming (SDP) is a subfield of convex optimization where the underlying variables are semidefinite matrices. It is generalization of linear and convex quadratic programming.

Second order cone programming (SOCP).

slide10

Stochastic programming studies the case in which some of the constraints or parameters depend on random variables.

Robust programming is, as stochastic programming, an attempt to capture uncertainty in the data underlying the optimization problem. This is not done through the use of random variables, but instead, the problem is solved taking into account inaccuracies in the input data.

Combinatorial optimization is concerned with problems where the set of feasible solutions is discrete or can be reduced to a discrete one.

slide11

Infinite-dimensional optimization studies the case when the set of feasible solutions is a subset of an infinite-dimensional space, such as a space of functions.

Constraint satisfaction studies the case in which the objective function f is constant (this is used in artificial intelligence, particularly in automated reasoning).

Disjunctive programming used where at least one constraint must be satisfied but not all. Of particular use in scheduling.

slide12

In a number of subfields, the techniques are designed primarily for optimization in dynamic contexts (that is, decision making over time):

Calculus of variations seeks to optimize an objective defined over many points in time, by considering how the objective function changes if there is a small change in the choice path.

Optimal control theory is a generalization of the calculus of variations.

Dynamic programming studies the case in which the optimization strategy is based on splitting the problem into smaller subproblems. The equation that relates these subproblems is called the Bellman equation.

slide13

Techniques

For twice-differentiable functions, unconstrained problems can be solved by finding the points where the gradient of the objective function is zero (that is, the stationary points) and using the Hessian matrix to classify the type of each point.

If the Hessian is positive definite, the point is a local minimum, if negative definite, a local maximum, and if indefinite it is some kind of saddle point.

However, existence of derivatives is not always assumed and many methods were devised for specific situations.

The basic classes of methods, based on smoothness of the objective function, are:

Combinatorial methods

Derivative-free methods

First order methods

Second-order methods

slide14

Actual methods falling somewhere among the categories above include:

Gradient descent aka steepest descent or steepest ascent

Nelder-Mead method aka the Amoeba method

Subgradient method - similar to gradient method in case there are no gradients

Simplex method

Ellipsoid method

Bundle methods

Newton's method

Quasi-Newton methods

Interior point methods

Conjugate gradient method

slide15

Line search - a technique for one dimensional optimization, usually used as a subroutine for other, more general techniques.

Should the objective function be convex over the region of interest, then any local minimum will also be a global minimum.

There exist robust, fast numerical techniques for optimizing twice differentiable convex functions.

Constrained problems can often be transformed into unconstrained problems with the help of Lagrange multipliers.

slide16

Here are a few other popular methods:

Hill climbin

Simulated annealing

Quantum annealing

Tabusearch

Beam search

Genetic algorithms

Ant colony optimization

Evolution strategy

Stochastic tunneling

Differential evolution

Particle swarm optimization

Harmony search

slide17

Uses

Problems in rigid body dynamics (in particular articulated rigid body dynamics) often require mathematical programming techniques, since you can view rigid body dynamics as attempting to solve an ordinary differential equation on a constraint manifold;

the constraints are various nonlinear geometric constraints such as "these two points must always coincide", "this surface must not penetrate any other", or "this point must always lie somewhere on this curve".

Also, the problem of computing contact forces can be done by solving a linear complementarity problem, which can also be viewed as a QP (quadratic programming problem).

slide18

Many design problems can also be expressed as optimization programs.

This application is called design optimization. One recent and growing subset of this field is multidisciplinary design optimization, which, while useful in many problems, has in particular been applied to aerospace engineering problems.

Mainstream economics also relies heavily on mathematical programming.

Consumers and firms are assumed to maximize their utility/profit. Also, agents are most frequently assumed to be risk-averse thereby wishing to minimize whatever risk they might be exposed to.

Asset prices are also explained using optimization though the underlying theory is more complicated than simple utility or profit optimation.

Trade theory also uses optimization to explain trade patterns between nations.

Another field that uses optimization techniques extensively is operations research.

slide19

History

The first optimization technique, which is known as steepest descent, goes back to Gauss.

Historically, the first term to be introduced was linear programming, which was invented by George Dantzig in the 1940s.

The term programming in this context does not refer to computer programming (although computers are nowadays used extensively to solve mathematical problems).

Instead, the term comes from the use of program by the United States military to refer to proposed training and logistics schedules, which were the problems that Dantzig was studying at the time.

(Additionally, later on, the use of the term "programming" was apparently important for receiving government funding, as it was associated with high-technology research areas that were considered important.)

slide20

Operations research

Operations research or operational research (OR) is an interdisciplinary science which uses scientific methods like mathematical modeling, statistics, and algorithms to help with decision making in complex real-world problems which are concerned with coordination and execution of the operations within an organization.

The nature of the organization is immaterial. The eventual intention behind using this science is to elicit a best possible solution to a problem scientifically, which improves or optimizes the performance of the organization.

slide21

The terms operations research and management science are often used synonymously.

When a distinction is drawn, management science generally implies a closer relationship to the problems of business management.

Operations research also closely relates to Industrial engineering.

Industrial engineering takes more of an engineering point of view, and industrial engineers typically consider OR techniques to be a major part of their toolset.

slide22

Some of the primary tools used by operations researchers are statistics, optimization, stochastics, queueing theory, game theory, graph theory, decision analysis, and simulation. Because of the computational nature of these fields, OR also has ties to computer science, and operations researchers regularly use custom-written or off-the-shelf software.

Operations research is distinguished by its ability to look at and improve an entire system, rather than concentrating only on specific elements (though this is often done as well). An operations researcher faced with a new problem is expected to determine which techniques are most appropriate given the nature of the system, the goals for improvement, and constraints on time and computing power. For this and other reasons, the human element of OR is vital. Like any other tools, OR techniques cannot solve problems by themselves.

slide23

Scope of operations research

  • A few examples of applications in which operations research is currently used include:
  • designing the layout of a factory for efficient flow of materials
  • constructing a telecommunications network at low cost while still guaranteeing QoS (quality of service) or QoE (Quality of Experience) if particular connections become very busy or get damaged
  • road traffic management and 'one way' street allocations i.e. allocation problems.
  • determining the routes of school buses (or city buses) so that as few buses are needed as possible
slide24

5. designing the layout of a computer chip to reduce manufacturing time (therefore reducing cost)

  • 6. managing the flow of raw materials and products in a supply chain based on uncertain demand for the finished products
  • 7. efficient messaging and customer response tactics
  • 8. roboticizing or automating human-driven operations processes
  • globalizing operations processes in order to take advantage of cheaper materials, labor, land or other productivity inputs
  • 10 managing freight transportation and delivery systems (Examples: LTL Shipping, intermodal freight transport)
slide25

11. Scheduling:

  • personnel staffing
    • manufacturing steps
    • project tasks
    • network data traffic: these are known as queuing models or queuing systems.
    • sports events and their television coverage
  • 12. blending of raw materials in oil refineries
  • Operations research is also used extensively in government where evidence-based policy is used.
slide26

History

The modern field of operations research arose during World War II.

Scientists in the United Kingdom (including Patrick Blackett, Cecil Gordon, C. H. Waddington, Owen Wansbrough-Jones and Frank Yates) and in the United States (George Dantzig) looked for ways to make better decisions in such areas as logistics and training schedules.

After the war it began to be applied to similar problems in industry.

slide27

World War II

Blackett's team made a number of crucial analyses which aided the war effort.

Britain introduced the convoy system to reduce shipping losses, but while the principle of using warships to accompany merchant ships was generally accepted, it was unclear whether it was better for convoys to be small or large.

Convoys travel at the speed of the slowest member, so small convoys can travel faster.

It was also argued that small convoys would be harder for German U-boats to detect.

slide28

On the other hand, large convoys could deploy more warships against an attacker.

Blackett's staff clearly showed that:

1. Large convoys were more efficient

2. The probability of detection by U-boat was statistically unrelated to the size of the convoy

3. Slow convoys were at greater risk (though considered overall, large convoys were still to be referred)

slide29

In another piece of work, Blackett's team analysed a report of a survey carried out by RAF Bomber Command.

For the survey, Bomber Command inspected all bombers returning from bombing raids over Germany over a particular period.

All damage inflicted by German air defenses was noted and the recommendation was given that armour be added in the most heavily damaged areas.

Their suggestion to remove some of the crew so that an aircraft loss would result in fewer personnel loss was rejected by RAF command.

slide30

Blackett's team instead made the surprising and counter-intuitive recommendation that the armour be placed in the areas which were completely untouched by damage, according to the survey.

They reasoned that the survey was biased, since it only included aircraft that successfully came back from Germany.

The untouched areas were probably vital areas, which if hit would result in the loss of the aircraft.

slide31

When the Germans organised their air defences into the Kammhuber Line, it was realised that if the RAF bombers were to fly in a bomber stream they could overwhelm the night fighters who flew in individual cells directed to their targets by ground controllers.

It was then a matter of calculating the statistical loss from collisions against the statistical loss from night fighters to calculate how close the bombers should fly to minimise RAF losses.

slide32

It is known as "operational research" in the United Kingdom ("operational analysis" within the UK military and UK Ministry of Defence, where OR stands for "Operational Requirement") and as "operations research" in most other English-speaking countries, though OR is a common abbreviation everywhere.

With expanded techniques and growing awareness, OR is no longer limited to only operations, and the proliferation of computer data collection has relieved analysts of much of the more mundane research.

But the OR analyst must still know how a system operates, and learn to perform even more sophisticated research than ever before. In every sense the name OR still applies, more than a half century later.

slide33

The profession of operations research

Societies

The International Federation of Operational Research Societies (IFORS) is an umbrella organization for operations research societies worldwide.

Significant among these are the Institute for Operations Research and the Management Sciences (INFORMS) and the Operational Research Society (ORS).

EURO is the association of European Operational Research Societies (EURO).

CORS is the Canadian Operations Research Society (CORS).

slide34

ASOR is the Australian Society for Operations Research (ASOR).

MORS is the Military Operations Research Society (MORS)--based in the United States since 1966 with the objective of enhancing the quality and usefulness of military operations research analysis in support of defense decisions.

ORSNZ is the Operations Research Society of New Zealand (ORSNZ).

ORSP is the Operations Research Society of the Philippines (ORSP),

ORSI is the Operational Research Society of India (ORSI) and

ORSSA the Operations Research Society of South Africa (ORSSA).

slide35

Automated planning and scheduling

Automated planning and scheduling is a branch of artificial intelligence that concerns the realisation of strategies or action sequences, typically for execution by intelligent agents, autonomous robots and unmanned vehicles.

Unlike classical control and classification problems, the solutions are complex, unknown and have to be discovered and optimised in multidimensional space.

slide36

In known environments with available models planning can be done offline.

Solutions can be found and evaluated prior to execution.

In dynamically unknown environments the strategy often needs to be revised online. Models and policies need to be adapted.

Solutions usually resort to iterative trial and error processes commonly seen in artificial intelligence.

These include dynamic programming, reinforcement learning and combinatorial optimization.

slide37

A typical planner takes three inputs:

a description of the initial state of the world,

a description of the desired goal, and

a set of possible actions,

all encoded in a formal language such as STRIPS.

The planner produces a sequence of actions that lead from the initial state to a state meeting the goal.

An alternative language for describing planning problems is that of hierarchical task networks, in which a set of tasks is given, and each task can be either realized by a primitive action or decomposed in a set of other tasks.

slide38

The difficulty of planning is dependent on the simplifying assumptions employed, e.g. atomic time, deterministic time, complete observability, etc.

Classical planners make all these assumptions and have been studied most fully.

Some popular techniques include:

Forward chaining and Backward chaining state-space search, possibly enhanced by the use of relationships among conditions (see graphplan) or heuristics synthesized from the problem, search through plan space, and translation to propositional satisfiability (satplan).

slide39

If the assumption of determinism is dropped and a probabilistic model of uncertainty is adopted, …………

then this leads to the problem of policy generation for a Markov decision process (MDP) or (in the general case) …..

Partially observable Markov decision process (POMDP).

slide40

Balanced scorecard

In 1992, Robert S. Kaplan and David Norton introduced the balanced scorecard, a concept for measuring a company's activities in terms of its vision and strategies, to give managers a comprehensive view of the performance of a business. The key new element is focusing not only on financial outcomes but also on the human issues that drive those outcomes, so that organizations focus on the future and act in their long-term best interest. The strategic management system forces managers to focus on the important performance metrics that drive success. It balances a financial perspective with customer, process, and employee perspectives. Measures are often indicators of future performance.

slide41

Since the original concept was introduced, balanced scorecards have become a fertile field of theory and research, and many practitioners have diverted from the original Kaplan & Norton articles. Kaplan & Norton themselves revisited the scorecard with the benefit of a decade's experience since the original article.

Implementing the scorecard typically includes four processes:

Translating the vision into operational goals;

Communicate the vision and link it to individual performance;

Business planning;

Feedback and learning and adjusting the strategy accordingly.

slide42

A comprehensive view of business performance

Balanced Scorecard is simply a concise report featuring a set of measures that relate to the performance of an organization. By associating each measure with one or more expected values (targets), managers of the organization can be alerted when organizational performance is failing to meet their expectations. The challenge with Balanced Scorecard is, and has been since it was popularized by an article in 1992 published in the Harvard Business Review, deciding on which measures to choose.

slide43

From the outset, the Balanced Scorecard has been promoted as a tool to help organizations monitor the implementation of organizational strategy.

The earliest Balanced Scorecards comprised simple tables broken into four sections - typically these 'perspectives' were labeled "Financial", "Customer", "Internal Business Processes", and "Learning & Growth".

Designing the Balanced Scorecard simply required picking five or six good measures for each perspective.

Many writers have since suggested alternative headings for these perspectives, and also suggested using either additional or fewer perspectives: these suggestions being triggered by a belief that 'better' designs will come from use of different headings.

slide44

The major design challenge faced with this type of Balanced Scorecard is justfiying the choice of measures made - "of all the measures you could have chosen why did you choose these...?" is a common question asked (and using this type of design process, hard to answer).

If users are not confident that the measures within the Balanced Scorecard are well chosen, they will have less confidence in the information it provides.

Although less common, these early style Balanced Scorecards are still designed and used today.

slide45

The early style Balanced Scorecards are hard to design in a way that builds confidence that they are well designed. Because of this, many are abandoned soon after completion.

In the mid 1990s an improved design method emerged. In the new method, selection of measures was based on a set of 'strategic objectives' plotted on a 'strategic linkage model' or 'strategy map'.

With this modified approach, the strategic objectives are typically distributed across a similar set of 'perspectives' as is found in the earlier designs, but the design question becomes slightly more abstract.

slide46

Managers have to identify the five or six goals they have within each of the perspectives, and then demonstrate some inter-linking between them by plotting causal links on the diagram.

Having reached some consensus about the objectives and how they inter-relate, the Balanced Scorecard's measures are chosen by picking suitable measures for each objective.

This type of approach provides greater contextual justification for the measures chosen, and is generally easier for managers to work through. This style of Balanced Scorecard has been the most common type for the last ten years or so.

slide47

Several design issues still remain with this modified approach to Balanced Scorecard design, but it has been much more successful than the design approach it supersedes.

Since the late 1990s, various improved versions of Balanced Scorecard design methods have emerged - examples being The Performance Prism, Results Based Management and Third Generation Balanced Scorecard for example.

These more advanced design methods seek to solve some of the remaining design issues - in particular issues relating to the design of sets of Balanced Scorecards to use across an organization, and in setting targets for the measures selected.

slide48

Many books and articles on Balanced Scorecard topics confuse the design process elements and the Balanced Scorecard itself: in particular, it is common for people to refer to a 'strategic linkage model' or 'strategy map' as being a Balanced Scorecard.

Balanced Scorecard is a performance management tool: although it helps focus managers' attention on strategic issues and the management of the implementation of strategy, it is important to remember that Balanced Scorecard itself has no role in the formation of strategy.

Balanced Scorecard can comfortably co-exist with strategic planning systems and other tools.

slide49

Actual usage of the balanced scorecard

Kaplan and Norton found that companies are using the scorecard to:

Clarify and update budgets

Identify and align strategic initiatives

Conduct periodic performance reviews to learn about and improve strategy.

In 1997, Kurtzman found that 64 percent of the companies questioned were measuring performance from a number of perspectives in a similar way to the balanced scorecard.

slide50

Balanced scorecards have been implemented by government agencies, military units, corporate units and corporations as a whole, nonprofits, and schools;

many sample scorecards can be found via Web searches, though adapting one organization's scorecard to another is generally not advised by theorists, …..

….. who believe that much of the benefit of the scorecard comes from the implementation method.

slide51

Comparison to Applied Information Economics

A criticism of balanced scorecard is that the scores are not based on any proven economic or financial theory and have no basis in the decision sciences.

The process is entirely subjective and makes no provision to assess quantities like risk and economic value in a way that is actuarially or economically well-founded.

Positive responses from users of balanced scorecard may merely be a type of placebo effect.

There are no empirical studies linking the use of balanced scorecard to better decision making or improved financial performance of companies.

slide52

Applied Information Economics (AIE) has been researched as an alternative to Balanced Scorecards.

In 2000, the Federal CIO Council commissioned a study [1] to compare the two methods by funding studies in side-by-side projects in two different agencies.

The Dept. of Veterans Affairs used AIE and the US Dept. of Agriculture applied balanced scorecard.

The resulting report found that while AIE was much more sophisticated, AIE actually took slightly less time to utilize. AIE was also more likely to generate findings that were newsworthy to the organization while the users of balanced scorecard felt it simply documented their inputs and offered no other particular insight. However, balanced scorecard is still much more widely used than AIE.