"Review of major types of uncertainty in fisheries modeling and how to deal with them"
Download
1 / 104

"Review of major types of uncertainty in fisheries modeling and how to deal with them" - PowerPoint PPT Presentation


  • 107 Views
  • Uploaded on

"Review of major types of uncertainty in fisheries modeling and how to deal with them". Randall M. Peterman School of Resource and Environmental Management (REM) Simon Fraser University, Burnaby, British Columbia, Canada. National Ecosystem Modeling Workshop II,

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' "Review of major types of uncertainty in fisheries modeling and how to deal with them"' - wirt


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

"Review of major types of uncertainty in fisheries modeling and how to deal with them"

Randall M. Peterman

School of Resource and

Environmental Management (REM)

Simon Fraser University,

Burnaby, British Columbia, Canada

National Ecosystem Modeling Workshop II,

Annapolis, Maryland, 25-27 August 2009


Outline and how to deal with them"

• Five sources of uncertainty

- Problems create

- What scientists have done

• Adapting those approaches for ecosystem modelling

Recommendations


Single-species and how to deal with them"

stock

assessments


Single-species and how to deal with them"

stock

assessments

Uncertainties

considered

General risk assessmentmethods


My background and how to deal with them"

Single-species

stock

assessments

Uncertainties

considered

General risk assessmentmethods


Single-species and how to deal with them"

stock

assessments

Risk management

Uncertainties

considered

Scientific advice:

including risk

communication

General risk assessmentmethods

Decision makers, stakeholders


Single-species and how to deal with them"

stock

assessments

Risk management

Uncertainties

considered

Scientific advice:

including risk

communication

General risk assessmentmethods

Multi-species

ecosystem

models

Decision makers, stakeholders

Impressive!!


Single-species and how to deal with them"

stock

assessments

Risk management

Uncertainties

considered

Scientific advice:

including risk

communication

General risk assessmentmethods

Uncertainties

considered

Multi-species

ecosystem

models

Decision makers, stakeholders


Single-species and how to deal with them"

stock

assessments

Risk management

Uncertainties

considered

Scientific advice:

including risk

communication

General risk assessmentmethods

Uncertainties

considered

Multi-species

ecosystem

models

Decision makers, stakeholders


Purposes of ecosystem models from NEMoW 1 and how to deal with them"

1. Improve conceptual understanding

2. Provide broad strategic advice

3. Provide specific tactical advice

Uncertainties are pervasive ...


Sources of and how to deal with them"

uncertainty

1. Natural

variability

Uncertainties


Sources of and how to deal with them"

uncertainty

2. Observation error (bias and

imprecision)

1. Natural

variability

Uncertainties


Sources of and how to deal with them"

uncertainty

2. Observation error (bias and

imprecision)

1. Natural

variability

3. Structural

complexity

Uncertainties


Sources of and how to deal with them"

uncertainty

2. Observation error (bias and

imprecision)

1. Natural

variability

3. Structural

complexity

Result:

Parameter

uncertainty

Uncertainties


Sources of and how to deal with them"

uncertainty

2. Observation error (bias and

imprecision)

1. Natural

variability

3. Structural

complexity

Result:

Parameter

uncertainty

Uncertainties

4. Outcome

uncertainty

(deviation

from target)


Sources of and how to deal with them"

uncertainty

2. Observation error (bias and

imprecision)

1. Natural

variability

3. Structural

complexity

Result:

Parameter

uncertainty

Uncertainties

4. Outcome

uncertainty

(deviation

from target)

Result:

Imperfect forecasts

of system's dynamics


Sources of and how to deal with them"

uncertainty

2. Observation error (bias and

imprecision)

1. Natural

variability

3. Structural

complexity

Result:

Parameter

uncertainty

Uncertainties

4. Outcome

uncertainty

(deviation

from target)

5. Inadequate

communication

among scientists,

decision makers,

and stakeholders

Result:

Imperfect forecasts

of system's dynamics


Sources of and how to deal with them"

uncertainty

2. Observation error (bias and

imprecision)

1. Natural

variability

3. Structural

complexity

Result:

Poorly

informed

decisions

Result:

Parameter

uncertainty

Uncertainties

4. Outcome

uncertainty

(deviation

from target)

5. Inadequate

communication

among scientists,

decision makers,

and stakeholders

Result:

Imperfect forecasts

of system's dynamics


Economic and how to deal with them"

risks

(industry)

Social

risks

(coastal

communities)

Biological

risks

(ecosystems)

Uncertainties

Risk:

Magnitude of variable/event and

probability of that magnitude occurring


Sensitivity analyses across

1. Which components and how to deal with them" to include

4. Management

objectives

2. Structural forms of relationships

5. Environmental conditions

3. Parameter values

6. Management

options

Sensitivity analyses across:

• Focus:

- Which parts most affect management decisions?

- Which parts are highest priority for more data?


2008 Mutton snapper and how to deal with them"

U.S. South Atlantic

& Gulf of Mexico

Overfishing

F / F30%

Overfished

SSB / SSBF30%


Sources of uncertainty and how to deal with them"

Problems

1. Natural variability

Resolution

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication


What scientists have done to deal with ... and how to deal with them"

1. Natural variability

1. Simulate stochastically

2. Make parameters a function of age, size, density, ...

3. Include other components (static or dynamic) - Predators, prey, competitors

- Bycatch/discards

- Environmental variables

...


Sources of uncertainty and how to deal with them"

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication


What scientists have done to deal with ... and how to deal with them"

2. Observation error

1. Assume % of total variance due to observation error

2. Conduct sensitivity analyses

3. Use hierarchical models that "pool" information to help "average out" annual observation error

- Jerome Fiechter et al. using hierarchical Bayesian

models on NEMURO (NPZD-based)


Stock and how to deal with them"

number

Pink salmon

Separate single-

stock analyses

Alaska

40

North

Multi-stock,

mixed-effects

model

30

20

South

B.C.,

Wash.

10

1

, change in salmon productivity, loge(R/S),per oC increase in summer sea-surface temperature

-0.5

0.0

0.5

1.0

gi

Mueter et al. (2002a)


2. Observation error ... (continued) and how to deal with them"

4. Separately estimate natural variation and observation error

-- Errors-in-variables models

-- State-space models

-- Kalman filter

Example 1: tracking nonstationary productivity parameter (Ricker  value)


3 and how to deal with them"

2

Productivity

parameter

1

Low

High

Decreasing

0

0

10

20

30

40

50

60

70

80

90

100

Year


Simulation test and how to deal with them"

"True"

Standard method

Kalman filter

3

Productivity

(Ricker 

parameter)

2

1

0

0

20

40

60

80

100

Year

• Kalman filter with random-walk system equation

was best across all types of nonstationarity

Peterman et al. (2000)


2. Observation error ... (continued) and how to deal with them"

Example 2 of observation error and natural variation

Simplest possible model: spawner-recruit relationship

Su and Peterman (2009, in prep.)

- Used operating model to determine statistical properties of various parameter-estimation schemes:

-- Bias

-- Precision

-- Coverage probabilities (accuracy of estimated width of probability interval for a parameter)


Test performance of an estimator and how to deal with them"

User-specified

"true" underlying parameter values

("What if ...?")

Operating model (simulator to

test methods)


Test performance of an estimator and how to deal with them"

User-specified

"true" underlying parameter values

("What if ...?")

Operating model (simulator to

test methods)

Generate "observed data"

from natural variation and observation error

Parameters estimated


Test performance of an estimator and how to deal with them"

User-specified

"true" underlying parameter values

("What if ...?")

Operating model (simulator to

test methods)

Generate "observed data"

from natural variation and observation error

Compare

"true" and

estimated values

Parameters estimated


Test performance of an estimator and how to deal with them"

User-specified

"true" underlying parameter values

("What if ...?")

Operating model (simulator to

test methods)

200

trials

Generate "observed data"

from natural variation and observation error

Compare

"true" and

estimated values

Parameters estimated


Harvest-rate history and how to deal with them"

LowVariableHigh

Extended Kalman filter

Errors-in-variables

Bayesian state-space

Standard Ricker

250

X

*

%

relative

bias

in 

150

True = 2

50

0

-50

0.25 0.75 0.25 0.75 0.25 0.75

Proportion of total variance

due to measurement error

• Results also change with true


Results for 95% coverage probabilities and how to deal with them"

- Uncertainty in estimated  is too narrow (overconfident)

for all 4 estimation methods

Estimated

Probability

Actual

Ricker 

- Trade-off between bias and variance

(Adkison 2009, Ecol. Applic. 19:198)


Recommendation and how to deal with them"

• Test parameter estimation methods before applying them (Hilborn and Walters 1992)

• Use results with humility, caution

- Parameter estimates for ecosystem models may inadvertently be quite biased!


Sources of uncertainty and how to deal with them"

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication


What scientists have done to deal with ... and how to deal with them"

3. Unclear structure of fishery system

1. Choose single "best" model among alternatives

1a. Informally

1b. Formally using model selection criterion (AICc)

Caution!!

- Not appropriate for giving management advice- Asymmetric loss functions

(Walters and Martell 2004, p. 101)


Asymmetric loss: Which case is preferred? and how to deal with them"

Case 1

Case 2

1.0

SSB /

SSBmsy

0.6

0.2

A

B

A

B

Species


Spawning favoured and how to deal with them"

Harvest favoured

1.18

1.33

0.68

2.09

0.30

1.44

0.25

0.5

1

2

4

4

2

1.0

0.5

0.25

Preference ratio

Fraser River Early Stuart sockeye salmon:

Best "management-adjustment" model (H, T, Q, T+Q)

Asymmetric with

spawning obj. favored

Symmetric

Asymmetric with

harvest obj. favored

Recommendation

• To develop appropriate indicators, ecosystem scientists should understand asymmetry in managers' objectives, especially given many species.

Cummings (2009)


What scientists have done to deal with ... and how to deal with them"

3. Unclear structure of fishery system

1. Choose single "best" model among alternatives

...

...

1c. Adaptive management experiment - Sainsbury et al. in Australia

More commonly, we have to consider a range of alternative models ...


3. Unclear structure of fishery system ... (cont'd.) and how to deal with them"

2. Retain multiple models; conduct sensitivity analyses

2a. Analyze separately

Eastern Scotian

Shelf cod

(closed in mid-1990s)

M

values

VPA

F*1000

Stock-

synthesis

Delay-diff.

(R. Mohn 2009)

SSB (thousands of tonnes)


3. Unclear structure of fishery system ... (cont'd.) and how to deal with them"

2. Retain multiple models; conduct sensitivity analyses

2a. Analyze separately

2b. Combine predictions from alternative models

- Unweighted model averaging

- Weighted with AIC weights or posterior probab., then calculate expected values of indicators

• But weighting assumes managers useexpected value objectives

- Many use mini-max objectives (i.e., choose action with lowest chance of worst-case outcome)


0.2 and how to deal with them"

150

0.1

100

0.05

50

0

0

Limit

reference

point

Probability

with

manage-

ment

action A

Expected SSB

(weighted average)

0.05

0

0

1.0

0.25

0.5

0.75

1.25

0.25

0.5

Worst-case

outcome

(unlikely, but

choose action with

lowest probability )

SSB/SSBtarget


Recommendation and how to deal with them"

• Ecosystem scientists should work iteratively with managers to find the most useful indicators to reflect management objectives.


3. Unclear structure of fishery system ... (cont'd.) and how to deal with them"

2. Retain multiple models; conduct sensitivity analyses

...

...

2c. Evaluate alternative ecosystem assessment

modelsby using an operating model to determine

their statistical properties

(e.g., Fulton et al. 2005 re: community indicators)


3. Unclear structure of fishery system ... (cont'd.) and how to deal with them"

2. Retain multiple models; conduct sensitivity analyses

...

...

...

2d. Evaluate alternative ecosystem assessment

models within closed-loop simulation (MSE) to determine robust management strategies across

range of operating models

Caution!!!! Elaborated upon later.


3. Unclear structure of fishery system ... (cont'd.) and how to deal with them"

Recommendation

• Ecosystem scientists should compare management advice from multiple models.

• Models are "sketches" of real systems, not mirrors

- Only essential features


Appropriate ecosystem model and how to deal with them"sketches?

ESAM

MRM

GADGETSEAPODYM

EwE

Atlantis

...?


Appropriate ecosystem model and how to deal with them"sketches?

ESAM

MRM

GADGETSEAPODYM

EwE

Atlantis

...?

• "A model should be as simple as possible, but no simpler than necessary" [and no more complex either!]

- Morgan and Henrion (1990)


Appropriate ecosystem model and how to deal with them"sketches?

ESAM

MRM

GADGETSEAPODYM

EwE

Atlantis

...?

• "A model should be as simple as possible, but no simpler than necessary" [and no more complex either!]

- Morgan and Henrion (1990)

Appropriate model complexity depends on:

- Type of questions/advice (Plagányi 2007)

- Knowledge and data


High and how to deal with them"

Effectiveness,

predictive power

Low

High

Low

Model complexity

"Adaptive radiation"

of ecosystem models

(Fulton et al. 2003, others)


Recommendation: and how to deal with them"

How ecosystem scientists can deal with structural uncertainty ... (continued)

• Build multiple (nested) models of a given system

- Which model is best for the questions?

- Yodzis (1998) could omit 44% of interactions

• Conduct closed-loop management strategy evaluations (MSEs) across a wide range of hypothesizedoperating models of aquatic ecosystem

- "Best practice"

-- Plagányi (2007)

-- Tivoli meeting (FAO 2008)

-- NEMoW I report (Townsend et al. 2008)


Sources of uncertainty and how to deal with them"

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication


What scientists have done to deal with ... and how to deal with them"

4. Outcome uncertainty

1. Empirically estimate it

(historical deviations from targets)


"Outcome uncertainty" and how to deal with them"

Early Stuart sockeye salmon, B.C. (1986-2003)

Realized

Target

Harvest

rate

Outcome uncertainty:

Both imprecise

and biased

Forecast of adults (millions)

Holt and Peterman (2006)


2. Add outcome uncertainty as a stochastic process and how to deal with them"

3. Conduct sensitivity analyses on nature of outcome uncertainty


MSE with CLIM2, a 15-popul. salmon model and how to deal with them"

Ricker

Ricker AR(1)

2.0

Kalman filter

Non-spatial HBM

6%

Distance-based

HBM

1.9

Relativeaverage

catch

1.8

1.7

1.6

1.5

None

(Dorner et al.

2009, in press)

Outcome uncertainty


MSE with CLIM2, a 15-popul. salmon model and how to deal with them"

Ricker

Ricker AR(1)

2.0

Kalman filter

Non-spatial HBM

6%

Distance-based

HBM

1.9

Relativeaverage

catch

1.8

1.7

1.6

1.5

Imprecise and

unbiased

None

(Dorner et al.

2009, in press)

Outcome uncertainty


MSE with CLIM2, a 15-popul. salmon model and how to deal with them"

Ricker

Ricker AR(1)

2.0

Kalman filter

Non-spatial HBM

6%

Distance-based

HBM

1.9

Relativeaverage

catch

1.8

24% decrease

1.7

1.6

1.5

Imprecise and

unbiased

Imprecise and

biased

None

(Dorner et al.

2009, in press)

Outcome uncertainty


Sources of uncertainty and how to deal with them"

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication


What scientists have done to deal with ... and how to deal with them"

5. Inadequate communication

1. Work iteratively with stakeholders and decision makers - Clarify management objectives and indicators -- Maximize expected value, mini-max, or ...?

2. Conduct sensitivity analyses on mgmt. objectives


5. and how to deal with them"Inadequate communication ... (continued)

Recommendation:

3. Show indicators with uncertainties - Use cognitive psychologists' findings about how people think about uncertainties and risks

-- Cumulative probability distributions

-- Frequency format,not decimal probability format

(due to six interpretations of "probability", only one of which is "chance")


“Chance" of an outcome for a given set of and how to deal with them"

management regulations:

Probability format

"There is a probability of 0.2 that SSB will drop below its limit reference point"


“Chance" of an outcome for a given set of and how to deal with them"

management regulations:

Probability format

"There is a probability of 0.2 that SSB will drop below its limit reference point"

Frequency format

"In two out of every 10 situations like this,

SSB will drop below its limit reference point".


“Chance" of an outcome for a given set of and how to deal with them"

management regulations:

Probability format

"There is a probability of 0.2 that SSB will drop below its limit reference point"

Frequency format

"In two out of every 10 situations like this,

SSB will drop below its limit reference point".

Gerd Gigerenzer et al.


5. and how to deal with them"Inadequate communication ... (continued)

Recommendation

4. Creativelydisplay multiple indicators, and trade-offs among them


Radar plots, and how to deal with them"

kite diagrams

Bycatch

Bycatch

Scenario 1

Microfauna

Target

Target

Microfauna

Shark

Shark

Habitat

Habitat

TEP (marine

mammals,

seabirds)

TEP

spp.

Pelagic:

demersal

Pelagic:

demersal

BSS

Piscivore:planktivore

Piscivore:planktivore

Biomass size spectra

Bycatch

Bycatch

Scenario 4

Microfauna

Target

Microfauna

Target

Habitat

Shark

Shark

Habitat

Pelagic:

demersal

TEP (marine

mammals,

seabirds)

TEP

spp.

Pelagic:

demersal

Biomass

size

spectra (BSS)

Biomass size

spectra

Piscivore:planktivore

Piscivore:planktivore

(Fulton, Smith, and Smith 2007)


AMOEBA plots for North Sea and how to deal with them"

Bpa = precautionary

biomass

Collie et al. (2003)


Yukon R. and how to deal with them"

fall chum

salmon

Average spawners (1000s)

100

Harvest rate

on run

exceeding

target

spawners

300

500

700

Target spawners (in 1000’s)

Collie et al. (in prep.)


Average spawners (1000s) and how to deal with them"

Avg. subsistence catch (1000s)

Yukon R.

fall chum

salmon

100

20

100

300

60

140

500

700

180

Harvest rate

on run

exceeding

target

spawners

% years commercial closed

Avg. commercial catch (1000s)

200

80

150

60

100

100

40

60

80

Target spawners (in 1000’s)


Vismon software, and how to deal with them"in prep.

Proportion harvested

Avg. commercial

catch (1000s)

Spawning target (1000s) of chum salmon

Avg. subsistence

catch (1000s)

Booshehrian, Moeller, et al.


Comment on tradeoffs and how to deal with them"

• Remind managers and stakeholders:

Ecological

indicators

High

Uncertainty

Low

Stated

Actual


Comment on tradeoffs and how to deal with them"

• Remind managers and stakeholders:

Ecological

indicators

Socio-economic

indicators

High

Uncertainty

Low

Stated

Actual

Stated

Actual


Comment on tradeoffs and how to deal with them"

• Remind managers and stakeholders:

- Apply same standards to economists/social scientists and ecologists!!!

Ecological

indicators

Socio-economic

indicators

High

Uncertainty

Low

Stated

Actual

Stated

Actual


Recommendations to deal with and how to deal with them" inadequate communication

1. Formal training:

Decision makers

and stakeholders

Scientists

2. "User studies" about effectiveness

of communication methods


Recommendations to deal with and how to deal with them" inadequate communication ...

3. Develop interactive, hierarchical information systems

to show:

- Management options

- Consequences

- Trade-offs

- Uncertainties

4. Develop communications strategies like Intergovernmental Panel on Climate Change (IPCC):


IPCC and how to deal with them"

• Advises decision makers

and stakeholders

• Communication

challenges:

- Complexity

- Uncertainty

- Risks

- Credibility


How IPCC solves these communication challenges and how to deal with them"

1. Multi-level information systems: IPCC (2007) reports

a. Aim at multiple audiences

b. Hierarchical

c. Numerous footnotes (~ hypertext links)

d. Diverse graphics


IPCC (2007) reports and how to deal with them"


How IPCC solves these communication challenges ... and how to deal with them"

2. Standardized format for describing uncertainties associated with "essential statements": - Chance of an outcome

- Confidence in that estimated chance of that outcome

- "...very high confidence that there is a high chance of ..."

- "We have medium confidence that ..."

• Similar to recent Marine Stewardship Council guidelines


Sources of uncertainty and how to deal with them"

1. Natural variability

2. Observation error

3. Unclear structure of fishery system

4. Outcome uncertainty

5. Inadequate communication


What scientists have done to deal with ... and how to deal with them"

Combination of first 4 sources of uncertainty

• Simulations of entire fishery systems

- Closed-loop simulations (Walters 1986)

- Management strategy evaluations (MSEs) (Punt and Butterworth early 1990s)

- Which management procedure is most robust to uncertainties

-- A single management procedure includes:

--- Data collection method

--- Stock or ecosystem assessment model

--- State-dependent harvest rule


Closed-loop simulation or MSE: and how to deal with them"~ flight simulator

Uncertainties

Unusual

weather

Robust

procedures for

responding to

unexpected

events

Equipment

failure

Random

events


Operating and how to deal with them"

model such as Atlantis

Natural

Aquatic System

Sampling, data collection


Ecosystem and how to deal with them"

assessment model

What What

we we

know don't know

Operating

model such as Atlantis

Natural

Aquatic System

Sampling, data collection

ESAM

MRM

EwE

GADGET

...


Ecosystem and how to deal with them"

assessment model

What What

we we

know don't know

Operating

model such as Atlantis

Natural

Aquatic System

Sampling, data collection

Stakeholders

Decision makers

(harvest rules)


Ecosystem and how to deal with them"

assessment model

What What

we we

know don't know

Operating

model such as Atlantis

Natural

Aquatic System

Sampling, data collection

Management

objectives

Harvesting

Stakeholders

Fishing regulations

(harvest quotas,

closed areas, ...)

Decision makers

(harvest rules)


Ecosystem and how to deal with them"

assessment model

What What

we we

know don't know

Operating

model

Natural

variability

Observation

error

Natural

Aquatic System

Sampling, data collection

Structural

uncertainty

Management

objectives

Harvesting

Stakeholders

Outcome

uncertainty

Inadequate

communic.

Fishing regulations

(harvest quotas,

closed areas, ...)

Decision makers

(harvest rules)

Entire diagram =

closed-loop simulation (MSE)

Peterman (2004)


MSEs include iterating across all major and how to deal with them" hypotheses about operating model

Result of MSE:

Identifies relative merits of management procedures for meeting management objectives


Conducting MSEs of and how to deal with them"ecosystem models

Caution: Substantial challenges ahead!

1. Characterizing operating model

- Range of alternative hypotheses

- Reliability of predictions from ecosystem models

- Nonstationary environment (what if ...?)

2. Simulating ecosystem assessment process based on "observed" data using GADGET, an ESAM, ...

- Automation of assessment process

3. Engaging scientists with decision makers, stakeholders


Conducting MSEs of and how to deal with them"ecosystem models

4. Simulating outcome uncertainty (deviation from target)

- Lack of data

5. Simulating state-dependent decision-making process

- Lack of clear operational ecosystem objectives

and indicators

- Complex objectives: optimize for one, make tradeoffs for others (Smith et al., Mapstone et al., and others in Dec. 2008 Fisheries Research)

plus ...


Can indicators of ecosystems from PCAs be used as and how to deal with them"

measures of system state for input to harvest rules?

C

A

PC 2

F

B

A B C

PC 1

Ecosystem status

(similarity to PCA category)

(Link et al. 2002)


Conducting MSEs of and how to deal with them"ecosystem models

6. Interpreting results

- Across multiple indicators

and sensitivity analyses

7. Computations

- CPU time


Recommendations for next steps for ecosystem models and how to deal with them"

1. Need standards for evaluating reliability of models

2. If fitting ecosystem models to data, use operating models to check adequacy of estimation methods

3. Evaluate how much difference will be made by proposed "improvements" to ecosystem models

(more complex not necessarily better)

4. Clarify operational management objectives and indicators that reflect ecosystem concerns

5. Analyze multiple models


Recommendations for next steps for ecosystem models and how to deal with them"

6. If use MSE approach (Tivoli, Plagányi, NEMoW I)

- Start simply (ESAMs, MRMs) for assessment models (Butterworth and Plagányi 2004)

- Choose operating model (e.g., Atlantis)

- Build experience

- Determine feasibility of MSEs for evaluating more complex assessment models (GADGET, EwE, ...)

7. Add to "Best practices"

- Standardized protocol for determining performance of multiple assessment modelsfor a given aquaticecosystem.

- Training/gaming workshops to improve communication


Reminders and how to deal with them"

• Sensitivity analyses should focus on finding which components cause changes in management advice.

• We probably underestimate the magnitude of uncertainty in estimates of parameters, state variables

• C.S. Holling: "The domain of our ignorance is larger than the domain of our knowledge."


ad