2.03k likes | 2.17k Views
AFRL: Quantifying Logistics Capabilities (PMD-0302). Principal Investigators: Heather Nachtmann Justin Chimka Manuel Rossetti Research Assistants: Alex Andelman David Rieske AFRL POC’s: Edward Boyle Stephanie Swindler Northrop Grumman: John Jacobs.
E N D
AFRL: Quantifying Logistics Capabilities (PMD-0302) Principal Investigators: Heather Nachtmann Justin Chimka Manuel Rossetti Research Assistants: Alex Andelman David Rieske AFRL POC’s: Edward Boyle Stephanie Swindler Northrop Grumman: John Jacobs
Project Background • Oliver, et al. (2002) identified logistic and operational factors associated with mission capability (MC) • Used correlation and regression analysis of quarterly data from FY93-FY00 to explain and predict F-16 MC rates • Cannibalization, funding levels, and personnel skill levels were found to be significant factors • Findings led to recognition that the USAF does not have a metric to relate maintenance (MX) personnel skill level to operational readiness
Project Objectives • Objectives of this project build on Oliver’s work • Investigate the relationship between mission capability and personnel skill level • Develop a metric of MC rate which is a function of MX personnel skill level • Identify standards for this metric based on AF objectives for MC rate • Explore relationships between MX personnel skill level and multiple utilization, and reliability and maintainability performance measures
Research Methodology • Four analysis tasks • Variables definition • Correlation analysis • Candidate regression model construction • Final model selection
Correlation Analysis • Sample correlation coefficients between dependent and independent variables were calculated • Interaction was investigated where strongly correlated independent variables were multiplied • Independent variables and interaction terms strongly correlated (r ≥ 0.80) with dependent variables were saved for further model development
Efficiency Analysis • Models with Adjusted R2 < 0.64 were eliminated • Efficient frontiers were found by plotting Adjusted R2 versus numbers of model inputs • These decreased the number of candidate models from 82 to 18
Assumption Analysis • The following assumptions were checked for each of the remaining models • Residuals have the normal distribution • Residuals have an expected value of zero • Residuals have a constant variance • No serial correlation among residuals • Models were eliminated where assumptions were unfounded
Assumption Analysis, continued • Normality of residuals • Ryan-Joiner test assuming normality • Reject assumption for p < 0.05 • Zero mean of the errors • One-sample t-test assuming m = 0 • Reject assumption for p < 0.05
Assumption Analysis, continued • Constant error variance • Two-sample t-test assuming m1 = m2 • Reject assumption for p < 0.05 • Uncorrelated errors • Correlation analysis • Models with strong correlation (r ≥ 0.80) between subsequent errors were removed
Final Model Selection • Given efficient models requiring reasonable assumptions, models w/o interaction were preferred • The recommended model of MC Rate as a function of personnel variables is the following • MC Rate = 0.347 + 1.27x7 + 4.89x9 • x7 = % Level 7 Maintainers • x9 = % Level 9 Maintainers • R2 = 0.820, Adjusted R2 = 0.807
Metric and Standard Establishment • Model recommended for MC Rate is also recommended as the MC Rate metric • Standard MC rate for FY00 was 84 percent
Representation of the Metric Model % Level 9 % Level 7 2.25 2.50 2.75 3.00 3.25 23 74.91% 76.14% 77.36% 78.58% 79.80% 24 76.18% 77.41% 78.63% 79.85% 81.07% 25 77.45% 78.68% 79.90% 81.12% 82.34% 26 78.72% 79.95% 81.17% 82.39% 83.61% 27 79.99% 81.22% 82.44% 83.66% 84.88% Expected MC rates between observed extremes of independent variables
Summary • Established metric and standard for MC rate using statistical models • Developed representations of dependent variable models as functions of independent personnel variables • MX skill level software tool was created to make models easier to implement
AFRL: Multi-State Selective Maintenance Decisions (MM-0302) Principal Investigator: C. Richard Cassady, Ph.D., P.E. Co-Principal Investigators: Edward A. Pohl, Ph.D. Scott J. Mason, Ph.D., P.E. Research Assistants: Thomas Yeung
Project Motivation • All military organizations depend on the reliable performance of repairable systems for the successful completion of missions. • Due to limitations in maintenance resources, a maintenance manager must decide how to allocate available resources.
Project Motivation (cont) • Selective maintenance is defined as the process of identifying the subset of maintenance activities to perform from a set of desired maintenance actions. • Selective maintenance models formulated to date are based on the assumption of binary (functioning or failed) component, subsystem and system status.
Project Objective to develop a modeling-based methodology for managing selective maintenance decisions when multiple (more than two) system states are possible
Outline • Scenario definition • Decision-making • Solution by total enumeration • Heuristic solution • A dispatching rule • Experimental design • Experimental results
Scenario Definition • Set of q independent and identical systems • Each system comprised of m independent subsystems • Motivating example (m = 41) • Subsystems extracted from AFI121-103_ACCSUP1 (MESL) • F-16A/B/C/D MESL used because of our experience with the F-16 at Hill AFB
Scenario Definition (cont) • All systems idle and available for maintenance • State of system i • ai = (ai1, ai2, … , aim) • aij denotes the amount of time required to bring subsystem j of system i into a properly operating condition
Scenario Definition (cont) • Some maintenance actions require spare parts or other resources that are not readily available. • The ready time of subsystem j in system i, ij, is the time at which these resources are available and maintenance on the subsystem can begin. • i= (i1, i2, … , im)
Scenario Definition (cont) • n future missions planned (n q) • Mission krequires some subset of the subsystems to be operational • sk = (sk1, sk2, … , skm)
Scenario Definition (cont) • Motivating example (types of missions) • FSL – Full System List • ADC – Air Defense, Conventional • ASC – Air to Surface, Conventional • ASY – Air Superiority • ASN – Air to Surface, Nuclear • DSP – Defense Suppression • TNG – Training • TST – Testing
Decision-Making • Which system should be assigned to each mission?
Decision-Making (cont) Every mission gets a system. No system gets more than one mission.
Decision-Making (cont) • Total time required for maintenance related to mission k • Ready time for maintenance related to mission k
Decision-Making (cont) • Once the assignments are made, maintenance crews must perform the maintenance. • = # of crews • We assume that a crew: • Works on no more than one system at a time • Works on a system only after it is “ready” • Works on a system continuously until all maintenance is finished
Decision-Making (cont) • For each mission, when does maintenance begin and by which crew is maintenance performed?
Decision-Making (cont) Every mission gets a crew. We cannot start maintenance before we are ready.
Decision-Making (cont) A crew cannot work on two systems at the same time.
Decision-Making (cont) • Completion time of maintenance for mission k
Decision-Making (cont) • wk = importance (weight) of mission k • Larger weight implies more importance • Objective is to minimize total weighted completion time of all maintenance
Decision-Making (cont) • The full optimization model is a binary programming problem with nonlinearities in both the objective function and several constraints.
Solution by Total Enumeration • Procedure • Enumerates all possible assignments • Enumerates all possible schedules for each assignment • 102nγiterations required to enumerate all solutions • 3 missions, 3 systems, 2 crews = one trillion iterations
Solution by Total Enumeration (cont) • One trillions iterations requires weeks to complete. • Computation time is not practical for even small instances.
Heuristic Solution • The nonlinearities render the problem incapable of being solved by most commercial solvers. • The problem was broken apart into two linear problems: • Assignment problem • Scheduling problem
Heuristic Solution (cont) • For each system/mission combination, the following ratio is computed: • Assignments are made based on this ratio in descending order. • This computation takes a fraction of a second.
Heuristic Solution (cont) • Heuristic solution to the assignment problem is used as an input for the scheduling problem. • The optimal solution for the scheduling problem is obtained using a commercial solver.
A Dispatching Rule • We also considered a simplified version of the heuristic that does not require the commercial solver. • This dispatching rule is designed to be much simpler computationally than the heuristic approach.
A Dispatching Rule • For each system/mission combination, the following ratio is computed: • Missions are simply “dispatched” or scheduled based on this ratio in descending order.
Experimental Design • Realistic problem instances of the multi-state selective maintenance problem were generated. • Both the heuristic/optimization and dispatching rule approaches were tested for their performance in terms of: • Solution quality • Computation time
Experimental Design (cont) • The F-16 is our motivating example. • The numerical examples are evaluated at the squadron level (q = 24, n = 24). • All instances have six identical crews available for maintenance at any given time.