1 / 202

AFRL: Quantifying Logistics Capabilities (PMD-0302)

AFRL: Quantifying Logistics Capabilities (PMD-0302). Principal Investigators: Heather Nachtmann Justin Chimka Manuel Rossetti Research Assistants: Alex Andelman David Rieske AFRL POC’s: Edward Boyle Stephanie Swindler Northrop Grumman: John Jacobs.

Download Presentation

AFRL: Quantifying Logistics Capabilities (PMD-0302)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.


Presentation Transcript

  1. AFRL: Quantifying Logistics Capabilities (PMD-0302) Principal Investigators: Heather Nachtmann Justin Chimka Manuel Rossetti Research Assistants: Alex Andelman David Rieske AFRL POC’s: Edward Boyle Stephanie Swindler Northrop Grumman: John Jacobs

  2. Project Background • Oliver, et al. (2002) identified logistic and operational factors associated with mission capability (MC) • Used correlation and regression analysis of quarterly data from FY93-FY00 to explain and predict F-16 MC rates • Cannibalization, funding levels, and personnel skill levels were found to be significant factors • Findings led to recognition that the USAF does not have a metric to relate maintenance (MX) personnel skill level to operational readiness

  3. Project Objectives • Objectives of this project build on Oliver’s work • Investigate the relationship between mission capability and personnel skill level • Develop a metric of MC rate which is a function of MX personnel skill level • Identify standards for this metric based on AF objectives for MC rate • Explore relationships between MX personnel skill level and multiple utilization, and reliability and maintainability performance measures

  4. Research Methodology • Four analysis tasks • Variables definition • Correlation analysis • Candidate regression model construction • Final model selection

  5. Variable Definition

  6. Variable Definition, continued

  7. Correlation Analysis • Sample correlation coefficients between dependent and independent variables were calculated • Interaction was investigated where strongly correlated independent variables were multiplied • Independent variables and interaction terms strongly correlated (r ≥ 0.80) with dependent variables were saved for further model development

  8. Correlation Analysis Results

  9. Correlation Analysis Results, continued

  10. Candidate Regression Model Construction

  11. Candidate Model Results

  12. Candidate Model Results, continued

  13. Efficiency Analysis • Models with Adjusted R2 < 0.64 were eliminated • Efficient frontiers were found by plotting Adjusted R2 versus numbers of model inputs • These decreased the number of candidate models from 82 to 18

  14. Efficiency Analysis, continued

  15. Assumption Analysis • The following assumptions were checked for each of the remaining models • Residuals have the normal distribution • Residuals have an expected value of zero • Residuals have a constant variance • No serial correlation among residuals • Models were eliminated where assumptions were unfounded

  16. Assumption Analysis, continued • Normality of residuals • Ryan-Joiner test assuming normality • Reject assumption for p < 0.05 • Zero mean of the errors • One-sample t-test assuming m = 0 • Reject assumption for p < 0.05

  17. Assumption Analysis, continued • Constant error variance • Two-sample t-test assuming m1 = m2 • Reject assumption for p < 0.05 • Uncorrelated errors • Correlation analysis • Models with strong correlation (r ≥ 0.80) between subsequent errors were removed

  18. Final Model Selection • Given efficient models requiring reasonable assumptions, models w/o interaction were preferred • The recommended model of MC Rate as a function of personnel variables is the following • MC Rate = 0.347 + 1.27x7 + 4.89x9 • x7 = % Level 7 Maintainers • x9 = % Level 9 Maintainers • R2 = 0.820, Adjusted R2 = 0.807

  19. Metric and Standard Establishment • Model recommended for MC Rate is also recommended as the MC Rate metric • Standard MC rate for FY00 was 84 percent

  20. Representation of the Metric Model % Level 9 % Level 7 2.25 2.50 2.75 3.00 3.25 23 74.91% 76.14% 77.36% 78.58% 79.80% 24 76.18% 77.41% 78.63% 79.85% 81.07% 25 77.45% 78.68% 79.90% 81.12% 82.34% 26 78.72% 79.95% 81.17% 82.39% 83.61% 27 79.99% 81.22% 82.44% 83.66% 84.88% Expected MC rates between observed extremes of independent variables

  21. Summary • Established metric and standard for MC rate using statistical models • Developed representations of dependent variable models as functions of independent personnel variables • MX skill level software tool was created to make models easier to implement

  22. AFRL: Multi-State Selective Maintenance Decisions (MM-0302) Principal Investigator: C. Richard Cassady, Ph.D., P.E. Co-Principal Investigators: Edward A. Pohl, Ph.D. Scott J. Mason, Ph.D., P.E. Research Assistants: Thomas Yeung

  23. Project Motivation • All military organizations depend on the reliable performance of repairable systems for the successful completion of missions. • Due to limitations in maintenance resources, a maintenance manager must decide how to allocate available resources.

  24. Project Motivation (cont) • Selective maintenance is defined as the process of identifying the subset of maintenance activities to perform from a set of desired maintenance actions. • Selective maintenance models formulated to date are based on the assumption of binary (functioning or failed) component, subsystem and system status.

  25. Project Objective to develop a modeling-based methodology for managing selective maintenance decisions when multiple (more than two) system states are possible

  26. Outline • Scenario definition • Decision-making • Solution by total enumeration • Heuristic solution • A dispatching rule • Experimental design • Experimental results

  27. Scenario Definition • Set of q independent and identical systems • Each system comprised of m independent subsystems • Motivating example (m = 41) • Subsystems extracted from AFI121-103_ACCSUP1 (MESL) • F-16A/B/C/D MESL used because of our experience with the F-16 at Hill AFB

  28. Scenario Definition (cont) • All systems idle and available for maintenance • State of system i • ai = (ai1, ai2, … , aim) • aij denotes the amount of time required to bring subsystem j of system i into a properly operating condition

  29. Scenario Definition (cont) • Some maintenance actions require spare parts or other resources that are not readily available. • The ready time of subsystem j in system i, ij, is the time at which these resources are available and maintenance on the subsystem can begin. • i= (i1, i2, … , im)

  30. Scenario Definition (cont) • n future missions planned (n q) • Mission krequires some subset of the subsystems to be operational • sk = (sk1, sk2, … , skm)

  31. Scenario Definition (cont) • Motivating example (types of missions) • FSL – Full System List • ADC – Air Defense, Conventional • ASC – Air to Surface, Conventional • ASY – Air Superiority • ASN – Air to Surface, Nuclear • DSP – Defense Suppression • TNG – Training • TST – Testing

  32. Decision-Making • Which system should be assigned to each mission?

  33. Decision-Making (cont) Every mission gets a system. No system gets more than one mission.

  34. Decision-Making (cont) • Total time required for maintenance related to mission k • Ready time for maintenance related to mission k

  35. Decision-Making (cont) • Once the assignments are made, maintenance crews must perform the maintenance. •  = # of crews • We assume that a crew: • Works on no more than one system at a time • Works on a system only after it is “ready” • Works on a system continuously until all maintenance is finished

  36. Decision-Making (cont) • For each mission, when does maintenance begin and by which crew is maintenance performed?

  37. Decision-Making (cont) Every mission gets a crew. We cannot start maintenance before we are ready.

  38. Decision-Making (cont) A crew cannot work on two systems at the same time.

  39. Decision-Making (cont) • Completion time of maintenance for mission k

  40. Decision-Making (cont) • wk = importance (weight) of mission k • Larger weight implies more importance • Objective is to minimize total weighted completion time of all maintenance

  41. Decision-Making (cont) • The full optimization model is a binary programming problem with nonlinearities in both the objective function and several constraints.

  42. Solution by Total Enumeration • Procedure • Enumerates all possible assignments • Enumerates all possible schedules for each assignment • 102nγiterations required to enumerate all solutions • 3 missions, 3 systems, 2 crews = one trillion iterations

  43. Solution by Total Enumeration (cont) • One trillions iterations requires weeks to complete. • Computation time is not practical for even small instances.

  44. Heuristic Solution • The nonlinearities render the problem incapable of being solved by most commercial solvers. • The problem was broken apart into two linear problems: • Assignment problem • Scheduling problem

  45. Heuristic Solution (cont) • For each system/mission combination, the following ratio is computed: • Assignments are made based on this ratio in descending order. • This computation takes a fraction of a second.

  46. Heuristic Solution (cont) • Heuristic solution to the assignment problem is used as an input for the scheduling problem. • The optimal solution for the scheduling problem is obtained using a commercial solver.

  47. A Dispatching Rule • We also considered a simplified version of the heuristic that does not require the commercial solver. • This dispatching rule is designed to be much simpler computationally than the heuristic approach.

  48. A Dispatching Rule • For each system/mission combination, the following ratio is computed: • Missions are simply “dispatched” or scheduled based on this ratio in descending order.

  49. Experimental Design • Realistic problem instances of the multi-state selective maintenance problem were generated. • Both the heuristic/optimization and dispatching rule approaches were tested for their performance in terms of: • Solution quality • Computation time

  50. Experimental Design (cont) • The F-16 is our motivating example. • The numerical examples are evaluated at the squadron level (q = 24, n = 24). • All instances have six identical crews available for maintenance at any given time.

More Related