1 / 50

Predictability Challenges in Aircraft Analysis and Design

This presentation discusses the issues faced in predicting and designing aircraft, including nonlinear, multidisciplinary, and multi-scale problems. It explores the difficulties in assessing predictability and the benefits of higher predictability in terms of cost, performance, and safety.

carlosp
Download Presentation

Predictability Challenges in Aircraft Analysis and Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Predictability Issues in Aircraft Analysis, Design, and CertificationChris L. Pettit, Ph.D., P.E.Multidisciplinary Technologies CenterAir Vehicles Directorate, Air Force Research Laboratory JHU Predictability Workshop, November 13-14, 2003

  2. About This Presentation … • Organizers’ goal: “Synthesize a template for quantitative processes related to predictability and UQ” • My goals as moderator: Define context and highlight key questions to motivate group discussion • Describe prediction problems being confronted in (military) aircraft design and certification, including: • Nonlinear, multidisciplinary, and multi-scale problems • Prediction difficulties that limit the performance and health of current systems and the development of future systems • Promote discussion and feedback on key topics: • Definitions of predictability and predictability-aware models • How to assess predictability and what to do with it • Error vs. uncertainty • Roles of testing during various phases of design and life cycle • Role of predictability assessment in aircraft systems engineering and decision-making (What is the risk associated with low predictability?) • Current impediments to predictability • Benefits of higher predictability (e.g., cost, performance, safety)

  3. About This Presentation (cont) … • I’ll try to avoid injecting unnecessary bias into what often are controversial philosophical issues • I hope to learn more from you than you will from me • But, I will assume … • We want to measure our ability to model complex systems • We are uncertain about all processes • Models are not reality • Natural and man-made physical systems are never deterministic • Assessing predictability requires uncertainty quantification (UQ) • Each model has a limited range of validity • Model validation ultimately depends on UQ and model usage • Predictability  Model Validity • Is the reverse true? • Physics-based models promote predictability assessment • Error estimators, safer extrapolation, etc. • Context does matter: military aircraft prediction is part of the DoD acquisition process  Models should help to assess system-level risks!

  4. Some tough prediction problems designers and analysts are facing …

  5. Prediction-Critical Disciplines for Current and Future Aircraft Systems These are disciplines that lead to severe performance restrictions, high required margins, and re-designs • Extreme environments (e.g., thermoacoustic loads) • Nonlinear aeroelasticity • Flow control and mixing • Signature reduction • Radar cross-section (RCS) • Thermal • Structural integrity • Fatigue, fracture, corrosion, delamination, battle damage, etc. • Strongly dependent on other disciplines and usage for loads • Sensitivity to manufacturing tolerances • Structural instability • Others??? (e.g., dynamics of UAV swarms, human behavior)

  6. Common Complicating Factors in Prediction Prediction-critical phenomena commonly involve … … complex processes that span multiple spatial and temporal scales  At which scales can we be predictive? … nonlinear processes … multidisciplinary interactions … relatively high epistemic and aleatory uncertainty … low observability in experiments and tests … sensitivity to BCs and ICs Each of these factors … … complicates our attempts to predict … impedes our efforts to assess predictability

  7. Current and Expected Practical Prediction Challenges (1/3) • Low acceptance of predictive ability • Especially for safety-critical and multi-scale phenomena • Model validation is a low priority • Risk assessment is not trusted • Accelerated testing requires • More dependable predictions • Less subjective risk estimation • Model and test integration • Nonlinear systems can be very sensitive to variability in system’s properties, loads, and BC’s • Bifurcations • Hard to model in complex systems

  8. Current and Expected Practical Prediction Challenges (2/3) • Non-robust optima in aeroelastic tailoring and laminar flow wings • Manufacturing variability • Off-design conditions • Non-traditional design concepts, and highly variable or extreme operating environments • Little historical basis for assessing loads and sensitivities • Difficult to estimate risks of new technology or design concepts (e.g., TRL assessment) • Untapped potential because of the low predictability??? • Are dated safety reqs holding back existing and new technologies?

  9. Current and Expected Practical Prediction Challenges (3/3) • Designer materials and non-traditional structures • Prediction of properties across length scales • Ensuring adequate performance in non-ideal conditions • Avoiding unintended failure modes • Multi-functional structures and systems integration • Structurally integrated (i.e., load-bearing) antennas • Distributed control surfaces and shape control • Optimization of control laws in multiple flight regimes • Load redistribution for non-aerodynamic or non-structural purposes (e.g., antenna pointing or RCS management)? • Integrated vehicle health management (IVHM) systems • Data fusion and model-based sensor placement optimization • On-line modification of control laws for loads management • Design of self-healing materials • Airframe-propulsion integration in hypersonic vehicles • System-level performance metrics • Defining trade-offs given multiple energy flow paths • Multiple performance modes requires multidisciplinary models

  10. Predictability in the context of aircraft design and certification …

  11. How the Tough Problems Affect Processes and Frameworks • Current aircraft systems already stress design and certification methods to (or beyond?) their practical limits • Unique design concepts suggest increased importance of nonlinear multidisciplinary physics clearly beyond capability of current design tools and certification processes • Physical and computational complexity of nonlinear multidisciplinary models obscures the propagation of uncertainty through networks of models • Difficult to dependably assess sensitivities and risks w/o a clear UQ process that is consistently implemented • For airframes: This has resulted in a process-centric approach to risk management instead of a knowledge-centric approach • This is untenable for future Air Force needs

  12. Multidisciplinary Problems • Very hard to predict and validate • Multi-scale, nonlinear physics • The “correct” uncertainty model often depends on physics modeling choices and measurement limitations • e.g., stress FE models vs. dynamics FE models • Highly variable operating environments, loads, and material properties • Complicated and expensive tests • Crucial to the success and safety of high-performance military aircraft  Computational multidisciplinary analyses are always suspect, as is any resulting risk prediction

  13. Predictability in Systems Engineering (SE) • Prediction must be performed and assessed in the context of systems engineering • Purpose of SE: manage system-level risks from cradle to grave • Risk results from uncertainty and error • Risk management demands good data and good predictions  Risk management requires predictability assessment • SE entails a risk allocation or flow-down from program level to system, sub-system, and component levels • Usually implicit and qualitative for complex systems • This flow-down parallels a similarly implicit flow-down of uncertainty in multidisciplinary design problems • Modeling and data-gathering decisions automatically allocate uncertainty and error to constituent analyses • Uncertainty and error budgets are never described explicitly and are extremely difficult to quantify Predictability assessment ultimately needs UQ

  14. Uncertainty Flow-Down • How much uncertainty can be tolerated in the top-level prediction of a multidisciplinary process? • How much uncertainty can be attributed to each sub-discipline in the network of models that comprise the multidisciplinary analysis? • Must address epistemic and aleatory sources • How much uncertainty can be tolerated in each sub-discipline analysis? • How do the modeled physics amplify input uncertainty? • What test, computer, and training resources must be invested to assess and control the uncertainty in each sub-discipline? Do these work for error flow-down also? Do these really help in assessing predictability?

  15. Prediction and Information-Management Tools • Design and test cycles of military aircraft now exceed 20 years! • Many airframe designers now work on only a few new aircraft programs during their entire career • Opportunities to gain practical experience are extremely limited • Can no longer depend on “old-timers” as the primary storehouses of corporate knowledge • Retirements and overwhelming demands on their time • Even they may not have insight into non-traditional problems • Worse yet: They can be “nay-sayers” • Analyses used to support design decisions may be obsolete by the time the aircraft is certified • DoD Acquisition Reform: Mandated evolutionary acquisition and spiral development processes institute definite needs for more complete knowledge to support future upgrades • How can prediction frameworks be structured to overcome these difficulties???

  16. Closing Remarks about Aircraft Predictability • Predictability must be assessed in terms of which questions are being answered by the model • Prediction-critical aircraft phenomena share many complicating characteristics • Ability to be predictive and to assess predictability is fundamental to future military aircraft systems and acquisition processes • UQ and error estimation are fundamental to predictability • Predictability depends as much on the practical details of modeling and testing process (e.g., best practices, ability to measure key data) as it does on theory

  17. What’s next?

  18. Breakout Group Plan of Action • I will present several suggested topics of discussion • Summarize first, then cover each separately in detail • Each topic addresses some of the concerns I’ve discussed already • Try to step through the topics one-by-one for group discussion • We have little time • Please try to confine your remarks to the question at hand • I encourage open discussion, but I will press ahead if we do not move through the topics quickly enough. Please don’t be insulted if I abruptly terminate a portion of the discussion. • Remember: Our ultimate goal is to begin developing a template for aircraft predictability assessment in the context of uncertainty and error

  19. Suggested Topics of Discussion

  20. Suggested Topics of Discussion • What is the working definition of predictability in the context of aircraft analysis, design, and certification? • What is the current state of UQ and predictability awareness for aircraft? • How can aircraft predictability be assessed objectively? • What are the dominant modeling, testing, and validation challenges that impede aircraft predictability? • What content must a “predictability-aware” model of a complex aircraft system offer? • What “new things” could be accomplished in aircraft analysis if predictability were substantially improved?

  21. Topic #1 What is the working definition of predictability in the context of aircraft analysis, design, and certification? • Does it differ substantially between the Critical Disciplines cited earlier? • Do we need to clarify the relationship between predictability, model validity, error estimation, and UQ? • “I can’t define what it means to be predictive, but I know it when I see it.”

  22. Topic #2 What is the current state of UQ and predictability awareness for aircraft? • Does it differ substantially between the Critical Disciplines? • Research vs. practice? • Do decision-makers place sufficient priority on predictability assessment?

  23. Topic #3 (1/2) How can predictability be assessed objectively? • What are the appropriate metrics? Is model validity truly a prerequisite? • What is the role of experimental evidence in understanding, measuring, and controlling predictability? • How is uncertainty related to error estimation? • Numerical error vs. statistical error? • Is a “converged” deterministic grid automatically good for UQ? • How should the error and uncertainty budgets be decomposed to clarify predictability assessment? • Global vs. local measures of predictability? • Throughout the design parameter space? • Throughout the spatio-temporal extent of a given design and its model? • Scaling issues in comparing tests and models? • Will the common scale factors (Re, Fr, etc.) remain the most important as non-traditional designs are developed? Note: This is already an issue for aeroelastic wind-tunnel models.

  24. Topic #3 (2/2) • Are acceptable confidence measures available for error estimates? • What is their nature (e.g., fuzzy vs. subjective probability?) • Is there agreement on how to combine component- and discipline-level error estimates to obtain system-level error estimates? • How should these be communicated to decision-makers?

  25. Topic #4 (1/2) What are the dominant modeling, testing, and validation challenges that impede aircraft predictability? • Where in the prediction chain do the limitations enter? • Availability of accurate input data and its variability? (e.g., constitutive properties, geometry, etc.) • A priori knowledge of input errors/uncertainty and their consequences? • Math models? Could include unresolved physics … • Algorithmic implementation of math models? This could include discipline coupling. • Numerical sensitivity (grid and time step, convergence criteria, etc.) • Short-term vs. long-term accuracy? Dependable error estimators? • Post-processing and interpretation? Model validation and integration with testing? Availability of dependable test data? • Can we trade some full-scale tests for more coupon and component tests to improve UQ and error estimation? • How are the challenges shaped by the push to reduce test resources and streamline certification decision-making?

  26. Topic #4 (2/2) • Which important measurements cannot be made with current capabilities? • Are these limitations controlled by physics, technology, cost, resource prioritization, or something else? • How can validation plans be adjusted to mitigate these limitations? • Given that aircraft normally admit some testing throughout the design process, how should these test resources be allocated to estimate model errors and uncertainty? • Other impediments to predictability assessment not mentioned yet???

  27. Topic #5 What content must a “predictability-aware” model of a complex system offer? • How does this depend on the purpose of the model? • Who will use it? When? Why? • How can information and high-fidelity analysis frameworks be structured to promote predictability? • Which types of prediction difficulties are best addressed through process structuring and control? • How can frameworks be used to promote communication between analysts and test personnel in estimating predictability? • Should the model carry supporting data in parallel to support predictability assessment? • What about enforced recording of modeling assumptions and decisions? • Multiple spatial and temporal scales?

  28. Topic #6 What “new things” could be accomplished in aircraft analysis if predictability were substantially improved? • How is aircraft performance predictability-limited? • Reduce required margins and safety factors? • How much of a safety factor is allocated to cover modeling errors and missing info vs. inherent variability? • Is a system-level risk or uncertainty budget a practical concept? • Can it be allocated rationally to components or modeling disciplines? • Should predictability goals be tied to different stages in the design and certification process? • Can predictability become a trade-off variable in the systems engineering process? Is this a function of the size of the production run?

  29. Anything else?

  30. Backup Slides

  31. Impediments to Reliable Risk Analysis of MD Aircraft Problems • System-level risks generally involve incommensurate types of ignorance whose relative importance is problem-dependent and discipline-dependent • No universally accepted way to measure and combine these types of ignorance consistently • Industry mindset often prefers wrong answers that come quickly to better answers that take longer • Design process is perceived as a time and resource sink that must be tolerated in order to generate revenue downstream • Certification processes automatically biased toward the technological status quo • Potentially delays transition of beneficial new structures and materials technologies

  32. The Role of Processes and Frameworks • We need a comprehensive approach to storing models, traditional analysis results, UQ results, and any other info used to support design and certification decisions (e.g., expert opinions) • Must facilitate guided access for “future generations” to support: • Future expansions of operational capabilities • Life-extension programs • Insight into the sources and solutions of unexpected problems • Should also promote informative modeling and analysis practice (including UQ) by requesting • Key inputs and outputs, including their uncertain aspects • Documentation of modeling decisions

  33. Our Perspective • UQ-based analyses are needed to help reveal unexpected failure modes and to assess their risk • We already do a reasonable job of preventing most well-known structural failure modes in traditional designs • Could be critical for non-traditional designs • UQ-centric processes promote maximum payoff from models and tests at all scales (e.g., coupon-level to full-scale) • Motivation behind test-planning should be transformed to emphasize model validation in addition to (or instead of?) certification criteria • This will require substantial modification of traditional R&D and program funding profiles • Allocate additional funds during conceptual and preliminary design stages to support additional data gathering and analysis activities  Need to fill the pool of knowledge as early as possible!

  34. What Should Our Goals Be? • USAF needs to increase reliance on multidisciplinary analysis earlier in the design process • Detect genuinely avoidable problems before full-scale ground and flight tests • Achieve operational capabilities and efficiencies by enabling access to portions of the design space that are precluded by current certification requirements and precautionary biases • Ideal outcome: Dependable quantification of technical and performance risk early on lead to • Informed assessment of competing technologies • Accelerated insertion of new material, manufacturing, and assembly processes • Proactive prevention of problems instead of compromise fixes after problems are uncovered during testing

  35. Systems Engineering Concepts • System*: An integrated composite of people, products, and processes that provide a capability or satisfy a stated need or objective • Systems Engineering (SE)*: An interdisciplinary engineering management process that evolves and verifies an integrated, life-cycle balanced set of system solutions that satisfy customer needs • Our premise: The goal of SE is to make informed decisions that efficiently mitigate risks while meeting goals • Every goal induces risk! Risks results from uncertainty!  UQ should be part of SE * Systems Engineering Handbook, DAU Press, 2000.

  36. Uncertainty and Systems Engineering for Aeroelasticity

  37. Airframe Certification (1/2) • Certification: The end result of a structured process for identifying and managing risk from conception to regular operation • Current processes: • Little reliance on analysis for risk assessment • Fail to promote interaction between tests and analyses • Inadequate for future materials, technologies, and design concepts • Result? Structures certified through safety-factor design and expensive “building block” tests • Additional $$$$$ spent to certify repairs (e.g., fatigue hot spots) and operational modifications (e.g., aeroelastic stability with new external stores) • Even certified airframes still have many unexpected problems • How can we learn tomorrow what we’re not learning today???

  38. Airframe Certification (2/2) • ASIP: USAF certification process for structural integrity • Reasonably successful but several shortcomings • Time-consuming and manpower intensive • Dependent on historical database • Risk assessment is too qualitative and subjective • USAF striving to increase reliance on analysis in airframe certification through • Higher-fidelity modeling earlier in design process • Uncertainty quantification (UQ) for risk analysis • Verification and validation of models • Streamlining and expanding knowledge generation and management processes • Why? • Increase safety and likelihood of achieving performance goals • Save time and money by reducing or eliminating some tests and accelerating iterative design processes • Avoid costly changes late in design cycle • Will these benefits actually be realized? TBD …

  39. Prediction and Information-Management Tools (2/2) • Need tools that actively promote the gathering and retrieval of relevant information • Knowledge-bases for capturing and accessing … • Conceptual design support info (e.g., historical requirements and capabilities) • Concept maps and influence diagrams for system-level interactions • Product-centric, object-oriented design environments that capture the methods operating on each product • Could include enforced documentation of modeling decisions • Also need tools that support consistent and rational data fusion, inference, risk assessment, and decision-making • Automated best practices and guided model-checking • Measures of confidence associated with expert opinions • Consistent model validation processes

  40. 2-DOF Airfoil LCO: Problem Description Kh Kq a q V W MCS • Subcritical Hopf bifurcation • 5th-order pitch spring • k3 < 0  destabilizing • MCS on a(t = 0), k3, k5 • 4,000 realizations at each reduced velocity • Incompressible, unsteady aero (Jones’ approximation)

  41. Current Issues in Uncertainty Quantification for Airframes

  42. Context for Identifying Research Challenges There are many kinds of risk: safety, performance, cost, and schedule Research justified on scientific grounds must also recognize non-technical priorities Analysis is a tool to support decision making in design and certification, which is a process of managing risk while trying to achieve performance goals

  43. Overview of Challenges • Technical Challenges • Probably familiar to most researchers • Non-Technical Challenges • Often a result of “institutional issues” • Non-technical because they can’t be resolved by technical advancement alone • Not always exclusive of technology because established design and certification practice often reflect assumed technical capabilities • Our Focus: Areas in which targeted research can lead to success given available computing and testing methods • No “Unobtainium” allowed!

  44. Aerodynamic Uncertainties (1/2) • Typical modeling issues won’t go away, but should they be re-prioritized for stochastic considerations? • Domain discretization and approximation of BC’s: How much “precision” is justified given aleatory uncertainties? • Simulation vs. design • What is “appropriate” level of fidelity or complexity? • Where and how to insert uncertainty models? • Sensitivity to IC’s • Structure? Flow? • Importance of non-stationary or extreme gust loads? • Many assumptions commonly made to work around uncertainty in atmospheric turbulence • von Karman spectrum and gust length scale are imposed compromises • Nonlinear instabilities sensitive to level of disturbance • Extreme gust events are not captured

  45. Aerodynamic Uncertainties (2/2) • Stochastic CFD for computational aeroelasticity • Model problem currently under study: 2 DOF airfoil with polynomial chaos for response • Modeled aero only (Jones approximation) • Uncertain: ka, kh, IC’s (a0) • Which problems would require or benefit from this? • Subcritical bifurcations  bimodal response pdf • Bifurcation sensitive to parametric uncertainty • Second-moment reliability methods not very “reliable” here • Need to know the nature of the bifurcation just to define what constitutes failure. • Integration with reduced-order solvers? • Institutional issues or roadblocks? • Training of analysts? • Integration with existing design tools and processes?

  46. Structural and Other Issues • Structural damping models • Perhaps a key factor in observed limit-cycles, but poorly understood • Which issues don’t we consider now but need to if certification required quantitative risk estimates of aeroelastic stability and performance? • More off-design conditions? • Representation of variable fuel and stores loads? • Uncertainty in composite lay-ups for aeroelastic tailoring? • Maybe not for low drag, but what about for embedded sensors (e.g., Sensorcraft) • Certain people don’t want to know about the uncertainties • Opportunities? • Active aeroelastic wing = built-in risk mitigation???

  47. Certification Philosophy (1/3) • Cert needs to be recognized by all as a structured dialogue that includes: • Designers and analysts • Test and manufacturing personnel • Cert Officials and users • This dialogue establishes perceived levels of acceptable risk for a given aircraft program • Safety and performance • Cost and schedule • Political • Cert officials haven’t declared how to use UQ to support cert decisions

  48. Certification Philosophy (2/3) • Trade-off studies suggest much potential for UQ here • Issues that impede use of risk analysis for airframes: • Current analysis and manufacturing capabilities • Availability of statistically significant input data • Background and mindset of decision makers • Legal and societal perception of quantified risk • Cost and time of design and cert process is high but “known” • Safety factors implicitly cover many sources of uncertainty • Parametric uncertainty, model errors, non-safety concerns (e.g., serviceability and performance), and “unknown unknowns” • How to allocate these in quantitative risk design criteria?

  49. Certification Philosophy (3/3) • Proposed innovative designs offer many unknowns w.r.t. current certification procedures • Identification of critical failure modes • Required testing to ensure safety in these modes • Required safety factors for UAV’s … no pilot to protect • Not yet clear if a “risk-informed” approach would be adequate for airframes • ~ Probabilistic safety factors (similar to LRFD in CE) • Airframe failure modes can be harder to identify a priori than those of civil structures • Hard to integrate new analysis methods and account for reduced risk associated with validated models • Hard to test airframes, but much easier than testing buildings!

  50. Other Considerations • Education and Training • Aerospace engineers get little training in probability and none in formal risk analysis • Undergrad curriculum does poor job of discussing practical failure modes and processes • Management often uninitiated also  hard sell! • Widespread high-fidelity analysis will require more sophistication of designers/analysts • Cost • Potential cost savings of risk-based cert hard to estimate • Inadequacies of current cert process often not evident until after long-term operation • Perhaps the true cost of current design and cert processes should be recalculated to include downstream consequences

More Related