# Part 2

## Part 2

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
##### Presentation Transcript

1. Part 2 Planning

2. Course Outline Day 1 Day 2 • Part 0: Student Introduction • Paper Helicopter - Pt 0 Use what you know • Part 1: DOE Overview • What is a Designed Experiment? • Part 2: Planning • Understand the test item’s process from start to finish • Identify Test Objective – screen, characterize, optimize, compare • Response variables • Identify key factors affecting performance • Paper Helicopter - Pt 1 Planning • Part 3: Hypothesis Testing • Random variables • Understanding hypothesis testing • Demonstrate a hypothesis test • Sample size, risk, and constraints • Seatwork Exercise 1 – Hang time Measurements • Part 4: Design and Execution • Part 4: Design and Execution • Understanding a test matrix • Choose the test space – set levels • Factorials and fractional factorials • Execution – randomization and blocking • In-Class F-18 LEX Study - Design Build • Paper Helicopter - Pt 2 Design for Power • Part 5: Analysis • Regression model building • ANOVA • Interpreting results – assess results, redesign and plan further tests • Optimization • In-Class F-18 LEX Study - Analysis • Seatwork Exercise 2 – NASCAR • Paper Helicopter – Pt 3 Execute and Analyze • Paper Helicopter – Pt 4 Multiple Response Optimization • Part 6: Case Studies

3. Science of Test IV Metrics of Note Plan Sequentially for Discovery Factors, Responses and Levels Design with Confidence and Power to Span the Battlespace N, a, Power, Test Matrices Analyze Statistically to Model Performance Model, Predictions, Bounds DOE Execute to Control Uncertainty Randomize, Block, Replicate

4. In Planning, We Think Hard! • Determine objective(s), consider test phases (test-look-test) • Many objectives possible: troubleshoot, reduce variation, find good design parameters for machine (robust product design), shift center • We’ll look at four others common to military T&E • Dig out the response variables – candidate MOPs etc. • Objective, precise, real-value measureable - the gold standard • Binary and subjective, while occasionally unavoidable, are poorer • Brainstorm ALL the potential causal factors – then decide how to control during test • No math, no clever ideas here … just plain hard work

5. Test Objective(s) • Screening • In many experiments we don’t know which factors play the greatest role, especially at the outset • Screening refers to conducting an experiment to identify the key factors • Experimental Approach • Include any factors in the test matrix that are thought to effect the response • Choose an initial experiment test matrix that uses a minimum of test resources • Execute the test • Identify the statistically significant factors • Continue with a sequential test program to fully characterize the response as a function of the identified key factors

6. Test Objectives • Characterization • We seek to define a math model for the response(s) as a function of the factors • Examples: • Aircraft Pitching Moment Coefficient as a function of aircraft canard surface position • Turbofan efficiency as a function of altitude

7. Test Objectives • Comparisons • Using statistically defensible methods, we seek to make a choice between options, or compare to a standard (Spec, ORD, CPD…) • Example: • The Air Force wants to field a new air-to-ground missile, the (REBEL AGM vA) • Early testing has shown that the missile often lands either well short or long of the target • An experiment must be conducted to determine if thenew variant of the REBEL (REBEL AGM vB) is as good as the old variant • We must choose enough trials to prove that eithervB is as good as vA or it is not

8. Test Objectives • Optimization • Often follows screening, characterization • Using a response model from an experiment: • Find the factor settings that maximize/minimize the response(s) • Example • Add strain gages to an existing structure to measure externally applied forces • Use DOE to characterize strain gage response from a structural analysis package (FEA) • Move gage locations on the structure to maximize strain response to Fx while minimizing response to Fy,Fz,Mx,My,Mz

9. Example: Leading Edge ExtensionDesign Comparison LEX D2 (Existing) LEX D1 (New) Suppose the USAF proposes an experiment to evaluate a Leading Edge Extension (LEX) modification to the Navy F-18 as part of a program to adapt the aircraft for USAF use (contrived) The goal is to find increased lift over a range of angles of attack without changing the pitching moment characteristics or increasing drag The focus of the program is to enhance low-speed performance

10. Plan: Identify Test Objective • What are we testing? Are we… • screening, characterizing, optimizing, comparing ? • In this example we are comparing the LEX types through aerodynamic characterization • Once we decide what we’re testing, we have to pick our response variables • We seek to increase Lift, and not increase Drag or change Pitching Moment as a result of changing to a new LEX • Our responses are • Lift (desire increase) • Drag (desire no change or reduce) • Pitching Moment (desire no change)

11. Start Yes No Output Decision Process Step Two Simple Tools for Designed Experiment Planning Process Flow Diagram Cause and Effect Diagram C = Control H = Hold Constant N = Noise Standard Operating Procedures

12. Stabilizer A-o-A LEX Sideslip Understand the Test Process • Suppose the budget supports a small scale model program to be tested in a low speed atmospheric wind tunnel • The key factors chosen by priority are: • LEX type (of course) • Angle of Attack (a) • Sideslip (b) • Stabilizer deflection

13. AA-10 F-15C Crank Fox Fox Crank Su-27B AIM-120 Decomposing AMRAAM and F-15C OFP Shoot ‘n Crank Process

14. Tool 1 – Process Flow for RESPONSES (MOPs) Process Flow Diagram Table of MOPs

15. Example Process Flow: AMRAAM Process Flow Diagram End No Shoot? Yes Shot Range/Time Acquire Time/ Range Tracking Range/Time Fighter IP Miss Distance Pk (Red/Blue) F Pole A Pole End End Table of MOPs

16. Understand the Test Process • 1/15th Scale Langley Full Scale Tunnel test operations • Install • Mount 6-component force balance in model • Mount model to support (sting) on pitch crescent • Align and level model with centerline of tunnel • Zero the sideslip and angle of attack position encoders • Operation • Set Data Acquisition System to desired sampling rate • Record wind-off values for all forces (these are subtracted from wind-on to yield aerodynamic contribution to loads) • Start tunnel fan and adjust RPM to desired flow conditions • Manually move model to set points in test matrix • Record all forces and moments • Adjust factors for next set point

17. Tool 2 – Fishbone to Construct Candidate Factors Measurements Materials Manpower Cause Labels: C -- Constant (scope limit) N -- Noise (uncontrolled) X – Experimental Factor Response to Effect Table of Factors Machines Mother Nature Methods

18. Example: LEX Comparison • Cause and Effect Diagram: F-18 LEX Study • Identify all factors: experimental, constant and uncontrolled Key Factors: LEX Type Angle of Attack Angle of Sideslip Stabilizer Deflection Measurements Materials Manpower Model mat’l Force Balance Experience Support mat’l a and b encoders Training Improved Lift Response CS protractors Press x-ducer (q) Wind tunnel set pt Temperature Tunnel speed control Model attitude set pt Humidity DAS sampling Rud/aileron locks Machines Mother Nature Methods

19. Example: AMRAAM Effectiveness Locations Missile Lot/OFP Red Pilot Skill Event time Aircraft OFP Turn g’s Blue Pilot Skill Miss Distance Turn Direction ECM Settings Altitude Measurements Airspeed Materials Manpower Winds Red Missile/Rdr Turn Duration Tech Order AMRAAM Mode Visibility Turn g’s Radar Mode Turn Delay Response to Effect Table of Factors Machines Mother Nature Methods

20. Treating Factors Not in the Test Matrix • Some factors will vary as noise during test • All effects that aren’t in the test matrix are interpreted by the mathematics as random noise by default • Factors outside the matrix can potentially affect the response • Examples: temperature, atmospheric pressure,other facilities nearby in operation, electrical noise effect on balance, backlash in gears of pitch mechanism (sets Angle-of-Attack) • Hold constant during test - careful • Understand the implication of such a decision • Inference only for specific settings of that factor – e.g. rudders neutral • Examples: aileron deflection, rudder deflection - locked out assumed zero influence – may actually deflect under load for instance

21. Final Process Description – Input-Process-Output (IPO) Diagram Process Flow Chart Input-Process-Output (IPO) Process Under Test Outputs (MOPs) Experimental Inputs

22. F-18 LEX Study Constant: Rudder, Aileron and Flap deflections Test Section Temp Atm. Pressure INPUTS (Factors) OUTPUTS (Responses) PROCESS: Lift Coefficient LEX Type Drag Coefficient F-18 Aerodynamics Angle of Attack Pitch Moment Coefficient Angle of Sideslip Stabilizer Deflection Flow irregularities Data reduction errors

23. Air-to-Air Missile Simulation Rocket motor burn time Boresight misalignment Radometransmisivity INPUTS (Factors) OUTPUTS (Responses) PROCESS: Target Altitude Offset Hit or Miss Range to Target Angle Off Shooter Nose Miss Distance Air-to-Air Missile Background Clutter Time to Acquire Target Countermeasures Target Maneuver Target Aspect Fragmentation pattern Fuze timing

24. Composites Production Raw Material Quality Pre-form Consistency INPUTS (Factors) OUTPUTS (Responses) Resin Flow Rate PROCESS: Fiber Permeability Type of Resin Gate Location Product Quality (Dry Spots, Voids) Fiber Weave Resin Transfer Molding Tensile Strength Mold Complexity Fiber Weight Test conduct inconsistencies

25. Flight Test Gross Weight Radar Measurement INPUTS (Factors) OUTPUTS (Responses) Airspeed PROCESS: SCP Deviation Turn Rate Set Clearance Plane Crossing Angle Ride Mode TF / TA Radar Performance Pilot Rating Nacelle Terrain Type Operator variability

26. Penetration Tests Weather, training, TLE, launch conditions INPUTS (Factors) OUTPUTS (Responses) Weapon Nose Shape PROCESS: Penetration Path Length Weight Impact Velocity Vertical / Horizontal Depth Impact Angle Hard Target Penetration Amount of Turn Angle of Attack Measurement Error

27. Paper Helicopter Noise INPUTS (X’s) (Factors) OUTPUTS (Y’s) (Responses) Type of Paper Number of Clips PROCESS: Type of Clips Wing Length Accuracy (in) Flight Quality (pts) Wing Width Hang Time (sec) Paper Helicopter Body Length Body Width Wing Angle Noise

28. Session 2: Summary Define the Problem • Understand the test item’s process from start to finish • Identify objective of test – screen, characterize, optimize, compare? • Identify all factors in the test matrix • Identify all responses • Identify factors affecting performance but not in the test matrix

29. Paper Helicopter Design Do Paper Helicopter – Pt 1 Planning