1 / 49

Human Aspects of NEC: Decision-Making, Organisation and Information

Human Aspects of NEC: Decision-Making, Organisation and Information. Dr Andy Belyavin A presentation to: Operational Research Society Farnborough. 18 April 2007. NEC. Introduction of new IT to systems presents substantial challenges

oria
Download Presentation

Human Aspects of NEC: Decision-Making, Organisation and Information

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Human Aspects of NEC: Decision-Making, Organisation and Information Dr Andy Belyavin A presentation to: Operational Research Society Farnborough 18 April 2007

  2. NEC • Introduction of new IT to systems presents substantial challenges • National Audit Office concluded that benefits rarely realised if previous system is maintained by IT introduced keeping processes constant • Introduction of IT is an enabler of organisational change • Analysis of the impact must understand this key element • Focus of the analysis must be on the people dimensions of the system • If the focus is on the IT itself wrong conclusions will be drawn almost surely

  3. Approaches to identifying solution • Developing strategy for organisation change is a hard problem • Tends to be done by constructing a plausible solution and then iterating by “trial and error” • Not a good solution for military systems • Clearly better if the problem can be approached analytically • Desirable elements of the solution identified • Undesirable elements ruled out • Put final polish on solution empirically • Presentation will discuss models of human decision-making and measures of performance for C2 systems

  4. NEC Human Challenges

  5. Key elements of modelling • At an abstract level can regard a C2 system as a complex system for taking large volumes of data in at one end and putting out decisions at a number of levels • Critically: need to be able to describe human elements in the system • Includes: • Need to be able to represent data flow in the system between human agents • Need to be able to model the process with time • Need to be able to represent conversion of data into models that can be used for information processing and decision-making

  6. Problems in human representation • Key issues in the people component of NEC that need to be described for long term concept development • Decisions • Information flow • Organisation form and process • Training and doctrine • ……… • Focus discussion on decisions, information flow and the assessment of organisation performance

  7. 01Decision making

  8. Basic principles and assumptions • Assumed that low- to medium- level military decisions are trained decisions made under time pressure • Appeal to Klein’s recognition-primed decision-making as the model • Effectively classify the inputs and map directly to courses of action • From the statistical point of view this corresponds directly to the multivariate discrimination problem • First approach developed by Fisher in 1930s – Fisher’s Linear Discriminant (LDF) • Demonstrated that solution to classification problem optimal if use weighted likelihood ratio

  9. Measure 2 A appropriate c1>c2 B appropriate c1=c2 c1<c2 Measure 1 Simple discrimination Discriminant function slope

  10. Components of the solution • Three key inputs to the classification • Mental model used to classify outcomes (discriminant function) • Perceived costs and benefits of outcomes (individual characteristics) • Data on which model is based (information) • Complicating factor is that decision is not at single time • Decision may evolve with time – need to model development • We can solve the problem for optimal classifier • In practice classifier does not need to be optimal; just pretty good and varies from individual to individual under some conditions • Can update decision with time to correct imperfect decisions

  11. Investigating model choices in decision-making DECIDE • Objective of the trials was to investigate use of information in decision-making • DECIDE task was developed under the guidance of Neville Moray at Surrey University • The aim was to control flow of troops through hostile territory to achieve the largest number sent with minimum casualties • Casualties were incurred when enemy strength was high and low when strength low • Score determined as a combination of flow achieved and casualties incurred

  12. DECIDE task (1) • The task is to send troops through a hostile zone • Enemy strength varies in the hostile zone and this determines the number of casualties taken • Participants had to decide when to send and when to stop sending troops • The task is to send the most number of troops through the zone whilst incurring the fewest casualties • Information is initially hidden and participants must request information by clicking on the source • Each request for information is recorded in a data log

  13. DECIDE task (2) • Participants can access four sinusoidal information sources (with added noise) • The four sources have different amplitudes and wavelengths • They must use these sources to infer enemy strength • The actual enemy strength is the sum of the four sine waves (without noise) • The best indicator is given by the sum of the four noisy sources • Metric of task success is:

  14. Participant performance • Performance was different for the three groups • Each group was given a different level of information about the task: • Group A: No information about the sources • Group B: Basic information about how the sources relate to enemy strength and an indication that two of the sources are better than the other two • Group C: Received the same information as Group B but after a period of training • A score of 500 represents a good score • The best participant scored 1600 on a number of runs

  15. Human variability • Three main sources of human variability: • different sources of information used to estimate enemy strength: this was deduced from the frequency of request of each source and the post trial interviews • frequency of use for each source: each participant had access to different information depending on their update frequencies • willingness to take casualties: some participants sent as enemy strength was just start to drop and others sent when enemy strength had reached a trough • The information value at each time step of the task was collected and used to fit classification models to the behaviour of the subjects

  16. DECIDE taskIPME model • DECIDE was simulated using the underlying equations governing the generation of the information sources etc. • A simple probabilistic model of the monitoring of the information sources was created based on the observed frequency of request for the individual sources • A two-state (send/not sending) operator decision model was developed: • at the end of each iteration the state was re-evaluated using the classification model • state is changed when there were two consecutive positive decisions to change state • classification model was based on the current state of the decision

  17. Classification model • Classification model is used to separate data into a number of populations • In the case of the DECIDE task we have two decisions: • to send when not sending • to stop when sending • The threshold of the decision was determined by coupling the model to an optimisation algorithm • The performance of the operator was used as the objective of the optimisation • The threshold was altered by the algorithm until the performance matched the observed performance • The threshold gives some indication as to whether people are willing to send early (upper boundary) or late (lower boundary)

  18. Performance of the model against the observed data • The classification models were able to reproduce performance scores well for 38 participants • The remainder did not appear to be using the information sources • Start decision was well modelled • Stop decision was more difficult to model and there was a tendency for simulated participants to stop sending too soon and then resend shortly afterwards • There was a relationship between personality and the timing of the start/stop decisions Observed Score = 633 Simulated Score = 560

  19. Summary conclusions • Basic classification model can vary from individual to individual • Crude representation of evolution of decision with time can be quite effective • Rule of three used in DECIDE task modelling • Criterion influenced by individual characteristics – personality in this case • Principles employed in simulation of behaviour of Anti-Air Warfare Officers in naval simulation with credible results

  20. 02Information and organisation performance

  21. Information in an organisational context • Two aspects to system performance: time to perform and quality of output • Much analysis of processes focuses on time to perform but quality of output is as important • Can model decision making at the pattern matching level as described earlier • Can this be extended to provide assessment of processes and procedures within a C2 system? • Ideally need some approach that encapsulates these factors and can be used for engineering a system • Study described here was based on methods for measuring information • Two widely used measures of information content: • Shannon’s information (entropy) • Fisher’s Information

  22. Shannon’s entropy • Data and information are different although often treated as the same • Data are part of the physical domain and measured in bits; information is in the cognitive domain and is measured in models of the current and future state of the world • Shannon’s entropy is strictly a measure of optimal coding for messages and therefore of data • Has no concern about the meaning of a message – information content • Interested in the quantity of data measured innumber of bits • Provides a measure of data flow given assumptions about the pattern of data elements in the stream

  23. Fisher’s Information • Fisher’s Information measures the amount of information data provides about a set of model parameters • Expressed in terms of the precision of these estimates provided by the data • Derived from the Maximum Likelihood estimation procedure • Can be viewed as a measure of the quality of the model in terms of describing the data • Can be extended to describing the information content of the model • Decided to use Shannon’s entropy as a measure of data flow and Fisher’s Information as a measure of information content • Basic measures are not commensurate • Have used the approach of Cedilnik and Košmelj to bring them onto a common scale

  24. Mathematical definition of the measures • Shannon’s entropy ep is defined by the equation on the right • If it is assumed that there are n possible values for the content and there are all equally likely, the measure simplifies • Fisher’s Information I is based on the estimate of the variances of a set of k parameters θ. • If it is assumed that the parameters lie in a range (a,b) the expression on the right provides a measure that is consistent with ep

  25. Consider a sample of data that might be coming into the system Series of pairs of numbers – a sample shown on the right Considered from the point of view of Shannon’s entropy the information content is the length of the message The message comprises 20 numbers reported as a maximum of three decimal digits The length of the message is a maximum of 20 x 7 bits = 140 bits That is the data content……. (1.0 , 1.0) (2.0, 1.7) (3.0, 3.3) (4.0, 4.1) (5.0, 4.9) (6.0, 5.5) (7.0, 7.2) (8.0, 8.3) (9.0, 8.9) (10.0, 9.9) Example data flow

  26. Develop context and model (1) • Suppose this sequence of pairs of numbers records the advance of an entity with time • Extra information: we can estimate the average speed

  27. Develop context and model (2) • Suppose this sequence of pairs of numbers records the advance of an entity with time • Extra information: we can estimate the average speed • A model we are applying to the data • Speed is not exact as data has noise • Extra information can be estimated using Fisher’s information • Using basic assumptions the information added is 5.46

  28. Develop context and model (3) • Suppose this sequence of pairs of numbers records the advance of an entity with time • Extra information: we can estimate the average speed • A model we are applying to the data • Speed is not exact as data has noise • Extra information can be estimated using Fisher’s information • Using basic assumptions the information added is 5.46 • We can estimate the position at 15 and 18 • Following same logic, further information added is 9.48

  29. Develop context and model (4) • Suppose the underlying observations are twice as variable • Using basic assumptions the information added is 4.66 • We can estimate the position at 15 and 18 • Following same logic further information is 7.88

  30. Fisher and good and bad models • Previous example was developed using the “true” model • What happens if inappropriate model is applied? • Appropriate model fit is shown in the upper graph • Inappropriate model shown on the lower graph • The estimates of Fisher’s information for the “slopes” in the two cases are: • 11.46 • 2.04 • If we used this for prediction the added information would be small for the inappropriate model

  31. Metrics, models and data • Examples displayed in previous slides illustrate three key points: • We can construct a methodology for measuring effect of information transactions • The metrics are sensitive to data quality and model quality • They demand an understanding of how models are acquired • Simple example deals with a model constructed from data gathered as part of the information flow • For data fusion the model will have been constructed prior to system use • To apply the previous logic we need to know the quality of the model • In addition we will have to handle variability in the data to which we apply predictive models

  32. Approach to testing the metrics in an organisational model • Selected a model with a repetitive decision that had been modelled previously • Based on the DECIDE task • Original form comprised a single-person task with multiple information sources • The task was taken as the basis for a model of a headquarters with four streams of information and a simple decision to make • Permits an overall measure of effectiveness through task score • Can manipulate information use and study overall effect • Includes natural delays and possible representation of corruption • Information flow resembles that of some Battlegroup headquarters

  33. General Behaviours in an Organisation • Decision making is a special case of process where information is turned into an order

  34. Structure • The structure of an organisation is determined by: • causality between processes • formal relationships between agents • informal relationships between agents

  35. Basic building blocks in the HQ model • Information processing behaviours • Gather data • Process and fuse information • Decide • Order action • Representation of the impact of decisions by closing the loop using a pseudo-military task • Use original information pattern from DECIDE task • Abstract data observation and interpretation as flows between cells in a notional HQ

  36. Problems to be represented in the metrics as applied to the model • Quality of decision-making procedure in information terms – reflecting training and experience • Impact of timeliness on decisions • Impact of unreliable information sources • Impact of inappropriate models • Two aspects must be addressed so that Fisher’s Information can be calculated • Precision of the fusion model • Variability of the data employed in the fusion

  37. Acquisition of the data fusion models • In the development of the statistics of the data fusion model it was assumed that the model was based on experience of the real system • This was represented by gathering data from the simulated task and fitting the fusion model to the observations • From the model fits the variance characteristics of the model are described • It is assumed that training and experience is represented by a level of exposure to real situations • Observations of performance following training indicate a performance curve that follows a t-½ law where t is the training time • The model that assumes exposure will follow the same law statistically

  38. Timeliness • The timeliness aspects of information are captured in two components of the model • The rate at which enemy strength changes in the simulated world • Time delays in the processing of information in the model

  39. Unreliability of information and appropriateness of the model • In the simulated HQ information sources can become corrupt • An extra step was inserted in the information processing to check the quality of the source vulnerable to corruption • Simple linear prediction was used to describe the check • For the construction of this model it was assumed that effectively unlimited experience would be available for “own sensors” • Variance of the model therefore assumed to be small

  40. Conditions tested • Simulations of the HQ model were conducted varying the following conditions • Amount of experience of the decision-maker • Level of noise on the data for the training of the decision-maker • Level of noise on the data in the simulated decision making • Presence or absence of source corruption • Effectively trying to measure three aspects of information handling • Quality of basic data • Quality of models used in decision-making • Appropriateness of decision making models

  41. Basic features of demonstration • Data flows at the same rate under all circumstances • Noise on the data is used to modify the effective input information according to Shannon’s entropy – assumed that data reported to appropriate precision • Fisher’s Information is summed from the analysis of potentially corrupt data and from the calculation of fused information • In general the information added in data fusion is of the same order as the information in the input data • Quality of training and experience contributes about the same amount as the data gathered from sensors

  42. Effect of noise on performance and ModFI

  43. Effect of training on performance and ModFI

  44. Effect of information delay on performance and ModFI

  45. ModFI as a predictor of performance

  46. Conclusions • It is possible to describe transactions in a model C2 system using a combination of Shannon’s entropy and Fisher’s Information • The information metrics correlate with overall performance in the abstract example used in the study • The key to the approach is the description of the models applied in decision-making • An essential element is the description of the statistical properties of these models • Some of these elements can be estimated through additional simulation • It is also important to describe data accuracy and information content in the same terms

  47. Overall summary • Human decision-making in a range of contexts can be represented using models from statistical classification • There is variability in the quality of the models employed by individuals as a function of training and experience • Individual characteristics can affect the decision taken through perception of the outcomes • Impact of information flow processes can be captured using Fisher’s information • Sources of variability that affect Fisher’s information include • Quality of decision making model • Reliability of basic data on which it is based • Influence of organisational processes that affect variability • Within limits of current study Fisher’s information is a passable predictor of organisational performance

More Related