1 / 22

D. (Denis) Ssebuggwawo 1 , S.J.B.A. (Stijn) Hoppenbrouwers 1 & H.A. (Erik) Proper 1,2

Assessing Collaborative Modeling Quality Based on Modeling Artifacts. D. (Denis) Ssebuggwawo 1 , S.J.B.A. (Stijn) Hoppenbrouwers 1 & H.A. (Erik) Proper 1,2 1 ICIS, Radboud University Nijmegen, The Netherlands 2 Public Research Centre -- Henri Tudor, Luxembourg

aden
Download Presentation

D. (Denis) Ssebuggwawo 1 , S.J.B.A. (Stijn) Hoppenbrouwers 1 & H.A. (Erik) Proper 1,2

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assessing Collaborative Modeling Quality Based on Modeling Artifacts D. (Denis) Ssebuggwawo1, S.J.B.A. (Stijn) Hoppenbrouwers1 & H.A. (Erik) Proper1,2 1ICIS, Radboud University Nijmegen, The Netherlands 2Public Research Centre -- Henri Tudor, Luxembourg 3rd Working Conference on The Practice of Enterprise Modeling (PoEM’10) Delft University, The Netherlands 9-10 November, 2010

  2. MENU Overview Collaborative Modeling EvaluationHypothesized Model & Alternative Model Empirical ResultsConclusion & Future Direction

  3. Overview Overriding Goals Determine the Efficacy: (Efficiency & Effectiveness ) - evaluate the different constructs (ML, MP, EP, ST) to determine the overall efficiency and effectiveness Efficiency : reduce the effort Effectiveness: improve the quality of the result Determine the Success of collaborative effort : (Success factors) - evaluate the modeling effort to determine (critical) success factors that influence the efficiency & effectiveness.

  4. Overview Modeling Artifacts Anchoring Collaborative modeling Evaluation on modeling artifacts Modeling Language (ML) Modeling Procedure (MP) End-Products (EP) Support Tool or Medium (ST/M)

  5. Overview The Modeling Artifacts

  6. CM Evaluation Supporting Frameworks SEQUAL (Lindland et al., 1994; Krogstie, et al., 2006) Based on: Semiotic theory Understanding the quality of conceptual models TAM (Davis 1986; Davis et al., 1989) TRA/TPB (Fishbein, 1975; Ajzen, 1991) About: Attitudes, Beliefs, Intentions/Perceptions, Behaviour Explaining & predicting user acceptance of IS/ITs MEM (Moody, 2001; Moody, 2003) Based on : Methodological pragmatism (Theo.Know validation) Evaluating IS design methods & TAM

  7. Controllable factors Uncontrollable factors PERCEPTIONS INTENTIONS BEHAVIOUR Behavioural Beliefs and Outcome Evaluations (bbioei) Attitudes toward Act or Behaviour (AB) Behavioural Intention (BI) Actual Behaviour (B) Normative Beliefs and Motivation to Comply (nbjmcj) Subjective Norm (SN) External/ Internal Psychological Behavioural Environment variables variables variables CM Evaluation Theory of Reasoned Action (TRA) Fig. 1. TRA Model

  8. CM Evaluation Hypothesized Model Interactions MP_1 MP_2 ML_n … … ML_2 MP_n ML MP ML_1 Fig. 2. Hypothesized Model Interactions EP ST EP_n … EP_1 EP_2 ST_1 ST_2 ST_n …

  9. CM Evaluation The Constructs Perceived Quality of the Modeling Language (PQML) Perceived Usefulness of the Modeling Procedure (PUMP) Perceived Quality of the End-Products (PQEP) Ease of Use of the Medium or Support Tool (EOUM/ST)

  10. CM Evaluation:Original Quality Dimensions

  11. CM Evaluation: Synthesized Quality Dimensions

  12. Hypothesized (a priori) Model Fig. 3. Hypothesized Model

  13. Alternative (Competing) Model Fig. 4. Competing Model

  14. Empirical Results Modeling Expt. & Evaluation Modeling Experiment:Collaborative modeling session using COMA toolEvaluation:Using a measurement instrument (Questionnaire) 7-pt Likert Scale Constructs to assess:PQML, PUMP, PQEP and EOUM

  15. Empirical Results Validation & Reliability Tests Exploratory Factor Analysis(EFA) Goal: Retain factors that account for significant amount variance in the data. Precursor to CFA.Principal Component Analysis (PCA): Data reduction : determing the number of factors Factor rotation : determining the (non-)correlation of factorsCommon Factor Analysis/Principal Factor Analysis: understanding the relationship btwn: indicators (measured: MLs, MPs, EPs, STs) variables in terms of factor (latent: PQML, PUMP, PQEP, EOUM) variables

  16. Empirical Results Validation & Reliability Tests EFA Resultsc Exploratory Factor Analysis(EFA)

  17. Empirical Results Validation & Reliability Tests Confirmatory Factor Analysis(CFA) (Structural Equation Modeling (SEM)) A priori hypotheses:Testing: a priori hypotheses/theoriesAssessing Goodness-of-fit: Assessing the goodness-of-fit based on variance after factor reduction in EFAAssessing validity & reliability: Testing and confirming the validity & reliability of a measurement instrument

  18. Model 1: Hypothesized CFA Results Model: Fig. 5. Path diagram Model 1

  19. Model 2: Competing CFA Results Model: Fig. 6. Path diagram Model 2

  20. Empirical Results Validation & Reliability Tests CFA Resultsc Confirmatory Factor Analysis(CFA)

  21. Conclusion & Future Direction Conclusion:Rather than model quality, other artifacts can be used in the evaluation of quality and success of a collaborative modeling effort.Future Direction:Establishing the interdependencies of the artifacts and their impact on the overall qualityMeasuring the acceptability and adoption of the quality framework in practice

  22. Thank you. Questions ?

More Related