1 / 55

Xiangyu Wang

Applications of Mixed Reality in Architecture , Engineering, and Con s truction: Specification, Prototype, and Evaluation. Xiangyu Wang. Outline. Background Specification Prototype Development System Evaluation Summaries and Conclusions. Background.

giona
Download Presentation

Xiangyu Wang

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applications of Mixed Reality in Architecture, Engineering, and Construction: Specification, Prototype, and Evaluation Xiangyu Wang

  2. Outline • Background • Specification • Prototype Development • System Evaluation • Summaries and Conclusions

  3. Background Mixed Reality (MR): an environment where real world and virtual world objects are presented together on a single display (Milgram & Kishino 1994; Milgram and Colquhoun 1999).

  4. Background • Goal: • To systematically and comprehensively transfer the available MR-based technology into Architecture, Engineering, and Construction (AEC) arenas. • Objectives • To develop a structured specification for mapping the available MR-based technology to specific tasks in AEC. • To develop prototypes named Mixed Reality-based collaborative virtual environments (MRCVEs) to transform the way of current design review collaboration. • To evaluate the above prototype systems in the aspect of benefits validation and usability engineering.

  5. SPECIFICATION METHODOLOGY Analyze AEC Tasks Classify MR Technology Map Technology to Task PROTOTYPE DEVELOPMENT Current Collaboration Mechanisms Groupware Issues Identify Feasible MRCVE Scenarios Concurrent Engineering, DVE, CVE Face to Face Conferencing Scenario MRCVE Prototype Development Virtual Space Conferencing Scenario PROTOTYPE EVALUATION Face-to-Face Conferencing Scenario V.S. Current Design Review Meeting Benefit Validation Virtual Space Scenario V.S. Current Web-based Design Collaboration Usability Evaluation Evaluation Methods System Improvements

  6. Specification: phase 1 • Phase 1- Specify MR Technology: • Classify MR (AR) based on four technological components: • Media Representation • Input Mechanism • Output Mechanism • Tracking Technology • Specification covers only major classes of devices. • Results founded and knowledge learnt in phase 1 lays foundation of mapping technology to tasks.

  7. Text Indication Platform, Tablet, Screen 2D Image & Video 3D Wireframe 3D Data 3D Object Augmenting Content Continuum Concrete Specification: phase 1 • Classifying the Media Representation (as an example of findings in phase 1) • Abstract-Concrete, or Schematic to 3D: high-fidelity representation is not necessarily superior over more abstract one because each type has its own appropriate application area. Abstract

  8. Specification: phase 1 • Other “continuums” for input metaphor, output metaphor, and tracking technology based on human’s cognitive aspect were also developed and elaborated in dissertation. All of these findings were used in mapping technology to AEC tasks (phase 3).

  9. Specification: phase 2 • Phase 2 – Analyze AEC Tasks • Factors from task influencing the applicability of MR technological components: • Task mental requirements • Working environment • Physical disposition • Hand occupation

  10. Specification: phase 3 • Phase 3: Map MR technology to AEC tasks based technological feasibility and usability in terms of physical and mental (human) factors

  11. Media Representation User Layer MR Component Layer Task Layer User Level Physical Movement Working Environment Task Analysis Tracking Technology Input Mechanism Hand Occupation Task Mental Requirements Task Layer User Layer A User-Centered Framework of Layer Interactions Output Mechanism

  12. Methodology (procedure) for MR System Development Cycle Task/Operation Site observation, interview etc. Analyze & Breakdown Step 1 Analyze Composite Tasks Perceptual Tasks Cognitive Tasks Information Processing Model Task Mental Requirement Working Environment Physical Disposition Hand Occupation Step 2 Feasibility Physical Mental Feasibility Physical Mental Feasibility Physical Mental Feasibility Physical Mental Usability Usability Usability Usability Media Representation Input Mechanism Output Mechanism Tracking Technology Step 3 MR System Prototype Expert Heuristic Evaluation Formative User-centered Evaluation Useful MR System

  13. Specification: phase 3 • Design specification and guidelines (AEC tasks) • Media Representation • Input Mechanism • Output Mechanism • Tracking Technology

  14. Prototype Development • Vision: To explore Mixed Reality (MR)-based tools that can provide the benefit of both 3D modeling and effective real-time collaboration to achieve design coordination objectives.

  15. Prototype Development • MRCVE : Mixed Reality – based collaborative virtual environment to realize design review collaboration through face-to-face conferencing or virtual space conferencing. • Technology mapping to review collaboration task for realizing a Mixed Reality Collaborative Virtual Environment (MRCVE) was implemented using the methodology described earlier.

  16. Step 1 — analyze the design review task Perceptual Task Scan, observe Inspect, discriminate, inspect Locate, identify Composite Task Encode, estimate, compare, analyze, plan Cognitive Task ANNOTATE design Write Motor Task Prototype Development

  17. Step 1 — analyze the design review task (cont’d) Prototype Development

  18. Prototype Development • Step 2 — map the technology to task: • Media representation:high-fidelity representations; • Input device: tangible input; • Output device: video-based See-through head-mounted-display: ARvision-stereoscopic HMD with a color video camera attached; • Tracker: large-scale pattern recognition. • Video and audio communication: Commercial Netmeeting software.

  19. Prototype Development • Tangible interface

  20. Prototype Development • Application Scenarios 2. Virtual Space Scenario 1. Face-to-face Scenario 4. Office-to-field Scenario 5. Field-to-office Scenario 3. Mixed Scenario

  21. Evaluation • Evaluation: • Benefits validation through experiments • To validate the benefits MRCVE application scenarios over the prevalent method. • System usability evaluation • Implement usability engineering evaluation on current MRCVE prototype against certain AR design guidelines.

  22. Evaluation: benefits validation • Design of experiment 1: Face-to-Face Conferencing Scenario V.S. Prevalent Design Review • Benchmark:Paper-based 3D drawing review collaboration. • Hypotheses: When compared to traditional paper-based drawing media, • Hypotheses 1: MRCD face-to-face scenario will significantly reduce the amount of time to complete task. • Hypotheses 2: MRCD face-to-face scenario will significantly reduce the workload of design review task. • Methodology: • Experiment • Post-test Questionnaire: subjects need to fill it in based on their gained experience from the experiments. • NASA task load index (TLX) to measure and compare the workload of using alternatives. • Direct observation or monitor of subjects’ collaborative performance by experimenter.

  23. Evaluation: benefits validation • Stimulus materials: Large-scale and simple models and corresponding 3D drawings are adapted from real projects of BMW contractor. • Subject: 16 engineering undergraduate and graduate students in Purdue. Every two subjects form a group for each treatment. • Measurement: time of completion and perceived workload. • Procedure: • Training session: Subjects were assigned enough time to practice how to use the different platforms. • Pre-experiment setting: Two subjects in one group played with two different sub-models (A and B) in AutoCAD 3D environment • Design error education: Every subject learned 4 design error patterns (3 known in common) • Real experiment: Two subjects sat together and started error-identifying in model C (A and B combined in a certain way) • Post-session Questionnaire: Filled in post-test questionnaires and the NASA TLX rating.

  24. Evaluation: benefits validation • Experimental Treatments: Paper-based 3D Drawing MRCD Face-to-face Conferencing

  25. Evaluation: benefits validation • Experimental statistical design Incomplete Block Design (Single Replication of Four Group – Two Period Crossover Design)

  26. Statistical Model: • Y = The time of detecting a conflict • M = the direct fixed effect of the nth method • T = the direct fixed effect of the gth pipe model • P = the fixed effect of the jth period. • , random fluctuations which are independent and normally distributed with mean 0 and variance . Evaluation: benefits validation

  27. Evaluation: benefits validation • Effect of treatments on time of completion

  28. Evaluation: benefits validation Statistical Results from SAS System

  29. Evaluation: benefits validation An F-test was implemented to the model to further validate the simplification. An F-Value 0.052 with corresponding P-value as 0.95 demonstrated insignificance of this simplification.

  30. Evaluation: benefits validation Discussion A t-test was further implemented to model and yielded an estimated performance difference for these two methods, which is 9.75 mins (with a P-value equaling to 0.0003). Mean and Median Value of Each Combination

  31. Evaluation: benefits validation • Effect of treatments on workload (NASA TLX) • F-value is 0.95 and p-value is 0.3385 (insignificant) Maximum Possible Rating Treatment Conditions

  32. Statistical Results for the Each NASA TLX Rating Category

  33. Evaluation: benefits validation • Questionnaire Results (Subsection 1): Scale: 1 2 3 4 5 6 poor excellent

  34. Evaluation: benefits validation • Questionnaire Results (Subsection 2): Scale: O O O O Totally agree Totally disagree • Q1: I felt that 3D interactivity in the MRCVE system aided design comprehension. 25%; 32%; 37%; 6%. 57% • Q2: Overall, compared with paper drawing, the AR system better facilitates design collaboration tasks. 25%; 37%; 25%; 13%. 62% • Q3: The MRCVE system better facilitated communication. 19%; 12%; 38%; 31%. 69% • Q4: The MRCVE system better facilitated creativity. 50%; 50%; 0%; 0%. 100% • Q5: The MRCVE system better facilitated problem-solving. 44%; 31%; 19%; 6%. 75% • Q6: The AR system increased the overall quality of output from the collaboration. 6%; 38%; 43%; 13%. 44% • Q7: The AR system better facilitated the quantity of work I could complete in a given amount of time. 36%; 32%; 20%; 12%. 68% • Q8: The AR system increased the quality of my contribution to the project. 32%; 30%; 32%; 6%. 62% • Q9: The MRCVE system increased my satisfaction with the outcome of the collaboration. 19%; 55%; 20%; 6%. 74% • Q10: The AR system increased understanding between my collaborator and me. 13%; 38%; 25%; 24%. 50%

  35. Evaluation: benefits validation • Design of experiment 2: Virtual Space Conferencing Scenario V.S. Web-based Design Collaboration • Benchmark:NavisWorks Roamer • Hypotheses: When compared to NavisWorks, • Hypotheses 1: MRCD virtual space scenario will significantly reduce time for performing the design review task. • Hypotheses 2: MRCD virtual space scenario will significantly reduce the workload of design review task. • Methodology: • Experiment • Questionnaire: subjects need to fill it in based on their gained experience from the experiments. • NASA task load index (TLX) to measure and compare the workload of using alternatives. • Direct observation or monitor of subjects’ collaborative performance by experimenter.

  36. Evaluation: benefits validation • Stimulus materials:Cluttered 3D models are adapted from real projects of BMW contractor. • Subject: 16 engineering undergraduate and graduate students in Purdue. Every two subjects form a group for each treatment. • Measurement: time of completion and perceived workload. • Procedure: • Training session: Subjects were assigned enough time to practice how to use the different platforms. • Pre-experiment setting: Two subjects in one group played with two different sub-models (A and B) in AutoCAD 3D environment • Design error education: Every subject learned 4 design error patterns (3 known in common) • Real experiment: Two subjects sat together and started error-identifying in model C (A and B combined in a certain way) • Post-session Questionnaire: Filled in post-test questionnaires and the NASA TLX rating.

  37. Evaluation: benefits validation NavisWorks Collaboration Treatment MRCD Virtual Space Conferencing Treatment

  38. Evaluation: benefits validation • Experimental statistical design • The same as for experiment 1.

  39. Evaluation: benefits validation • Effect of treatments on time of completion

  40. Evaluation: benefits validation Statistical Results from SAS System

  41. Evaluation: benefits validation An F-test was implemented to the model to further validate the simplification. An F-Value 0.547 with corresponding P-value as 0.59 demonstrated insignificance of this simplification.

  42. Evaluation: benefits validation Discussion A t-test was further implemented to model and yielded an estimated performance difference for these two methods, which is 17.2 mins (with a P-value equaling to 0.0001). Mean and Median Value of Each Combination

  43. Evaluation: benefits validation • Effect of treatments on workload (NASA TLX) • F-value is 4.92 and p-value is 0.047 (Significant) Maximum Possible Rating Treatment Conditions

  44. Statistical Results for the Each NASA TLX Rating Category

  45. Evaluation: benefits validation Questionnaire Results (Subsection 1): Scale: 1 2 3 4 5 poor excellent

  46. Evaluation: benefits validation • Questionnaire Results (Subsection 2): Scale: O O O O O Totally agree Neutral Totally disagree • Q1: I felt that 3D interactivity in the MRCVE system aided design comprehension better than the 3D interactivity in NavisWorks. 31%; 38%; 13%; 13%; 5%. 69% • Q2: Overall, compared with NavisWorks, the AR system better facilitates design collaboration tasks. 13%; 56%; 0%; 26%; 5%. 69%. • Q3: The MRCVE system better facilitated communication. 19%; 44%; 6%; 19%; 12%. 63% • Q4: The MRCVE system better facilitated creativity. 19%; 44%; 26%; 6%; 5%. 63% • Q5: The MRCVE system better facilitated problem-solving. 13%; 52%; 6%; 29%; 0%. 65% • Q6: The AR system increased the overall quality of output from the collaboration. 13%; 44%; 13%; 31%; 0%. 57% • Q7: The AR system better facilitated the quantity of work I could complete in a given amount of time. 44%; 26%; 13%; 13%; 4%. 70% • Q8: The AR system increased the quality of my contribution to the project. 26%; 44%; 6%; 18%; 6%. 70% • Q9: The MRCVE system increased my satisfaction with the outcome of the collaboration. 13%; 50%; 19%; 13%; 5%. 63% • Q10: The AR system increased understanding between my collaborator and me. 13%; 19%; 31%; 26%; 11%. 32%

  47. Evaluation: Usability • Heuristic evaluation • AR usability guidelines (Gabbard 1997) • Our specification and design guidelines • Formative user-centered evaluation • The two experiments mentioned earlier were also implemented as usability experiments

  48. Evaluation: Usability Results and Interpretation of Usability Analysis for Face-to-face Conferencing Scenario. Scale: 1 2 3 4 5 6 (very little) (very much)

  49. Evaluation: Usability • Would you be resistant to using face-to-face conferencing scenario system or similar MR systems in the future? About 81.3% (13) gave negative response. • Would you embrace the opportunity to use the face-to-face conferencing scenario system again in the future? About 69% (11) gave positive response.

  50. Evaluation: Usability Results and Interpretation of Usability Analysis for Virtual Space Conferencing Scenario. Scale: 1 2 3 4 5 (very little) (very much)

More Related