1 / 61

Preparation and Planning for ICSE 2006 (28th International Conference on Software Engineering)

Preparation and Planning for ICSE 2006 (28th International Conference on Software Engineering). Leon J. Osterweil (ljo@cs.umass.edu) University of Massachusetts Amherst, MA 01003 USA SinoSoft 2003 Beijing, China 3 December 2003. Who am I?. Professor of Computer Science

judith
Download Presentation

Preparation and Planning for ICSE 2006 (28th International Conference on Software Engineering)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Preparation and Planning forICSE 2006(28th International Conference on Software Engineering) Leon J. Osterweil (ljo@cs.umass.edu) University of Massachusetts Amherst, MA 01003 USA SinoSoft 2003 Beijing, China 3 December 2003

  2. Who am I? • Professor of Computer Science • University of Massachusetts, Amherst • Software Engineering Researcher • Software Process Definition • Software Analysis • Dean, College of Natural Sciences & Mathematics • General Chair, ICSE 2006 • 28th Int’l. Conference on Software Engineering

  3. Why Are We (with Mary Lou Soffa) Here? • Plan for ICSE 2006 • Meet Research colleagues • Plan events leading up to ICSE 2006 • Support Chinese initiatives in software engineering

  4. What is ICSE? • Largest, most important, meeting on Software Engineering Research • Also covers practice, history, future • Actually a set of colocated meetings • Many accompanying workshops • Many tutorials • Usually 1000-1500 attendees • Held annually around the world • In US in odd-numbered years • In Europe every four years • Previous Asian meetings • Tokyo, Osaka, Singapore

  5. Washington San Francisco Atlanta San Diego Orlando Monterey, CA Pittsburgh Austin Baltimore Seattle Boston Los Angeles Portland St. Louis (2005) Munich, Germany Tokyo, Japan London, England Singapore Nice, France Melbourne, Australia Sorrento, Italy Berlin, Germany Osaka, Japan Limerick, Ireland Toronto, Canada Edinburgh, Scotland (2004) Past ICSEs

  6. ICSE 2006 • In Shanghai • First time on Asian continental mainland • General Chair • Leon J. Osterweil • Program CoChairs • Mary Lou Soffa • Dieter Rombach • Organizer • Prof. Dehua Ju

  7. Component Structure of ICSE • Main Conference • Three days, 4-5 parallel tracks • Satellite conferences • 2-3 smaller ones, both before and after ICSE • Many workshops • 12-15, mostly one or two days before ICSE • 30-60 attendees • Many tutorials • As many as 25 • Both before and after ICSE • Tools/trade exposition (?) • If there is interest

  8. Possible Events Prior to 2006 • In 2005 • One large (200 attendees?) conference in Shanghai • One or more workshops around China • In 2004 • Research seminars in Beijing and Shanghai • Research tutorial series around China • Workshop somewhere in China

  9. What are our research interests? • Leon Osterweil • Software Process • Software Analysis • Mary Lou Soffa • Software Analysis, Testing, and Debugging • Compilers and Optimizers • Dieter Rombach • Empirical Methods • Reviews and Walkthroughs

  10. Research InterestsLeon J. Osterweil

  11. What do I mean by “process” • Activities like development, verification, evolution, etc. of (eg.) software products • High level processes: • Develop requirements, Do Object Oriented Design, Formally verify • Low level processes • Archive test results, Verify a lemma • Often concurrent • Often coordinate people, automated systems • Often resource sensitive • Usually (regrettably) informal or undefined

  12. My interest in process:Reasoning to assure quality products and services • Superior quality products from superior processes • Build quality in, don’t “test in” quality (manufacturing) • Many observed “process errors” • Real processes are intricate • Automation can help/support • Reasoning to assure quality in processes • Simulations support reasoning • Other approaches too (eg. static analysis)

  13. Other Reasons for Interest in Process • Communication • Coordination • Intuitive understanding • Prediction/projection • Verification • Training • Automation • Deep understanding • Etc.

  14. Appropriate Modeling Notation is Key • Different formalism approaches support different goals • Formalisms vary in • Rigor • Precision (semantic detail) • Semantic scope • Clarity Which formalisms are effective in demonstrably supporting which kinds of reasoning?

  15. Processes are software

  16. Processes are software They should be Engineered

  17. Processes are software They should be Engineered Using appropriate languages

  18. Process Definition Approaches • Natural language • Structured Natural Language • Pictorial representations • DFDs • FSAs • Petri Nets • Object Technologies • Programming languages Directly analogous to product definition approaches Different approaches for different Phases Purposes

  19. Process definition language issues • Blending proactive and reactive control • Coordinating human and automated agents • Without favoring either • Specification of resources • Exception management • Real time specification

  20. The Little-JIL Process Language • Vehicle for exploring language abstractions for • Reasoning (rigorously defined) • Automation (execution semantics) • Understandability (visual) • Supported by • Visual-JIL graphical editor • Juliette interpreter • Evaluation by application to broad domains • A third-generation process language • A “work in progress”

  21. Little-JIL Example:“Smart” Regression Test Process RegressionTest GetArtifacts ReportResults Stop PerformTest SelectTests Stop PerformTest Report Failure ExecuteTest GetExecutable GetTest Cases NoteFailure Compare Results Get Input Data Run Test Get Expected Output Data

  22. The “Step” is the central Little-JIL abstraction Interface Badge (includes resource specs) Prerequisite Badge Postrequisite Badge TheStepName Z X Handlers Substep sequencing Reactions

  23. Trivial SW Development Process

  24. Trivial Example Elaboration of Requirements Step

  25. Trivial Example Elaboration of Design Step

  26. Requirements Rework

  27. Requirements Rework Invocation of step originally defined as substep of Requirements

  28. Requirements Rework Same exception thrown Invocation of step originally defined as substep of Requirements

  29. Requirements Rework Same exception thrown Invocation of step originally defined as substep of Requirements Different invocation context -> different response

  30. What does this tell us? • Abstraction/reinstantiation is necessary • For an adequately articulate language • For clear understanding of “rework” Other language features similarly motivated By specific examples and experiences

  31. Sequential In order, left to right Parallel Any order (or parallel) Choice Choose from Agenda Only one choice allowed Try In order, left to right Little-JIL Proactive Flow Specified by four Sequencing Kinds Iteration usually through recursion Alternation using pre/post requisites

  32. Example of Choice and Try Step Kinds Implement Reuse_Implementation Custom_Implementation Look_for_Inheritance Look_for_Parameterized_Class Look_for_Objects_to_Delegate_to Main Goal: Support Human flexibility

  33. Reactive Control through Scoped Exception Handing • Steps may have one or more exception handlers • React to exceptions thrown in descendent steps • Handlers are steps themselves InterfaceFilesDon’tCompile DevelopInterfaceFiles InterfaceFilesCompile

  34. Four different continuations on exception handlers • Complete • Handler was a “fixup” and now it is OK to go back • Continue • Handler brought step to an acceptable postcondition state and it is OK to go on • Restart • SNAFU. Handler cleaned up mess, now OK to redo • Rethrow • Go up to parent and hope the parent knows what to do

  35. Examples of Resources • Input artifacts: requirements document, locks on key artifacts • People: designers with varying skills • Tools: ROSE • Agents: Each step has a distinctly identified unique resource responsible for execution of the step (and all of its substeps)

  36. Bob Resource Human Design Team Carol Hardware Ted Software Alice Data Manager PC Sparc Designer Resource Model: Requires andWhole-Part Relationships

  37. Resource Request Example Agent: OODDesigner;expert tool: ClassDiagramEditor artifact: DiagramReposLock IdentifyRelationships SpecifyRelationships RefineRelationships Resource request is a query on the Resource specification repository

  38. Juliette: The Little-JIL Interpreter • Juliette is distributed • Every step has its own interpreter • Interpreter executed on agent’s platform • Communication via Agendas • One for each agent and service • Services include: • Object Management • Resource Management • Step sequence Management • Agenda Management

  39. Achieving Product QualityThrough Quality Processes • Thru reasoning about process characteristics • Analogous to software product measurement and evaluation • Dynamic monitoring of process execution • like interactive debugging and tracing • Simulations can be predictive • Tracing provides audit trails • Need static analysis of processes too • Prove absence of pathologies

  40. Process Reasoning Examples • Is the process correct (eg. consistent with rqts.)? • How fast will the process run? • How to be sure that humans perform their jobs? • Train them, monitor their participation • Are resources adequate, efficiently used? • How to improve the process • And be sure that changes are improvements? • Simulations can spot problems • Static analysis can verify for all executions

  41. The Capability Maturity Model (CMM)is a Specific Approach to Software Process Improvement

  42. The Capability Maturity Model (CMM)is a Specific Approach to Software Process Improvement It is a test plan for black box testing of processes

  43. The Capability Maturity Model (CMM)is a Specific Approach to Software Process Improvement It is a test plan for black box testing of processes Can’t test quality into software process products either

  44. Current Evaluation Projects • Software Development: • Perpetual testing: Programming flexibly evolvable integrated testing and analysis • Configuration Management • Collaborative Object Oriented Design • Performing data flow analysis processes • Robot coordination • Distributed scientific statistical data processing • Medical and Nursing processes • Ecommerce processes such as auctions • Egovernment processes

  45. Robot Coordination Process

  46. Scientific Statistical Data Processing • How do scientists do their work? • Reproducing results is core of all science • Should help in reproducing results • Evidence this this has been done (dynamic) • Determine if there are any statistical processing pathologies • Avoid false “findings”

  47. Produce a 3-D Forest Model Fly-Over Data Mosaic 3D Model Fly-Over Data Fly-Over Data Fly-Over Data Maybe plan a fly-over, maybe just get a different dataset… Creates “new” versions of the fly-over data

  48. Medical/Nursing Processes • Defining procedures, protocols, formally, rigorously, completely • Complicated by exceptions • Traces provide audit trails • Analysis can find flaws, omissions, etc.

  49. Top Level Medical Process:Blood Transfusion

  50. Collect Blood Substep

More Related