1 / 23

T-76.4115 Iteration Demo

T-76.4115 Iteration Demo. Team DTT I1 Iteration 7.12.2005. Project status (10 min) achieving the goals of the iteration project metrics Work results (20 min) Project plan Requirements document Used work practices (5 min). Agenda. Introduction to the X-Connector project (1/2).

Download Presentation

T-76.4115 Iteration Demo

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. T-76.4115 Iteration Demo Team DTT I1 Iteration7.12.2005

  2. Project status (10min) achieving the goals of the iteration project metrics Work results (20 min) Project plan Requirements document Used work practices (5 min) Agenda

  3. Introduction to the X-Connector project (1/2) • Domain: process simulation, simulation model with several simulation engines (cluster) • Deliverables: • a user interface for defining data exchange between the cluster members • a test server environment for prototyping the data exchange • data synchronization scheme • Technology: • web service technology based on OPC specifications • user interface as an Eclipse plug-in

  4. Introduction to the X-Connector project (2/2) • The customer of the project is VTT Industrial Systems • Prioritized objectives: • User interface implementation and test server to store data exchange connection information • Synchronisation implementation for the test server cluster

  5. Status of the iteration’s goals • Goal 1: Design core architecture: • OK: Components of the software (UI, DX interface, servers) are defined more thoroughly, key components’ design is finalized • Goal 2: Plan and document quality assurance activities and practices • OK: Documented in project plan chapter 5.2 • Goal 3: Produce a first working version of the software with some core functionality • OK: User interface with limited functionality • Partially completed: static DX-configuration, test server • Goal 4: Improve working methods • OK: communication improved (MS Messenger, Wiki), more effective meeting practices used • Goal 5: Create / update SEPA diaries • OK • Goal 6: Produce the plan for Implementation iteration 2 • OK: planning on general level done, more specific plan will be delivered in the beginning of I2

  6. Status of the iteration’s deliverables (1/2) Documents: • Updated project plan • OK: chapter 5.2 Quality assurance added, document updated • Updated requirements document • OK: descriptions of selected use cases added • Technical specification • OK: updated version of the document • Test cases, test report and test log • UI tests OK, but testing for the integrated system could not be done • Updated SEPA diaries • OK: All practices were used in I1 • User’s instructions • OK: Initial version of the user’s manual • Progress report • OK

  7. Status of the iteration’s deliverables (2/2) Software: • User interface (limited functionality) • Create a workspace (UC1.1): OK • Add/remove clusters (UC2.1, UC2.3): Unfinished • Add servers to clusters (UC3.1): OK • Add and remove connection groups (UC5.1, UC5.2): Unfinished • Add and remove connections to groups (UC5.4, 5.5): Unfinished • Browse OPC servers (UC8.1): OK • TestSimulator Data Access • Browsing of servers: OK • Provides simulated server information from a static source: OK • Provides dummy data: OK • Data eXchange (UC4.1 – UC6.2) • Provides DX web service interface: OK • DXConfigurationWrapper: Removed (integrated in DXConfiguration) • OPC XML DXKit: OK • TestSimulator Data eXchange • DX features: Unfinished

  8. Realization of the tasks (1/2) • Status and effort distribution per task group (4.12.2005) • 44 tasks in total, 3 added during the iteration • Remaining effort of project management for iteration demo and I2 planning • Remaining effort of programming mainly for task ”Implement Server DX Functionality” which didn’t finish on time • Remaining effort of QA for task ”Test DX configuration as a system” which was postponed to I2 • SEPA-practices, especially heuristic evaluation, are more suitable to be used in I2

  9. Realization of the tasks (2/2) • Realized effort smaller than estimated (~425h vs. 514h) • Effort needed to achieve the goals would have been ~460h

  10. Working hours by person Realized hours in this iteration • Coding was started one week behind the schedule • 20% of the effective development time lost • Christmas vacation will be (at least) one week shorter.. • Separate components could not be integrated, because one DX component did not finish on time • System testing could not be done (Ville and Tomi could not use 20 budgeted hours)

  11. Working hours by person Realized hours in I1 iteration Plan in the beginning of this iteration • 51% of the effort used at the moment • Work done during christmas vacation is counted to I2 Latest plan

  12. Quality metrics • No critical or blocker bugs found • 4 unfixed defects at the moment (all minor) • Weekly code reviews Bug metrics

  13. Quality assessment • User interface showed good quality • No blocker, critical or major defects! • Thorough testing of implemented features • Only 4 minor defects were found • OPC XML DXKit has been tested only by the developers during the development process • Formal test session will be arranged after the integration • Quality of the implemented software modules is good, but the integration has not yet been made. Thus, no conclusions of the quality as a system can be made Legend Coverage: 0 = nothing 1 = we looked at it 2 = we checked all functions 3 = it’s tested Quality: J = quality is good K = not sure L = quality is bad

  14. Software size in Lines of Code (LOC) as of 4.12 • The numbers for each component shows the LOC without comments. The percentage indicates the amount of comments of the total LOC. • The PP column shows LOC for the code provided by the customer • DXConfiguration and OPC XML DX Kit contain auto generated code as well as user code • As this project contains both auto generated code and code obtained from the customer, it is not very easy to use this as a reliable metric. • It is, however, interesting touse this for comparing the code output in different components.

  15. Changes to the project • Goals of the project • Implementation of the fast data access was removed • Goals of the coming iterations • Some features planned for I1 not yet finished and have to be postponed • We will finish most of these before the official start of I2 • Synchronization scheme is the main goal of I2 • Requirements • New requirements added • Many requirements have been re-specified during I1 • Technologies • Client side web service functions implemented in Java/Axis instead of C++/gSoap • DXConfigurationWrapper removed

  16. Risks • Current situation regarding the risks • Risk log in the project plan: 10 risks with their effects, estimated impacts and probabilities, persons responsible • 2 risks realized during I1 • The customer’s CVS server was inaccessible for a week and delayed the start of development work • Problems in understanding the project domain caused delays • Identified risks • A team member quits in the middle of the project or something happens (accident, sickness etc.) • New technologies cause problems • The customer is developing some software components that will be used in the project but fails to deliver them in time • Requirements change in the middle of the project • Workload of the whole project is underestimated • Workload of iteration I1 is underestimated and (software) deliverables won’t finish on time • Developed software components are incompatible • One of the group members has difficulties to stay on schedule • Difficulties in understanding the project domain • Problems with the customer’s CVS server

  17. Results of the iteration • System architecture • Updated requirements document • QA plan • Demonstration of the developed software

  18. System architecture • The system consists of a server and a user interface • The server is implemented in C++ and the UI components in Java

  19. Requirements document • Functional requirements • 21 (and 1 deleted) functional requirements • New functional requirements have been added • Existing requirements have been re-specified • Use cases • 31 (and 4 deleted) use cases • Many additions during I1 • Existing use cases have been re-specified • Use case IDs have been renamed • Use case list has been re-organized • Detailed descriptions of selected use cases • Non-functional requirements • No changes

  20. QA plan • 1 test session • Software modules (except UI) were tested only by the developers (no test sessions) • System integration was postponed  the whole system couldn’t be tested • BugZilla wasn’t really used during I1 • No real need, because there has been only 1 developer per module • Bugs found in the test session have been reported to BugZilla • Weekly code reviews were held • No defects found in code reviews • Good for coming up with solution ideas and for keeping the management team up-to-date

  21. Demo script • Server • Start the server • Use the server specification text file as a parameter • Locate IP address of server • User interface & Server • Start up the user interface in Eclipse • Connect to the first DA server, OPC foundation’s test server • http://opcfoundation.org/XmlDaSampleServer/Service.asmx • (or in case it is not online, http://www.tswinc.us/xmldademo/xml_sim/OpcXmlDaServer.asmx) • Port 80 • Connect to the Test server • Port 8081 • Browse the DA trees • Open subtrees • Show properties • Drag & Drop nodes between the two servers • Enter needed attributes in the window which is opened

  22. Used work practices • Practices used in I1 iteration • Iterative development: OK • Iteration planning: OK, task list had to be updated several times • Documenting: OK • Risk management: Risk log updated • Time reporting: Effort management tool was used to produce status reports weekly • Size reporting: OK • Version control: CVS in use, all versions of the documents was published in project’s web site • Communication: Problems of the PP-iteration were tackled with Messenger and Wiki • Meeting practices: Weekly meetings more effective than in PP • Defect tracking: BugZilla will be used more in I2 • Heuristic evaluation: SEPA-topic • Refactoring: SEPA-topic

  23. Thank you!Any questions?

More Related