1 / 8

www.whatsfordinner.com

www.whatsfordinner.com. Bryce Rodgers Kent Warner Matt Heckman. The Plan. Schedule was: Iteration 1 development 10/14 thru 10/30 Core of Classifier, DB initialized, blank web pages Iteration 2 development 11/3 thru 11/15 DB-website, db-classifier, add more pages (food, meal)

myron
Download Presentation

www.whatsfordinner.com

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. www.whatsfordinner.com Bryce Rodgers Kent Warner Matt Heckman

  2. The Plan • Schedule was: • Iteration 1 development 10/14 thru 10/30 • Core of Classifier, DB initialized, blank web pages • Iteration 2 development 11/3 thru 11/15 • DB-website, db-classifier, add more pages (food, meal) • Iteration 3 development 11/18 thru 12/1 • Site populates DB, classifier recommends from DB, touch-up to pages & ‘extra’ pages/features (ya right…) • Test on the 12/2 Final fixes 12/4 thru 12/7

  3. The Reality • Deliveries were: • Iteration 1 completed 11-9 • Most of it was finished days before 11-8 • Iteration 2 completed over T-giving break • Everything connected properly, now to add the guts • Iteration 3 not completed • Most important parts completed first: get recommendation using DB, navigation, pages/forms, add data to DB. • Test on 12/8 and 12/9 • Test doc started on 12/2, outlined test plan for last days

  4. What was Delivered? • Register an account with the site • Engine which focuses on choosing what it knows you like • If it has nothing to go on, it will be random until it does • Each recommendation must be voted like/dislike • Engine can catch up to the user quickly • Recommends meals based on a scoring system + Bayesian probability • Records meals when you accept the recommendation • Header bar with navigation (most pages/links removed) • Add new data to the DB

  5. What was left out? • Browse All Content Page • MealMap Calendar Page • ‘Launchy’/iTunes search-bar • Few degrees of freedom with respect to creating meals & food items • Determine taste-profile of user ahead of time • Score meals numerically rather than like/don’t like • Javascript to simplify data-entry • Nearly all ‘bells & whistles’ of the classifier including: • Recommend new meals (at request) User s ‘Reports a Meal’ • Recommend from within small sub-set of meals • User control over recommendation algorithm • Never launched on WWW • Recipes, ingredients, tracking what’s in your cupboard

  6. What went well? • Diverse skillset, team members knew their domain well • GoogleChat and googleCode were very convenient • Good IDE and dev environment configuration • Good ideas, even if we had too many sometimes • Requirement/stakeholder analysis of the product • Team members are veterans of managed dev • Out-sourcing some of the data-gathering tasks • Design helped a lot with generating HTML

  7. What Happened? • Fact that we were planning more than we could accomplish • Very few meetings after being ‘turned loose’ • 2 people live in CR • Inefficiencies with technology choices (servletsvsjsp, javascript) • So many ideas, almost 0 rejection of ideas, lead to conflicting views of how we were proceeding • Schedule was not very specific and wasn’t in the front of our minds • More specific deadlines, lean away from iterations toward builds

  8. Risks • Recipes & data-driven product (amount of data such as meals, food items) • caused us to abandon the idea of tracking recipes or ingredients, and focus on general things (high risk, high impact) • Trade-off between generality & product uniqueness/value • GUI of a website blessing & a curse • Focus on acquiring data, clumsy to enter too much data on one page • Had ideas about how to solve problem, but cost a lot of time • Dev environment-related problems • This risk under-occurred, there could have been many more problems • Recommending based on each individual user’s history (data-driven) • Practical with research, and a high-profile site • Low time impacted the mitigation plan, causing this risk to occur • Feature Creep • Ohhh yeah… that happened • Lesson learned: reject ideas to avoid over-complicating, even if it seems narrow-minded • Gold-Plating • Either occurred a lot or none at all, depending on how you look at it • Team Distance • Catalyst risk, occurred • Risk Monitoring • Formally, didn’t happen too often • Lesson Learned: even after the project is over, monitoring risk has value

More Related