1 / 15

Epinions 2.0 Design Review August 1, 2000

Epinions 2.0 Design Review August 1, 2000. To Discuss. I. What informed 2.0 II. What we’ve done III. Where we’re headed IV. Concerns/Issues. I. What informed 2.0. Epinions 1.0. What we’ve seen users do Anecdotal and community feedback Data (site activity, PVs, clickpaths, etc.)

Download Presentation

Epinions 2.0 Design Review August 1, 2000

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Epinions 2.0 Design Review August 1, 2000

  2. To Discuss • I. What informed 2.0 • II. What we’ve done • III. Where we’re headed • IV. Concerns/Issues

  3. I. What informed 2.0 Epinions 1.0 • What we’ve seen users do • Anecdotal and community feedback • Data (site activity, PVs, clickpaths, etc.) • User testing • January test

  4. I. What informed 2.0 Various and sundry redesigns • Browse pages • Low-hanging fruit • Product page • UCD, with scenarios... • Member profile

  5. I. What informed 2.0 MetaDesign’s identity and UI work • Develop system • Color scheme • Visual hierarchy • Iconography • Definitely rough,but good start

  6. I. What informed 2.0 Buyers’ Experience document • Features • Shopper types

  7. I. What informed 2.0 Efficacy Project • “Getting users to the right product” • Combination of data and user testing • From the proposal: • It will be important to continuously measure user behavior and psychology to understand whether we are: • Successfully meeting our users’ needs. That is, do users find the site • Useful? • Usable? • Enjoyable? • Effectively promoting and enabling “monetizable” behavior • Are we helping users to make purchase decisions? • What brings users back to the site?

  8. II. What we’ve done • Developed a user-centered design schedule • Highly iterative - testing twice a week • Start at the basics and move towards finished designs • Specify the tasks • Define the pages (purposes, goals, etc.) • Develop an architecture • Build wireframes • Design mockups

  9. II. What we’ve done Two tracks – Design and Usability

  10. II. What we’ve done Documents • Reverse User Environment Design • More helpful in grounding than in moving forward • Task Flow and Definitions • Area Cards • Purposes, functions, tasks • Wireframes (lo-fi) • Testing findings • Mockups (hi-fi)

  11. II. What we’ve done Use of convention • Process – well understood methods of user-centered design • Conceptualization -- ? • Expression --

  12. III. Where we’re headed • Step back and reconsider focus • From “buyer’s experience” to “reader’s experience” • Address all core functionality • Registration, member profile, etc. • Assess process – is it working? • Are we getting good results from testing? • (What kinds of results are we seeking?) • Are we developing quality designs?

  13. IV. Concerns/Issues (Part I) • Shifting focus – buyer’s to reader’s to… ? • Business case for “reader’s experience”? • What, exactly do we want from it? • Have we reflected appropriately? • What’s the conceptual model? • Re-do scenarios and personae?

  14. IV. Concerns/Issues (Part II) • Demographic data – who is our audience? • Segmented by category? • Integrating with Infrastructure 2.0 team • Not sticking to process • Moved to “hi-fi” before finishing “lo-fi”

  15. IV. Concerns/Issues (Part III) • Scope (again!) • How much do we bite off? • Does it matter? • What does it mean to address all core functionality?

More Related