Trieval - PowerPoint PPT Presentation

Cs 430 info 430 information retrieval l.jpg
Download
1 / 24

  • 202 Views
  • Uploaded on
  • Presentation posted in: Internet / Web

Ranked by importance of document as calculated by some algorithm (e.g., Google PageRank) Duplicates shown separately or merged into a single record ...

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

Download Presentation

trieval

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript


Cs 430 info 430 information retrieval l.jpg

CS 430 / INFO 430 Information Retrieval

Lecture 23

Usability 2


Course administration l.jpg

Course Administration

Wednesday, November 16

No discussion class

Thursday, November 17

No lecture

No office hours


Course administration3 l.jpg

Course Administration

Assignment 3

Grades were returned yesterday

If you have any questions, send email to cs430-l@lists.cs.cornell.edu

Replies may be delayed until next week


Usability factors in searching l.jpg

Usability Factors in Searching

Example: Design an interface for a simple fielded search.

Interface: Fill in boxes, text string, ... ?

Presentation of results ... ?

Manipulation of results ... ?

Functions: Specify field(s), content, operators, ... ?

Retain results for manipulation ... ?

Query options ... ?

Data:Metadata formats ... ?

Data structures and file structures ... ?

Systems:Performance ... ?


The design evaluate process l.jpg

The Design/Evaluate Process

Requirements (needs of users and other stakeholders)

Evaluation

start

Implementation (may be prototype)

Design (creative application of design principles)

release


Evaluation l.jpg

Evaluation

What is usability?

Usability comprises the following aspects:

  • Effectiveness – the accuracy and completeness with which users achieve certain goals Measures: quality of solution, error rates

  • Efficiency – the relation between the effectiveness and the resources expended in achieving themMeasures: task completion time, learning time, clicks number

  • Satisfaction – the users’ comfort with and positive attitudes towards the use of the systemMeasures: attitude rating scales

    From ISO 9241-11


Evaluation7 l.jpg

Evaluation

  • The process of determining the worth of, or assigning a value to, the usability on the basis of careful examination and judgment.

  • Making sure that a system is usable before launching it.

  • Iterative improvements after launch.

  • Categories of evaluation methods:

    • Analytical evaluation: without users

    • Empirical evaluation: with users

    • Measurements of operational systems


Evaluation without users l.jpg

Evaluation without Users

Assessing systems using established theories and methods

Evaluation techniques

  • Heuristic Evaluation (Nielsen, 1994)

    • Evaluate the design using “rules of the thumb”

  • Cognitive Walkthrough (Wharton et al, 1994)

    • A formalized way of imagining people’s thoughts and actions when they use the interface for the first time

  • Claims Analysis – based on scenario-based analysis

    • Generating positive and negative claims about the effects of features on the user


Evaluation with users l.jpg

Evaluation with Users

Preparation

Sessions conduct

Analysis of results

Testing the system, not the users!

Stages of evaluation with users:

User testing is time-consuming and expensive.


Evaluation with users preparation l.jpg

Evaluation with UsersPreparation

  • Determine goals of the usability testing

    “The user can find the required information in no more than 2 minutes”

  • Write the user tasks

    “Answer the question: how hot is the sun?”

  • Recruit participants

    Use the descriptions of users from the requirements phase to detect potential users


Usability laboratory l.jpg

Usability Laboratory

Concept: monitor users while they use system

Evaluators User

one-way mirror


Evaluation with users sessions conduct l.jpg

Evaluation with UsersSessions Conduct

  • Conduct the session

    • Usability Lab

    • Simulated working environment

  • Observe the user

    • Human observer(s)

    • Video camera

    • Audio recording

  • Inquire satisfaction data


Evaluation with users results analysis l.jpg

Evaluation with UsersResults Analysis

  • If possible, use statistical summaries

  • Pay close attention to areas where users

    • were frustrated

    • took a long time

    • couldn't complete tasks

  • Respect the data and users' responses, don't make excuses for designs that failed

  • Note designs that worked and make sure they're incorporated in the final product


Measurements on operational systems l.jpg

Measurements on operational systems

Analysis of system logs

•Which user interface options were used?

•When was was the help system used?

•What errors occurred and how often?

•Which hyperlinks were followed (click through data)?

Human feedback

•Complaints and praise

•Bug reports

•Requests made to customer service


The search explorer application reconstruct a user sessions l.jpg

The Search Explorer Application: Reconstruct a User Sessions


Refining the design based on evaluation l.jpg

Refining the design based on evaluation

Designers and evaluators need to work as a team

Designers are poor evaluators of their own work, but know the

requirements, constraints, and context of the design:

•Some user problems can be addressed with small changes

•Some user problems require major changes

•Some user requests (e.g., lots of options) are incompatible with other requests (e.g., simplicity)

Do not allow evaluators to become designers


Usability experiment ordering of results l.jpg

Usability experiment: Ordering of results

The order in which the hits are presented to the user:

•Ranked by similarity of match (e.g., term weighting)

•Sorted by a specified field (e.g., date)

•Ranked by importance of document as calculated by some algorithm (e.g., Google PageRank)

•Duplicates shown separately or merged into a single record

•Filters and other user options

What impact do these choices have on the usability?


Experiment on the google interface l.jpg

Experiment on the Google Interface

Methodology

•10 information seeking tasks in 2 categories

•Users randomized across tasks

•Click-through data to see what the user did

•Eye tracking data to see what the user viewed

•Google results presented with top ten ranks reversed

An example of interdisciplinary information science

research by Cornell's Human Computer Interaction

Group and Computer Science Department


Evaluation example eye tracking l.jpg

Evaluation Example: Eye Tracking


Slide20 l.jpg

Evaluation Example: Eye Tracking


Google evaluation click through data l.jpg

Google EvaluationClick-Through Data

Number of users who clicked on link

Rank of hit


Google evaluation eye tracking data l.jpg

Google Evaluation Eye Tracking Data

Number of users who viewed short record before first click

Rank of hit


Google evaluation eye tracking data23 l.jpg

Google Evaluation: Eye Tracking Data

Part of short record viewed before first click (% of users)

Title: 17.4%

Snippet: 42.1%

Category: 1.9%

URL: 30.4%

Other: 8.2%(includes, cached, similar pages, description)


Google experiment click through data with ranks reversed l.jpg

Google Experiment: Click Through Data with Ranks Reversed

Percentage of users who clicked on link

Rank of hit


  • Login