1 / 38

Integrating People, Places, and Things into a Desktop Search Engine

Integrating People, Places, and Things into a Desktop Search Engine. By Kyle Rector Senior, EECS, OSU. Agenda. Background My Approach Demonstration How it works The Survey Plans for User Evaluation Future Plans. What is the Issue?.

suki-weaver
Download Presentation

Integrating People, Places, and Things into a Desktop Search Engine

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Integrating People, Places, and Things into a Desktop Search Engine By Kyle Rector Senior, EECS, OSU

  2. Agenda • Background • My Approach • Demonstration • How it works • The Survey • Plans for User Evaluation • Future Plans

  3. What is the Issue? • Amount of emails, web browsing and files on the computer are always increasing • Solutions: • Filing systems • Desktop search • Web search • Email filtering • However, people can misfile things, and search may not be useful if you don’t know what to query

  4. Related Work • Vannevar Bush’s concept of memex[1]: “…a device in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility.”

  5. Related Work • Three publications from EuroPARC have investigated logging of user activities • PEPYS[2]: used an active badge system to log location • Video Diary[3]: two major cues of remembering events were people and objects • Activity-based Information Retrieval[4]: “…systems which aim to support human memory retrieval may require special attention to the user interface; otherwise the cognitive load imposed by interaction can outweigh the reduction in load on the user’s memory”.

  6. Related Work • Memory landmarks: events that stick out in one’s mind • Horvitz et. al. [5] designed a Bayesian model to predict important memory landmarks from their study • Important variables: subject, location, attendees, and whether meeting is recurrent.

  7. Related Work • Episodic Memory[6]: memory can be organized into different episodes • Ringel et. al. [7] also created a timeline display of files, emails, and web history based on user events

  8. Related Work • Stuff I’ve Seen[8]: Desktop search which indexes email, files, web, and calendar • Initial findings from their experiment: • Time and people are important retrieval cues • 48% of queries involved a filter, most common being file type • 25% of queries involved people • Sorting by date is a good way for people to find items.

  9. Related Work • Phlat[9]: Desktop search using contextual cues • Findings from long term study: • 47% of queries involved a filter • People and file type were the most common filters • 17% of queries used only filters. • Had an issue with the aliasing of names, which RFID Ecosystem would fix

  10. Agenda • Background • My Approach • Demonstration • How it works • The Survey • Plans for User Evaluation • Future Plans

  11. My Approach • Google Desktop Gadget interface • Event filters: people, objects, location, and time • File filters: query string, file type • Uses Google Desktop Search • Display results in a timeline view My Gadget

  12. Demonstration

  13. Agenda • Background • My Approach • Demonstration • How it works • The Survey • Plans for User Evaluation • Future Plans

  14. System Architecture Google Desktop Gadget RFID Ecosystem Database User Input Google Desktop Search Browse Timeline Results

  15. Step 1: Configure the Database Google Desktop Gadget RFID Ecosystem Database User Input Google Desktop Search Browse Timeline Results

  16. Step 1: Configure the Database • Gadget: communicates with the database to get events • User: specifies any combination of events they would like to use • Gadget: setup to do searches, and has a dropdown list of event choices

  17. Step 2: Filter Your Query Google Desktop Gadget RFID Ecosystem Database User Input Google Desktop Search Browse Timeline Results

  18. Step 2: Filter Your Query • Desktop Search filters: • Event: before, during, or after • File type • Text query • Event filters: • People • Locations • Objects • Date

  19. Step 2: Filter Your Query • User: specifies the filters in the gadget • Gadget: communicates with the database to get the possible event times • User: • can choose one or all event times • can decide if they want to search before, during, or after one or all events

  20. Step 3: Search Your Desktop Google Desktop Gadget RFID Ecosystem Database User Input Google Desktop Search Browse Timeline Results

  21. Step 3: Search Your Desktop • Gadget: • Accesses Google Desktop URL by using Registry Editor • Parses Google Desktop HTML to get to Browse Timeline page • Parses Browse Timeline HTML to find correct date of event

  22. Step 3: Search Your Desktop • Browse Timeline: History of file modification times

  23. Step 3: Search Your Desktop • Gadget: • Parses through Browse Timeline HTML to filter files • i.e.: If you wanted files that you modified when you met with Magda on July 14th from 4:30 - 5:00pm, then files between those times will be selected. • Displays the selected results in an HTML file saved to the Temp directory

  24. Step 4: The Results Google Desktop Gadget RFID Ecosystem Database User Input Google Desktop Search Browse Timeline Results

  25. Step 4: The Results • Example: All file types while meeting with Magda

  26. Agenda • Background • My Approach • Demonstration • How it works • The Survey • Plans for User Evaluation • Future Plans

  27. The Survey • Before the survey, had a simple prototype program Old GUI Old Results Page

  28. Survey on Mobile Computer Usage within CSE

  29. The Survey • Sent survey to Faculty, Staff, Graduate, and Undergraduate students • 9 questions, where 2 were demographic • 33 people responded to the survey • Changes made based on survey: • Object feature • Before, During, or After meeting option

  30. Agenda • Background • My Approach • Demonstration • How it works • The Survey • Plans for User Evaluation • Future Plans

  31. Plans for User Evaluation • Questions I want to answer: • Do contextual parameters (people, places, things) with relation to work events save time when doing a desktop search? • Do the size and frequency of text queries decrease when doing a desktop search? • Are the Google Desktop Gadget GUI and the results page easy and functional to use?

  32. Plans for User Evaluation • Each participant will have six tasks: • Three with Google Desktop • Three with my gadget • Develop User Scenarios • PowerPoint story board with pictures and speech • Will only be seen for a temporary amount of time • Users complete search tasks • Participants should remember and use contextual information to make searching easier

  33. Plans for User Evaluation • Do contextual parameters (people, places, things) with relation to work events save time when doing a desktop search? • Time how long a participant takes from the end of the story session to successfully completing a task • Compare Google Desktop Search times to my gadget desktop search times

  34. Plans for User Evaluation • Do the size and frequency of text queries decrease when doing a desktop search? • Review what types of filters subjects are using • Count how many times a subject does not use text in their query • If they use text, count how many words are in the query • Can compare results to previous work (Phlat, Stuff I’ve Seen)

  35. Plans for User Evaluation • Are the Google Desktop Gadget GUI and the results page easy and functional to use? • Will have participants answer a evaluation survey after the tasks are done • Subjects will rate features and output page using the Likert scale

  36. Agenda • Background • My Approach • Demonstration • How it works • The Survey • Plans for User Evaluation • Future Plans

  37. Thank you Any Questions?

  38. Sources • Bush, V. As we may think Atlantic Monthly 176, 101-108 (1945). • Newman, W., Eldridge, M., Lamming, M. PEPYS: Generating autobiographies by automatic tracking. ECSCW Amsterdam, The Netherlands 175 – 188 (1991). • Eldridge, M., Lamming, M., Flynn, M. Does a video diary help recall? People and Computers VII Cambridge University Press, Cambridge 257 – 269 (1992). • Lamming, M., Newman, W. Activity-based information retrieval: technology in support of personal memory. • Horvitz, E., Dumais, S., Koch, P. Learning predictive models of memory landmarks. In Proceedings of the CogSci 2004: 26th Annual Meeting of the Cognitive Science Society, Chicago, USA, August 2004 (2004). • Tulving, E. Elements of episodic memory. Oxford University Press (2004). • Ringel, M., Cutrell, E., Dumais, S., Horvitz, E. Milestones in time: the value of landmarks in retrieving information from personal stores. Proceedings of Interact (2003). • Dumais, S., Cutrell, E., Cadiz, J., Jancke, G., Sarin, R., Robbins, C. Stuff I’ve seen: a system for personal information retrieval and re-use, SIGIR’03, July 28 – August 1, 2003, Toronto, Canada. (2003). • Cutrell, E., Robbins, D., Dumais, S., Sarin, R. Fast, flexible filtering with Phlat – personal search and organization made easy, Proceedings in CHI 2006, April 22-27, 2006, Montreal, Quebec, Canada (2006).

More Related