academic research writing n.
Skip this Video
Loading SlideShow in 5 Seconds..
Download Presentation


115 Views Download Presentation
Download Presentation


- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript


  2. IMPOSSIBLE TO ESTIMATE … The time and trouble it will take (power of bad luck) The power of serendipity (power of good luck) Consequently 2 key parallel skills: time management and risk assessment

  3. INTENTION Not always to narrow down your data, but to open up possibilities and generate choices

  4. SOLUTIONS The “answer” isNOT in the library or on the Internet … It’s (already) in your head !

  5. TWO APPROACHES Searching with purpose takes HUGE amounts of time Exploratory (relying on serendipity) takesHUGEamounts of time –more than you have in any course

  6. PREDICT ! Speculatespecifically about what you could find; what you're likely to find Even if wrong, it’s better to approach research with articulated assumptions

  7. PROCESS Predict Search (gather more than you need) Evaluate Document / record Cluster/group/categorize


  9. TWO TYPES OF LIBRARIES PUBLIC VS UNIVERSITY Housing two very different kinds of materials

  10. Peer Review Process • Said to be “refereed” • Sent to several 3rd party experts for scrutiny • Returned with: rejected, publish with minor revs, major revs, or “as is”

  11. ACADEMIC PUBLISHING Books “Articles” in “journals” -- published monthly or quarterly usually

  12. OBJECTIVES Thoroughness While you’re at it, research more than you need Explore more than is necessary Bring home (copies of) everything you come across Accuracy & Meticulousness Record all meta information (author, title, journal, URL, date, call #, page #, editor, which library) Where you were What the date is

  13. OBJECTIVES Balance & fairness Choose a variety of sources, opinions and viewpoints Always choose current sources, but older ones are not always wrong or outdated Clarity in complexity Retain paradoxes, dilemmas and inconsistencies Don’t oversimplify

  14. ALAN SOKOL Published an article through peer review process that was “BS” Only one article by one journal BUT Made critical world wonder about the whole process Where does truth live? What is the relationship between truth and legitimacy?

  15. PUBLISHING SCANDALS Sokal's Hoax(see next slide) Poor Medical Research (1/7) About the Peer Review Process The Abuse of Science

  16. SCRUTINIZE SOURCES The key is TRANSPARENCY Who wrote it? Credentials? When? Why? Are sources used documented and traceable?

  17. EVALUATE SOURCES Thoroughness Accuracy Balance Clarity = TRANSPARENCY

  18. LibrariansvsInternet Search Engines

  19. LIBRARY SCIENCE Is all about standardization ISBN numbers Library of Congress Standard “subjects”

  20. INTERNET / LIBRARY KEY DIFFERENCES Librarians standardize everything, incl. booleans (and, or, adj, not) but Internet search engines standardize little (they compete)

  21. INTERNET SEARCH ENGINES 3 Kinds: Generic(Yahoo) Meta: search other engines (Metacrawler) Dedicated(Lawcrawler). See

  22. SEARCH ENGINES Each collects filters stores eliminates serves data … differently

  23. PROBLEMS WITH SEARCH ENGINES Research and promotion = 2 sides of same coin Sometimes what you find has nothing to do with your research skills

  24. TOPICS • Get them there ! • Research vs. Promotion • Active and passive promo • How search engines work • Bots, spiders, the wild (wild) west • Taxonomy of search engines • Controlling text and using links • Help them find what they (and you) want ! • Heuristics (Interface design & layout) • Naming conventions • Keep them there (“stickiness”) ! • Push-pull technologies of promo (blog, rss, podcasts, listserv) • The dance of love between meta-tagging and server logs

  25. RESEARCH - PROMO • R&P = 2 sides of same coin • Research=the abilityto findinformation • Promotion=the abilityto be found • 2 sides of same coin • Roman god & our January • Looks both ways at once

  26. SEARCH ENGINE PROBLEMS Librarians standardize everything -- subjects (and associated subjects), as well as Booleans (and, or, adj, not) Engines do not standardize: each has their own methods, algorithms and “fuzzy” Booleans

  27. HOW ENGINES WORK Gathering, sorting, storing

  28. GENERIC ENGINES Index a variety of different kinds of information Examples … • Livesearch (Microsoft) • Gigablast • Vivisimo • Lycos

  29. DEDICATED ENGINESi Focus on specific kinds of information • Yahookids ( • Monster ( jobs • (phone book) • Lawcrawler ( • Blogscope for the blogsphere (

  30. DEDICATED ENGINESii • Go here for a annotated list of dedicated News Search Engines( or here for the MSWord version But they include: • Google News • Ananova • NewsTrawler • Blogdex

  31. META ENGINESi … they search and index other engines • Beaucoup • Metacrawler • Google • Web MD ( • Hotbot • Mamma • Dogpile • Altavista

  32. META ENGINESii … these ones (+ more) search and index other NEWS engines • Info com ( • Quick News ( • See The Best and Most Popular Meta Search Engines at( • Ithaki(

  33. ABOUT ENGINES Sites about search engines: make these your friends! Know your toolset! Examples … • Search Engine Showdown ( • Search Engine Watch ( • Search Engines dot com ( • A Spider’s Apprentice ( • All Search Engines dot com (

  34. PROMOTION Active and Passive

  35. ACTIVE PROMOTIONi Means aggressively approaching search engines and/or buying ads • Some engines (Yahoo) still accept “submit it” functions( ( • Buying ads: as in Google CPM’s (cost per 1000 impressions) and CPC’s (cost per click) in a variety of intensities from random to IP-address associated

  36. ACTIVE PROMOTIONii • Lycosads start at $500/mo • Subcontract out to multiple submit agencies like Wide Promote and Submitter ( • Never underestimate “old-fashioned” promo: telephone, f2f, print media

  37. PASSIVE PROMOTIONi Several variations … • Embedded in code • Right in the main text • Connecting text to code • Online tools such Google keyword (

  38. PASSIVE PROMOTIONii Embedded in the code • Meta tags • Title tags Embedded in the (body) text • Keywords: lexis (buzzwords) for messaging • Stylistic resonance

  39. IN THE CODE Meta tags & Title tags In MSIE choose View\Source In Firefox and Netscape choose View\Page Source

  40. META TAGSi <meta name="language" content="en-CA, English, fr-CA, French" /> <meta name="Keywords" content="HCI,human computer interaction,usability testing and design, Toronto Ontario, accessability, accessibility, acessabilitycompliance, complience, technical writing, writing for the web" />

  41. META TAGSii Sometimes avoiding the engines is as important as getting indexed by them <meta name="robots" content="all, index, follow" /> or <meta name=“robots" content=“noindex, nofollow" />

  42. META TAGSiii <meta name="author" content="Dr. Peter Paolucci is a full time professor at York University in Toronto, Ontario and the president, CEO, and founder Learn Canada" /> <meta name="location" content="Toronto, Ontario, Canada" />

  43. META TAGSiv <meta http-equiv="Expires" content="Wed, 25 April 2009 08:00:00 GMT" /> <meta name="Description" content="Learn Canada is dedicated to the advancement of all online teaching and learning and interfaces, through research, professional and technical writing, improved interface design, and the study and use of human computer interaction” />

  44. KEYWORD TOOLS • Textalyser ( • Keyword density analyser( • $$ Webjectives( • See( all kinds of generators

  45. TITLE TAGS Important because … • Keywords in title are indexed by search engines • Contents of title become the name of user’s Bookmark(Firefox-Netscape) or Favorites(MSIE) • When synchronized with the domain name and content of text (body) title are very powerful locators

  46. HOW YOU ARE MANIPULATED Research and promotion = 2 sides of same coin What you find has little to do with your research skills Promotion can be Passive (in the HTML code) Active (submitting abstracts or buying ads which are measured in CPMs)

  47. SOME GOOD SOURCES ! How to do research Advice on Research & Writing Style, formatting documentation (Monash U)

  48. MORE GOOD SOURCES II Academic Integrity (avoiding plagiarism)(York) Evaluating Websites: Criteria and Tools (Cornell U)

  49. KNOWLEDGE / WISDOM Knowledge = Mere data (who, where, what, when) Wisdom = data that’s been scrutinized labelled categorized and clustered (linked relationally and to other data) Wisdom = Why and how

  50. MAP IT! Try to locate data in a schematic or map showing its relation to other data other problems other concepts other ideas