Using large information and citation databases for evaluation
Download
1 / 61

Using large information and citation databases for evaluation - PowerPoint PPT Presentation


  • 80 Views
  • Uploaded on

Using large information and citation databases for evaluation. Tefko Saracevic, PhD School of Communication, Information and Library Studies Rutgers University, USA [email protected] http://www.scils.rutgers.edu/~tefko. Full disclosure. I have no connection with Scopus

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about 'Using large information and citation databases for evaluation' - adin


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Using large information and citation databases for evaluation l.jpg

Using large information and citation databases for evaluation

Tefko Saracevic, PhD

School of Communication, Information and Library Studies

Rutgers University, USA

[email protected]

http://www.scils.rutgers.edu/~tefko


Full disclosure l.jpg
Full disclosure evaluation

  • I have no connection with Scopus

    • But: I am on Scopus Advisory Board & as such have a free password

      • but I have Scopus access through Rutgers University Library and as Elsevier journal editor

    • I participated so far at one Scopus Advisory Board meeting (Budapest) and evaluated their product informally over phone conversations

    • I gave an informal talk about using Scopus at 2006 American Library Association meeting & at Rutgers

© Tefko Saracevic


What you can t find on scopus l.jpg
What you can’t find on Scopus evaluation

Named after:

Chiffchaff

(Phylloscopus Collybita)

a small bird with great navigational skills

© Tefko Saracevic


Slide4 l.jpg

© Tefko Saracevic evaluation


Definition of the central theme l.jpg
Definition of the central theme evaluation

to evaluate (verb)

to consider or examine something in order to judge its value, quality, importance, extent, condition, or performance

© Tefko Saracevic


However l.jpg
However … evaluation

  • Evaluation has many components and should use a number of sources

  • Information & citation databases are a powerful source & tool, but one among a number of others

  • Very useful

  • But use with skill & caution!

© Tefko Saracevic


Overview of scopus l.jpg
Overview of Scopus evaluation

  • Elsevier effort to get into searching

    • & combining ScienceDirect & Scirus (web searching)

  • Massive effort & outlay; big marketing

    • development investment HUGE & undisclosed

  • Headed by Eefka Smit & a young, mostly Dutch team

    • global operations:

      • Headquarters: Amsterdam; marketing: global; indexing: Philippines; computers: Dayton, Ohio, USA

  • Unveiled in 2004

    • new features unveiled constantly – innovative

      • e.g. mid 2005: added RefWorks; end 2005 Citation tracking; 2006 Author profiling & further analysis tools

  • Search engine licensed from Fast

© Tefko Saracevic


Coverage l.jpg
Coverage evaluation

  • Science & technology only, no (or little) humanities

    • includes Chemistry, Physics, Mathematics, Engineering, Life and Health Sciences, Social Sciences, Psychology, Economics, Biological, Agricultural and Environmental Sciences

  • Covers some 15,000 journals, 700 proceedings, 600 trade publications, 125 book series, 12.5 mill. patents

  • Incorporates wall to wall Medline, Embase, Compendex, & many other databases

  • © Tefko Saracevic


    Coverage9 l.jpg
    Coverage … evaluation

    • Time covered:

      • Abstracts go back to 1966

      • References go back to 1996

    • While having gaps, coverage seems more comprehensive than any other single database

    • Also incorporates web search via Scirus

      • 200 mill. web sources

    • Also strong in non-English & developing country sources

      • More than 60% of titles are from countries other than the US

    © Tefko Saracevic


    Overview of other databases for a few comparisons l.jpg
    Overview of other databases evaluation- for a few comparisons

    • Web of Science (WoS)

      • Coverage: science, technology, humanities

      • origin in three citation databases

        • Science Citation Index (SCI), Social Science Citation Index (SSCI), Arts & Humanities Citation Index (AHCI)

        • at Rutgers coverage only 1994-present - pricing reason - with some 8,000 journals, plus patents & other databases – only this accessible to me

    • DIALOG

      • a very large supermarket – some 900 databases (db) in every field and area, including citation indexes

      • Citation db coverage: SCI 1974- ; SSCI 1972 -; A&H, 1980-

        • all accessible to me

    © Tefko Saracevic


    Reviews l.jpg
    Reviews evaluation

    • Comparing Scopus and Web of Science

      • 2005: http://www.charlestonco.com/comp.cfm?id=43

      • 2006: http://www.charlestonco.com/comp.cfm?id=43

        • critical of Scopus gaps in coverage, particularly before 1996

      • but not clear why comparison of these two services

        • Scopus does many different things that WoS does not & vice versa

        • both have citation searching but Scopus has much more

        • Scopus subject searching is much more comprehensive, WoS citation searching is more comprehensive, but Scopus citation tracking more usable for evaluation

    © Tefko Saracevic


    What can you do l.jpg
    What can you do? evaluation

    • Subjects search

      • with many capabilities to limit & modify, rank

    • Source search – journals, types of sources

    • Author search with many extensions

      • – e.g. as to citations to and from

    • Citation tracking

    • Integrated with getting full texts with library

    • Integrated with RefWorks, given library has it

    • Integrated web search

    © Tefko Saracevic


    What do i do l.jpg
    What do I do? evaluation

    • Use it as in a variety of roles & evaluations, as a:

      • researcher

      • teacher

      • journal editor

      • mentor

      • promotion, tenure, committee member; administrator

      • tool for keeping current; also:

        • for finding what and who did I miss

        • who is leading an area

    concentrate here

    with implications

    © Tefko Saracevic


    What do you see l.jpg
    What do you see? evaluation

    • At first: Lots of features laid out all at once

    • But, relatively clear interface laying out capabilities

    • Geared toward fast, intuitive learning & use

      • and indeed it is relatively easy to learn & use

    • Results displayed in Last In First Out (LIFO) order, but can be ranked or listed in various ways

    © Tefko Saracevic


    But lets get going l.jpg
    But lets get going …. evaluation

    Live examples from

    http://www.scopus.com/

    user: tsaracevic

    password: I am not telling

    or:

    http://www.libraries.rutgers.edu/

    © Tefko Saracevic


    Starting l.jpg
    Starting … evaluation

    search options

    © Tefko Saracevic


    Use in research and citation tracking l.jpg
    Use in research and citation tracking evaluation

    • Presently, I have completed but am updating & re-writing a comprehensive review about the notion of relevance in information science

    • For that:

      • I did subject searching & identified & evaluated areas of research

      • I also searched for some key authors and did citation tracking & evaluated contributions & trends

        • including, of course, a vanity search

      • then I saved each author or subject search in a list

    © Tefko Saracevic


    Fun part l.jpg
    Fun part evaluation

    • Had fun tracking those that cited them that cited them …

    • Eventually got lost in the tracking maze – of course!

    • Well, lets take a look

    © Tefko Saracevic


    Subject search l.jpg
    Subject search evaluation

    search selections

    © Tefko Saracevic


    Search results l.jpg
    Search results evaluation

    • I found 66 articles about “relevance AND judgment”

      • then saved them in My List, so I can evaluate, use and update them later

      • then I found all the citations to the 66 articles

    • Here is the results page

    • And then two author examples…

    © Tefko Saracevic


    Search results21 l.jpg
    Search evaluationresults

    Using options after I got the results

    © Tefko Saracevic


    Following a single author article l.jpg
    Following a single author & article evaluation

    • Selected one of the most cited articles:

      • Saved in list as “Voorhees 2000” and did citation tracking: who cited it?

      • it was cited 28 times (“Voorhees children”)

      • then I went on and found 102 articles that cited Voorhees children (“Voorhees grandchildren”)

    • this way I evaluated impact of an article and spread into various publications and areas

    • Well, lets take a look

    © Tefko Saracevic


    Selected article l.jpg
    Selected article evaluation

    various features

    © Tefko Saracevic


    After searching citation tracking i create lists l.jpg

    My 11 saved lists evaluation

    after searching& citation tracking I create lists

    © Tefko Saracevic


    Voorhees 2000 i saved in my lists l.jpg
    Voorhees 2000 evaluationI saved in my lists

    various features

    © Tefko Saracevic


    28 voorhees children l.jpg

    various features evaluation

    28 Voorheeschildren

    © Tefko Saracevic


    102 voorhees grandchildren l.jpg

    various features evaluation

    102 Voorheesgrandchildren

    © Tefko Saracevic


    Slide28 l.jpg
    then… evaluation

    • I selected and viewed the list “Mizzaro citations” to work on them further

    • selected them all

    • clicked on citation tracking

    • and voila!

    © Tefko Saracevic


    Slide29 l.jpg

    Selected them all for citation overview evaluation

    © Tefko Saracevic


    Slide30 l.jpg

    Interested in this one evaluation

    © Tefko Saracevic


    Slide31 l.jpg

    Follow-up on four articles; evaluation

    Tombros was NEW for me!

    © Tefko Saracevic


    Following a vanity but useful trail l.jpg
    Following a vanity but useful trail evaluation

    • Created a similar list of my own articles

    • Selected one on interaction & relevance

    • Who cited it?

    • Who cited them who cited me?

    • Discovered a number of previously unknown articles

    • Well, lets take a look

    © Tefko Saracevic


    Author selection disambiguation l.jpg
    Author selection & evaluationdisambiguation

    Choice

    List of all 20 authors last name “Saracevic “–

    first page

    © Tefko Saracevic


    Author selection disambiguation34 l.jpg

    List of all 5 “Saracevic, T” – all me evaluation

    Author selection &disambiguation

    List of all 20 authors last name “Saracevic “–

    second page

    Choices

    © Tefko Saracevic


    Scopus i without self citations l.jpg
    Scopus & I: without self-citations evaluation

    No. of articles in Scopus

    No. of citations in Scopus

    This one

    © Tefko Saracevic


    Scopus i with self citations l.jpg
    Scopus & I: with self-citations evaluation

    No. of all citations in Scopus

    977 all

    -950 without

    27 self

    © Tefko Saracevic


    Web of science wos l.jpg
    Web of Science (WoS) evaluation

    • Same subject search “relevance AND judgment”

    • Same vanity search

    • Reminder: My access to WoS through Rutgers limited to 1994 – present

    • Well, lets take a look

    © Tefko Saracevic


    Wos subject search l.jpg
    WoS: evaluationsubject search

    search selections

    © Tefko Saracevic


    Wos subject search results l.jpg
    WoS: subject search results evaluation

    search results

    © Tefko Saracevic


    Wos and i my articles l.jpg
    WoS and I: my articles evaluation

    analysis features

    No. of articles in WoS

    © Tefko Saracevic


    Wos and i authors citing me l.jpg

    No. of all citations in WoS evaluation

    WoS and I: authors citing me

    Author citing me most

    Self citations

    © Tefko Saracevic


    Wos and i my citations l.jpg
    WoS and I: my citations evaluation

    analysis features

    No. of all citations in WoS

    © Tefko Saracevic


    Dialog l.jpg
    Dialog evaluation

    • Same vanity search

    • Reminder: My access to Dialog databases includes whatever years they have:

      • Citation db coverage: SCI 1974- ; SSCI 1972 -; A&H, 1980-

    • Dialogweb I use is a command search

      • powerful but not intuitive at all

      • needs training or information professional

    • Well, lets take a look

    © Tefko Saracevic


    Dialog and i my citations l.jpg
    Dialog and I: my citations evaluation

    List of databases being searched

    search command: expand on authors named “saracevic”

    © Tefko Saracevic


    Dialog and i search process l.jpg
    Dialog and I: search process evaluation

    • commands complex, thus screens not shown, except the final result screen

    • Briefly:

      • found my articles in all 4 databases (126 articles)

      • some articles are in more than one db, thus removed duplicates (102 unique articles remained)

      • found citations to me in all db (1513 citations)

      • some citations are in more than one db, thus removed duplicates (1084 unique citations remained, but include self citations)

      • finally, eliminated self citations (1042 citations without self citations)

    © Tefko Saracevic


    Dialog and i search process46 l.jpg

    S1: no. of articles evaluationin those db

    S2: no. of articlesafter removing duplicates

    S3: no. of citationsin those db

    S4: no. of citationsafter removing duplicates

    S5: no. of citationsafter removing self citations

    Dialog and I: search process

    © Tefko Saracevic


    Comparisons of my articles citations l.jpg
    Comparisons of my articles & citations evaluation

    © Tefko Saracevic


    Tracking a single article l.jpg
    Tracking a single article evaluation

    Barry C.L., Schamber L. (1998) Users' criteria for relevance evaluation: A cross-situational comparison Information Processing and Management, 34(2-3), 219-236

    • Tracked citations in Scopus

    • And in Web of Science

    © Tefko Saracevic


    Slide49 l.jpg

    Cited 33 times in Scopus evaluation

    I followed up on the citations – cited even in: Evaluating research for use in practice: What criteria do specialist nurses use? Journal of Advanced Nursing 50 (3), pp. 235-243

    © Tefko Saracevic


    And the winner is l.jpg

    For Barry & Schamber 1998 article: evaluation

    Scopus: 34 citations

    Web of Science: 31 citations

    Oh well …

    Were they the same articles? Degree of overlap?

    Overlap: 27 documents (both in Scopus & WoS)

    Scopus had 7 that WoS did not

    WoS had 4 that Scopus did not

    Scopus 34

    7

    27

    4

    WoS 31

    and the winner is?

    © Tefko Saracevic


    Tracking one of my own articles l.jpg
    Tracking one of my own articles evaluation

    Spink, A., Saracevic, T. (1997).

    Interaction in information retrieval: Selection and effectiveness of search terms.

    Journal of the American Society for Information Science and Technology, 48(8), 741-761

    • Again: Tracked citations in Scopus

    • And in Web of Science

    © Tefko Saracevic


    And the winner is52 l.jpg

    Scopus 43 evaluation

    12

    31

    9

    WoS 40

    and the winner is?

    • For Spink & Saracevic 1997 article:

      • Scopus: 43 citations

      • Web of Science: 40 citations

    • Oh well …

    • Were they the same articles? Degree of overlap?

      • Overlap: 31 documents (both in Scopus & WoS)

      • Scopus had 12 that WoS did not

      • WoS had 9 that Scopus did not

    © Tefko Saracevic


    To my surprise l.jpg
    To my surprise… evaluation

    • For my article I followed a bit on unique citations in each, Scopus and WoS:

      • WoS had one article that did not cite the original at all

      • WoS did not have five citations from JASIST – it had other citations from that journal – these were in Scopus

      • Scopus did not have one citation from Inf Processing & Management and three citations from JASIST, it had other citations from those journals – these were in WoS

      • Oh well…

    © Tefko Saracevic


    Editorial uses l.jpg
    Editorial uses evaluation

    • I use citation tracking as editor of the journal Information Processing & Management:

      • find [good] referees – most important function for any editor

        • who did what in this area/topic, how cited

      • subject layout of the topic of the paper

      • tracking of author’s own work

      • self-plagiarism?

    © Tefko Saracevic


    Inviting referees l.jpg

    gets me right into Scopus evaluation

    Inviting referees

    editorial page for inviting referees

    © Tefko Saracevic


    For this particular paper in scopus l.jpg
    For this particular paper in Scopus evaluation

    • I went to author search for first author

      • he was over time at two instituions

      • published 7 papers, two on data fusion, but different topics

      • was cited only twice, thus no use following citation tracking

    • Then I did a subject search “data fusionAND information retrieval” since 2004

      • found authors that were cited a few times on the topic

      • invited two to be referees

    © Tefko Saracevic


    Citation versus subject searching l.jpg
    Citation versus subject searching evaluation

    • Each follows a different path for retrieval

    • Studies show that each retrieves different documents

      • low overlap between what is retrieved

    • As a rule, when doing serious searching and evaluation I do both

      • popular engines e.g. Google are useless for this

    • Citation searching/tracking also serves different purposes

      • mapping of an area/topic and author

      • also used fofr assessing impact

    © Tefko Saracevic


    My preference l.jpg
    My preference: evaluation

    • Scopus

      • easy & fast to use

      • comprehensive

      • many very useful features

      • combination of several modes of searching

        • use depending on need and task

      • useful for various evaluations

      • has holes, but EVERY database has them, Scopus has fewer ones

      • helpful people around, easy to reach & communicate

    © Tefko Saracevic


    What is not in scopus but i would love it l.jpg
    What is not in Scopus evaluation but I would LOVE it

    • Graphical display of connections

      • add visualization, network maps

    • Longer years back

      • Web of Science also has limitation on years depending on subscription rate

        • going back from 1994 costs gazillion dollars – Rutgers does not have it

    • Massive checking & corrections as needed

      • check on what is missing in issues & adding

      • check on citations and adding missed or deleting wrong ones

    • How about adding humanities?

    © Tefko Saracevic


    Conclusions l.jpg
    Conclusions evaluation

    • Actually, I do not have any

    • But subject & author searching & citation tracking beside being serious business and useful for evaluation is also fun!

    • So have fun!

    © Tefko Saracevic


    Slide61 l.jpg

    thank you evaluation

    dank u

    hvala

    danke

    merci

    grazie

    gracias

    © Tefko Saracevic


    ad