Tags » scholarly impact

 
 
 

NISO’s webinar on item-level (i.e. article) usage statistics

Categories: Midd Blogosphere

On September 15, I “attended” a webinar presented by the National Information Standards Organization (NISO) which focused on the improving capacity to measure usage at the article level.  The presenters contend that article-level usage information more accurately reflects scholarly impact than the current ‘gold-standard,’ citation-based measures.

The first presenter was Paul Needham describing the progress of the PIRUS 2 project. Their mission is to establish global standards for finding article-level usage information.  (It is funded by Joint Information Systems Committee (JISC) in the UK and Counting Online Usage of NeTworked Electronic Resources (COUNTER), the group that established protocols for journal-level usage statistics, participates.)  Needham reported that PIRUS has loaded data from six publishers that includes 555,000 articles from 5500 journals, and just shy of 17 million download events.  They expect there will be upwards of 3 billion article download events to ‘count’ each year when most publishers are participating.  They are also working with institutional repositories.  More information can be found at their website: www.cranfieldlibrary.cranfield.ac.uk/pirus2/

The second presenter was Johan Bollen describing the Metrics from Scholarly Usage of Resources (MESUR) project.  http://www.mesur.org/MESUR.html   “The project’s major objective is enriching the toolkit used for the assessment of the impact of scholarly communication items, and hence of scholars, with metrics that derive from usage data.”  Bollen contends that impact factors based on citation data does not honor the full scope of scholarly activities; further citation tracking takes more than two years to develop.  In contrast, usage data is real-time and can use large scales of data.  He described MESUR as a “scientific project to study science itself from real-time indicators.”  The usage data he’s been working with includes “clickstreams” so they can follow what articles are clicked on after a given article is downloaded.  One of his articles on his research is available at http://arxiv.org/abs/0902.2183 and includes a graphic that shows how citation-based metrics do not reflect actual usage. It is fascinating in an academic scholarly kind of way.