Friday, October 14, 2016

How to read things without opening them

Brent Seales is a professor of computer science at the University of Kentucky. He drives several very cool research programs in areas ranging from robotic surgery to advanced image progressing.

Recently, the Economist (one of my favorite weekly reads) featured some of his work. Here's a link.

How to read an old scroll without opening it

Tuesday, October 11, 2016

iCite and W2P ratios - new ways of quantifying scientific productivity

What's the best way of ranking scientists based on their productivity?

There are lots of options including:
  • number of publications
  • number of citations
  • impact factor of the journals the publications are in
All have strengths and weaknesses.

Recently, leaders at my academic institution have started to talk publicly about h-indices. These are calculated for each author as the number (x) of publications he/she has that have each been cited at least x times. That's an interesting idea because it rewards impact; people who publish lots of papers that nobody cites have lower h-indices than people who publish a few manuscripts that are very influential. However the h-index has the drawback that it grows with time (because people publish more papers and they have more time to be cited). This means that it favors seasoned scientists who have been productive for a long time. It's not a great way of identifying a rising star.

I don't think that there will ever be a single perfect metric that scientists can use to quantify productivity but I was excited to see that NIH is supporting the Relative Citation Ratio with their new iCite tool.

The Relative Citation Ratio (RCR) is an article level metric that quantifies scientific influence. To quote a help box from iCite, "It is calculated as the cites/year of each paper, normalized to the citations per year received by NIH-funded papers in the same field."

The Weighted RCR is the sum of the individual RCRs for a group of papers.

This creates an interesting opportunity. If you calculate the ratio of the Weighted RCR to Total Publications (I'll call it W2P) you get a single value that defines the influence of a collection of papers.

If your W2P is greater than 1, your papers are more influential than those of your NIH-funded colleagues. If it's less than 1, your papers are being cited less often than average.

WtoP doesn't scale with the number of papers you publish so it shouldn't depend on the length of your career.

Only time will tell how scientists use these new metrics but I am going to make a bold (and probably rash) prediction. Since NIH is supporting RCRs, I think that they will take over from h-indices as the mostly commonly used measure of productivity in US biomedical science.

For the record, here are my current stats
  • 46 publications
  • h-index = 22
  • Weighted RCR = 59.98
  • W2P = 1.34
and for the truly nerdy