Pages

Thursday 20 October 2011

Automatic for the People

The second set of notes from yesterday's ESRC Seminar Series on Impact looks at the efforts made by the USA's National Science Foundation (NSF) to automate the collection of science metrics.

Julia Lane, Program Director for the Science of Science and Innovation Policy, gave an overview of background and development of the 'Star Metrics' system. As in the UK, the 17 federal funding agencies were asked to justify the investment the government had made in science.

Refreshingly, rather than offloading this burden on to individual researchers (as is currently happening with the RCUK ROP system), the NSF decided that:
  • the information should be harvested automatically and electronically;
  • the system should be voluntary.
I know. What were they thinking? What we need is mandatory forms, and lots of them! Do they know nothing about research funding management?

But no, they were thinking very logically. After all, in the twenty first century, when the internet allows us to order our groceries, book our holidays and buy our road fund tax, why can't it be used to automatically gather information on impact?

Thus, they created a system that does the following:
  1. Follows the trail of grants through individual HEI financial system. This can tell them: who is funded (via the HR system), including PI, Co-I, RA and students, where the money is being spent (via the procurement system), and who they are subcontracting to or collaborating with (via the finance system);
  2. Follows the trail of outputs, by linking with the patent office and publication databases;
  3. Follows the individual via various CV systems, such as Vivo, Harvard Catalyst and Eurocris;
  4. Analyses the areas funded by the federal agencies by scanning and machine reading the applications, and doing a key word analysis.
As a result Star Metrics can be used to identify:
  • What expertise there is in a particular area, or at a particular site;
  • Where there is a shortage of expertise;
  • How much funding has been put into any discipline area;
  • Areas of overlap between funders;
  • What has been funded in any geographical location (such as a state);
  • What has been funded in any institution, or for an individual;
  • What local or national businesses have benefited from the funding;
  • The outputs and outcomes from any funded project;
  • The career development of anyone associated with the project.
And all, as Lane said, without the academic having to lift a pen. Now how refreshing is that?

No comments:

Post a Comment