TL;DR : It would be nice if my ORCID works list could be ordered according to e.g. impact factor
Every year, we have an annual review of output by my institute, which I’m sure is a common thing amongst research institutes, laboratories and probably universities. Now, I have my ORCID account connected with the DataCite “search and link”, which automatically imports new works from DataCite-connected datacentres. So, when my manager asked me for the year’s scientific output instead of trawling through emails, files, calendar, etc, I simply went to my ORCID profile, and pulled out a list of works, which I then just handed over. Five seconds later, still patting myself on the back, my enthusiasm for this self-service mode of output evaluation was somewhat curbed. I had a feeling it was too good to be true…
It turns out that at my institute, as at probably any serious research institute or university, not all scholarly output is created equal. Leaving aside the aspects of wider impact assesment in terms of what kind of contributions count for assesment - ie, departing from the “traditional” list of journal article, conference proceeding, etc and counting posters, blog posts, technical reviews, software, educational material, peer reviews, etc - and concentrating only on the quality of the contribution, my manager was faced with a problem. Although ORCID did indeed show a comprehensive view of my research outputs, it did not say annything about which ones were eligible for inclusion in my institutional assessment, because there was no way to map the object to the “publication count” as defined by my institute.
I have project notes, conference posters, software releases, reproducible workflows, educational courses, and of course the “good-old” peer reviewed journal articles -
How is the institutional librarian or group leader to know which ones are applicable ?
Hm. Fair enough…
Institutional membership to the rescue ?
Now, it may be that if my institute has an institutional ORCID membership, they would be able to proactively obtain my contributions for internal records, instead of relying on me to send them my (perhaps biased) report. This would allow them to both pull in works from sources which recognise my ORCID, and update my profile within the institute, as well as filter the works accorcding to their internal criteria, before harvesting them and submitting them to the institutional repository. Perhaps…
However, the single most important factor in what to select from my list of works was whether or not the work was published in an accredited journal or publication. Whilst there are a few nuances to this, it essentially comes down to whether or not the publication is peer reviewd, and has an impact factor.
Sorting by impact factor
So even though I had to go through my list of works and send through a filtered list, selecting works based on the internal criteria of my institute, the integration of ORCID with the publishers (including DataCite) really saved an incredible amount of time. I could be fairly certain that if my work was not in my ORCID profile, it would not satisfy the institutional criteria, since it would not have been published in an accredited journal, so I didn’t have to spend those hours trawling my email, calendar, filesystem, looking for articles which I may have published by forgotten about. Yay ORCID !
However, I still had a fairly long list to go through, which was sorted chronologically. Whilst going through this list I had to check which source the contribution came from (which was pretty easy)
However, to make things even easier it would be awesome if that list could be sorted by impact factor instead of date. This would not only make my annual reporting much easier, but also allow those visiting my profile to see my most impactful works first.
Ideas on implementation
This is probably not trivial to implement, but if I had to think of a solution, it would need the impact factors of the data sources. I’m not sure if the OAI-PMH protocol allows for data centres to include their impact factor in their repository metadata (or if this even makes sense), but that might be one way. Another way would be for an ORCID service to cross-reference the contribution with Web of Knowledge data on impact factors, and try to guess what impact factor to weight the article with.
Another aspect of a repository is it’s repository webometrics score, where available. This is what we decided to implement in the CHAIN-REDS Semantic Search tool. The issue here might be cross-referencing the DataCite data center identifier (the prefix, essentially) with the webometrics identifier. Of course, it would be even better if webometrics used the same persistence and uniqueness infrastructure as the rest of the academic publishing world, but we can’t have everything now, can we !
An alternative approach might be to forget about the publication and focus on the object; the works could be ranked by the Altmetric score, although this may imply some kind of relationship between Altmetric and ORCID. The work should have a persistent, unique identifier and this should allow a unique Altmetric score, which seems to imply a well-formed algorithm for ranking works in the ORCID list… but here I confess my ignorance of whether or not this is actually feasible.
Long story short : rank by importance.
So, basically : wouldn’t it be nice to have an ORCID profile which at least tried to list my works ranked by “importance” ? We could have multiple definitions of what “important” means, but any of them would be better than ranking by something as insignificant as “title”. This would really make the ORCID profile a more powerful tool in evaluation of researchers, both “at a glance”, as well as during annual reviews as was my case.
Bruce Becker writes in his personal capacity and this post in no way reflects the policy or position of the Council for Scientific and Industrial Research. This article serves as a personal account and does not reflect any relationship between the CSIR and ORCID, DataCite or any other entity mentioned. It was written in the context of the THOR Ambassador programme, which he is a member of.
ORCID - Y U NO sort by impact ? was originally published by Africa-Arabia Regional Operations Centre. at Africa-Arabia Regional Operations Centre on April 11, 2016.
This is a companion discussion topic for the original entry at http://www.africa-grid.org/blog/2016/04/11/ORCID-impact/