16 Mar 2023
ResearchGate recently notified me that I had reached a milestone: my research items had reached 1,000 reads. Ignoring for a moment the awkwardness of “research items,” which we can, I think, chalk up to ResearchGate making it possible to publish a variety of materials, I want to think about what “reads” means here, because in the age of citometrics et alum these kinds of quantifications may eventually play more of a role than any of us might wish.
As a member of a department personnel committee, I recently enjoyed reviewing the work of three terrific junior faculty members, all of whom I will note here deserve to be tenured and promoted without delay. All three offered not only impressive vitas and compelling portfolios of materials for personnel committee members to peruse, but also had very polished slide decks, each of which featured various composite scores of semesterly SEIs (Student Evaluation of Instruction). All three are smart enough to know that SEIs are biased in a variety of well-documented ways and, when they consist of 7 fairly subjective questions, offer little of statistical significance. They are also smart enough to know that the same institutions that don’t invest in faculty also tend to think things like SEIs are acceptable forms of assessment and even development. (The kind of professional development I have in mind would not only include funding for travel to conferences but also funding a teaching resource center staffed by people with a real focus on educating college-aged students as well as funding of teaching pairs and training for faculty on how to be better assessors of their own, and their colleagues’, approaches to teaching.)
So add up some scores, calculate an average, and include that in graphic on a slide. Put another way, it doesn’t matter how meaningful, or meaningless, the number is, so long as it is a number.
And that probably reveals my attitude toward a milestone like 1000 reads, because … is it? Is it really 1000 reads? Or is it 1000 downloads? Or 1000 views of a page from which you could download the text, which is how Academia.edu seems to work. (More on that later.)
Google Scholar seems to hew to the more conventional approach to counting things by counting only citations. However it is not entirely clear how they are arriving at those counts: is it only from materials also deposited in/with Google Scholar? I ask because, to be honest, the portfolio of my materials with Google Scholar is not complete. Nor is it complete with ResearchGate nor Academia.edu. Perhaps worse: the make-up of the portfolio on each is different, with Google Scholar having older materials, Academia having more conventionally humanities materials, and ResearchGate getting more computational materials.
In all honesty, there was no principled division of materials because I have never been quite sure which one was worth investing the effort to upload everything, and, really, my goal was to put everything in a GitHub repository. (I’m still working on this.)
In a better world, researchers could post their open access, or otherwise being made accessible, materials in a repository of their choosing and then these sites/services would simply index things there. That has not happened and it’s my guess that that is not going to happen. Academia wants you eventually to invest in a premium membership, like LinkedIn, and Google simply wants to keep you in the GooglePlex. ResearchGate seems to be able to remain in the “if you get enough users the money question will answer itself” phase. Their “About Us” pages hints at perhaps eventually rolling out a job search service or … something.
In the mean time, we got numbers. And I guess you could put them in a slide deck. Or an annual performance evaluation. Or something!
You can go back to the logbook or dive into the archive. Choose your own adventure!