Skip to Main Content

Citation Searching: Research Gate ML

How to find documents that cite a given article

Research Gate ML

In the last few years, I have received several questions about ResearchGate, the social network site for academics.  Launched in 2008, their stated aim was to help researchers communicate quickly via their platform, making it easy to share and access scientific and scholarly knowledge and or expertise.  It’s free to join and each member is given a “profile page” whereby they can give a brief biographical snapshot and list their publications.  Just seven years later, ResearchGate has a noteworthy reach with  more than 3,000 scientists polled by Nature reporting they were “aware” of ResearchGate and just under half said they “visited ResearchGate regularly” (Van Noorden 2014).   On first glance, it might seem that ResearchGate has a wide coverage of articles from different disciplines and years but its coverage of recent years is far more substantial and some disciplines such as the arts and humanities as well as some areas of social sciences receive sparse coverage (Thelwall and Kousha 2015).   So as an academic social network, most reviewers have no qualms with ResearchGate per se.  The problems scholars have is with the ResearchGate SCORE as a measure of a researcher’s scientific reputation.  Questions that came to mind when taking a closer look at ResearchGate were: 1) How do they get that score? 2)Why isn’t their method transparent? 3)How do we know that their viewing figures are not artificially inflated?  These were the questions I set out to answer as I approached this research project.  I thought surely there must be numerous studies taking a critical look at how far and wide ResearchGate strays from well-established bibliometric guidelines for research metrics.  

Luckily, I found quite a few papers that address these very questions and their conclusions were interesting and surprising.  The most surprising thing I found was that no study, to date, has been able to refute or confirm that ResearchGate’s viewing figures are artificially inflated.  So this question continues to taunt.  There was consensus on the fact that ResearchGate’s article views have low to moderate correlations with both Scopus citations and Mendeley readers (Thelwall and Kousha 2014).  Incidentally, Mendeley is Elsevier’s answer to a social citation manager that helps author’s keep track of their citations and like ResearchGate and Academia.edu it has a social component.  Some studies concluded that if the article intake and reputation of ResearchGate continues to grow then the correlation factor between ReseachGate metrics and traditional research metrics will also increase as ResearchGate becomes more comprehensive.  Other papers/studies I read gave opposite evidence and found the ResearchGate Score to have serious limitations going so far as to say “the ResearchGate Score should not be considered in the evaluation of academics in its current form (Kraker and Lex 2015).   Of course some scholars argue that the ResearchGate Score is a composite metric taking into account “social interactions” in tandem with traditional research metrics which gives a more “desirable” picture of impact but in the end there is no consensus on how to measure academic influence via social media (Jordan 2015).  

Apart from the papers and studies, I found that scholars either love or hate ResearchGate.  Many scholars find ReseachGate’s frequent use of automated e-mails (that claim to come from colleagues active on the site) a disgraceful tactic that lures people to join on false pretenses.  There have been incidents where profiles on the site have not been created by real people but have been created “automatically and incompletely” by culling details of scholar’s affiliations, publications records, etc. from off the web (Van Noorden 2014).   Others find that every important paper in their field has been easily and quickly accessed via ResearchGate.  In the end, what is a researcher to do who is seeking some kind of empirical yes or no about ResearchGate?  Like everything else, it depends . . .  if you are seeking an alternative or new way to get your work out there that alters the traditional metrics of scholarly communication, ResearchGate might just be the thing.  However, if you are seeking a reliable tool to measure your scholarly output then ResearchGate does not make the grade.  

References ML

Corvello,  V.,  Genovese,  A.,  &  Verteramo,  S.  (2014).  Knowledge  sharing among users of scientific social networking platforms.  Frontiers in Artificial Intelligence and Applications, 261, 369-380.

Delgado López-Cózar, E., Robinson-García, N., Torres-Salinas, D. (2014). The Google Scholar experiment: How to index false papers and manipulate bibliometric indicators. Journal of the Association for Information Science and Technology, 65 (3), 446-454.

Hoffmann, C. P., Lutz, C. &Meckel, M. (2015).  A relational altmetric? Network centrality on ResearchGate  as  an  indicator  of  scientific  impact.  Journal  of  the  Association  for Information Science and Technology. doi: 10.1002/asi.23423

Jordan, K. (2015). Exploring the ResearchGate score as an academic metric: Reflections and implications  for  practice.  In:  Quantifying  and  Analysing  Scholarly  Communication  on the Web (ASCW15), 30 June2015, Oxford. http://oro.open.ac.uk/43538/1/ASCW15_jordan_response_kraker-lex.pdf

Kadriu,   A.   (2013).   Discovering   value   in   academic   social  networks:   A   case   study   in ResearchGate.  In  Proceedings  of  the  35th International  Conference  on  Information Technology Interfaces (ITI2013) (pp. 57-62). Los Alamitos:IEEE Press.

Kraker,  P.  &  Lex,  E.  (2015).  A  critical  look  at  the  ResearchGate  score  as  a  measure  of scientific   reputation.   In   Proceedings   of   the  Quantifying   and   Analysing   Scholarly Communication on the Webworkshop (ASCW’15), Web Science conference 2015 (Oxford, UK, June 28 –July 1, 2015).

Ortega,  J.  L.  (2015).  Relationship  between  altmetric  and  bibliometric  indicators  across academic social sites: The case of CSIC's members. Journal of Informetrics, 9(1), 39-49.

Thelwall,  M.,  &  Kousha,  K.  (2014).  Academia.edu: Social network  or  academic  network? Journal of the Association for Information Science and Technology, 65(4), 721-731.

Thelwall,  M.,  &  Kousha,  K.  (2015). ResearchGate: Disseminating, communicating and measuring Scholarship? Journal of the Association for Information Science and Technology, 66(5), 876-889.

Van  Noorden,  R.  (2014).   Scientists  and  the  social  network.  Nature, 512(7513),  126-129. http://www.nature.com/news/online-collaboration-scientists-and-the-social-network-1.15711.