by jordicabot | Jul 6, 2015 | evaluating research, funny, Research Rants
Maybe it’s because the hypercriticality of our research area but if I had to summarize what I think it’s my main role as PC Chair, I´d say is the following: (if you don’t recognize the song the meme is paraphrasing, you’re too young to be doing...
by jordicabot | Apr 18, 2015 | evaluating research, Research Rants, tools
Announcing the release of our new MetaScience online service that we have developed to help on the critical matter of evaluating the quality of conferences using metrics that are not usually available (unless you take the time to calculate them yourself). The current...
by jordicabot | Jan 12, 2015 | evaluating research, funny, Research Rants
Very funny (and sad at the same time) collection of reviewer’s comments on research papers on http://shitmyreviewerssay.tumblr.com/ (also on twitter at yourpapersucks). The comments mix criticisms on what probably are really bad papers The best thing about the...
by jordicabot | May 6, 2014 | evaluating research, presenting, Research Rants
We recently completed a research work that ended with a bunch of negative results. Even if negative, we thought the results we obtained were valuable because, in our opinion, were not obvious (in fact we wanted to “prove” that the variables we studied were...
by jordicabot | Apr 18, 2014 | evaluating research, Research Rants
Lionel Briand and André van der Hoek (PC Chairs of ICSE 2014, for those working on other research areas, I think it’s safe to say that ICSE is the most well-known research conference on Software Engineering) have published his analysis of the peer-review process...
by jordicabot | Mar 5, 2014 | evaluating research, publishing, Research Rants
I took the time to count how many new reviews for the ECMFA’14 conference were uploaded each day to the easychair account for the conference. The results, displayed below, are exactly what I was expecting (it’s also my own behaviour 🙂 ). Even if the...