First of all , the “I” in the post title is not me, it’s Richard Paige, Professor of Enterprise Systems and (as of May 2015) Deputy Head of Department (Research) in the Department of Computer Science at the University of York. And besides all this, a very opinionated person that recently shared his thoughts on the state of software engineering research after spending plenty of hours this year reviewing around 100 papers for several Software Engineering conferences and journals.
To be clear, I’m just reposting here the tweets he wrote since they are public, I’m sure he won’t mind (though I haven’t asked 🙂 , I’ll do it a posteriori: “ask for forgiveness and not permission”) and I believe that his opinions are spot on and deserved to collected here.
Enter Richard:
In the past 8 months I've reviewed around 100 papers for various SE events/journals. Some observations.
— Richard Paige (@richpaige) August 24, 2015
(1) Nothing beats a good analytic/synthetic survey paper. They're hard to write, distill lots of great info, and you keep going back to them
— Richard Paige (@richpaige) August 24, 2015
(2) Many (most?) conference submissions exist not to further science but to further career progression.
— Richard Paige (@richpaige) August 24, 2015
(3) 80% of the papers I reviewed were neither acceptable nor accepted (though at least 15% were close – ie., benefited from reviews).
— Richard Paige (@richpaige) August 24, 2015
(4) At least 70% of the reviews I saw were in the useful-to-insightful range. Around 5% were totally brilliant reviews.
— Richard Paige (@richpaige) August 24, 2015
(5) Analysing Github is exceptionally popular, exceptionally boring (to reviewers), and isn't too difficult to publish.
— Richard Paige (@richpaige) August 24, 2015
(6) Mathematical papers tended to receive stronger reviews than empirical/engineering papers, even if their contribution was much smaller.
— Richard Paige (@richpaige) August 24, 2015
(7) Tool papers are hard to review, manage, publish, except when there is a dedicated artifact track with insanely great reviewers
— Richard Paige (@richpaige) August 24, 2015
(8) There is increasing homogeneity in publishing patterns: early career researchers tend to publish/try to publish in same places. [cont..]
— Richard Paige (@richpaige) August 24, 2015
(8)[cont] Many bibliographies included authors' self-citations of their [top-4-SE-conferences] papers.
— Richard Paige (@richpaige) August 24, 2015
If every early career SE researcher has a CV with an ICSE+FSE+ASE+TSE paper, does that help them stand out in the crowd?
— Richard Paige (@richpaige) August 24, 2015
I wonder if the advice we give to PhD students that they "must have an ICSE/FSE/ASE" paper is good advice.
— Richard Paige (@richpaige) August 24, 2015
Tool tracks at most software engineering conferences encapsulate the disconnect between SE academic research and practice.
— Richard Paige (@richpaige) August 25, 2015
"Here's a tool that partly solves a problem that would never happen in practice because of policy, process, discipline, or nobody caring."
— Richard Paige (@richpaige) August 25, 2015