The 40th ACM Technical Symposium on Computer Science Education
March 4-7, 2009, Chattanooga, TN USA

Guidelines for Reviewers of Paper Submissions

General Recommendations

Here are some recommendations for writing reviews of submitted papers that help the authors and improve the quality of the symposium.

  • Recognize the difference in types of papers. This year we have asked authors to categorize their papers to help you distinguish their aim.
    • Experience reports describe an idea or a course that worked well and is now being recommended to others.
    • Research studies present a more careful study, which is particularly differentiated by its use of appropriate methodology. Please note the methodology does not need to be quantitative; rather, it needs to be appropriate to support the claims made by the author.
    • Philosophical papers present an argument for some idea about our curriculum, course or field.
    • Tool papers may describe courseware or a technique that the author believes has wide application.

    While each type of paper is evaluated on organization, technical soundness, potential contribution and overall evaluation, you may be looking for different things in these categories depending on the category of the paper. You do not need to require an experience report to be a research study, nor a research study to provide a tool. All papers should show appropriate organization and should reflect the literature that already exists related to the topic being discussed. All claims should be supported by logical arguments that follow from ideas, experience or data provided.

  • Your job as a reviewer is to write detailed reviews, even for excellent papers. Tell the authors why you liked their paper, so that the authors know what made them so successful.
  • Even if your opinion is that the paper is poorly written or poorly thought-out, you can still provide constructive criticisms to help the authors, and in the long run, the conference. Think of your goal as convincing the authors of the paper you're reviewing to submit something else next year, but of such high quality that it will be well-reviewed and easily accepted. Give the authors advice on how to do that.
  • The best reviews clearly justify the reviewer's choice of rating. The least valuable review gives a low score with no written comments. That simply tells the authors that they have been unsuccessful, with no indication of how or why.
  • The focus of your review should be on content.
  • Please point out typographic and grammatical errors, unless there are too many of them.
  • Although SIGCSE requires all submitted papers to be polished work, all accepted authors get a brief opportunity to improve the presentation of their paper before camera-ready copy is due. Your detailed feedback may help improve a paper, and in a small way, improve the conference.

Substandard Reviews

SIGCSE 2009 will use a meta-review process after reviews have been turned in. Reviews that do not objectively, accurately, and clearly assess a paper's suitability for publication at SIGCSE, founded in the reviewer's disciplinary expertise and on the basis of the written paper's originality, technical soundness, contribution to CS education, and clarity of presentation may be deleted.

For example, an unacceptable review might:

  • be incoherent, unreadable, or irrelevant to the paper;
  • focus on the paper's topic area or presumed authors at the expense of assessment of the paper itself; or
  • provide no justification for its numeric ratings. (Even in "obvious" cases, reviewers should briefly justify ratings.)

Please note that a difference in rating or opinion with other reviewers or PC members will NEVER be cause for deletion of a review.

Examples of Good Reviews

To help reviewers better understand the qualities of good, useful reviews, here are several example comments, organized by review category:

  • The paper presentation (organization and writing style)
    • The organization of the units on forensics was well done. However, the discussion of how it fits into the curriculum is overly broad and not too realistic. Many factors were overlooked on the "curriculum side".
    • Good level of detail on your approach. Table 2 is very handy. Under Section 2, it seems like log analysis and auditing may fit in your column two. How will you ensure additional security emphasis is implemented?
    • The paper was easy to read, although it could benefit from a review of the English sentence structure. The paper was organized in an easy-to-follow manner. The authors explained their motivations and methods for their study.
    • The paper could use additional proofing and polishing. I suggest finding a non-robotics person to read for both language and communication. Some sentences are poorly formed (e.g., sent. 1 of last par. in sec. 1). Some content seems misplaced (e.g., discussion of mobility in section 3).
    • The organization is faultless. It is very clear what the paper is going to say and how. The paper follows through with crystal clear subject headings and a logical flow of information. There are some grammatical problems; these are not serious, but a thorough proof-reading would be helpful.
  • The idea or work presented is original and appropriately builds on and acknowledges previous work in the area
    • I would have liked to see some discussion and references setting this work in the context of other studies of student learning and knowledge retention. While I don't know of other studies that have examined exactly the phenomenon this paper does, a short search in the ACM digital library turned up these examples that are relevant...
  • The paper is technically sound; it includes evidence for any statements or conclusions that it makes
    • This paper makes a very good argument in the introduction for why this course is needed. It is timely, and addresses a topic outside of the norm often seen at SIGCSE.
    • I can't recall ever seeing something similar at SIGCSE. In spite of the previous problems, I would urge acceptance of this paper on a topic that we rarely see at SIGCSE.
    • The hypotheses are too obvious and the validation of them is not enough. Therefore, the contribution of this paper is quite limited.
  • The idea or work presented has potential to contribute to CS education
    • This paper should generate a lot of discussion and have a good audience. It is a topic that many schools are trying to address (including mine.)
    • Hard to judge given the writing organization problems, but I do not see a lot of significance here. The verification that the laboratory helped more than on-line component alone is a nice result, if it is supported by the data. Having taught this course already and collected feedback on your approach makes the paper stronger.
    • It is important for those who might be considering this approach to know that it can be successful. If I were considering this approach I would want to know if the students could understand the code, and how deeply I could get into the material given time constraints.
  • Overall evaluation
    • As noted above, I'd urge acceptance of this paper because it's relevant and unique. If it had competitors, I wouldn't be nearly so kind. But I haven't seen competitors.
    • This is a good, interesting topic, accessible to the SIGCSE audience, and widely useful.
    • A good practical beginning guide to implementing lab exercises in a visualization course.
    • Given the potential interest in this topic the authors could do better to capture the imagination of the reader; perhaps with a paragraph or two on famous cases.
  • Presentation
    • The material stands well on its own but could be made more attractive for a presentation. Help the audience understand why this is interesting. Relate some "exciting horror stories" of real-world failures. Justify why a full semester should be devoted to this course in an already tightly-packed curriculum.
    • After introducing the course, spend time on how you handle the diverse population. Also spend time on the lab components that the students found most interesting. Provide handouts of the lab exercises for those who are interested (or provide a link to a web resource.)

Questions? Please contact:

Steve Wolfman and Gary Lewandowski
SIGCSE 2009 Program Chairs

This site is hosted by The University of Arizona Department of Computer Science.