This is a recruitment post for an academic survey on the quality of comments and documentation for Java programs.
My team is developing a research prototype of a technique for automatically generating comments in Java source code. The long-term goal of the project is to develop a tool that can write JavaDocs for an otherwise undocumented program.
We are now soliciting feedback on different types of comments produced by our prototype. We are comparing these to manually-written summaries, other algorithms, and a combination of human- and computer-written comments.
We've set up a web survey to collect feedback about the comments. The survey shows 24 different Java methods from 6 different Java programs, as well as English summaries for those methods. To avoid bias, the survey does not say which algorithm generated the summary (or if it is human-written) -- it can be quite difficult to tell in many cases.
We are recruiting Java programmers to participate in the survey. It would entail reading each of the 24 methods and the associated summary, and then answering a few questions about the methods. We expect it to take 30-60 minutes.
This is a pilot study and reading every method carefully can be difficult, so we are offering US$10 to the first 10 participants who have at least 2 years of Java experience. If we get a good reponse, we may open the deal to more than 10 participants.
If you are interested, send me a Private Message. I will respond with additional explanation and the link to the survey. Note that as a University, we are required to obtain Informed Consent, which means that the study cannot be done anonymously. Of course, no identities will be revealed publically here or elsewhere. Once you complete the survey, we are set up to reimburse you via your choice of PayPal or Bitcoin.
Thank you for considering!