Project Discussion - https://docs.google.com/document/d/1Zne2CpODO8YvkSf5-h1U4SRt4LsK8e9dAyFUjOqwrHU/edit
- Greg Cusack (gregcusack@ucla.edu)
- Kunal Patel (kunalpatel1793@gmail.com)
- Nathan Pilbrough (nathanpilbrough@gmail.com)
- Tzu-Wei Vivi Chuang (vivi51123@gmail.com)
- Zhengshuang Ren (zhengshuangren@gmail.com)
- Does this report sufficiently describe motivation of this project?
- Does this report describe when and how this research can be used by whom in terms of examples and scenarios?
- Does the report clearly define a research problem?
- Does the report adequately describe related work?
- Does the report cite and use appropriate references?
- Does the report clearly & adequately present your research approach (algorithm description, pseudo code, etc.)?
- Does the report include justifications for your approach?
- Does this report clarify your evaluation’s objectives (research questions raised by you)?
- Does this report justify why it is worthwhile to answer such research questions?
- Does this report concretely describe what can be measured and compared to existing approaches (if exist) to answer such research questions?
- Is the evaluation study design (experiments, case studies, and user studies) sound?
- an enhanced program differencing algorithm for Scala
- an evaluation of an automated test coverage collection tool
- an evaluation of Eclipse refactoring engines
- an assessment of false positives in FindBug
- an evaluation of a code clone detector.
- a comparative evaluation of code review tools model differencing for labview models.
- a new change impact analysis for UML models
- an assessment of a performance profiling tool.
- an empirical study of code clones in ocaml