Many instructors ask what we do to reduce cheating on programming tasks. We take a multi-faceted approach:
- zyLabs
- Similarity checker -- For most zyLabs languages, instructors can run our built-in similarity checker to detect submitted programs that are very similar. Similarities are highlighted. The tool is similar to Stanford's MOSS, but extremely easy to use due to being built right in. It has also been optimized for speed.
- Modification of ZMLs -- Many instructors use our "zyBooks Maintained Labs" (ZMLs). We make changes to many ZMLs each year or even each term, causing previous-term solutions to fail (and be easily noticeable).
- Signatures -- We create a concise visual signature of programming effort, visible to students and instructors. Instructors can readily notice unusual signatures (like just one submission yielding full credit). For best results, we recommend requiring the zyLabs development environment where students code in the zyBook, or at least requiring frequent submissions to the auto-grader, like after every 20 min of programming or every 20 lines of code.
- Analytics -- Instructors can now see how much time a student spent programming in zyLabs' development environment.
- IP logging -- Instructors can see a log of IP addresses for all of a student's submissions, which can be helpful in cases where someone else is doing some of a student's work.
- And a secret technique -- We have a technique that can let us know exactly which student has posted one of our problems to those websites that specialize in solutions. We can't say more than that since we don't want students to know the technique and hence try to find workarounds, but we are looking into ideas of what to do when we detect such cases.
- Other
- Auto-generated CAs -- Many of our challenge activities are auto-generated, so each student gets a unique problem that requires a unique solution.
- Scaffolding -- Our material is designed for students to learn incrementally, with Participation Activities giving an initial opportunity to practice with feedback, then Challenge Activities providing more opportunities, and finally Lab Activities allowing greater opportunity to learn (with ZMLs also ranging from easier to moderate to harder).
- DMCA takedown notices -- We dedicate time to scouring the web and submitting takedown notices where appropriate, especially from one well-known .com site whose name we won't mention. We've successfully had many thousands of items taken down, but of course new ones pop up. We continue that battle.
- Time respect / clarity -- Much cheating occurs when students can't understand a task, or find it is taking far longer than it should. We obsess over creating clear activities that are of appropriate length, paying attention to student feedback and activity data and continually improving our activities in response.
We encourage instructors to consider a few things:
- Why cheating occurs: Few students take a course planning to cheat. Many reasons are known to lead students to resort to cheating, such as when a task is too big of a step, when students get stuck and can't find legitimate means for help, when workload is excessive and thus deemed unreasonable, and more. None of this justifies the cheating, but instructors may wish to think about how some classes encourage or discourage cheating based on how they are set up.
- Students and decisions: Young students' brains are still developing until about age 25, including the prefrontal cortex, which handles decision making and planning. Instructors may wish to have appropriate expectations of these young students, and design a course where the temptation to cheat is minimized.
- Working together: Pair programming, peer instruction, and other work-together approaches are shown to improve learning and reduce attrition. We highly recommend making use of such techniques, especially in early CS courses. Students have many years to learn to develop independently.
- Cheating prevention: We encourage instructors to adopt a mindset of cheating prevention rather than cheating detection, which not only keeps students on track, but is more fun for instructors. We encourage instructors to:
- make fun use of signatures and the similarity checker during normal class time, such as to see how students are doing and where they are struggling, or to see who submitted similar programs to a low-stakes quiz question. Students then see what tools instructors have to detect cheating as well.
- assign many small programs instead of one large program in intro classes, shown to yield better learning, less need for help, and happier students.
- use "thresholds" or "drop the lowest X labs" policies.
- discuss academic integrity, not on the first day when nobody is considering cheating, but later when programs are getting harder.
- let students work together, under some constraints of course.
- provide lots of help, whether via discussion forums, office hours, mentors, etc.
We are actively developing more techniques to detect, and even more importantly to prevent, cheating. Signatures are improving with more info, similarity checking will detect rings of students, more auto-generated coding CAs are being developed, zyLabs may soon have auto-generation capabilities, and much more. Ultimately, we hope that our content will be so well structured with incremental steps and strong feedback, and that students will also realize that instructors can see their effort and their similarity to other code, that most students will simply do the work that is expected and have a great time learning to program, feeling proud of their accomplishment at the end of the day.