The cards in an advanced zyLab provide information about the workspace and the work learners perform in them. Each card can be clicked anywhere to expand it, then click "Hide" to collapse it.
Instructors and TAs with proper permissions will see these cards under each advanced lab in a zyBook.
Template, Model Solution, Test Bench
Lab Statistics
Student Behavior Insights
Similarity Detection
Student Activity
Template, Model Solution, Test Bench
This view of the template, model, and test bench is read-only, but click the 'Edit' button at any time to switch to the edit side.
The Template is the workspace presented to learners with all the files and settings present there. Any changes made to the template after learners have begun working will require those learners to open the settings in the bottom left and reset to lab to template to receive updates. Resetting will remove their current work and settings.
The model solution represents the ideal/expected workspace and can be used to confirm your tests work properly. Learn more about model solutions here.
The test bench is where you can create any automated and manual tests to check your learners code. Learn more about the test bench here.
Options contains settings like submission limits and model solution visibility that can be changed on the edit side. Learners will need to reset to template if you change these settings after they begin working.
Lab Statistics
The lab statistics card contains information about submissions, scoring, and allows downloading workspaces at the time of best submission or the specified submission date from the top of the card.
The workspace download can contain all files, or specific files for a smaller, more precise download.
A CSV of submissions tied to IP addresses can also be downloaded by clicking the button in the top right. Student names can also be anonymized to ID numbers for presentation purposes.
Student Behavior Insights (beta)
Student behavior insights provides an average of submissions, explore runs, time spent, pasted code, and average lines. It also provides outliers within that data with additional info like code pasted at submission and our abnormal development metric. Any outlier will be indicated by an orange dot to the left of the corresponding metric. Click Download Outliers Data to download the complete table of outliers.
Click the edit button next to "Analyzed Files" to change the files being analyzed for student insights. Then confirm by clicking "Analyze".
Activity outliers indicate significant deviations from the class average. Each column is explained below:
- Submissions: Total number of submissions to the autograder.
- Explore runs: Number of code executions during development in the lab.
- Time spent: Total time spent on the lab. Outliers are identified using the interquartile range method.
- Code pasted: Percentage of characters pasted into the lab throughout the session. Outliers have a paste ratio greater than 50%.
- Code pasted at "first highest" submission: Percentage of characters pasted into the lab at the earliest submission with the best score. Outliers have a paste ratio greater than 50%. This metric accounts for cases where students modify their code after achieving full points.
-
Abnormal Development (coming very soon): A beta feature designed to detect potentially unnatural code development behaviors, especially manual copying from external sources. Displayed in a valued from 0 (normal behavior) to 1 (very abnormal behavior). Each of the following metrics have their own thresholds and their own weight, in order of that weight below.
- Thinking pauses: Gaps in activity while working on the lab.
- Switching frequency: How often students alternate between new code development and modifying existing code.
- Linear development rate: The extent of continuous linear coding without revisions.
- Max Score: Maximum score achieved.
Abnormal Development is an experimental feature, and we encourage instructors to use it cautiously. The goal was to minimize false positives however some confident or advanced students may naturally exhibit higher scores. Outliers are considered if they exceed 0.85. If instructors observe unexpected patterns or excessive false positives, feedback is strongly encouraged to refine the metric by emailing us at beta@zybooks.com. Until further validation, this metric should be used as a supplementary tool alongside other insights.
In the example above, the code is 43% below the threshold for what we consider a "normal" amount of directional changes in their code writing. It's 100% below the threshold we consider to be a normal amount of pauses. And lastly, it's 13% above our threshold for linearity. We expect to see more pauses and directional changes, but ideally less linearity.
This data is only an indicator of outliers worth investigating. For instance, a learner pasting their own, unique work would show up in outliers. They may have been copying their own function from one location to another. Click on any learner to open them in Student Activity, and click History for a more granular view of their work.
Clicking any run or submission in their coding trail will jump to that point in History, even if their history is not open.
See the student activity section below for more info.
Similarity Detection
Similarity detection is an easy way to compare student's latest, best submission, or current workspaces, and determine if the amount of similarity is acceptable. Click "Add known solution" to also compare the model solution with their submissions--another solution can be added in place of the model after checking the box. Anonymize names will convert student names to id numbers, for presentation purposes.
Note that the comparison is performed on the default file, and any additional files can be added, or removed. Exclusion would be necessary for any files that are not meant to be edited, because their contents are the same among all users.
After selecting best submissions or current workspaces to run the similarity check, or opening a previous check, the comparison will open. The top of the comparison is split between pairs of submitters, and a percentage of similarity.
The file comparison along the bottom of the screen will show a visual comparison of the submitters currently selected.
The similarity checker defaults to the entire class and 90% or more similarity. Select any percentage from the drop down menu, or enter a unique amount to submission pairs that are at least that similar. Sections and individuals can also be selected for comparison. Individual selection will automatically default to 10% or more similarity, but can also be changed using the dropdown.
Click download sets to download the similarity data from the table, or download submission files at the bottom to download the workspaces of the two learners currently selected.
Student Activity
Student activity displays the scoring results for all learners, along with a coding trail that displays the date of the first submission up front. After the date, is an indicator for the day, a "-" for each development run, and a score for each submission. The total time spent in the workspace is indicated at the end of the coding trail. In the top right of student activity, you can download the data including test results.
Learn more about coding trails here.
Instructors and TAs can open any learner's workspace from student activity to code directly in those workspaces or add comments. Learners can also comment on their own code. To add a comment, highlight a line or block, right-click, and select Add Comment. This can be a great way to aid students who might be having difficulty. Comments can be deleted at any time from the same context menu.
Keep in mind, the History is read-only, so no code or comments can be added while it is open.
Sorting and filtering student activity
Student activity defaults to sorting by name, but there are a number of other options available. Score is useful for sorting by learners that may be having trouble, and "needs grading" is a great option if you're using manually graded tests.
Additionally, learners can be filtered by their section or individual learners/groups, or sorted by submission state.
Click on any learner to open their current workspace, and open History in the top right for a more granular look at their work. You can also click on any moment in the coding trail, whether the learner's workspace is open or not, to open the workspace to that point in their history. Clicking the 10 in the coding trail below opens the workspace to the state when it earned 10 points.
Learn more about History here.
Scoring for any submissions is displayed at the bottom of the student activity card, and is expandable to view the complete results. There are also buttons to navigate to the previous and next students here. These buttons are also keyboard-focusable, and can be highlighted using tab/shift-tab and activated using enter.
That's it!