Seventh SPLICE Workshop Proceedings

Proceedings Citation

C. Shaffer, P. Brusilovsky, K. Koedinger, S. Edwards. Proceedings of SPLICE 2021 workshop CS Education Infrastructure for All III: From Ideas to Practice at 52nd ACM Technical Symposium on Computer Science Education, March 15-16, 2021, Virtual Event

Organizers
Cliff Shaffer, Virginia Tech
Peter Brusilovsky, University of Pittsburgh
Ken Koedinger, Carnegie Mellon University
Steve Edwards, Virginia Tech

Peer Reviewed Papers

Title: New Acos Content Types
Authors: Ari Korhonen, Giacomo Mariani, Peter Sormunen, Jan-Mikael Rybicki, Aleksi Lukkarinen, Lassi Haaranen, Artturi Tilanterä and Juha Sorva
Slides: Available Here
Abstract: This paper demonstrates three new content packages recently published for Acos, the server for sharing smart learning content. The packages are aimed at 1) code annotation, 2) automatically assessed tracing exercises, and 3) scripted stepwise animations for stepping through explanations of different content. The content can be disseminated to different learning management systems by utilising several learning protocols, such as LTI.

Title: A Proposed Workflow For Version-Controlled Assignment Management
Authors: Bob Edmison, Austin Cory Bart and Stephen Edwards
Abstract: Computer science instructors spend a significant amount of time developing software exercises and projects for their classes. After creating these assignments, sharing them with their colleagues can be another large effort, even at the same institution. As part of an effort to scale course offerings, as well as sharing content, we propose a solution to leverage two existing technologies, Waltz and PEML, to create a complete workflow that will allow an instructor to store their programming assignments in a Git code repository, use PEML to define the exercise, and then use Waltz to create the student-facing resources in a learning management system (LMS), as well as the automated creation of program assessments in autograders. We will also discuss future opportunities to extend this workflow.

Title: Integrating Diverse Learning Tools using the PrairieLearn Platform
Authors: Matthew West, Nathan Walters, Mariana Silva, Timothy Bretl and Craig Zilles
Abstract: In this article, we describe PrairieLearn, a flexible open-source platform for asking questions to students that is in broad use for both homework and exams. We demonstrate PrairieLearns flexibility and ability to integrate existing code and questions into a single platform using three case studies: Parsons problems, designing finite-state machines, and self-grading "Explain in plain English" questions. We highlight aspects of PrairieLearns structure that enable this flexibility, in particular PrairieLearns ability to execute arbitrary code both during the generation of a question instance and during grading student answers.

Title: CodeOcean and CodeHarbor: Auto-Grader and Code Repository
Authors: Sebastian Serth, Thomas Staubitz, Ralf Teusner and Christoph Meinel
Slides: Available Here
Abstract: The Hasso Plattner Institute (HPI) successfully operates a MOOC (Massive Open Online Course) platform since 2012. Since 2013, global enterprises, international organizations, governments, and research projects funded by the German ministry of education are partnering with us to operate their own instances of the platform. The focus of our platform instance is on IT topics, which includes programming courses in different programming languages. An important element of these courses are graded hands-on programming assignments. MOOCs, even more than traditional classroom situations, depend on automated solutions to assess programming exercises. Manual evaluation is not an option due to the massive amount of users that participate in these courses. The paper at hand presents two of the tools developed in this context at the HPI: CodeOceanan auto-grader for a variety of programming languages, and CodeHarbor, a tool to share auto-gradable programming exercises between various online platforms.

Peer Reviewed Lightning Talks

Title: Exploring the Complexity of Crowdsourced Programming Assignments
Authors: Nea Pirttinen and Juho Leinonen
Slides: Available Here
Abstract: CrowdSorcerer is a tool in which students can create their own programming assignments according to teacher's instructions, and later review their peers' assignments. In this lightning paper, we take a brief look into the level of complexity of assignments novice programmers create with the tool.

Title: Analyzing Fine-Grained Material Usage Behavior
Authors: Charles Koutcheme, Juho Leinonen, Juha Sorva and Arto Hellas
Abstract: Most prior work in log data analysis within introductory programming courses has focused on log data gathered from programming environments. While such data can provide insights into the programming process of students, it misses other parts of the learning process, such as students' use of e-textbooks or other online learning materials. In this work, we present preliminary results from an analysis of learning material use at a fine grain. We discuss the possibilities that such data provides for studying the learning process of students in introductory programming courses.

Title: Microservices for Score and State Saving of Aggregated Interactive Assignments
Authors: Cay Horstmann
Slides: Available Here
Abstract: This lightning talk presents a set of microservices that enable the inclusion of arbitrary interactive JavaScript elements in assignments, course/lab notes, and textbooks. Implementors of the JavaScript elements add calls to a very simple API. The services take care of aggregation, state saving, and the LTI protocol. A current implementation is the CodeCheck assignment feature that allows the creation of LTI assignments consisting of autograded coding exercises, code tracing exercises, and Parsons problems.

Title: Runestone's Question Bank, Exam Generator, and Log Data
Authors: Barbara Ericson and Bradley Miller
Slides: Available Here
Abstract: Runestone is an open-source platform for interactive ebooks. It serves over 120,000 registered learners and has an average of 350,000 page views a day. There are ebooks for secondary computer science as well as for undergraduate computing courses: CS1, CS2, data science, web programming, and more. The platform supports executable and editable examples in Python, Java, C, C++, HTML, JavaScript, Processing, and SQL. It also includes code visualizers/steppers for Python, Java, and C++ code. Runestone supports instructional material: text, videos, and images as well as typical practice problems with immediate feedback such as multiple-choice, fill-in-the-blank, and matching questions. Runestone also has unusual features such as audio tours of code, clickable areas, adaptive Parsons problems, and a unique practice tool. This paper highlights the new exam generation feature which also allows A/B testing, explains the log data that is collected and is available for analysis, and describes plans for future development including making the smart content reusable.

Title: Integrating a Colony of Code Critiquers into WebTA
Authors: Leo Ureel II
Slides: Available Here
Abstract: WebTA is an LTI-based Code Critiquer; a system that accepts student code submissions and provides computing students with immediate feedback on their program design. Previously, WebTA was used to critique Java programs in the introductory CS courses. Our goal is to extend WebTA to include a colony of code critics so that we can use WebTA in our Engineering Fundamentals courses that teach programming in MATLAB and later add code critics to support a diverse range of computing and non-computing students in courses that utilize programming in a variety of languages.