DSlab (Distributed Systems Lab) is a web tool that students can use for self-assessing their assignments automatically in a Distributed Systems course under realistic conditions in real deployment. As such, the tool has the following characteristics:
- Web based: students can upload their assignments and access all their submissions and associated information (source code, assessment, feedback, etc.) using a web interface. Instructors can have access to all submissions of all users and their associated information. As there are thousands of submissions, the tool provides a functionality that allows instructors to access the best submission of each student.
- Automatic assessment: students upload their source code using the web interface. Next, DSLab compiles and builds it, and then stores the source code and the built code in the database. Next, different instances of the code are executed interacting with one another. Finally, the final state of the different instances is compared. If the final state among the instances is consistent, the assignment is marked as correct; otherwise as not correct.
- Feedback: it indicates whether the assignment is correct or not. In case it is not correct, it reveals which data structures used by the distributed algorithm were not consistent. In addition, students can access the final state of all data structures of each instance as well as the execution logs created by each instance while executing.
- Interaction of student’s implementation with instructor’s correct implementation: half of the computer instances participating in the assessment of the assignment run student’s implementation, whereas the other half runs instructor’s implementation. This gives us more certainty about the correctness of student’s implementation; that is, not only the final state is consistent among all instances (i.e., instances that run student’s implementation and instances that run instructor’s implementation) but also this final state is reached after a complete interaction between student’s and instructor’s instances.
- Execution under realistic conditions in real deployment: the instances of the assignment are deployed in different computers (i.e., in real deployment). In addition, to test the assignments under different realistic conditions, during the assessment we force the failure of some computers following different failure patterns.
- Unlimited submissions: students are allowed to resubmit their assignment as many times as they wish (i.e., there is no limitation in the number of submissions). The final mark is the highest mark obtained among all submissions.
- Ability to run several assignments simultaneously: this is explained in more detail when presenting the dispatcher component of the LSim Framework below.
DSLab has three components (Figure below, left side in black):
- Web interface: it constitutes the user interface for both students and instructors. Students can upload the code they have implemented, compile and build it with the rest of the project, and then select the conditions under which the assignment will be executed. Instructors, in addition to this, can access all uploaded projects and submissions of all users and get to know the best submission of each student. They can also administrate users and groups (add, remove, change password), create a new semester, etc.
- Database: it stores all data generated in the system. For each semester it concerns all students and groups, all the code uploaded and the result of all submissions.
- Front-end: it constitutes the interface that is in charge of deploying and running the assignments in a set of computers. In our case, we developed a customized tool, called LSim, for testing applications in a set of distributed computers (Figure below, right side in blue).
LSim is the tool we developed to test students’ assignments in a set of distributed computers. LSim is formed by LSim Framework and LSim Library:
LSim Framework: it is a framework that automatically deploys and runs the application on the resources that will perform the execution of the assignment. It is composed by the following elements (Figure above, middle side in blue):
- Launcher: it receives the specification that describes the characteristics of the assignment to be executed (number of instances of each piece of code to be run, initialization parameters, etc.); then, it prepares, deploys, initializes and starts up the assignment.
- Dispatcher: each computer from the computational infrastructure, that will run instances of the assignment, installs a dispatcher. The dispatcher is responsible of creating the environment to run an instance of the assignment and start it up. For each execution, the dispatcher will create a new running environment in the local computer, which allows a computer to run several assignments simultaneously without interferences. In addition, the dispatcher receives the initialization and coordination information needed for the instance before starting the execution and during its execution. Finally, it collects results of the execution and sends them transparently to the component in charge of assessing the execution. Interactions between the application and the local dispatcher are carried out through LSim Library (explained below).
- Resource-pool: it is in charge of identifying the location of the available dispatchers (IP and port). For each execution of an assignment, the Launcher requests the Resource-pool a set of dispatchers (i.e., a set of distributed computers) that will participate in the execution.
LSim Library: it is a library that automates the initialization, coordination, collection of results and collection of log messages. Applications call LSim Library for each of these functions. LSim Library relieves the developer of the application, which is executed using LSim, of the burden of implementing the necessary code to deal with all these aspects.
The current version of LSim allows the execution of java programs but it could be easily adapted to other languages.
Both
DSLab and
LSim are implemented using an
open source license (GNU v3).
Thanasis Daradoumis, Joan Manuel Marquès Puig, Laura Calvet Liñan, Marta Arguedas (2021). A distributed systems laboratory that helps students accomplish their assignments through self-regulation of behavior. Educational Technology Research and Development. https://doi.org/10.1007/s11423-021-09975-6.
Joan Manuel Marquès Puig, Thanasis Daradoumis, Laura Calvet Liñan, Marta Arguedas (2020). Fruitful Student Interactions and Perceived Learning Improvement in DSLab: a Dynamic Assessment Tool for Distributed Programming. British Journal of Educational Tecnology. 51(1), 53–70. doi:10.1111/bjet.12756.
Thanasis Daradoumis, Joan Manuel Marquès Puig, Marta Arguedas, Laura Calvet Liñan (2019). "Analyzing students' and teachers' perceptions to improve the design of an automated assessment tool in online distributed programming". Computers & Education. ISSN: 0360-1315. Elsevier. https://doi.org/10.1016/j.compedu.2018.09.021
Joan Manuel Marquès, Daniel Lázaro, Àngel A. Juan, Xavier Vilajosana, Marc Domingo, Josep Jorba (2013). "PlanetLab@UOC: A Real Lab over the Internet to experiment with Distributed Systems". Computer Applications in Engineering Education. Volume 21, Issue 2, pages 265–275, June 2013. DOI 10.1002/cae.20468
Joan Manuel Marquès, Àngel A. Juan, Alejandra Pérez, Thanasis Daradoumis, Daniel Lázaro, Rubén Mondéjar (2012): “Using real labs over the Internet in distance-learning courses on distributed systems: students' behavior and feedback”. International Journal of Electrical Engineering Education. Volume 49, Number 1 / January 2012. pp. 74-87. DOI 10.7227/IJEEE.49.1.6