Feedback on the first French xMOOC: Fundamentals of Project Management

According to the New York Times 2012 was “The Year of the MOOC”. The popularity of MOOCs, or Massive Open Online Courses, has spread extremely fast during the past twelve months. In less than a year, dozens of the most prestigious universities have joined Coursera, the main MOOC provider. Some of these courses have gathered tens of thousands of students, and more than three million people have registered on Coursera so far. MOOCs started in Canada in 2008. They initially took their inspiration from connectivism, a learning theory stating that the learning process should be decentralized. Now, this type of MOOC is often referred to as cMOOC. In these courses, the instructor is a facilitator of interactions between students rather than a transmitter of knowledge. These courses were still pretty marginal until Stanford University launched two courses dealing with artificial intelligence, in November 2011. Over one hundred thousand students enrolled to the courses. The MOOC wave then spread very quickly, reaching first the prestigious universities of the Ivy League; they finally turned worldwide by 2013. The first MOOCs were computer science courses, but they now encompass a wide range of topics including physics, medecine, biology, philosophy, etc. The teaching method used in those recent MOOCs is pretty similar to the traditionnal top-down approach of most college classes. They are often called xMOOCs, to distinguish them from cMOOCs. Assignments and exams play a key role in this type of course. Most are taught in English, although the movement is slowly gaining momentum in the French-speaking world. So far, research has focused on cMOOCs, and despite the increasing amount of data, little is known about the underlying mechanisms of xMOOCs. Here we report on some results from the first French xMOOC, ABC de la Gestion de Projet (Fundamentals of Project Management). This course was set up by Remi Bachelet, Professor at the Graduate Engineering School Centrale Lille, along with a team of academic and student volunteers. It was scheduled between March 18th and  April 21st. Peer assessment, a familiar approach in xMOOCs, was used extensively to grade students’ assignments . Peer assessment is one of the major challenges of this new  teaching approach, since it is one of the only ways to scale up the evaluation process when automated evaluation does not apply. Moreover, it is said to be a very effective teaching approach. Some studies have shown that there is a high degree of correlation between peer and teacher’s assessments. Nevertheless, most of the research on peer assessment has been carried out on small numbers of students. However it is not that clear how relevant and efficient this practice is in the context of a MOOC. Firstly, how reliable is the assessment process? Does reproductibility depend on the type of assignment, upon the instructions given? Is it possible to detect categories of students through peer assessment, and categories of assessors? Is there a link between the behavior as an assessor and the marks received through peer evaluation?

Methods

Attendees enrollment and MOOC’s early organization

Students were enrolled from January 10th to March 21st 2013. From January 10th to March 2nd, students were recruited via Bachelet’s OpenCourseWare website http://gestiondeprojet.pm, and redirected towards a Google Group forum. In addition to gestiondeprojet.pm, social networks suchs as Twitter, Google+ and Facebook were used to reach students. Some newspapers blogs reffered to the MOOC a day before the registration deadline (Figaro l’Etudiant 20/03/2013). Enrollment stopped on March 20th with 3596 students. The efficiency of these different enrollment methods were monitored whenever possible through Google Analytics or through click analytics such as goo.gl or bit.ly. On Google Group, students were free to present themselves, discuss about their motivations, etc. According to R. Bachelet, this pre-MOOC was meant to get students used to the functionning of the MOOC before it was launched. From March 2nd onwards, students were lead to Canvas.net, the platform that hosted the MOOC. The Google Group was not closed during the MOOC, but was not officially declared shut down. The course began officially on March 18th.

Sequence of the course on Canvas and on social networks

Canvas is an open source platform, launched in 2011 by Instructure, a US company founded in 2008. It is used as a Learning Management System by several American universities. It was originally designed for courses with a few dozen students, a few hundreds at most. In August 2012, it was decided to join the momentum and develop/host MOOCs on LMS Canvas ; the catalog of MOOC courses is called Canvas.net. The first MOOCs started on Canvas in January 2013. Unlike other MOOC hosting platforms like edX and Coursera, the participants numbers are limited on Canvas due to technical constraints, and the theoretical upper limit is set at 10 000 participants. Videos were hosted on Youtube and viewed within the platform, and downloadable through Canvas. In total, there were 32 videos spread on four weeks, 5 to 10 minuts each in average, for a total of 5 hours. A forum was open in Canvas and communities were created on social networks Google+ and Facebook for more informal discussions in addition to the forum. 388 and 436 students had joined these social networks on April 21st.  Both forums and social networks were moderated and closely followed by the MOOC staff. On Twitter, the corresponding hashtag was #MOOCGdP.

Quizzes, assignments, exams, certification

Each video was followed by a quizz in order to check whether its content had been seen and understood. Answers could be submitted three times, only the best mark was taken into account. The final exam was open from April 14th to 21st. It was made of 45 questions, there was no time limit but only one submission was possible. There were three types of certificates: a basic one, an advanced one and a team certificate. Over the 3596 enrolled students, 618? took part in the advanced certificate. Successful completion of the quizzes and the exam was sufficient to get the basic certificate.  280 points out of 400 were needed to pass the basic certificate. To obtain the advanced certificate, students were required to submit at least three of the four assignments that were given, and a score of 560 out of 800 points. 200 points could be obtained from quizzes, 200 from the exam, and 400 through assignments, each assignment bringing 100 points. They were case study centered: the launching of a new type of biscuit in a company. The first assignment was to draw up the project plan by means of a concept map through Visual Understanding Environment or any other Open Source application. For the second assignment, students had to submit a discounted cash flow using powerpoint. The third one was about minuting synthetically a project meeting, using a standardized form. For the last assignment, students had to illustrate a project schedule by means of a Gantt chart. Each assignment was due on Saturday at midnight, CEST (respectively 23rd March, 30rd March, 6th April, 13rd April).  The day following the submission, each student was asked to evaluate four assignments. Only the students who had submitted an assignment were allowed to evaluate their peers. There was neither time limit for this evaluation process, nor did the students gain points by joining the peer assessment process. Students were given an assessment grid, available before the submission, as well as instructions. Regarding the concept map, a guide was designed to help students evaluate their peers; for the discounted cash flow, a correct version of the calculus was provided; for the meeting report, there was only the assessment grid and for the Gantt, a correct version and a evaluation guide were available. The final mark was decided by the MOOC team on the basis of these peer evaluations.

Data

The only data directly available from Canvas is the gradebook and answers to quizzes and surveys. Student activity reports are available, providing data on pages visited, quizzes submitted and corresponding timestamps, but not downladable. These reports are not a log file, since only the last timestamp is available. It is not possible to save timestamps for all logs. A Ruby code was written by the MOOC team to download these reports in JSON. A R script was then used to extract required data and build a database. Regarding peers evaluation, a similar process was used to obtain data on marks, comments and timestamps. In addition to these data, students were asked to fill in three surveys over the duration of the course. The first survey was posted on March 18th, the mid-term survey was posted April 7th and the last on on April 21st. An additionnal survey was posted only for those who had taken part in the advanced certificate. Those surveys provided data on gender, sociology of the MOOC, motivations, among other things.

Statistical Analysis

After anonymisation, data were analyzed with the open source statistical software R 2.12. The linear model was used whenever possible, otherwise non-parametric tests were used. In the following analysis, we’ll refer to the mean of marks given by the four pairs as average mark of the user. The corresponding coefficient of variation is the division of the standard errors of the marks by this average mark. It reflects the consensus around a given mark. The higher this coefficient is, the lower the consensus. In order to evaluate the behavior as an assessor we calculated an index of severity for each evaluation. We computed the difference between the mark given by the assessor and the average mark of the artifact. The average mark of the artifact is the mean of the marks an artifact get, for in general. This index can be negative if the assessor systematically give bad marks. The number of evaluations done for a given assignement and timestamps were available for each evalution. Out of these timestamps, it is possible to compute the time between two submissions, that respesents an estimate of the maximum time spent in the process of evaluating an artifact. In order to detect categories among students, the classification algorithm kmeans was used. Based on input data such as assessor index, average mark of the user, or other data, and after parametrization, it computes categories. This algorithm does not work with missing values, which were therefore not taken into account in the analysis.

Results

Out of 3586 participants enrolled, 481 submitted assignment 1 (concept map), and 526 submitted assignment 3 (minutes of the meeting). The majority of those taking part in peer evaluation were educated at least to university post-graduate?? level (Bac+5). According to the demographics questionnaire, this tranche represents 61,7% of all MOOC participants (all certificates), and 70% of MOOC/Advanced certificate participants

Anúncios
Esse post foi publicado em Uncategorized. Bookmark o link permanente.

Deixe um comentário

Preencha os seus dados abaixo ou clique em um ícone para log in:

Logotipo do WordPress.com

Você está comentando utilizando sua conta WordPress.com. Sair / Alterar )

Imagem do Twitter

Você está comentando utilizando sua conta Twitter. Sair / Alterar )

Foto do Facebook

Você está comentando utilizando sua conta Facebook. Sair / Alterar )

Foto do Google+

Você está comentando utilizando sua conta Google+. Sair / Alterar )

Conectando a %s