Moodle Analytics as Course Evaluation Posted on June 23rd, 2011 by

What relationship exists between student activity on Moodle and final grades in a course? I downloaded all user activity for my Spring 2011 General Psychology course (68 students across 2 sections) and made a pivot table in Excel to summarize the activity by students and activity type.


On average, each student viewed the course 114 times (Range: 39 to 269). Not surprisingly, the more often students visited Moodle, the higher their final grade (r = .25, p = .04). However, when you visit a course page, there’s lots of things you could be doing. If you always went there to take a required Quiz, it’s not too surprising that doing that frequently makes you more likely to receive a higher final grade. Therefore, I focused on the viewing of Resources (e.g., Class Schedule, Syllabus, and Contact Information) that aren’t part of the final grade. Viewing the Class Schedule and the Syllabus were not significantly correlated with the final grade. Even viewing the Review Sheets for exams did not predict a student’s final grade; however, viewing the professor’s calendar and contact information did (r = .29, p = .04).

One of the assignments that I gave this semester asked students to conduct a Knowledge Quest throughout the semester during which they chose a topic to learn about in depth and created a webpage summarizing what they learned. Their webpage was part of a larger website for the class where students could view each other’s topics. There was a link to this website on our Moodle course page. On average, students visited this website 18 times (Range: 0-75)* and students who visited more often received higher final grades (r = .32, p = .01). It’s possible that students who visited the website were actually working on their webpage; therefore, this positive correlation could just reflect that students who complete assignments do well overall. Yet, visiting the website still accounted for approximately 8% of the variability seen in the final grade after controlling for the grade earned on the Knowledge Quest assignment, suggesting that students were reading topics besides their own and that helped.

For each of the 4 sections of the course, students also wrote a forum post that made a connection between course material and something from their daily lives such as a movie, a news article, or an interaction in the residential halls. Making the post was part of the final grade, but reading the posts from other students was optional. However, my hope was that reading other students’ posts would help students integrate new information into their existing knowledge and aid memory.  Students did read each other’s posts. On average, students read 22 posts by other students (Range: 1-122), yet there was no correlation between reading these posts and their semester grade (r = -0.74, p = .55). The level of analysis here may be wrong. Possibly I should have looked at the correlation between reading the forum posts for one course section and the exam performance for that specific section rather than reading all forum posts and the final grade.

What to make of this? Well, I was mainly just curious, but I think that Moodle analytics could be used as an additional piece of evidence when faculty members are considering changes to their courses. For example, if my student evaluations indicate that students don’t find the forum posts helpful for learning, then this in conjunction with the lack of a positive correlation would suggest the need for a different assignment.

*All of these counts are approximations because a student could go directly to the Knowledge Quest website if they bookmarked it in their browser. They didn’t have to use Moodle to get to the website. Similarly, paper versions of the syllabus (and other resources)  were also usually distributed in class, so using the electronic version wasn’t required to get the information.

 

Comments are closed.