
7 minute read
ANOTHER F@£KING pie chart?!
Lewis Carr discusses the art of meaningful data in your LMS.
All LMS platforms worth their salt offer some kind of reporting tool. Some include dashboards, others include full-blown analytics. With all these tools at our fingertips, it’s tempting to fall into the trap of creating a report for everything and building funky widget-driven dashboards, each with exploding pie charts, sexy data tables and graphs you didn’t even know were graphs!
And because everyone loves a graph, you’re always being asked for “yet another report”, but just because that graph is really pretty, doesn’t mean it’s really useful. In the movie Jurassic Park, they’re so busy figuring out whether they could create dinosaurs that they forget to ask whether they should.
We must look closer at the story the data tells us and dive into why understanding your data is crucial before you start playing with those fancy reporting tools.
Beautiful but Meaningless Data
So, you’ve just generated a stunning 3D pie chart that shows learner engagement across all your courses. It’s colour-coded and interactive. When you click it, it explodes into a heat map or scatter diagram. It is the Inception of all dashboards, a dashboard within a dashboard, within a dashboard. But even Christopher Nolan’s head starts to spin when you try to make sense of it, as you have no idea what it actually means.
Understanding the Data behind the LMS
Every piece of data in your LMS represents something in the real world of learning. For instance, when your platform says a learner is “active,” what does that really mean? Is it based on logins? Course progress? Discussion participation? And what time period is this activity measured?
These aren’t just pedantic questions. They’re crucial for interpreting your data correctly. For example, if you’re reporting on course completion rates, but your definition of “enrolled” includes learners who signed up but never started the course, your numbers could be seriously skewed.
In my day job, I get asked by clients to create reports all the time, even though they all use the same core LMS platform. So why do I keep getting asked for reports when the LMS has reports built-in? Great question. Indeed it sounds like the reports in the LMS are crappy, and why don’t I just build a new set of reports that I can reuse across multiple platforms? The answer to this question is simple: it is because the same data set can mean different things to different people. Metrics such as total logins, course completions, and quizzes passed are just numbers, and without context, they mean nothing. For example, if I were measuring the BMI of a bloke named Dave, and his BMI was 38, you would assume Dave needs to lay off the pies. However, what if I told you that Dave was a 6 foot 9 rugby-playing boxer and built of pure muscle? Dave’s BMI doesn’t mean the same thing now. This is why your LMS metrics don’t always mean the same thing, and this is why I’m forever creating reports for clients. And it’s why we don’t mess with Dave. No one messes with Dave.
The Devil’s in the Details: A Case Study
Let’s look at a real-world example to illustrate why understanding the data matters. Suppose you want to create a report on learner engagement in your online courses.
At first glance, you might think it’s as simple as pulling data on how often learners log in. But dig a little deeper, and you’ll realise it’s not that straightforward. There are a few things we need to consider:
Defining Engagement: Is a learner who logs in daily but never posts in forums more or less engaged than one who logs in weekly but actively participates in discussions? How do we even determine this? Back in school, I was that kid who rarely asked a question in class. This didn’t mean I wasn’t engaged, it was that I understood what was being taught and didn’t need the teacher to elaborate (plus, I was a smart ass. Yep, I was that kid too). Is a learner who only takes the quiz once and passes more or less engaged than the learner who takes the quiz three times?
Data Sources: You might need to combine data from multiple places in your LMS – logins, forum posts, assignment submissions, and quiz attempts, for instance. And each source must be able to tell a story. If all your class scores 100% on your quiz, does this mean the questions were too easy? Or that you have a class full of evil geniuses? Or is your teaching so incredible that your learners are all able to ace your quiz?
Timeframes: Should you consider engagement over the entire course, or just the most recent week or month? Sometimes, learners cram for exams, or may ramp up their efforts when the course expiry is approaching. Time frames therefore, differ for each learner, so you need to be clear on what story your timeframes tell you. The question I get asked a lot is, “How long does it take for someone to complete the course”? This depends on so many factors. Are they studying full-time or part-time? Do they have a family to look after? Are they taking multiple modules at once? It’s like asking how long it would take for someone to get to Scotland. Are they walking? Going by train? Stopping off along the way? Saying a course takes 3 months to complete makes the assumption the user is going to sit down and work on the course for 3-months. Yes, I can sit and binge-watch Game of Thrones in a few days, but realistically, I’m not going to. I’ve got real work to do.
Contextual Factors: How do you account for differences in course design? A course with weekly deadlines might show different engagement patterns than a self-paced course. Courses that are drip-fed or include face-to-face sessions affect engagement. Without context, you end up with stupid statements like, “In January, we had 100 engaged learners, but in March, we had 20. Therefore, we think engagement declines after 3 months, so we need to do something…quick!”. What might really be happening is that your learners download all the PDFS locally in January and, therefore, don’t need to use the LMS much after that. The learners are still deeply engaged, perhaps even more so, but your data doesn’t show it.
By wrestling with these questions, you ensure that your reports and dashboards, however they look, provide meaningful insights.
The Power of Informed Analysis
Before you think about visualisations, start with a question you want to answer and include a why.
“Why do you need to know this, and what will you do once you know the answer?”
Then, make sure you provide context - explain what the data represents and how it is calculated.
In the end, the goal of data analysis in an LMS isn’t to create the prettiest charts. It’s to gain insights that can improve teaching and learning. So, the next time you’re tempted to create a report and build a dashboard, take a step back. Dive into your data, understand its nuances, and ensure it means something before you worry about the visualisations. That exploding pie chart will be a whole lot cooler and more useful if it tells a good story. This is where your analytics become useful, they inform you of what’s really happening and why. Then, armed with this, you can start making continual improvements to your courses.