9 minute read

WHO’S YOUR ADDIE?

Lewis Carr lays on the couch to talk us through his ADDIE issues and the problems with e-learning project design.

ADDIE. You know the one, Analysis, Design, Development, Implementation, and Evaluation. It’s a trusty and well-used model, but can we apply it to e-learning projects in real life?

I’m talking about actual, real-world projects where the client is the boss and, to some extent, the project manager.

Before we dive in, let me tell you a little about ADDIE. This instructional design framework has been around since the 1950s, when it was first developed by the US Army to create effective training programmes. Over the decades, ADDIE has become the gold standard for designing all kinds of learning experiences, from classroom-based training to online courses and beyond.

Of course, ADDIE isn’t the only player in the game when it comes to instructional design. Some eLearning teams have success with more iterative frameworks like LLAMA or Design Thinking. But for large-scale, highstakes projects that require a robust, structured approach, ADDIE is still tough to beat.

So what is it about ADDIE that has made it such an enduring model? For starters, its linear, systematic approach provides a clear roadmap for eLearning development. Breaking the process down into distinct phases helps designers stay organised, focused, and on track. The emphasis on analysis, evaluation, and continuous improvement ensures that the final product is carefully tailored to learner needs and performance objectives.

Now, before you all start critiquing my project management skills, let me tell you that ADDIE is great. I’m a fan; however, things can go awry quite quickly when the client dictates the project and all this clever ADDIE stuff goes out of the window.

ALet’s look at the first part of the model: A for Analysis. This is where we gather and analyse all the stakeholder requirements, learning objectives, and target audience details. From learner pain points to subject matter expertise, we leave no stone unturned. We may even conduct a needs assessment and job task analysis if that’s what it takes to truly understand the learning.

However, in the real world, it is very difficult to get clients to actually sit down and provide a comprehensive list of requirements, learning objectives, and audience details. More often than not, it’s a constant game of “Oh, by the way…” and “Can you just…?” as they trickle in new information every other day or after you have started the build. And don’t get me started on subject matter experts - they’re often juggling several jobs and barely have the time to answer an email, let alone meet with you. So, you can’t have all the information you thought you were getting. Analysis still happens, but it’s based on past experiences and gut feelings as a developer.

Getting the most out of the analysis phase is all about thoroughness and attention to detail. I’m talking about comprehensive needs assessments, detailed audience profiles, and a deep dive into the content and its associated tasks and competencies. The better your analysis, the stronger the foundation for the rest of the project. But when this doesn’t happen, and that project brief you were promised never materialises, then you can scratch off that all-important “analysis” section and move on to “D”.

DD is for “Design”. As designers, we love to start by sketching out storyboards, planning the navigation, and selecting the perfect colour scheme. After trawling Dribbble for inspiration, we get all excited about the instructional strategies we can employ, such as scenario-based learning and interactive branching scenarios. The client wants gamification elements, so we set about designing those too; it all sounds incredible.

But this is where things start to come undone. As soon as we start putting together storyboards and wireframes, the clients will inevitably swoop in with a barrage of “Can we just…” requests.

“Can we just add in a fancy animated intro?”

“Can we just throw in a random game for no reason?” “Can we just make the whole thing bright pink because it’s the CEO’s favourite colour?”

“My cousin sketched this on a napkin; can you turn it into a Pixar cartoon?”

And the final crushing blow: the content we receive from scriptwriters and SMEs has no scenarios, no branching content, and zero gamification elements. It reads like an essay or a poorly bullet-pointed PowerPoint. How are we supposed to craft a cohesive, effective learning experience when we’re presented with this?

But the design phase in ADDIE isn’t just about the creative elements. It’s also where we carefully map out the learning objectives, define the instructional approaches, and plan the overall structure and flow of the eLearning course. We may get to flex our creative muscles after all, as long as we ground those ideas in solid instructional design principles. The only problem here is that the client has given us two vague learning objectives, so we spend the next few meetings teasing the rest out of them.

DNow we move on to the third letter of ADDIE, D for “Development”. Development is where the real magic happens. We’ll code up those interactions, record the voiceovers, source the stock imagery, and AI the life out of everything. And let me tell you, there’s nothing quite like the satisfaction of seeing all the pieces come together. It’s like watching a caterpillar transform into a butterfly - if that butterfly was made of vector drawings pinched from stock sites, JavaScript, and clickable hotspots.

But alas, we’ll be merrily building away, thinking we’re on track, when suddenly the client decides they hate everything and demands a complete redesign. “I know we approved those interactions last month, but now I want them to look totally different.” “The voiceovers are terrible; we need to re-record everything.” “Wait, why does the script we sent you sound terrible now?” Cue the sound of our souls slowly withering away.

But rest assured, development isn’t a solo endeavour. ADDIE has taught us to keep looping back with stakeholders and subject matter experts, getting their input and feedback at every step to avoid the inevitable trap above. That way, we can address any issues or concerns early on and ensure the final product is truly aligned with learner needs. The only issue is that the client happily nods along and agrees with everything we say and show them until the last minute when suddenly they look at it properly.

IThe fourth letter in ADDIE is “I for Implementation”. Easy peasy. We’ll upload everything to the LMS, test it thoroughly, and ensure a smooth rollout that would make every project manager swoon. No hiccups, no glitches, no unpleasant surprises. Just a seamless learner experience that has stakeholders high-fiving each other and the compliance team doing cartwheels. But wait. Has the client given us enough time and resources to properly test the eLearning course before unleashing it on learners? Nope, it’s straight from development to “Just publish it already; we’re already two weeks late!” And then, of course, the inevitable avalanche of bug reports and angry user feedback comes crashing down. But hey, at least we get to play the hero when we frantically work overtime to patch everything up, right?

Implementation isn’t just about pushing the “publish” button. It’s also about meticulously testing the course, working out any kinks, and providing comprehensive training and support for the end users. But often, projects don’t go to plan, and much of this is done the day before the project goes live. Now I know a lot of project managers reading this are cursing me, saying this is all just bad project management, and although I agree, I also work directly with clients, and this is the real world (sorry PMs, but you don’t have to build the course; it’s harder than it looks).

ELastly, we move on to “E for Evaluation”. Well, that’s where the ADDIE model and the instructional designer really become the dynamic duo. We gather all the learner feedback, crunch the assessment data, and use those insights to keep refining the course. It’s a never-ending cycle of improvement - just the way I like it. After all, what’s the point of designing an eLearning masterpiece if you’re not going to continuously iterate and enhance it?

However, as if clients actually care about collecting data, analysing results, and using that insight to improve the course. They’re already onto the next project, leaving me to awkwardly ask, “Uh, so should we send out a survey or…?”

And heaven forbid I suggest doing a proper postimplementation review - they’ll just glare at me and say, “Can’t you just wrap this up already? We need to move on.” “We haven’t budgeted for phase two.” I have yet to see a course refactored based on student feedback. Once built, it is forever SCORMED in the LMS tomb.

The evaluation phase isn’t just about the final product. We’re supposed to build formative check-ins throughout the entire process so we can make agile adjustments and refinements as we go. That way, we’re not waiting until the end to get crucial feedback - we’re incorporating it every step of the way. If only things were this simple. Yes, there are check-ins in all projects, just as there are testing windows. Often, we end up doing continual changes throughout the build, and then there is still a massive end-of-project testing script (or unscripted testing script), which equates to countless hours spent making tweaks.

I realise that this article sounds very negative, and don’t get me wrong, I’ve been involved in projects that have gone smoothly and used ADDIE to great effect. Incorporating key ADDIE principles - like upfront learner analysis and continuous feedback loops - can be incredibly valuable. But I wrote this article to tell the truth; hopefully, it has resonated with you. Even with the best intentions, you will be railroaded by a client who doesn’t like the characters you’ve drawn, the padding above the logo or the white space you thought looked ace.

Now, it’s not all doom and gloom. Every project starts with a plan, but it’s how we find a balance that allows us to apply ADDIE while accommodating the whims of the client and the human side of development. If you manage all this, then you are a design/project manager legend and I tip my hat to you.

This article is from: