People often find my question frustrating—especially when the request seems clear and straightforward. It can seem like a waste of time to pause and unpack why learning is needed, rather than jumping straight to what the instruction should cover.
But I ask the question to uncover a hidden problem with the request:
When we build learning experiences around content,
we miss the
opportunity to design solutions.
Learning experiences—from microlearning to courses to curricula—built around a topic often fall short not because the content is wrong, but because they skip a critical step: defining the specific problem the learning is meant to solve. That problem isn't about what information the course should cover, it's about what you want learners to be able to do that they can't already, and why that matters to them.
I use the “Iceberg Illusion” to help non-designers see what instructional design and strategy really involves. It’s easy to think of design as just the visible part—activities, materials, or course content—but real learning only happens when those visible parts are built on top of deeper concerns.
Instructional Design
Iceberg Illusion
Beneath the surface, we start with the organization’s goals and identify the knowledge and skills people need to achieve them. From there, we clarify the design promblem—the specific knowledge and skills the learners lack.
Next, we ask a crucial question: Why would learners want to acquire this knowledge or skill? Understanding learner motivation allows us to choose a strategy for engaging learners in activities that address the design problem and the organization's goals. Only then do we start creating the visible content.
Building up this foundation below the visible layer takes time, but is what transforms "building a course" into "designing a solution." When we do the deep work upfront, the content becomes obvious and our focus can remain on learning experiences, high-quality materials, and diagnostic assessments that create real, lasting change.
See these ideas in action by exploring my work.
When training staff on a large and complex topic—genetics, in this case—completing a full analysis before development (as in the ADDIE model) would delay delivery indefinitely. Because staff in a national research program needed to field genetics questions immediately, I created and led designers, stakeholders, and learners a new agile-inspired workflow: we identified a few essential learning objectives, launched an MVP training, and refined it using real-world feedback and analytics. This approach let us scale genetics education rapidly and effectively—focusing on the knowledge and skills staff needed most.
Using this agile workflow, we launched the first learning activities in just a few weeks. Testing and feedback guided improvements to existing content and expantion to cover more of the needs and objectives. Within a year, we saw a significant improvement in the staff's overall ability to achive the program's goals in this area.
When training staff on a large and complex topic—genetics, in this case—completing a full analysis before development (as in the ADDIE model) would delay delivery indefinitely. Because staff in a national research program needed to field genetics questions immediately, I created and led designers, stakeholders, and learners a new agile-inspired workflow: we identified a few essential learning objectives, launched an MVP training, and refined it using real-world feedback and analytics. This approach let us scale genetics education rapidly and effectively—focusing on the knowledge and skills staff needed most.
Using this agile workflow, we launched the first learning activities in just a few weeks. Testing and feedback guided improvements to existing content and expansion to cover more of the needs and objectives. Within a year, we saw a significant improvement in the staff's overall ability to achieve the program's goals in this area.