Let me walk you through our approach—one that you might find valuable for your own learning development projects.
Healthcare learning content requires absolute precision. Get it wrong, and you're not just creating ineffective training—you could potentially impact patient care.
Our ATLAS project faced a common challenge: we needed to integrate expertise from various specialists while maintaining a cohesive voice and ensuring educational effectiveness. This meant coordinating:
Academic rheumatologists with deep but highly specialized knowledge
Primary care physicians with broad practical experience
Learning design experts focused on engagement and retention
Project managers concerned with timelines and resources
Without structure, this could quickly become a chaotic process with contradictory feedback and endless revision cycles.
We implemented a structured, tiered review process that balanced thoroughness with efficiency:
All content first passed through our core review team, consisting of:
In-house Project Lead Expert – A senior rheumatologist who verified clinical accuracy
Clinical Lead – A practicing physician who assessed practical relevance and application
Project Manager – Who evaluated alignment with project scope and timeline constraints
This core team provided the first comprehensive review, identifying major clinical issues before content moved to the next stage.
Once approved by the core team, content moved to our instructional design team where we:
Created detailed storyboards reflecting the approved content
Applied adult learning principles to enhance engagement
Designed interactions and assessments to reinforce key concepts
Developed a consistent voice and visual style
Chunked content appropriately for digital consumption
This transformation phase was critical—we weren't just formatting content; we were restructuring it for optimal learning while preserving clinical accuracy.
After digital development, content underwent a rigorous external SME review. Unlike the original content SMEs, these reviewers evaluated the material against a structured evaluation matrix that assessed:
Clinical Accuracy: Was the medical information current and correct?
Educational Effectiveness: Would the approach lead to knowledge retention and application?
Usability: How intuitive was the navigation and interaction?
Accessibility: Could all learners access and understand the content?
Alignment with Guidelines: Did content align with current best practice guidelines?
These external reviewers brought fresh eyes to the content, often identifying issues that internal teams had overlooked.
Technology played a crucial role in making this multi-layered review process efficient. We utilized Articulate Review 360 as our primary collaboration tool, which offered several advantages:
Contextual Feedback: Reviewers could comment on specific elements rather than providing general feedback
Centralized Discussion: All stakeholders could see each other's comments, reducing contradictory feedback
Version Control: Clear tracking of which version was being reviewed
Resolution Tracking: Ability to mark comments as addressed, with documentation of how they were resolved
Multimedia Review: Capacity to comment on text, interactions, and multimedia elements
This platform transformed what could have been a disjointed process into a streamlined collaboration.
A typical module would progress through our review framework as follows:
Initial Content Development: Subject experts drafted core content
Core Team Review: The project lead, clinical lead, and project manager provided initial feedback
Content Revision: Updates based on core team feedback
Storyboarding: Instructional designers transformed content into learning blueprints
Development: Creation of digital learning assets
External SME Review: Evaluation against our quality matrix via Review 360
Refinement: Addressing external SME feedback
Final Approval: Sign-off from core team before launch
This cyclical process typically involved 2-3 revision rounds, with each cycle producing increasingly refined content.
Our system wasn't without challenges. Here's how we addressed common hurdles:
When experts disagreed (which happened regularly with emerging topics like medicinal cannabis), we implemented a consensus-building protocol:
Documented areas of agreement and disagreement
Facilitated evidence-based discussion
Created tiered content structures that presented core consensus with additional perspectives as supplementary information
To prevent delays from busy SMEs, we:
Set clear review deadlines with calendar invitations
Created focused review sessions for specific content areas
Developed templates that guided efficient feedback
Established escalation protocols for when reviews were delayed
With multiple contributors and reviewers, maintaining a consistent voice was difficult. Our solution:
Developed a comprehensive style guide
Assigned a content editor to harmonize language
Created exemplar modules that demonstrated the desired approach
Included voice and tone as specific review criteria
The true measure of our review process was in the outcomes:
Reduced Revision Cycles: Average revision rounds decreased from 4.5 to 2.7
Improved Clinical Accuracy: Post-launch clinical corrections reduced by 92%
Higher Learner Satisfaction: Content quality ratings increased to 4.8/5 (up from 4.2/5)
Streamlined Production: 23% reduction in development time for new modules
Enhanced Stakeholder Satisfaction: SMEs reported greater confidence in the final product
If you're looking to implement or improve your SME review cycle, consider these principles:
Structure is essential: Define clear roles, responsibilities, and review criteria before starting
Layer your reviews: Different reviewers should focus on different aspects rather than everyone reviewing everything
Use the right tools: Platforms like Review 360 dramatically improve collaboration quality
Document everything: Keep clear records of feedback and resolutions
Focus reviews with templates: Guide reviewers to provide the specific feedback you need
Build in time buffers: SME delays are inevitable; plan for them
Celebrate improvements: Recognize when the process is working well
A well-designed SME review process is more than quality control—it's a collaboration framework that brings together diverse expertise to create truly effective learning experiences.
What review challenges have you encountered in your instructional design work? I'd love to hear your experiences in the comments.