When developing high-quality healthcare education, your Subject Matter Expert (SME) review process can make or break your project. At Arthritis & Osteoporosis WA, we developed a robust, multi-layered review framework for our ATLAS eLearning project that ensured both clinical accuracy and instructional effectiveness.

Let me walk you through our approach—one that you might find valuable for your own learning development projects.

The Challenge: Multiple Stakeholders, One Voice

Healthcare learning content requires absolute precision. Get it wrong, and you're not just creating ineffective training—you could potentially impact patient care.

Our ATLAS project faced a common challenge: we needed to integrate expertise from various specialists while maintaining a cohesive voice and ensuring educational effectiveness. This meant coordinating:

  • Academic rheumatologists with deep but highly specialized knowledge

  • Primary care physicians with broad practical experience

  • Learning design experts focused on engagement and retention

  • Project managers concerned with timelines and resources

Without structure, this could quickly become a chaotic process with contradictory feedback and endless revision cycles.

Our Multi-Tier SME Review Framework

We implemented a structured, tiered review process that balanced thoroughness with efficiency:

Tier 1: Core Review Team

All content first passed through our core review team, consisting of:

  • In-house Project Lead Expert – A senior rheumatologist who verified clinical accuracy

  • Clinical Lead – A practicing physician who assessed practical relevance and application

  • Project Manager – Who evaluated alignment with project scope and timeline constraints

This core team provided the first comprehensive review, identifying major clinical issues before content moved to the next stage.

Tier 2: Instructional Design Transformation

Once approved by the core team, content moved to our instructional design team where we:

  1. Created detailed storyboards reflecting the approved content

  2. Applied adult learning principles to enhance engagement

  3. Designed interactions and assessments to reinforce key concepts

  4. Developed a consistent voice and visual style

  5. Chunked content appropriately for digital consumption

This transformation phase was critical—we weren't just formatting content; we were restructuring it for optimal learning while preserving clinical accuracy.

Tier 3: External Expert Review

After digital development, content underwent a rigorous external SME review. Unlike the original content SMEs, these reviewers evaluated the material against a structured evaluation matrix that assessed:

  • Clinical Accuracy: Was the medical information current and correct?

  • Educational Effectiveness: Would the approach lead to knowledge retention and application?

  • Usability: How intuitive was the navigation and interaction?

  • Accessibility: Could all learners access and understand the content?

  • Alignment with Guidelines: Did content align with current best practice guidelines?

These external reviewers brought fresh eyes to the content, often identifying issues that internal teams had overlooked.

Leveraging Articulate Review 360

Technology played a crucial role in making this multi-layered review process efficient. We utilized Articulate Review 360 as our primary collaboration tool, which offered several advantages:

  • Contextual Feedback: Reviewers could comment on specific elements rather than providing general feedback

  • Centralized Discussion: All stakeholders could see each other's comments, reducing contradictory feedback

  • Version Control: Clear tracking of which version was being reviewed

  • Resolution Tracking: Ability to mark comments as addressed, with documentation of how they were resolved

  • Multimedia Review: Capacity to comment on text, interactions, and multimedia elements

This platform transformed what could have been a disjointed process into a streamlined collaboration.

The Review Cycle in Action

A typical module would progress through our review framework as follows:

  1. Initial Content Development: Subject experts drafted core content

  2. Core Team Review: The project lead, clinical lead, and project manager provided initial feedback

  3. Content Revision: Updates based on core team feedback

  4. Storyboarding: Instructional designers transformed content into learning blueprints

  5. Development: Creation of digital learning assets

  6. External SME Review: Evaluation against our quality matrix via Review 360

  7. Refinement: Addressing external SME feedback

  8. Final Approval: Sign-off from core team before launch

This cyclical process typically involved 2-3 revision rounds, with each cycle producing increasingly refined content.

Challenges and Solutions

Our system wasn't without challenges. Here's how we addressed common hurdles:

Challenge: Contradictory SME Feedback

When experts disagreed (which happened regularly with emerging topics like medicinal cannabis), we implemented a consensus-building protocol:

  • Documented areas of agreement and disagreement

  • Facilitated evidence-based discussion

  • Created tiered content structures that presented core consensus with additional perspectives as supplementary information

Challenge: Review Bottlenecks

To prevent delays from busy SMEs, we:

  • Set clear review deadlines with calendar invitations

  • Created focused review sessions for specific content areas

  • Developed templates that guided efficient feedback

  • Established escalation protocols for when reviews were delayed

Challenge: Maintaining Voice Consistency

With multiple contributors and reviewers, maintaining a consistent voice was difficult. Our solution:

  • Developed a comprehensive style guide

  • Assigned a content editor to harmonize language

  • Created exemplar modules that demonstrated the desired approach

  • Included voice and tone as specific review criteria

Measuring Success

The true measure of our review process was in the outcomes:

  • Reduced Revision Cycles: Average revision rounds decreased from 4.5 to 2.7

  • Improved Clinical Accuracy: Post-launch clinical corrections reduced by 92%

  • Higher Learner Satisfaction: Content quality ratings increased to 4.8/5 (up from 4.2/5)

  • Streamlined Production: 23% reduction in development time for new modules

  • Enhanced Stakeholder Satisfaction: SMEs reported greater confidence in the final product

Key Takeaways for Your Review Process

If you're looking to implement or improve your SME review cycle, consider these principles:

  1. Structure is essential: Define clear roles, responsibilities, and review criteria before starting

  2. Layer your reviews: Different reviewers should focus on different aspects rather than everyone reviewing everything

  3. Use the right tools: Platforms like Review 360 dramatically improve collaboration quality

  4. Document everything: Keep clear records of feedback and resolutions

  5. Focus reviews with templates: Guide reviewers to provide the specific feedback you need

  6. Build in time buffers: SME delays are inevitable; plan for them

  7. Celebrate improvements: Recognize when the process is working well

A well-designed SME review process is more than quality control—it's a collaboration framework that brings together diverse expertise to create truly effective learning experiences.

What review challenges have you encountered in your instructional design work? I'd love to hear your experiences in the comments.