Planning and Analysis

Why this supra‑badge matters 

Planning and Analysis is where sound instructional design begins—and where leadership shows up first. Building strong solutions requires front‑end clarity: Who are the learners? What performance matters? Where are the constraints and risks? By integrating analysis into planning, I reduce surprises downstream and keep design and development aligned with real performance needs. The artifacts on this page demonstrate that discipline across multiple contexts and methods. 

Sub‑Badge: Gap Analysis 

Competencies. Conduct a rigorous performance analysis to define the Desired Status, Actual Status, and the Discrepancy; triangulate with interviews, surveys, observations, and performance data; translate findings into design implications. 

Personal achievements. My Sailing School Graduates Gap Analysis surfaced why some learners struggled to navigate the Columbia River after completing ASA 101. By contrasting desired on‑water decision‑making with actual post‑course performance and validating with SME interviews, learner surveys, and personal observations, I pinpointed river‑specific scenario gaps (currents, traffic patterns, “no‑go” zones). Those findings redirected design toward scenario‑rich training that mirrors the realities graduates face as new club members. 

Sub‑Badge: Target Population & Environment 

Competencies. Profile the audience and context: entry skills, prior knowledge, attitudes, delivery access, preferences, group characteristics, and environmental constraints; turn insights into concrete design decisions. 

Personal achievements. My Learner Analysis for sailing students mapped these factors into clear implications for instruction. It informed modality mix (job aids for just‑in‑time use, focused eLearning for knowledge and recognition, VILT/ILT for coached practice), scoped the amount of scaffolding each topic required, and calibrated tone and challenge level. This analysis‑to‑action thread now frames every build plan I create. 

Sub‑Badge: Analysis Techniques for Instruction 

Competencies (Challenge 1). Determine subordinate/prerequisite skills and knowledge; align enabling skills to objectives and assessments; anticipate barriers to performance. 

Personal achievements (Challenge 1). In my AI Tools Academy Evaluation Plan, I mapped instruments directly to objectives and identified prerequisite skills and likely blockers (tool access, policy constraints, novice misconceptions). This ensured measurement focuses on the capabilities that matter and that supports (job aids, quick wins, exemplars) are in place before evaluation. 

Competencies (Challenge 2). Analyze diverse sources and validate content quality; synthesize evidence to guide design. 

Personal achievements (Challenge 2). My literature review—Transforming Workforce Training: The Impact of AI on Soft and Traditional Skills Development—synthesized 30+ peer‑reviewed sources at the onset of the recent AI wave. That early, methodical scan helped separate durable findings from hype and now informs patterns I use for coaching, deliberate practice, and feedback in AI‑mediated learning. 

Sub‑Badge: Analyze Technologies 

Competencies. Evaluate existing and emerging technologies (features, affordances, constraints, risks); align technology choices with goals, learners, and context; propose viable applications. 

Personal achievements. In Innovating Education: Leveraging Constructivism, AI‑Enhanced VR, and Adaptive Learning Platforms for Immersive Learning Experiences, I analyzed VR, AI, and ALPs for instructional feasibility, transfer potential, and implementation risks. The result is a practical lens I now use to judge when immersive tech adds genuine value (e.g., spatial judgment, consequence visualization) and when lighter‑weight solutions are better. 

Overall experience: what I gained 

Working through this supra‑badge sharpened my front‑end toolkit and decision hygiene. I’m faster and clearer at moving from raw data to actionable design constraints; better at sequencing prerequisites and aligning measures; and more disciplined in documenting assumptions so stakeholders can see—and challenge—the logic before we build. 

Applying this in current and future practice 

  • Current practice. Every project begins with a compact analysis plan: define the gap, profile the learners/context, decompose prerequisites, and pre‑align measures. This keeps SMEs focused, shortens review cycles, and reduces rework. 
  • Future practice. I’ll continue to pair evaluation mapping with technology analysis so that measurement and implementation travel together. The goal is consistent: fewer surprises, tighter alignment to performance, and solutions that stand up in the real world. 

Closing thought 

Planning & Analysis made the decision‑work visible: the upstream habits that prevent downstream rework. The artifacts here don’t just show what I discovered; they show how I ask, verify, and translate evidence into design choices—habits I’ll keep refining as projects scale in complexity and impact. 

Challenge 1: Thoroughness in executing a gap analysis

Challenge 1: The skillset for determining characteristics of a target population and/or an environment that may impact the design and delivery of instruction

Challenge 1: The ability to determine subordinate and prerequisite skills, and knowledge

Challenge 2: Dedication to use appropriate techniques to analyze various types and sources to validate content

Challenge 1: Being aware and open enough as a designer to analyze the characteristics of existing and emerging technologies and their potential use