§2.7

Data Analysis Procedures

Data analysis was applied to the curriculum artifact trail (§2.6.1) and used the course evaluation inputs (§2.6.2§2.6.4) as contextual triggers explaining why specific revisions occurred. The analysis is qualitative and developed an account of how the curriculum changed across iterations and what design moves recurred. Reflexive practices, including positionality statements and ongoing peer feedback, were used to examine my assumptions as an instructor-researcher.

2.7.1 Thematic Coding of Curriculum Revisions

Curriculum revisions, captured in dated slide-deck and module changes, were coded thematically using procedures informed by Lungu (2022). Open coding generated descriptive codes for each revision (for example: tool swap, accessibility adjustment, prompt clarification, example update, pacing change, ethics framing addition). Axial coding then grouped related codes into higher-order themes that surfaced the recurring design moves across the four iterations.

The coding analyzed the revisions themselves — what changed in each slide deck, module, or assignment, and what kind of trigger preceded the change. Where course evaluation feedback noted a recurring student-side issue (a confusing instruction, a tool that had become inaccessible, a missing scaffold), that issue is treated as the contextual trigger for the next-iteration revision and is recorded alongside the revision in the artifact trail. The thematic analysis surfaced repeated reasons for curriculum changes that explain the design thinking behind how the course was revised and scaled (RQ1), and it surfaced patterns of contextual and technological adaptation visible across iterations (RQ2A, RQ2B).