Augmented reality and worked examples: Targeting organic chemistry competence

Research output: Contribution to journalArticlepeer-review

13 Downloads (Pure)

Abstract

Instructional guidance, provided using worked examples, helps the inexperienced learner cope with complex information, that may be difficult to process in limited capacity working memory. For students of chemistry, such complex information can pertain to the visualisation of structural changes in molecules throughout chemical reactions. This can be alleviated through the affordances of augmented reality (AR) technology. 3D structures are important as they have a crucial impact on the chemical and physical properties of molecules. Within a framework of Cognitive Load Theory, this study illustrates how AR-supported worked examples may enhance learning of electrophilic aromatic substitution. The participant cohort were FHEQ level 5 undergraduate students studying a module of organic chemistry. In addition, the achievement motivation of learner's was also explored, and how this may be impacted by the provision of AR technology and worked examples. The control group was provided with a copy of our worked examples that contained 2D reaction mechanism drawings. Data was collected using a combination of quantitative instruments and qualitative surveys/interviews. For this cohort of students, significant intragroup improvements, and greater normalised change values, in conceptual understanding were observed in the AR group. This was not observed in the control group. No significant intergroup differences in reported cognitive load or achievement motivation of students were found. This was unaffected when introducing prior relevant chemistry experience as a covariate. Student feedback and subsequent thematic analysis show not only the positive impacts on student engagement, but also how students convey their understanding of electrophilic aromatic substitution principles.
Original languageEnglish
Article number100021
JournalComputers & Education: X Reality
Volume2
Early online date28 Apr 2023
DOIs
Publication statusPublished - 2023

Cite this