What to Blend? Exploring the Relationship Between Student Engagement and Academic Achievement via a Blended Learning Approach

The present study investigated whether student engagement with different online blended learning activities predicts academic performance as measured via a multiple-choice online exam for an undergraduate cognitive psychology course. Higher completion rates of weekly online quizzes predicted final exam performance. Findings are discussed in relation to using online learning resources to enhance student engagement and performance and learning analytics to identify students in need of further support. Since findings only revealed one significant predictor, more research is required to identify additional factors influencing academic achievement in an online blended learning approach.


Introduction
With the arrival of COVID-19 in 2020, higher education pedagogies shifted towards blended learning (i.e., a combination between online/on-site synchronous and online asynchronous activities via virtual learning platforms; Kintu et al., 2017) or fully online formats. Even prior to the pandemic, the global online education market (including investments in virtual tutoring, video conferencing tools, integrated learning systems and online teaching/learning software) had been growing and would reach over $350 billion by 2025 (Koksal, 2020). Regardless of the driver of change (e.g., social distancing; accommodating part-time and distant learners), the consensus from the academic world (Yeigh & Lynch, 2020) and regulatory bodies (Barber et al., 2021) is that many teaching and learning activities will remain online, and that blended learning "is here to stay" (Abi-Raad & Odhabi, 2021). A recent systematic review (Raes et al., 2020) on blended learning revealed that the flexibility offered improves self-regulated learning, but synchronous activities lack in interaction, are not well attended, and maintaining comparable learning standards for everyone is difficult. Consequently, the implementation of blended learning should be informed by evidence for the benefits and challenges it carries in relation to learning outcomes, teaching quality and efficiency. In this paper, we investigated whether student engagement with different activities in a blended learning approach predicts academic achievement.
A positive relationship between measurements of student engagement and academic achievement is well-documented in various programmes and instructional formats. In most studies, academic achievement is determined by grades. Student engagement is defined as the "…student psychological investment in an effort directed toward learning, understanding, or mastering the knowledge, skills, or crafts that academic work intends to promote" (Newmann et al., 1992, p. 12). This definition displays that student engagement is multi-faceted and includes behavioural (i.e., student participation in learning activities), emotional (i.e., reaction to learning and other agents) and cognitive components (i.e., strategies used for learning). A meta-analysis of 69 independent studies demonstrated a moderately strong and positive correlation between student engagement and academic achievement (Lei et al., 2018). However, it also revealed that studies report contradicting findings (Chen et al., 2013;Shernoff & Schmidt, 2008), raised the importance of accounting for individual differences in how students master skills (e.g., students who perform well, master their learning skills, and need less time to study), and highlighted that the different components of engagement might have a differential association with academic achievement (Lei et al., 2018). Crucial to this study, behavioural engagement is consistently considered to be more strongly and positively associated with academic achievement, compared to emotional and cognitive engagement (Furrer & Skinner, 2003).
In a blended learning approach, student participation in learning activities includes the three traditional types of interaction (i.e., learner-content, learner-instructor, and learner-learner interaction; Moore, 1989), which occur asynchronously and synchronously. The learner-content interaction is achieved via asynchronous activities that students complete online before or after a synchronous session (e.g., students read a textbook chapter, watch a pre-recorded video, complete a quiz). Furthermore, when designing learner-content interactions, instructors create their own online resources (e.g., via the use of different virtual learning environments and recording software, instructors create videos about a topic and embed self-paced questions for the students). They can also use online learning platforms (i.e., integrated learning systems) provided by educational publishers which include individual access to e-books and online learning activities such as flashcards, animated videos, and quizzes that students can complete asynchronously (Nevid & Gordon, 2018). The learner-instructor and learner-learner interactions mainly occur in synchronous sessions online and/or on-site (e.g., live lectures, break-out groups discussions). When designing synchronous sessions, instructors are expected to use the appropriate technology to make content lively and engaging (Martin & Bolliger, 2018) and improve students' sense of connection (Yamagata-Lynch, 2014). Despite the rapid development of blended learning, there is limited evidence about its effectiveness in relation to learning outcomes. For example, Kintu et al. (2017) explored the relationship between learner characteristics (e.g., self-regulation), design features (e.g., technology quality) and learning, and demonstrated that no variable predicts learning performance in blended learning. Furthermore, when adopting a blended learning approach instructors encounter challenges such as deciding what activities should be completed synchronously and which asynchronously (Kenney & Newcombe, 2011). Knowledge of which learning activities predict academic achievement and whether the students complete them would be useful when designing and implementing blended learning.
There is literature about the powers and pitfalls of some learning activities used in online learning, but the evidence for their associated learning benefits is mixed and dependent on the metrics of engagement and whether the activity was incentivised or not. For example, numerous studies explore the impact of asynchronous quizzes on learning. Some studies show that students' scores in practice tests positively correlate with final course grades (Van Camp & Baugh, 2014), whilst others show that using publisher-provided quizzes (Bell et al., 2015) and web-quizzes (Daniel & Broida, 2004) does not improve class performance. In addition, some studies indicate that online quizzes benefit performance only when they are incentivised with course credit (Beard, 2017;Nevid & Gordon, 2018;Sotola & Crede, 2021) and only for the high performing students who are already engaged (Nevid et al., 2020). The benefit of incentivised learning activities, especially via adding assessment weighting, on student engagement and performance has been well-documented (Beard, 2017;Chevalier et al., 2018;Nevid et al., 2021). However, other studies discuss the detriments of extrinsic rewards on student motivation (Deci et al., 2001) and reveal that incentives have no impact on effort and achievement (Gneezy et al., 2011). Moreover, incentivising all activities via assessment weighting would require substantial administrative support (Yamagata-Lynch, 2014). In addition, most studies focus on quiz performance and how it predicts final learning outcomes via retrieval practice (Roediger & Butler, 2011;Sotola & Crede, 2021), but there is no study focussing on quiz engagement per se and its impact on learning. Finally, other learning activities have gained less research attention. For example, digital textbook usage (e.g., reading time and highlighting) can predict course outcomes and can be an early predictor of student failure or success (Junco & Clem, 2015). However, there is need for more evidence about how metrics of engagement through learning analytics (e.g., time spent reading, logins, mouse click counts, video-watching frequency, page turning) can be used to support teaching and learning (Caspari-Sadeghi, 2022;Rets et al., 2021).
In addition to the mixed evidence mentioned above, when implementing online blended learning approaches, institutions, instructors, and students encounter practical challenges. The integration of online resources has a significant purchasing cost for institutions as reported in a recent increase in prices (Lowe et al., 2021). In addition, instructors must invest time to integrate them in their course, and students must rely on other relevant resources (e.g., good internet connection) to ensure access, raising challenges for the embedment of inclusivity when designing teaching and assessment. Furthermore, there is evidence for increased levels of stress in remote professional settings which rely on telecommunication, including education (Mheidly et al., 2020). Consequently, more research is needed to demonstrate the learning benefits when students engage with blended learning activities synchronously and asynchronously, and in turn, identify the best practices for encouraging student engagement with them.
The present study aimed to measure levels of student behavioural engagement with different learning activities (incentivised and non-incentivised) and explore its association with academic achievement in a real-world online blended learning approach. To this end, data were collected from a second-year Cognitive Psychology module taught in an online blended format at Queen Mary University of London in Fall 2020. We captured levels of behavioural engagement in terms of how many online quizzes and experimental tasks students completed, how many asynchronous videos they watched, how many times they logged in to the online integrated system (i.e., Cengage's MindTap), and how many online synchronous sessions they attended live. We measured academic achievement via students' performance in the final exam. It was hypothesised that student behavioural engagement with the different activities is associated with final exam performance. The relative impact of each of the student engagement measurements was determined, via correlational and multiple regression analysis.

Participants
The participants consisted of 123 second-year undergraduate Psychology students from Queen Mary University of London. All students were enrolled on the Cognitive Psychology module and there were no exclusion criteria. Ethical approval was obtained by the Queen Mary University of London Ethics Research Committee (QMERC20.058) and data collected were considered minimal risk and for normal educational practices. All students received information about the purposes of this study, and any student who did not want their data to be used could opt-out. None of them did. Anonymity was granted to each student by assigning them a random number (e.g., PT01).

Procedure, Module Description and Measures
The module was taught via a blended online learning format. It was designed online using Queen Mary's Virtual Learning Environment (i.e., Moodle) and integrated Cengage MindTap for Cognitive Psychology. Every week, the instructor outlined the weekly learning outcomes and all learning activities (asynchronous-synchronous) that students should complete to meet the learning outcomes. The activities were mapped on online tasks (see Figure 1). Students were asked to attend the live synchronous session online according to their timetable (duration of 2 h). Before the live session, students would watch asynchronous videos. The videos were either pre-recorded by the instructor or YouTube videos providing a brief introduction of the weekly topic (e.g., if the weekly content was on decision making, the asynchronous video would introduce students on how people make decisions). The average duration of the asynchronous videos was 11 min and would not exceed 20 min each. Most importantly, and in line with the discussion on attention span and interactivity while watching online video content (Bradbury, 2016;Lagerstrom et al., 2015), each asynchronous video would include a reflective question, which students were encouraged to explore and discuss during the live synchronous session. In addition, students would complete an online cognitive experiment using Cengage MindTap CogLab, which was relevant to the weekly topic (e.g., if the weekly content were on decision making, students would complete the CogLab on risky decisions). CogLab tasks were available for completion at any-time of the week, and they would be discussed during the live synchronous sessions. Moreover, reading associated with the weekly content was assigned and students could access the e-book via their individual Cengage MindTap account. Finally, students would complete online retrieval quizzes with multiple choice questions (MCQs), created by the instructor to assess knowledge of the weekly content. Quizzes would become available after the live synchronous session, students had one chance and a week to complete them and received feedback once the deadline elapsed. Participation in all tasks was regularly encouraged and highlighted as important to meet the intended learning outcomes and facilitate cognitive effort when completing interdependent activities (e.g., students would need to complete CogLab experiments to be able to actively participate in the synchronous discussions). Only the participation in the quizzes was incentivised via assessment weighting at low stakes (i.e., student performance on each weekly quiz contributed 0.56% to the overall module score).
Student behavioural engagement with the blended learning activities was captured throughout the semester. Measures included: (a) asynchronous videos view rate (i.e., number of weekly videos watched before the live session, per twelve in total), (b) live online synchronous session attendance rate (i.e., number of synchronous sessions attended on the scheduled time, per eight in total), (c) online experiment completion rate (i.e., number of weekly CogLab experimental tasks completed, per eight in total), (d) number of logins to online resources during term and revision time (i.e., how many times students logged in to the Cengage online platform to access the e-book and/or other resources), and (e) online quiz completion rate (i.e., how many weekly MCQs quizzes completed, per nine in total). For all behavioural engagement measures, high numbers indicated increased levels of student engagement with the online blended learning activities.
Academic achievement for this module was measured via student performance in the final exam (i.e., a grade from 0-100%). The final exam consisted of 50 MCQs which were completed online at the students' preferred space and within a time window of three hours. All information was gathered from Learning Analytics Dashboards (i.e., Moodle; MindTap).

Design and Data Analysis
The current study used a cross-sectional design. The independent variables and predictors relative to student engagement with online blended learning activities were determined via correlational and multiple regression analysis, and was operationalised through six different measures: view rate of asynchronous videos watched before the live sessions, the live attendance of the synchronous sessions, completion rate for MindTap CogLab activities, number of logins to MindTap during term-time and revision period, and finally, completion rate for online MCQs quizzes. The dependent variable was the final exam grade. SPSS version 25 was used to analyse the data.

Results
Data from 123 participants was analysed using descriptive statistical tests, to extract frequency information regarding student engagement and their academic achievement. Mean scores for each of the engagement measures and the exam grade are presented in Table 1.
A multiple linear regression was carried out to investigate whether engagement with different online blended learning activities could significantly predict students' final exam scores. Multiple regression assumptions were checked and met. One hundred and sixteen participants were deemed suitable for multiple regression analysis with up to five predictors (Green, 1991). There were four cases with a standardised residual greater than 2 but none of them greater than 3, as such minimum cases were excluded as outliers (Cook & Weisberg, 1982). Visual inspection of residuals scatterplots between the outcome variable and errors of predictors indicated that the assumptions of normality, linearity, and homoscedasticity were met. The Durbin-Watson test value was close to 2 (i.e., 2.18) and showed that the assumption of independent errors was tenable. Finally, all VIF (Variance Inflation Factor) values for each predictor were well below 10 (Field, 2009).
The regression model was significant F (5, 110) = 2.58, p = .030, and explained 10.5% of the variance in final exam scores (R 2 ). The unique contribution of each predictor is shown in Table 2. Standardised regression coefficients indicated that completion rate of the weekly online quizzes was significantly positively related with the final exam grade and had the greatest influence than all student engagement measures. More specifically, by completing all nine online quizzes students could increase their final exam grade by 16 grade points.

Discussion
We explored whether student behavioural engagement with online blended learning activities predicts academic performance, and if so, to what degree. Findings demonstrated that the completion Note. M and SD represent mean and standard deviation, respectively. Some students did not sit the final exam and did not use the online content via MindTap.
rate of online quizzes was the only indicator of behavioural engagement that predicted performance on the final exam. To our knowledge, this study is unique for two reasons. It assesses the impact of quiz completion rates, rather than quiz performance, on academic achievement and measures behavioural engagement with multiple learning activities, thus suggesting a comparative benefit of completing online quizzes in a real-world online blended learning format. The learning benefit of engaging with online quizzes is in line with and extends existing literature (e.g., Van Camp & Baugh, 2014) about the positive association between quizzing and final course performance. Differing from previous studies, we did not focus on quiz performance and its association with final academic achievement. Rather, we captured quiz completion rate as a metric of behavioural engagement, and we demonstrated the predictive power of quiz completion rate as opposed to an association. As such, this study overcame shortcomings in correlational studies and can eliminate the alternative explanation, which posited that better performing students tend to perform well in all assessments and make greater use of online learning resources. More specifically, we demonstrated that greater used of online quizzes predicts performance in the final exam, and by completing all weekly quizzes one could increase their final exam grade by 16 grade points and change grade classification (e.g., from 2.1 to a First).
Why completing online quizzes matter? In line with theoretical and empirical evidence for context-dependent memory (Godden & Baddeley, 1980;Grant et al., 1998), it is plausible that by completing many multiple practical quizzes as formative assessment helped students encode information in a context that was similar to the retrieval context of the final exam. Moreover, students completed the quizzes and the final exam in the exact same online format (e.g., MCQs presented in the same online virtual environment) thus facilitating students' performance via increased familiarity with the environmental context. In addition, practical online quizzes in this study were spaced out evenly per week during the semester, therefore highlighting the importance of the spacing effect in learning (Cepeda et al., 2006) and suggesting that spacing out retrieval practice contributes to academic performance. Finally, students would complete quizzes for all assessed content throughout the semester and in turn increase their awareness of their understanding to guide their final revision efforts accordingly (Sotola & Crede, 2021).
One may argue that the effect of quizzes relies solely on their incentivised nature via increasing the likelihood that students will engage with them. Indeed, findings are in line with the literature about the benefits of incentivised learning (Chevalier et al., 2018;Nevid et al., 2021;Sotola & Crede, 2021) and extend it by showing benefits in an online blended learning approach and by focussing on completion rates of incentivised tasks. However, we do not think that the finding is solely due to quizzes carrying assessment weighting as an incentive. Firstly, completion of all Note. * p < .05, * * p < .001.
activities was constantly encouraged by highlighting their significance to the learning outcomes. We believe that this encouragement acted as an educational nudge based on goal setting which could lead to positive outcomes (Damgaard & Nielsen, 2018). Indeed, based on supplementary analysis (see Supplemental material), the odds ratios of performing above average in the final exam after completing all as opposed to some asynchronous activities was above 1 for all metrics of engagement, further suggesting the association between participation in the activities and performance (Szumilas, 2010). Secondly, watching asynchronous videos did not carry assessment weighting, yet viewing rates positively correlated with final exam grades. Finally, we used the final exam grade rather than overall module grade, therefore the actual grade from the quizzes (i.e., the incentive per se) did not contribute to our measurement of academic achievement. As such, we think that the predictive power of the quiz completion rates on academic achievement is not contingent to them being incentivised only. However, future research should further compare the predictive power of quiz completion rates when they are incentivised and when not and consider adding appropriate low-stakes incentives to other blended learning activities. Contrary to our predictions, we found no evidence that the other metrics of behavioural engagement with online learning activities predicted final academic performance. We can speculate why in relation to the actions taken and the learning strategies developed during those learning activities. For example, it is surprising that attendance rate in the live synchronous sessions did not significantly predict performance in the final exam. However, Büchele (2021) suggested that it is not a matter of whether students attend the lecture or not, but how they attend (e.g., actively by taking notes) that predicts academic achievement. Such monitoring was impossible in the current study due to online synchronous teaching, however future research should include more nuanced aspects of behavioural engagement during synchronous sessions (e.g., questions asked by the students, contribution to group-discussions, note taking) and assess their benefit on academic performance. Furthermore, this discussion also relates to the importance of the use of cameras during online sessions for encouraging active learning (Castelli & Sarvary, 2021), which, however, was not part of the rationale of this study. Finally, future research should compare the impact of the synchronous sessions on learning outcomes when teaching is blended online only and in a hybrid mode (e.g., students attending the synchronous session online and in-person).
Findings have implications for course design and add to the timely discussion about the use of learning analytics to inform course design and improve teaching efficiency, student satisfaction and performance (Brown et al., 2020;Caspari-Sadeghi, 2022;Foster & Siddle, 2020). They provide an insight into the design and provision of asynchronous online learning activities which can improve student performance at exams (e.g., at least nine weekly quizzes per semester, spaced out, carrying few credits, in similar formats with the final exam). In addition, it informs pedagogical practice in relation to using quiz engagement analytics as a pre-diagnostic tool for identifying students in need of further support. For example, generating a "no-engagement with quiz" alert could be employed as a time-efficient and student-focused strategy for enhancing student engagement, and in turn academic achievement.
Nevertheless, some caveats must be mentioned. Firstly, we do not know whether our results are solely relevant to academic performance measured via MCQs and to the specific course content. Perhaps behavioural engagement with online activities differentially predicts academic performance measured via essay writing and performance in other courses. More research assessing performance in distinct types of assessment and subjects is essential (e.g., whether engagement with synchronous sessions and discussions predicts performance in essay-based exams). In addition, even though the regression model was significant, there was variance in academic performance that remained unexplained. For example, future research should explore whether the relationship between students' behavioural engagement and academic performance is moderated by other factors relevant to students' cognitive styles, their familiarity with online learning resources and the perceived effectiveness and popularity of each online learning activity.
To conclude, the present study found that student behavioural engagement with online quizzes predicts student performance at a final exam in an online blended learning approach. With the significant rise in online and blended learning approaches, more research on the factors influencing the link between student engagement and academic achievement will help educators in the effective development and usage of online learning resources.

Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding
The author(s) received no financial support for the research, authorship, and/or publication of this article.

Supplemental Material
Supplemental material for this article is available online.
psychology of language and communication (including non-verbal communication), and she conducts experimental studies with implications in educational and therapeutic settings (e.g., the role of gestures in online, asynchronous learning; the link between gestures and empathy). She currently teaches Emotion, Cognitive Psychology, Business Psychology and Essential Skills for Psychologists. As an educator, she is dedicated to teaching excellence and is particularly interested in enhancing student experience and employability. She is involved in a project evaluating the effectiveness of a school-based, peer-to-peer mentoring intervention which aims to enhance adolescents' well-being and young adults' transferable skills. She has disseminated the outcomes of her research projects at conferences and via publications (e.g., Journal of Experimental Psychology, Learning, Memory, and Cognition) and has presented her scholarship activities at HEA Conferences, with a focus on improving assessment and feedback, and diversifying the curriculum. Dr Argyriou is a member of the Experimental Psychology Society and a Fellow of the Higher Education Academy.
Miss Kenza Benamar is currently an undergraduate student in Psychology at Queen Mary University of London. She has sought out opportunities to gain research experience via assisting two projects. One project investigates the relationship between academic achievement and student engagement when psychology is taught online/in a blended format. Her interest stemmed from wanting to better understand how students can best be helped with their academic achievements, especially those struggling from earlier on. The second project explores the role of gesturing/non-verbal communication when academic content is delivered online and asynchronously. This project is still in progress but looks very promising. Although her interests vary, she plans to continue gaining as much experience while contributing to research within multiple fields in psychology.
Miss Milena Nikolajeva is a Queen Mary University of London Psychology alumna and currently a Research Assistant at the Youth Resilience Unit at Queen Mary University of London. During her undergraduate studies, she investigated the impact of COVID-19 on mental wellbeing dependent on sociodemographic status. Additionally, she assisted with research exploring emotion recognition in forced and voluntary migrants. Specifically, the project looked at how emotion recognition and associated cognitive biases are affected by exposure to trauma and change over time, and how these changes correspond with changes in well-being and mental health constructs. Presently, she is part of a team investigating the relationship between student engagement and academic achievement when teaching psychology in an online blended format. The team is also exploring the role of non-verbal communication (particularly the importance of gestures) when delivering learning content asynchronously in an online setting. Finally, she is currently assisting with a systematic literature review to collate the current evidence base on trauma-informed approaches in psychologically distressed adult populations. While she strives to solidify her research interests, she aims to obtain a variety of research experience in different settings to reach her aspirations of pursuing a research career.