Measuring Learning and Promoting Academic Integrity in Online Instruction

in Kinesiology Review
View More View Less
  • 1 Department of Health & Human Performance, Texas State University, San Marcos, TX, USA
  • | 2 Department of Kinesiology, Pennsylvania State University, University Park, PA, USA

The COVID-19 pandemic shifted kinesiology courses into more hybrid and online delivery, creating new challenges and opportunities for evaluating learning and online testing. Research using the Biomechanics Concept Inventory indicates that both high-tech and low-tech active learning experiences implemented in hybrid and online formats in biomechanics courses improve student learning above levels for lecture alone. However, online pre- and posttesting using concept inventories or major exams are vulnerable to cheating. Experience and research on proctoring online testing indicate only partial success in detecting cheating absent substantial faculty commitment to investigate suspicious behavior. These difficulties with online testing provide an opportunity for kinesiology faculty to implement more authentic, holistic assessments that are less vulnerable to violations of academic integrity. The importance of well-designed, rigorous assessment methods that uphold academic integrity standards will continue to evolve as kinesiology departments expand online learning.

Public health concerns from the COVID-19 pandemic drove a majority of university instruction from traditional, face-to-face delivery to hybrid and online delivery in early 2020. This likely slowed the spread of the disease and sparked a reexamination of university instruction; however, it may have also adversely affected faculty–student interactions and learning. Some students believe that online classes are inferior to in-person classes (Means & Neisler, 2020) and have gone so far as suing universities to lower their tuition (Justin & Oxner, 2020; Oxner, 2021). What do we know about student learning in university classes, and what is known about the integrity of online testing of students when trying to assess performance and learning?

In this article, we summarize research on these questions in hybrid and online settings, drawing predominantly on investigations in biomechanics and kinesiology hybrid and online courses that were adapted in response to the COVID-19 pandemic. We first overview aggregate learning of core concepts in introductory biomechanics classes in various modes of delivery. We then outline the results of systematic efforts at a large northeastern university to monitor and reduce academic dishonesty in online testing.

Learning Introductory Biomechanics In Person and Online

Most kinesiology professionals have taken a motor learning course and are familiar with rigorous definitions of learning. This means that many kinesiology university faculty do not make the common error of confusing the performance measure of course grades or student perceptions with the relatively permanent change in knowledge, skills, or values that constitute learning. A couple of decades of research on learning biomechanics concepts from the national guidelines for the introductory biomechanics course in kinesiology (Abraham et al., 2018) provide some perspective on measuring learning in this core course.

Most Scholarship of Teaching and Learning (SoTL) research in biomechanics has been based on the Biomechanics Concept Inventory (BCI). The BCI is a 24-question test of four prerequisite and eight biomechanics course competencies from the guidelines, with national normative data (Knudson et al., 2003). The BCI was based on common physics SoTL tests (e.g., Force Concept Inventory, Mechanics Baseline Test), and three versions of the measure have been implemented using paper and electronic formats in numerous studies of in-person, hybrid, and online biomechanics courses (Knudson, 2004, 2006; Knudson et al., 2003).

Many of these studies use an unbiased measure of learning called the normalized gain score (g = [post − pre]/[max score − pre]) proposed by Hake (1998). Research with the BCI also indicates the unbiased nature of g as a measure of learning biomechanics. For example, BCI scores, learning (g), and final grades were pulled from three random classes (n = 98) taught by the same professor from student participants (N = 226) of six classes examined in two studies (Knudson, 2020; Knudson & Wallace, 2021). Knowledge of key biomechanics concepts at the pretest was not associated with any other measure, indicating a lack of bias of the BCI, learning measure (g), and the final grade (see Table 1). Student performance measured with a final grade in the course was weakly associated with the BCI posttest and learning; however, the shared variance (6%–8%) of these associations has little meaning. This confirms that the performance measure of the final grade is different from the BCI-based learning measure of g.

Table 1

Correlations Between Biomechanics Concept Inventory Scores, Learning, and Final Grade in Introductory Biomechanics

PretestPosttestLearning (g)Final grade
Pretest0.175−0.1980.183
Posttest0.917*0.288*
Learning (g)0.238*

*p < .05.

National data using the BCI indicate that students have difficulty mastering Newtonian biomechanics concepts (g ≈ 0.1–0.2), a finding that is consistent with student learning in physics (Hake, 1998). Because biomechanics relies heavily on Newtonian mechanics from physics, many physics education research discoveries are likely relevant to learning biomechanics concepts. For example, physics education research shows that course and instructor variables are weakly associated with learning basic Newtonian mechanics concepts (Halloun & Henestes, 1985; McDermott, 1991), a finding confirmed (r2 < 2.3%) in research using the BCI (Knudson et al., 2009). Student characteristics (i.e., grade point average, student interest, perceived application) and behaviors are more strongly associated (r2 = 14%–40%) with normalized gain (Hsieh & Knudson, 2008; Hsieh et al., 2012). We know biomechanics is a tough subject for most kinesiology majors, and many dislike it even after the course. Consequently, we need to find ways to engage students and motivate them to be responsible for the hard work of learning. Studies of biomechanics classes, however, report that students attribute more responsibility for learning to the instructor than to themselves (Knudson & Wallace, 2021; Wallace & Knudson, 2020).

Much of the SoTL research on improving learning in numerous disciplines examines the use of active learning experiences. Active learning is interactive, problem-based instruction that promotes student involvement in relevant activities and thinking about what they are doing (Bonwell & Eison, 1991; Driessen et al., 2020). Active learning significantly improves learning of biomechanics concepts (g = 0.3–0.4) over the levels reported for traditional lecture, between 0.1 and 0.2 (Knudson, 2019, 2020; Knudson & Wallace, 2021; Wallace & Knudson, 2020; Wallace et al., 2020). This observation is consistent with extensive research in physics and numerous (biology, chemistry, computer science, education, engineering, geology, math, and psychology) other disciplines (Beichner et al., 2007; Freeman et al., 2014; Hake, 1998; Prince, 2004; Soneral & Wyse, 2017; Springer et al., 1999).

Researchers have begun to examine if active learning instruction can be implemented in hybrid biomechanics classes and labs. Wallace and Knudson (2020) reported similar learning of biomechanics concepts using low-tech active learning in face-to-face and hybrid course formats. Low-tech active learning instruction relies on lower cost moveable chairs/tables, white boards, and multiple monitors, rather than instructor-controlled, networked computer stations and smart boards/screens. Several innovations in bioengineering/biomechanics activities and labs were tried in online and hybrid classes resulting from the COVID-19 pandemic. Anecdotal accounts suggest positive student perception and potential utility when implementing them during pandemic emergency remote instruction (Gerstenhaber & Har-El, 2021; Giles & Willerth, 2021; Lee et al., 2021; Ochia, 2021; Ramo et al., 2021; Troutman & Grimm, 2020). Similar positive responses for online active learning experiences have been reported by biology students (McGreevy & Church, 2020). Other research shows that students who voluntarily participated in optional synchronous online physics lessons learned 30% more than students who did not participate (Guo, 2020).

Knudson (2020) reported a study of low-tech active learning experiences on learning in two biomechanics classes interrupted by emergency online instruction due to COVID-19. Additional posttest questions allowed for the study of student perceptions of the emergency online instruction, along with the planned study of learning and student engagement. Most students (59%) preferred face-to-face over online instruction in biomechanics. About a quarter of the students reported resistance to group-based active learning; however, the low-tech active learning in the face-to-face portion (two thirds) of the course still improved the mean mastery of biomechanics concepts (g = 0.44) above levels previously reported for lecture, as noted above (Knudson, 2019, 2020; Knudson & Wallace, 2021). The author recommended interpreting this learning as likely greater than lecture alone, but cautioned that a lack of proctoring of the BCI posttest could have influenced the finding.

This evidence from biomechanics and physics SoTL indicated that both high-tech and low-tech active learning experiences implemented in hybrid and online formats in biomechanics courses improve student learning above levels for lecture alone. Despite student concerns about online course quality (Means & Neisler, 2020), these aggregate improvements in student learning above passive, lecture instruction alone can be expected with quality courses and student motivation. The SoTL concept inventories, however, cannot in themselves account for all contextual factors that may impact concept inventory scores. For example, there are numerous ways to cheat (Hearne Moore et al., 2017), and the potential for cheating is noteworthy in the context of online presentations of major exams with limited proctoring (Alessio et al., 2017; Goedl & Malla, 2020; Vazquez et al., 2021). The rest of this article addresses the monitoring of academic integrity of students at a large, research university.

Monitoring and Mitigating Academic Dishonesty Online

The pivot to remote instruction presented many challenges for faculty. In-person classes allow for many types of traditional assessment, from practical skills tests to paper–pencil proctored exams; however, the rapid change in delivery format resulted in assessment dilemmas for many instructors. Technical challenges associated with unexpected remote delivery needed to be balanced with the maintenance of rigor and course standards, with the overlaid challenge of limited skills among some faculty for building alternative assessment strategies. These factors in the spring of 2020 all coincided to create a context where academic dishonesty could thrive.

Academic integrity is often operationally defined at universities in relation to student conduct as being tied to acts such as plagiarism and cheating (Macfarlane et al., 2014). For many years, the faculty could detect academic integrity violations by things that they could easily see in person. Examples include observing a student looking at another’s exam, similarities in a paper submission to other printed material, or a student using crib notes during a quiz. The faculty were familiar with many of these strategies and could put measures in place to mitigate the risk of these things happening. Though the growth of online instruction in kinesiology programs may have led some faculty to be more familiar with online assessment and related anticheating measures (Mahar et al., 2014), many departments had faculty who lacked the technical and pedagogical skills or the financial resources to adapt assessments so rapidly to mitigate cheating.

The most commonly identified reasons for cheating in the literature include peer pressure, lack of preparedness, performance anxiety, lack of knowledge about appropriate behavior, limited understanding of the consequences, inability to manage the multiple demands of student life, feeling anonymous in a classroom setting, and situations that encourage academic integrity violations (Whitley & Keith-Spiegel, 2002). Other factors could include cheating because the assessment is high-stakes, the course content is particularly challenging, or students are high achievers (Anderman & Won, 2019; Ottaway et al., 2017). The mode of delivery is also related to cheating rates, though less consistently so. Faculty are more likely to perceive that cheating is happening in online learning situations compared with in-person learning (Jaschik & Lederman, 2019). In the spring of 2020, many of these factors were in place for kinesiology students across the country, with students suddenly removed from their campuses, learning resources, and peer social support networks. This challenged educators substantially with respect to the assessment of student learning and promotion of academic integrity.

Experience With Academic Integrity Violations During the Pandemic

The Spring 2020 mid-semester pivot to remote learning for most universities across the United States meant that assessment strategies that were well defined in the syllabi needed to be adapted for remote delivery. Faculty may have spent years honing quizzes and exams to achieve the level of specificity and rigor they preferred, only to have to modify these assessments quickly to be given online rather than in the classroom. The following sections summarize the experience of kinesiology faulty at a large research university in monitoring potential violations of academic integrity and prevention strategies.

A Case of Unauthorized Collaboration

One large lecture format kinesiology class that had only planned for in-person high-stakes exams for assessment experienced significant issues with cheating after the pivot to remote learning. With the pivot to online learning, the faculty member once again emphasized academic integrity in their communication and included an “honor code” statement with the instructions for the online exams. Despite this appeal for honorable behavior, considerable cheating occurred in these online major exams.

This was detected by the faculty member, who noticed with a visual scan of a digital gradebook that several groups of students had scored similar or exactly the same responses on exams after the pivot to remote learning. The instructor opted to use analyses outlined in previous research for Exact Errors in Common, which is the degree of similarity of students’ incorrect answers on multiple choice exams (Harpp & Hogan, 1993; Nath & Lovaglia, 2009; Nelson, 2006). This led to the discovery of several pairs and small groups who had a very high similarity for the online exams, but not for the previous in-person exams. These pairs/small groups comprised almost 25% of the total class. The students were confronted with this information and the charge of academic integrity violations. They were offered the opportunity to confess and accept the charges or to participate in the university’s process of academic integrity “trials.” The overwhelming majority of students confessed to unauthorized collaboration during the exam, messaging each other during the exam for answers.

Prevention Strategies

This experience led to heightened concern about academic integrity, as online instruction was designed for the remainder of 2020. These semesters had the benefit of planning for completely online (summer) or some degree of hybrid (fall) instruction. A wider range of university tools were in place to allow instructors to make choices to meet their pedagogical needs and preferences for the prevention of cheating. These methods were predominantly focused on building skills and awareness of features and prevention strategies for faculty, and included more trainings on technology for faculty (increased use of learning management system features, creating awareness of possible problems, etc.), templates for syllabus language and course contracts addressing academic integrity/technology issues, and instruction for monitoring via videoconferencing (e.g., monitoring exam-taking via ZoomTM). Student-focused initiatives were also implemented. For example, the availability of university-prepared academic integrity training programs helped to increase student accountability and awareness regarding the definition and scope of academic dishonesty in the evolving learning climate. The need for different methods of assessment was omnipresent as the academic year began and faculty were faced with many choices for monitoring their exams. Table 2 outlines different types of proctoring that were used in coursework and includes some of the lessons learned from their use. The department primarily used Examity for online proctoring, although there are other services like Examsoft, Honorlock, and ProctorU. None offered an easy solution or were highly effective in preventing cheating. Different faculty had varying skills for using these effectively and adapting these tools into their assessments.

Table 2

Characteristics of and Experience With Online Proctoring

Live proctoring by the instructorLive proctoring via online serviceRecorded proctoring via online serviceAutomated proctoring via online serviceUnproctored
DescriptionTakes place in real time during the exam with an instructor and/or teaching assistants.Takes place in real time during the exam with a trained, human proctor.Involves the video recording of a student taking an online exam, and the proctor reviews the video at a later time.Student is videoed taking an exam, and a program is used to detect suspicious behavior.Students complete assessments with no proctoring, though other controls may be in place from the LMS (e.g., lockdowns, logs, question banks).
ProsCorrections can be made in real time, can allow for interaction between the proctor and student, no additional cost.Corrections can be made in real time, can allow for interaction between the proctor and student.More flexibility with scheduling, less expensive than live proctoring, record exists for disputes.More flexibility with scheduling, less expensive than live proctoring, record exists for disputes.Provides flexibility for the students who may have technology/internet issues, less time-consuming to oversee, no additional costs, assessment can be designed as “open-note”.
ConsMust take place at a scheduled time, technology challenges, internet access issues.Often expensive, must take place at a scheduled time, technology challenges, internet access issues.Often expensive, no live interaction to correct behavior, technology challenges, internet access issues.Possible “false positives,” “flags” still need to be reviewed, technology challenges, internet access issues.Greater potential for cheating through unauthorized aid and collaboration.
Pandemic lessons learned at Penn stateWas difficult to manage large class sizes, needed multiple proctors to have fewer students onscreen, correcting students for bad behavior during the exam was disruptive to other students, students used screen savers to fool proctors.Students who lived in rural areas or lacked strong enough Wi-Fi access had difficulties in using their webcam and the LMS, scheduling times were issues for students in different time zones.Instructors still needed to review footage of the exam for each student, which was very time-consuming, poor video quality of the recording made detection of problems difficult, video captured inappropriate background footage.Instructors still need to review flagged footage, number of flagged issues was significant, flagged issues were not clear violations, students reported a high degree of anxiety and fear from the exam environment.

Note. Descriptions and categories adapted from Hussein et al. (2020). LMS = learning management system.

One example of how the assessment and proctoring options overlapped during the academic year was with a large lecture class that was fully remote for the 2020–2021 academic year, with different instructors in the fall and spring. The instructor in the fall adapted their assessments to be more applied and held assessments as unproctored, open note. The questions were more complex and often required several steps to complete and made cheating by unauthorized collaboration quite difficult within the confines of the exam time. Of course, this modification required a fair amount of thoughtfulness on the part of the instructor in creating the questions and a greater amount of time to grade, which are important factors for consideration with a large class. This instructor reported similar grades as previous semesters and did not find any instances of academic integrity violations. The spring instructor opted for the same types of assessments used in previous semesters and opted for no proctoring beyond what the learning management system (CanvasTM; Instructure, Salt Lake City, UT) offered (e.g., limited time, shuffled answers, quiz logs, etc.). This instructor experienced multiple instances of cheating and reported poorer scores on assessments than previous years. Using features within the learning management system, predominately quiz logs showing if a student had left the exam page, the instructor was able to determine that there was unauthorized aid being used for some cases. Other cases for this spring course were brought forth by students, most notably group text messages showing that students were collaborating during the exam. In these cases, with the minimal prevention methods that were used (limited time, shuffled answers, and quiz logs) and no additional forms of proctoring, the instructor would have not been able to tell that this unauthorized collaboration was occurring. The exact errors in the common analyses described above would not apply with the kind of questions on these exams. The fall-term approach appears to show how designing the assessment strategy to fit with online delivery resulted in better academic integrity and learning outcomes, though of course these conclusions are limited by our ability to detect all forms of cheating.

Future Instructional Trends and Academic Integrity Promotion

Before the COVID-19 pandemic, discussion of the future of higher education was predominately concerned with changing demographics that could impact recruitment and retention (Grawe, 2018), with online education a continuing topic of interest. The pandemic brought many online education issues to the forefront, with concerns about content delivery, assessment, and learning outcomes becoming immediate concerns. This abrupt and challenging perturbation to our delivery of instruction has provided an opportunity to thoughtfully plan for the future of both in-person and virtual instruction in kinesiology.

Strategies for the prevention of cheating and the promotion of academic integrity in online settings have been attempted with varying success. One intervention designed to educate and improve health science students’ attitudes toward academic integrity reported improvements in attitudes (Azulay Chertok et al., 2014), while a similar study of undergraduates observed no change in attitudes, but lower cheating behaviors (Stephens et al., 2021). Both of these interventions used educational methods to provide students with a better understanding of the different facets of academic integrity and university-level policies and consequences of academic integrity violations. The interventions were well designed and included a postassessment of student understanding. A prepandemic review of educational, preventive strategies used at institutions of higher education in Australia and New Zealand found varying levels of effectiveness; however, large-scale institutional support for the strategies in the form of sanctions for a lack of participation appeared to improve compliance for completion of the training (Sefcik et al., 2020). With the availability of rapidly evolving technology, there is an increased need for research on effective preventive educational models. While there was little time to train students before the instructional transition in the spring of 2020, these types of student training offer better potential outcomes than strategies limited to instructor comments and integrity language in syllabi. Perhaps introductory kinesiology courses could include increased content on ethical issues specific to our field, along with academic integrity training, regardless of the mode (in person or online) of course delivery.

Perhaps the most effective strategy for the prevention of cheating lies with the faculty member in designing assessment strategies that work well in whatever setting they may be implemented. Kinesiology departments can provide training for faculty on what puts students at a greater risk of engaging in cheating behavior and help them to build skills in creating assessment tools that best gauge student learning outcomes in a variety of classroom settings (labs, large lecture, online, etc.). Training should provide instruction on which situations create more opportunity for cheating, as well as exposure to a variety of assessment techniques. This may also help to alleviate some of the faculty concerns regarding cheating and online learning (Jaschik & Lederman, 2019). As technology and the capabilities of learning platforms evolve rapidly, it may be worthwhile to consider annual “continuing education credits” in order to prepare faculty for current issues in instruction and academic integrity. Creating more authentic assessments of discipline and application knowledge might also minimize the opportunity to use electronic resources to cheat in online testing.

Another lesson learned from the pandemic that is relevant to kinesiology departments is the use of different proctoring options for online classes. While these proctoring services are not completely effective and require faculty attention, access to a growing variety of proctoring methods will offer instructor flexibility in course delivery moving forward (Hussein et al., 2020). A comparison of performance levels between different proctoring scenarios revealed that live/in-person proctors had a higher effect on reducing bias in online test scores compared with web-based proctors or no proctors (Vazquez et al., 2021). In line with this, better outcomes (less time use and lower scores on par with in-person tests) were found when online proctoring was used in an undergraduate health science course (Alessio et al., 2017). Unproctored online exams may not automatically result in excessive cheating with well-designed assessments, though previous research has shown higher scores for unproctored online exams compared with typical tests (Alessio et al., 2017; Goedl & Malla, 2020; Vazquez et al., 2021).

Various proctoring techniques, combined with data-driven techniques to examine digital outputs as described above and elsewhere (Cleophas et al., 2021), can help faculty feel more secure in their assessments. However, to date, little research has been conducted on newer forms of automated proctoring. Faculty needing to give traditional tests online can also use several strategies to minimize the impact of cheating (Cluskey et al., 2011; Norris, 2019). Kinesiology departments can work with faculty to provide information on different strategies to proctor assessments and how to adapt their assessments to these different forms of monitoring.

Recommendations

A large body of evidence from many disciplines supports the use of active learning in improving student learning. Active learning experiences implemented in hybrid and online formats in biomechanics courses significantly improve student learning above levels for lecture alone. The use of pre- and posttest concept inventories are the standard measure of student learning; however, tests of learning or performance presented online clearly cannot measure all aspects of student learning and are more vulnerable to cheating than in-person testing. Experience and research on proctoring online testing indicate partial success in detecting and preventing cheating without substantial faculty commitment to investigate suspicious behavior. These difficulties with online testing provide an opportunity for kinesiology faculty to implement more authentic, holistic assessments that are less vulnerable to violations in academic integrity. Faculty wanting to rely on traditional exams or concept inventories can implement restrictive testing procedures and proctoring services to minimize cheating. Well-designed, rigorous assessment methods that uphold academic integrity standards are essential to kinesiology instruction and will continue to evolve as kinesiology departments expand online learning.

References

  • Abraham, L., Bird, M., Johnson, J., Knudson, D., Russell, P., Smith, D., & Strohmeyer, S. (2018). Guidelines for undergraduate biomechanics (4th ed.) [Guidance document]. SHAPE America. https://www.shapeamerica.org/uploads/pdfs/2018/guidelines/Guidelines-for-UG-Biomechanics.pdf

    • Search Google Scholar
    • Export Citation
  • Alessio, H.M., Malay, N., Maurer, K., Bailer, J.A., & Rubin, B. (2017). Examining the effect of proctoring on online test scores. Online Learning, 21(1), 146161. https://doi.org/10.24059/olj.v21i1.885

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderman, E.M., & Won, S. (2019). Academic cheating in disliked classes. Ethics & Behavior, 29(1), 122. https://doi.org/10.1080/10508422.2017.1373648

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Azulay Chertok, I.R., Barnes, E.R., & Gilleland, D. (2014). Academic integrity in the online learning environment for health sciences students. Nurse Education Today, 34(10), 13241329. https://doi.org/10.1016/j.nedt.2013.06.002

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Beichner, R.J., Saul, J.M., Abbott, D.S., Morse, J.J., Deardorff, D.L., Allain, R.J., Bonhmam, S.W., Dancy, M.H., & Risley, J.S. (2007). The student-centered activities for large enrollment programs (SCALE-UP) project. Reviews in Physics Education Research, 1, ar3. https://www.compadre.org/per/per_reviews/volume1.cfm?#Cite

    • Search Google Scholar
    • Export Citation
  • Bonwell, C.C., & Eison, J.A. (1991). Active learning: Creating excitement in the classroom. ASHE-ERIC higher education reports. Association for the Study of Higher Education. https://eric.ed.gov/?id=ED336049

    • Search Google Scholar
    • Export Citation
  • Cleophas, C., Hoennige, C., Meisel, F., & Meyer, P. (2021). Who’s cheating? Mining patterns of collusion from text and events in online exams. SSRN. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3824821

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cluskey, G.R., Ehlen, C.R., & Raiborn, M.H. (2011). Thwarting online exam cheating without proctor supervision. Journal of Academic and Business Ethics, 4, Ar10. https://www.aabri.com/manuscripts/11775.pdf

    • Search Google Scholar
    • Export Citation
  • Driessen, E.P., Knight, J.K., Smith, M.K., & Ballen, C.J. (2020). Demystifying the meaning of active learning in postsecondary biology education. CBE-Life Science Education, 19, ar52. https://doi.org/10.1187/cbe.20-04-0068

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H., & Wenderoth, M.P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111, 83198320. https://doi.org/10.1073/pnas.1319030111

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gerstenhaber, J.A., & Har-El, Y-E. (2021). Virtual biomaterials lab during COVID-19 pandemic. Biomedical Engineering Education, 1, 353358. https://doi.org/10.1007/s43683-020-00045-6

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Giles, J.W., & Willerth, S.M. (2021). Strategies for delivering biomedical engineering elective during the COVID-19 pandemic. Biomedical Engineering Education, 1(1), 115120. https://doi.org/10.1007/s43683-020-00023-y

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Goedl, P.A., & Malla, G.B. (2020). A study of grade equivalency between proctored and unproctored exams in distance education. American Journal of Distance Education, 34(4), 280289. https://doi.org/10.1080/08923647.2020.179637

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Grawe, N.D. (2018). Demographics and the demand for higher education. Johns Hopkins University Press.

  • Guo, S. (2020). Synchronous versus asynchronous online teaching of physics during the COVID-19 pandemic. Physics Education, 55(6), Article 0065007. https://doi.org/10.1088/1361-6552/aba1c5

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hake, R.R. (1998). Interactive-engagement versus traditional methods: A six thousand student survey of mechanics test data for introductory physics. American Journal of Physics, 66, 6474. https://doi.org/10.1119/1.18809

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Halloun, I.A., & Henestes, D. (1985). The initial knowledge state of college physics students. American Journal of Physics, 53(11), 10431055. https://doi.org/10.1119/1.14030

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Harpp, D.N., & Hogan, J.J. (1993). Crime in the classroom: Detection and prevention of cheating on multiple-choice exams. Journal of Chemical Education, 70(4), 306. https://doi.org/10.1021/ed070p306

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hearne Moore, P., Head, J.D., & Griffin, R.B. (2017). Impeding students’ efforts to cheat in online classes. Journal of Learning in Higher Education, 13(1), 923.

    • Search Google Scholar
    • Export Citation
  • Hsieh, C., & Knudson, D. (2008). Student factors related to learning in biomechanics. Sports Biomechanics, 7(3), 398402. https://doi.org/10.1080/14763140802233207

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hsieh, C., Smith, J., Bohne, M., & Knudson, D. (2012). Factors related to students’ learning of biomechanics concepts. Journal of College Science Teaching, 41(4), 8389.

    • Search Google Scholar
    • Export Citation
  • Hussein, M.J., Yusuf, J., Deb, A.S., Fong, L., & Naidu S. (2020). An evaluation of online proctoring tools. Open Praxis International Council for Open and Distance Education, 12(4), 509525. https://doi.org/10.3316/informit.620366163696963

    • Search Google Scholar
    • Export Citation
  • Jaschik, S., & Lederman, D. (2019). 2019 survey of faculty attitudes on technology. Inside Higher Education.

  • Justin, R., & Oxner, R. (2020). Texas universities are moving more classes online but keeping the tuition the same. Students are asking if it’s worth the money. Texas Tribune. https://texastribune.org/2020/07/06/texas-universities-coronavirus-online-classes

    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2004). Biomechanics concept inventory: version two. In M. Lamontagne, D.G.E. Robertson, & H. Sveistrup (Eds.), Proceedings of XXIInd International Symposium on Biomechanics in Sports (pp. 378–380). University of Ottawa. https://ojs.ub.uni-konstanz.de/cpa/article/view/1326.81-82

    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2006). Biomechanics concept inventory. Perceptual and Motor Skills, 103(1), 81–82. https://doi.org/10.2466/PMS.103.1.81-82

  • Knudson, D. (2019). Do low-tech active learning exercises influence biomechanics student’s epistemology of learning? Sports Biomechanics. Advance online publication. https://doi.org/10.1080/14763141.2019.1682650

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2020). A tale of two instructional experiences: Student engagement in active learning and emergency remote learning of biomechanics. Sports Biomechanics. Advance online publication. https://doi.org/10.1080/14763141.2020.1810306

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Knudson, D., Bauer, J., & Bahamonde, R. (2009). Correlates of student learning in introductory biomechanics. Perceptual and Motor Skills, 108, 499504. https://doi.org/10.2466/PMS.108.2.499-504

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Knudson, D., Noffal, G., Bauer, J., McGinnis, P., Bird, M., Chow, J., Bahamonde, R., Blackwell, J., Strohmeyer, S., & Abendroth-Smith, J. (2003). Development and evaluation of a biomechanics concept inventory. Sports Biomechanics, 2, 267-277. https://doi.org/10.1080/14763140308522823

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Knudson, D., & Wallace, B. (2021). Student perceptions of low-tech active learning and mastery of introductory biomechanics concepts. Sports Biomechanics, 20(4), 458468. https://doi.org/10.1080/14763141.2019.1570322

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lee, C.-H., Liu, Y., Moore, M., Ge, X., & Siddique, Z. (2021). Enhancement of stay-at-home learning for the biomechanics laboratory course during COVID-19 pandemic. Biomedical Engineering Education, 1(1), 149154. https://doi.org/10.1007/s43683-020-00025-w

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Macfarlane, B., Zhang, J., & Pun, A. (2014). Academic integrity: A review of the literature. Studies in Higher Education, 39(2), 339358. https://doi.org/10.1080/03075079.2012.709495

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mahar, M.T., Hall, T.R., Delp, M.D., & Morrow, J.R. (2014). The state of online education in kinesiology in the United States. Kinesiology Review, 3(4), 177. https://doi.org/10.1123/kr.2014-0068

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McDermott, L.C. (1991). Millikan lecture 1990: What we teach and what is learned—Closing the gap. American Journal of Physics, 59(4), 301315. https://doi.org/10.1119/1.16539

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McGreevy, K.M., & Church, F.C. (2020). Active learning: Subtypes, intra-exam comparison, and student survey in an undergraduate biology course. Education Sciences, 10(7), 185. https://doi.org/10.3390/educsci10070185

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Means, B., & Neisler, J. (2020). Suddenly online: A national survey of undergraduates during the COVID-19 pandemic. Digital Promise. https://digitalpromise.dspacedirect.org/handle/20.500.12265/98

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Nath, L., & Lovaglia, M. (2009). Cheating on multiple choice exams: Monitoring, assessment, and an optional assignment. College Teaching, 57(1), 38. https://doi.org/10.3200/CTCH.57.1.3-8

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Nelson, L.R. (2006). Using selected indices to monitor cheating on multiple-choice exams. Journal of Educational Research and Measurement, 4(1), 118. https://so05.tci-thaijo.org/index.php/RMCS/article/view/46802/38785

    • Search Google Scholar
    • Export Citation
  • Norris, M. (2019). University online cheating—How to mitigate the damage. Research in Higher Education Journal, 37, Ar11. https://www.aabri.com/manuscripts/193052.pdf

    • Search Google Scholar
    • Export Citation
  • Ochia, R. (2021). A hybrid teaching method for undergraduate biomechanics lab. Biomedical Engineering Education, 1, 187193. https://doi.org/10.1007/s43683-020-00033-w

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ottaway, K., Murrant, C., & Ritchie, K. (2017). Cheating after the test: Who does it and how often? Advances in Physiology Education, 41(3), 368374. https://doi.org/10.1152/advan.00103.2016

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Oxner, R. (2021). Student files class-action lawsuit against UT-Austin, saying online education last year wasn’t worth the tuition. Texas Tribune. https://texastribune.org/2021/03/02/ut-austin-lawsuit-pandemic-tuition/

    • Search Google Scholar
    • Export Citation
  • Prince, M. (2004). Does active learning work? A review of the literature. Journal of Engineering Research, 93, 223231. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x

    • Search Google Scholar
    • Export Citation
  • Ramo, N.L., Lin, M.A., Hald, E.S., & Huang-Saad, A. (2021). Synchronous vs. asynchronous vs. blended remote delivery of introduction to biomechanics course. Biomedical Engineering Education, 1(1), 6166. https://doi.org/10.1007/s43683-020-00009-w

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sefcik, L., Striepe, M., & Yorke, J. (2020). Mapping the landscape of academic integrity education programs: What approaches are effective? Assessment & Evaluation in Higher Education, 45(1), 3043. https://doi.org/10.1080/02602938.2019.1604942

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Soneral, P.A.G. & Wyse, S.A. (2017). A SCALE-UP mock-up: Comparison of learning gains in high- and low-tech active-learning environments. CBE Life Sciences Education, 16, 12. https://doi.org/10.1187/cbe.16-07-0228

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Springer, L., Stanne, M.E., & Donovan, S.S. (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: A meta-analysis. Review of Educational Research, 69(1), 2151. https://doi.org/10.3102/00346543069001021

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Stephens, J.M., Watson, P.W.S., Alansari, M., Lee, G.C., & Turnbull, S.M. (2021). Can online academic integrity instruction affect university students’ perceptions of and engagement in academic dishonesty? Results from a natural experiment in New Zealand. Frontiers in Psychology, 12, Article 569133. https://doi.org/10.3389/fpsyg.2021.569133

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Troutman, V.A., & Grimm, M.J. (2020). Interactive digital experience as an alternative laboratory (IDEAL): Creative investigation of forensic biomechanics. Journal of Applied Biomechanics, 37(2), 163170. https://doi.org/10.1123/jab.2020-0171

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Vazquez, J.J., Chiang, E.P., & Sarmiento-Barbieri, I. (2021). Can we stay one step ahead of cheaters? A field experiment in proctoring online open book exams. Journal of Behavioral and Experimental Economics, 90, 101653. https://doi.org/10.1016/j.socec.2020.101653

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wallace, B., Gheidi, N., & Knudson, D. (2020). Incorporating problem-based learning with direct instruction improves learning in undergraduate biomechanics. Journal of Hospitality, Leisure, Sport & Tourism Education, 27, Article 100258. https://doi.org/10.1016/j.jhlste.2020.100258

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wallace, B., & Knudson, D. (2020). The of course format on student learning in introductory biomechanics courses that utilise low-tech active learning exercises. Sports Biomechanics. Advance online publication. https://doi.org/10.1080/14763141.2020.1830163

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Whitley, B.E., Jr., & Keith-Spiegel, P. (2002). Academic dishonesty: An educator’s guide. Lawrence Erlbaum Associates Publishers.

Knudson (dk19@txstate.edu) is corresponding author, https://orcid.org/0000-0003-0809-7970

  • Abraham, L., Bird, M., Johnson, J., Knudson, D., Russell, P., Smith, D., & Strohmeyer, S. (2018). Guidelines for undergraduate biomechanics (4th ed.) [Guidance document]. SHAPE America. https://www.shapeamerica.org/uploads/pdfs/2018/guidelines/Guidelines-for-UG-Biomechanics.pdf

    • Search Google Scholar
    • Export Citation
  • Alessio, H.M., Malay, N., Maurer, K., Bailer, J.A., & Rubin, B. (2017). Examining the effect of proctoring on online test scores. Online Learning, 21(1), 146161. https://doi.org/10.24059/olj.v21i1.885

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Anderman, E.M., & Won, S. (2019). Academic cheating in disliked classes. Ethics & Behavior, 29(1), 122. https://doi.org/10.1080/10508422.2017.1373648

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Azulay Chertok, I.R., Barnes, E.R., & Gilleland, D. (2014). Academic integrity in the online learning environment for health sciences students. Nurse Education Today, 34(10), 13241329. https://doi.org/10.1016/j.nedt.2013.06.002

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Beichner, R.J., Saul, J.M., Abbott, D.S., Morse, J.J., Deardorff, D.L., Allain, R.J., Bonhmam, S.W., Dancy, M.H., & Risley, J.S. (2007). The student-centered activities for large enrollment programs (SCALE-UP) project. Reviews in Physics Education Research, 1, ar3. https://www.compadre.org/per/per_reviews/volume1.cfm?#Cite

    • Search Google Scholar
    • Export Citation
  • Bonwell, C.C., & Eison, J.A. (1991). Active learning: Creating excitement in the classroom. ASHE-ERIC higher education reports. Association for the Study of Higher Education. https://eric.ed.gov/?id=ED336049

    • Search Google Scholar
    • Export Citation
  • Cleophas, C., Hoennige, C., Meisel, F., & Meyer, P. (2021). Who’s cheating? Mining patterns of collusion from text and events in online exams. SSRN. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3824821

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Cluskey, G.R., Ehlen, C.R., & Raiborn, M.H. (2011). Thwarting online exam cheating without proctor supervision. Journal of Academic and Business Ethics, 4, Ar10. https://www.aabri.com/manuscripts/11775.pdf

    • Search Google Scholar
    • Export Citation
  • Driessen, E.P., Knight, J.K., Smith, M.K., & Ballen, C.J. (2020). Demystifying the meaning of active learning in postsecondary biology education. CBE-Life Science Education, 19, ar52. https://doi.org/10.1187/cbe.20-04-0068

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H., & Wenderoth, M.P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111, 83198320. https://doi.org/10.1073/pnas.1319030111

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Gerstenhaber, J.A., & Har-El, Y-E. (2021). Virtual biomaterials lab during COVID-19 pandemic. Biomedical Engineering Education, 1, 353358. https://doi.org/10.1007/s43683-020-00045-6

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Giles, J.W., & Willerth, S.M. (2021). Strategies for delivering biomedical engineering elective during the COVID-19 pandemic. Biomedical Engineering Education, 1(1), 115120. https://doi.org/10.1007/s43683-020-00023-y

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Goedl, P.A., & Malla, G.B. (2020). A study of grade equivalency between proctored and unproctored exams in distance education. American Journal of Distance Education, 34(4), 280289. https://doi.org/10.1080/08923647.2020.179637

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Grawe, N.D. (2018). Demographics and the demand for higher education. Johns Hopkins University Press.

  • Guo, S. (2020). Synchronous versus asynchronous online teaching of physics during the COVID-19 pandemic. Physics Education, 55(6), Article 0065007. https://doi.org/10.1088/1361-6552/aba1c5

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hake, R.R. (1998). Interactive-engagement versus traditional methods: A six thousand student survey of mechanics test data for introductory physics. American Journal of Physics, 66, 6474. https://doi.org/10.1119/1.18809

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Halloun, I.A., & Henestes, D. (1985). The initial knowledge state of college physics students. American Journal of Physics, 53(11), 10431055. https://doi.org/10.1119/1.14030

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Harpp, D.N., & Hogan, J.J. (1993). Crime in the classroom: Detection and prevention of cheating on multiple-choice exams. Journal of Chemical Education, 70(4), 306. https://doi.org/10.1021/ed070p306

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hearne Moore, P., Head, J.D., & Griffin, R.B. (2017). Impeding students’ efforts to cheat in online classes. Journal of Learning in Higher Education, 13(1), 923.

    • Search Google Scholar
    • Export Citation
  • Hsieh, C., & Knudson, D. (2008). Student factors related to learning in biomechanics. Sports Biomechanics, 7(3), 398402. https://doi.org/10.1080/14763140802233207

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hsieh, C., Smith, J., Bohne, M., & Knudson, D. (2012). Factors related to students’ learning of biomechanics concepts. Journal of College Science Teaching, 41(4), 8389.

    • Search Google Scholar
    • Export Citation
  • Hussein, M.J., Yusuf, J., Deb, A.S., Fong, L., & Naidu S. (2020). An evaluation of online proctoring tools. Open Praxis International Council for Open and Distance Education, 12(4), 509525. https://doi.org/10.3316/informit.620366163696963

    • Search Google Scholar
    • Export Citation
  • Jaschik, S., & Lederman, D. (2019). 2019 survey of faculty attitudes on technology. Inside Higher Education.

  • Justin, R., & Oxner, R. (2020). Texas universities are moving more classes online but keeping the tuition the same. Students are asking if it’s worth the money. Texas Tribune. https://texastribune.org/2020/07/06/texas-universities-coronavirus-online-classes

    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2004). Biomechanics concept inventory: version two. In M. Lamontagne, D.G.E. Robertson, & H. Sveistrup (Eds.), Proceedings of XXIInd International Symposium on Biomechanics in Sports (pp. 378–380). University of Ottawa. https://ojs.ub.uni-konstanz.de/cpa/article/view/1326.81-82

    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2006). Biomechanics concept inventory. Perceptual and Motor Skills, 103(1), 81–82. https://doi.org/10.2466/PMS.103.1.81-82

  • Knudson, D. (2019). Do low-tech active learning exercises influence biomechanics student’s epistemology of learning? Sports Biomechanics. Advance online publication. https://doi.org/10.1080/14763141.2019.1682650

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Knudson, D. (2020). A tale of two instructional experiences: Student engagement in active learning and emergency remote learning of biomechanics. Sports Biomechanics. Advance online publication. https://doi.org/10.1080/14763141.2020.1810306

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Knudson, D., Bauer, J., & Bahamonde, R. (2009). Correlates of student learning in introductory biomechanics. Perceptual and Motor Skills, 108, 499504. https://doi.org/10.2466/PMS.108.2.499-504

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Knudson, D., Noffal, G., Bauer, J., McGinnis, P., Bird, M., Chow, J., Bahamonde, R., Blackwell, J., Strohmeyer, S., & Abendroth-Smith, J. (2003). Development and evaluation of a biomechanics concept inventory. Sports Biomechanics, 2, 267-277. https://doi.org/10.1080/14763140308522823

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Knudson, D., & Wallace, B. (2021). Student perceptions of low-tech active learning and mastery of introductory biomechanics concepts. Sports Biomechanics, 20(4), 458468. https://doi.org/10.1080/14763141.2019.1570322

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lee, C.-H., Liu, Y., Moore, M., Ge, X., & Siddique, Z. (2021). Enhancement of stay-at-home learning for the biomechanics laboratory course during COVID-19 pandemic. Biomedical Engineering Education, 1(1), 149154. https://doi.org/10.1007/s43683-020-00025-w

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Macfarlane, B., Zhang, J., & Pun, A. (2014). Academic integrity: A review of the literature. Studies in Higher Education, 39(2), 339358. https://doi.org/10.1080/03075079.2012.709495

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mahar, M.T., Hall, T.R., Delp, M.D., & Morrow, J.R. (2014). The state of online education in kinesiology in the United States. Kinesiology Review, 3(4), 177. https://doi.org/10.1123/kr.2014-0068

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McDermott, L.C. (1991). Millikan lecture 1990: What we teach and what is learned—Closing the gap. American Journal of Physics, 59(4), 301315. https://doi.org/10.1119/1.16539

    • Crossref
    • Search Google Scholar
    • Export Citation
  • McGreevy, K.M., & Church, F.C. (2020). Active learning: Subtypes, intra-exam comparison, and student survey in an undergraduate biology course. Education Sciences, 10(7), 185. https://doi.org/10.3390/educsci10070185

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Means, B., & Neisler, J. (2020). Suddenly online: A national survey of undergraduates during the COVID-19 pandemic. Digital Promise. https://digitalpromise.dspacedirect.org/handle/20.500.12265/98

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Nath, L., & Lovaglia, M. (2009). Cheating on multiple choice exams: Monitoring, assessment, and an optional assignment. College Teaching, 57(1), 38. https://doi.org/10.3200/CTCH.57.1.3-8

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Nelson, L.R. (2006). Using selected indices to monitor cheating on multiple-choice exams. Journal of Educational Research and Measurement, 4(1), 118. https://so05.tci-thaijo.org/index.php/RMCS/article/view/46802/38785

    • Search Google Scholar
    • Export Citation
  • Norris, M. (2019). University online cheating—How to mitigate the damage. Research in Higher Education Journal, 37, Ar11. https://www.aabri.com/manuscripts/193052.pdf

    • Search Google Scholar
    • Export Citation
  • Ochia, R. (2021). A hybrid teaching method for undergraduate biomechanics lab. Biomedical Engineering Education, 1, 187193. https://doi.org/10.1007/s43683-020-00033-w

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ottaway, K., Murrant, C., & Ritchie, K. (2017). Cheating after the test: Who does it and how often? Advances in Physiology Education, 41(3), 368374. https://doi.org/10.1152/advan.00103.2016

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Oxner, R. (2021). Student files class-action lawsuit against UT-Austin, saying online education last year wasn’t worth the tuition. Texas Tribune. https://texastribune.org/2021/03/02/ut-austin-lawsuit-pandemic-tuition/

    • Search Google Scholar
    • Export Citation
  • Prince, M. (2004). Does active learning work? A review of the literature. Journal of Engineering Research, 93, 223231. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x

    • Search Google Scholar
    • Export Citation
  • Ramo, N.L., Lin, M.A., Hald, E.S., & Huang-Saad, A. (2021). Synchronous vs. asynchronous vs. blended remote delivery of introduction to biomechanics course. Biomedical Engineering Education, 1(1), 6166. https://doi.org/10.1007/s43683-020-00009-w

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Sefcik, L., Striepe, M., & Yorke, J. (2020). Mapping the landscape of academic integrity education programs: What approaches are effective? Assessment & Evaluation in Higher Education, 45(1), 3043. https://doi.org/10.1080/02602938.2019.1604942

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Soneral, P.A.G. & Wyse, S.A. (2017). A SCALE-UP mock-up: Comparison of learning gains in high- and low-tech active-learning environments. CBE Life Sciences Education, 16, 12. https://doi.org/10.1187/cbe.16-07-0228

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Springer, L., Stanne, M.E., & Donovan, S.S. (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: A meta-analysis. Review of Educational Research, 69(1), 2151. https://doi.org/10.3102/00346543069001021

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Stephens, J.M., Watson, P.W.S., Alansari, M., Lee, G.C., & Turnbull, S.M. (2021). Can online academic integrity instruction affect university students’ perceptions of and engagement in academic dishonesty? Results from a natural experiment in New Zealand. Frontiers in Psychology, 12, Article 569133. https://doi.org/10.3389/fpsyg.2021.569133

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Troutman, V.A., & Grimm, M.J. (2020). Interactive digital experience as an alternative laboratory (IDEAL): Creative investigation of forensic biomechanics. Journal of Applied Biomechanics, 37(2), 163170. https://doi.org/10.1123/jab.2020-0171

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Vazquez, J.J., Chiang, E.P., & Sarmiento-Barbieri, I. (2021). Can we stay one step ahead of cheaters? A field experiment in proctoring online open book exams. Journal of Behavioral and Experimental Economics, 90, 101653. https://doi.org/10.1016/j.socec.2020.101653

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wallace, B., Gheidi, N., & Knudson, D. (2020). Incorporating problem-based learning with direct instruction improves learning in undergraduate biomechanics. Journal of Hospitality, Leisure, Sport & Tourism Education, 27, Article 100258. https://doi.org/10.1016/j.jhlste.2020.100258

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Wallace, B., & Knudson, D. (2020). The of course format on student learning in introductory biomechanics courses that utilise low-tech active learning exercises. Sports Biomechanics. Advance online publication. https://doi.org/10.1080/14763141.2020.1830163

    • Crossref
    • PubMed
    • Search Google Scholar
    • Export Citation
  • Whitley, B.E., Jr., & Keith-Spiegel, P. (2002). Academic dishonesty: An educator’s guide. Lawrence Erlbaum Associates Publishers.

All Time Past Year Past 30 Days
Abstract Views 246 246 0
Full Text Views 533 533 281
PDF Downloads 140 140 50