Universities betting on EdTech? New data says faculty make or break the impact
The study focuses on a two-year pilot program (2021–2023) that introduced a peer-based collaborative studying platform across five regional campuses and one urban research campus. The platform aimed to lower DFW rates - defined as grades of D, F, or course withdrawal - by encouraging low-stress, peer-driven academic engagement.

In an era of increasing pressure on universities to improve student retention and graduation rates, a recent study examines the true value of educational technology interventions. The paper, “Using Technology to Support Success: Assessing Value Using Strategic Academic Research and Development”, published in Education Sciences in May 2025, provides a detailed evaluation of a collaborative studying software tool deployed across multiple campuses of a large Midwestern university.
The research, rooted in a Strategic Academic Research and Development (SARD) framework, triangulates qualitative and quantitative data to assess whether educational technology investments can significantly enhance student outcomes. The findings reflect the complexity of technology adoption in higher education and underscore that success is contingent on thoughtful integration, instructor engagement, and systemic alignment.
How was the technology implemented and assessed?
The study focuses on a two-year pilot program (2021–2023) that introduced a peer-based collaborative studying platform across five regional campuses and one urban research campus. The platform aimed to lower DFW rates - defined as grades of D, F, or course withdrawal - by encouraging low-stress, peer-driven academic engagement.
Students could use the platform to share notes, organize study groups, and communicate with tutors and peers, but not directly with instructors. The tool also featured gamification elements, offering points redeemable for gift cards. However, usage remained modest, with only 25% of students adopting the software during the pilot's first year.
In response, the university implemented a more structured rollout in Fall 2022. Instructors were directed to embed the tool in syllabi, course announcements, and learning management systems. A targeted campaign, modeled on prior success in an anatomy course at the urban campus, sought to replicate earlier gains in student outcomes.
The SARD framework guided the evaluation, combining statistical analysis of course grades and pass rates with qualitative feedback from instructors and students. Fidelity of implementation, how consistently instructors promoted and integrated the tool, was also assessed to understand variation in results.
Did the technology improve student outcomes?
Quantitative results presented a nuanced picture. Across 22 matched courses, overall pass rates rose from 61% in Fall 2021 to 70% in Fall 2022, a statistically significant increase. Average GPA also improved from 1.85 to 2.19. However, when analyzed across different levels of software use (low, medium, high), gains were distributed nearly evenly, suggesting that other factors may have influenced outcomes.
A mixed-effects model revealed that instructor fidelity had a significant impact on student grades, while student-level usage did not show a statistically significant effect. Notably, a significant interaction was found between fidelity and student usage, suggesting that the benefits of the tool depended largely on how actively instructors encouraged its use.
Drilling down into discipline-specific outcomes, some anatomy courses experienced dramatic improvements. For instance, one campus saw a 34% increase in pass rates compared to pre-intervention years. However, similar courses at other campuses saw flat or negative changes, indicating that contextual factors, such as pedagogy, student demographics, and departmental culture, played crucial roles.
What did faculty and students report about the tool?
Instructors provided mixed feedback on changes in student engagement. Nearly half reported improvements, citing increased peer collaboration, attendance, and classroom participation. Others noted no change or cited pandemic-related disengagement. Interestingly, online instructors observed stronger community-building effects compared to in-person counterparts.
About 59% of instructors noted positive shifts in student performance. Some cited jumps in exam scores or final grades. Others pointed to active supplemental instructors or peer mentors as catalysts. However, instructors also reported frustration due to their inability to view the platform from a student’s perspective, limiting their ability to provide hands-on guidance.
Student surveys confirmed uneven adoption. While 75% tried the platform, only 52% continued using it. Reasons for abandonment included difficulty navigating the tool, preference for instructor support, or reliance on other resources. However, those who used the tool regularly praised its collaboration features, shared resources, and peer feedback mechanisms.
Students also disclosed a strong preference for solo studying, with 73% favoring individual learning. Most students turned to classmates or online searches when facing academic difficulty, while only 16% used campus tutoring services. This finding highlights the potential of peer-based tools to complement, but not replace, existing study habits.
The authors emphasize the need for institutions to plan rigorous, multi-method assessments from the outset of any technology rollout. They recommend involving faculty, students, and support staff in both implementation and evaluation phases, and caution against relying solely on vendor-provided analytics. The SARD framework proved effective in capturing the complex ecosystem within which such technologies operate.
- READ MORE ON:
- educational technology in higher education
- how edtech affects student grades and retention
- higher education tech pilot program results
- collaborative learning technology in higher ed
- remote learning engagement tools
- edtech adoption barriers in colleges
- digital learning tools in universities
- FIRST PUBLISHED IN:
- Devdiscourse