Andy Ray is Graduate Admissions Director at Ohio University. He earned a B.S. in Audio Engineering and B.A. in Spanish from Middle Tennessee State University. He then served as a Rural Community Development Volunteer in the Peace Corps in southern Costa Rica, before earning an M.A. in Spanish and Portuguese, and later a Ph.D. in Spanish, from the University of Tennessee.
The onset of the COVID-19 pandemic disrupted traditional methods of assessing English proficiency in higher education and beyond. In-person testing was cancelled, forcing universities to reconsider how to evaluate international students for both admission and teaching assistantship eligibility.
At Ohio University, this challenge was heightened by the recent retirement of the locally administered Speaking Proficiency English Assessment Kit (SPEAK) test, a long-standing oral proficiency measure for international teaching assistants (ITAs).
Our university already accepted the Duolingo English Test (DET) for undergraduate admissions in March 2020, but graduate faculty were cautious to follow suit. They wanted more peer-reviewed evidence and assurance of the DET’s security and scoring rigor before extending it to graduate contexts. The DET’s growing robust empirical research base and willingness to share data played an important role in helping us take the first step.
From crisis response to structured adoption
Rather than fully committing to the DET for graduate admissions right away, we implemented the DET in a phased rollout: first using it for ITA eligibility only–our most rigorous standard. During the pilot, we relied on the DET’s integrated subscores, especially Conversation, to judge oral proficiency.
As the test evolved, we transitioned to the more traditional individual subscores (Speaking, Writing, Reading, and Listening), which aligned more closely with TOEFL and IELTS categories. Over two semesters, faculty gained confidence in the DET, and Ohio University’s Graduate Council codified it as an official tool for ITA certification.
By the time we proposed extending the DET to admissions and funding decisions, adoption was straightforward. Faculty had become familiar with the test, internal score data were available, and external benchmarks supported equivalencies with IELTS and TOEFL.
The results speak for themselves
Since adopting the DET for ITA certification two years ago, our data have been encouraging. Between Spring 2023 and Spring 2025, ITAs assessed with the DET achieved average overall scores of 128.6 (SD = 9.6), with Speaking subscores averaging 130.4. ITAs admitted with the DET came from diverse backgrounds including Nepal, Iran, China, Ghana, and Argentina and maintained strong academic records (average GPA: 3.62, SD = 0.38).
While our sample is small (12 students), the results reinforce the DET’s suitability as both a predictive measure of graduate students’ success and as an effective tool to ensure the required level of English proficiency for ITA positions.
What we learned from adopting and scaling the DET
Our process highlighted three important takeaways for other institutions:
- Dialogue is essential. We benefited from ongoing conversations with colleagues in the English department, the Ohio Program of Intensive English, and the DET researchers. This collaboration built trust in the test’s validity and expanded our broader understanding of proficiency assessment.
- Incremental change is more sustainable. Adopting the DET in stages—beginning with ITA eligibility—allowed faculty to adjust gradually. It also provided a safeguard before broader adoption took place. Starting small facilitated scaling quickly. By first adopting DET for ITA eligibility, we showed how the test works in practice. This early step reassured faculty and paved the way for full adoption, demonstrating that the test is ready to serve institutions across contexts.
- Consistency in scoring matters. Initially, we faced the question of whether to rely on the DET’s integrated subscores or its individual categories. The Graduate College ultimately recommended consistency with other major tests, using the newer subscore framework. This decision improved comparability and transparency across admissions and assistantship policies.
Research confirms DET aligns with real classroom performance
Ohio University’s experience is consistent with emerging external research. A recent multi-institutional study (publication forthcoming) led by Dr. Okim Kang, professor of applied linguistics at Northern Arizona University, examined the DET’s predictive validity for ITAs across four large U.S. universities with established ITA programs.
The study directly compared the DET and locally-administered SPEAK test scores with ITAs’ performance in in-class teaching presentations. At the time of the study, the SPEAK test remained the standard tool for ITA screening at these universities, providing a meaningful benchmark for comparison.
The researchers found that the DET speaking scores had strong correlations with classroom performance, much higher than those for the SPEAK test. Perhaps most importantly, the study confirmed that the DET scores aligned not only with general oral proficiency, but also with key dimensions of teaching effectiveness, such as clarity of speech and language use. This provides empirical backing for what we have observed at Ohio University: the DET can serve as an efficient and valid proficiency measure for both admissions and ITA programs.
Of course, it’s important to remember that standardized proficiency tests cannot fully capture the complexity of teaching success. Institutional context, faculty expectations, and pedagogical training remain powerful determinants of ITA outcomes. For this reason, Ohio University continues to view the DET as one component of a holistic assessment system, complemented by ongoing support and professional development.
Key insights from Ohio University’s experience with the DET
The move to adopt the DET at Ohio University was not a single decision, but the product of careful piloting, faculty engagement, and validation against internal and external evidence. Our journey reflects both the challenges and the opportunities in updating English proficiency assessments in the ITA program as well as for graduate admissions.
Although we started with incremental adoption to build faculty trust, the deliberate pace ultimately worked to our advantage. It gave us time to collect evidence, address questions, and build consensus across departments.
Looking back, I am glad we took the measured approach we did, but I also recognize that institutions now have the benefit of stronger research and practical examples, including ours. The DET is now a regular part of our admissions and ITA assessment process: valued for its practicality, and supported by faculty confidence.