Why this prize-winning CMU team says experimentation will improve education

By Atiya Irvin-Mitchell

PITTSBURGH — Every year, the XPRIZE Digital Learning Challenge calls for teams from universities nationwide to use their skills to improve or find more effective learning tools. And in the 2023 edition that concluded last month, the team the competition thought did that best had members from Pittsburgh’s own Carnegie Mellon University.

The team’s Adaptive Experimentation Accelerator was developed to allow educators to conduct experiments in the classroom to gauge which teaching methods work best for students. The tool took home first place and a $1 million prize.

The team included members from the University of North Carolina State University and the University of Toronto. But from CMU’s side of things, members included Human-Computer Interaction Institute associate professor John Stamper and Open Learning Initiative Director Norman Bier, with support from Steven Moore, Raphael Gachuhi, Tanvi Domadia and Gene Hastings. They told Technical.ly the 20-member team built the tool over the course of two years.

This wasn’t the first time Bier and Stamper participated in the competition. According to Bier, winning builds on the university’s history of investing in educational technology work.

“This provides a demonstrable hypothesis-driven path towards development and iterative improvement and I think it really distinguishes the way that CMU does learning science and ed-tech from a lot of other institutions,” he said.

Stamper added that the Adaptive Experimentation Accelerator was built in part with tools that existed because of participation in a previous challenge, in which participants built software for a handheld device that could be deployed in Africa to teach math, English and Swahili. By using those tools, Stamper said, the team was able to ultimately succeed in the competition’s most recent challenge.

The CMU pros noted that an important feature of the new tool is that it allows teachers to set it to automatically find and default to better methods of teaching. For example, if a certain amount of students aren’t responding to a particular message well, they’ll be presented with an alternative message.

Part of why tools like this matter, Bier said, is because in the world of education, there’s not often a lot of room for troubleshooting which methods of teaching work best for students. He appreciated that the Open Learning Initiative, which seeks to improve learning no matter which format a person is trying to learn in, provided tools that were used to develop the one the team used to win.

“In order to really make progress in learning sciences, we have to be able to run these and replicate them at scale, and it has to be accessible,” Bier said. Experimentation “was crucial to the success of this project.”

Now that the 2023 competition has wrapped up, the Adaptive Experimentation Accelerator will be used in coursework at CMU and by its partners. Both researchers hope this will improve learning for students and teachers alike across the country.

“The tools that were there, that we’re building out now, really opened the doors for every educator anywhere to participate in this kind of experimentation and join in helping us to better understand and improve human learning,” Stamper said.

Atiya Irvin-Mitchell is a 2022-2023 corps member for Report for America, an initiative of The Groundtruth Project that pairs young journalists with local newsrooms. She wrote this piece for Technical.ly Pittsburgh, a publishing partner of the , where it first appeared. 



Originally published at www.penncapital-star.com,by Special to the Capital-Star

Comments are closed.