From Prime Minister Boris Johnson’s pledge to fund a £250m AI lab for the NHS, to the Department for Education’s recently launched ‘Artificial Intelligence (AI) horizon scanning group’, you could be forgiven for thinking that AI is being lauded as a panacea to some of the most pressing issues society faces.
As an EdTech professional very much engaged in exploring AI in personalised learning technologies, perhaps I should welcome such movements wholeheartedly. But, while any such focus on investigating how to improve learning is positive, my worry is that such expert groups sometimes miss the most important facet of their remit – interrogating empirical evidence.
Anyone who takes an interest in EdTech will no doubt have read many a comment piece either extolling the virtues or questioning the impact of AI in education. These opinions are passionately and persuasively argued, providing personal experience and insight into the issue (not least from suppliers of AI-powered EdTech solutions such as myself)! However, it seems that there are two important elements missing from the debate as it stands.
The first concerns EdTech-evangelists’ seeming willingness to put all their eggs in the AI basket. In much of the commentary I see, AI is positioned as the answer to any number of classroom challenges. Similarly innovative, alternative technologies seem to be side-lined in such commentary in favour of AI’s buzzword-based pulling power.
The second missing link in this debate is the impersonal – that is to say, empirical, unbiased evidence based on rigorous research rather than personal passion.
It’s that lack of evidence that is most concerning. As Nick Gibb, Minister of State at the Department for Education, said in a response to a written parliamentary question, “AI is a complex, emerging area… However, the impact of these technologies in the classroom still remains largely unevidenced.”
My worry is that any group of experts, such as the Department for Education’s new AI steering group, will inevitably have very human opinions and personal passions that may overshadow the current lack of robust research and empirical evidence.
Beneath this letter, you can see a collection of evidence we have undertaken over the past few years. Whilst this – alongside collaboration with schools and our team’s experience – informs the decisions we make at Sparx, there is still much work to be done. Our Education Research Team is committed to an ongoing programme of research and we will share our findings with the education and EdTech sectors in the hope that an evidence-based design culture will soon underpin all improvements within education.
Compared with national norms, Sparx students made 67% more progress in Year 7 and a further 63% more progress in Year 8.
Compared to doing no homework, students made 83% more progress by completing 15 minutes of Sparx homework, and 23% more progress for every 15 minutes of additional Sparx homework thereafter.
For any randomised control trial, getting the trial design right is critical as it determines the reliability of the evidence that will be obtained.
“I have the confidence to increase my part-time teaching hours to (almost) full time. This is directly due to Sparx and the enormous time-saving nature of this fantastic product.“
Schools that have adopted Sparx Maths Homework see an increase in homework completion rates to over 90%.
Although Sparx Maths is based online, we encourage students to complete their bookwork through regular bookwork checks. The below two external papers justify why we put so much emphasis on learners writing their workings and answers down:
How handwritten input helps students learning algebra equation solving (March 2008) – read now.
The pen is mightier than the keyboard: advantages of longhand over laptop note taking (April 2014) – read now.
Teachers save on average five hours per week (based on a teacher with 10 classes per week) when using Sparx Maths.
Maths confidence is strongly associated with student progress and attainment, as well as maths enjoyment and perceived importance of maths.
St James is a mixed secondary school in Exeter. With over 650 students, they are committed to raising standards to help their students move onto exciting post 16 activities and ultimately helping them to lead successful lives.
Sparx is the crucial combination of what is practical, what is proven amongst educators, what is proven within educational research, and conducting research of our own when there isn’t yet consensus within the education community.
Multi-Academy Trust reports provide insights into student progress in maths compared with other academies in the group.
End of term reports provides an ‘at a glance’ overview and insights into class-wide progress in the preceding term.
We need to encourage an evidence-based design culture to ensure impact-centric EdTech.
Why is it so important to have an evidence-based design culture in EdTech?