Using AI In Classrooms Is A Test Of Public Trust
Teachers will soon be able to use artificial intelligence (AI) to create lesson plans and mark homework to help them in classrooms. A £4m project, announced by the UK government before the 2024 school year, includes giving AI companies access to school material to create educational content.
In support of educators, the Secretary of State for Science, Innovation, and Technology, the Rt Hon Dr Peter Kyle MP, said: “This is the first of many projects that will transform how we see and use public sector data. By making AI work for them [teachers], this project aims to ease admin burdens and help them deliver creative and inspiring lessons every day while reducing the time pressures they face.”
However, ministers were also keen to reassure parents and teachers that any use of AI would be safe.
“The new project will bring teachers and tech companies together to develop and use trustworthy AI tools to help mark homework and save teachers time,” the government said in a statement.
Balancing Innovation With Public Trust In AI
As part of its announcement, the government published research showing a degree of nervousness around AI, especially among parents. While it should come as no surprise that schools were “most trusted to make decisions about the use of pupils’ work and data,” the same cannot be said for tech companies with a possible interest in building AI platforms and tools.
“Trust in tech companies was extremely limited and there was little to no support for them to be granted control over AI and pupil work and data use,” the government said. This touches on an issue relevant to this project and every AI deployment - trust.
While no one can doubt the good that comes from AI-driven applications, this must be balanced against public perceptions around fairness and privacy. For many people, the algorithms behind AI can appear opaque. So, when AI systems fail, whether through error, bias, or misuse, it’s little wonder the public becomes sceptical.
Transparency & Accountability Must Be On The Curriculum
Trust is a cornerstone for any successful AI implementation. And regarding AI, accountability can only be achieved deep within the system by adhering to a robust observability strategy. This approach allows IT professionals to monitor and understand a system based on the data it generates.
As the name suggests, observability provides in-depth visibility into an IT system. It’s an essential resource for overseeing extensive tools and intricate public sector workloads and vital for helping ensure AI operations function correctly and ethically. It also can play a crucial role in regulatory compliance by providing detailed data points for auditing and reporting.
Not only does observability enhance the operational aspects of AI systems, but it also plays a pivotal role in building public trust by helping ensure these systems are transparent and aligned with user needs.
Data Privacy Is A Top Concern
Then, there are the issues most likely to gain public scrutiny. As the government’s research in the schools' AI project showed, protecting data privacy is often a top concern. Proper protocols include embedding robust data encryption, stringent access controls, and comprehensive vulnerability assessments as standard. These steps help ensure that sensitive information is safeguarded and systems protected against external attacks and internal leaks.
Any AI project also needs to consider the end user. There needs to be a gradual build-up of trust in the tools rather than a jarring change, which can immediately put users on the defensive.
Beyond the pace of adoption, regular engagement with stakeholders and other interest groups is essential for understanding public expectations and concerns about AI. In this regard, open communication and educating both the public and agency personnel about AI’s capabilities and limitations can help quell any concerns about the technology and promote a public conversation.
That’s why I’m watching this education project in the UK with such interest. It’s an examination of the public sector’s approach to one of the most precious things - the education of our children. One thing is clear: stakeholders, including teachers, parents, and students, will be looking to see whether those behind this AI project pass with flying colours or need to go back to the drawing board.
Rob Johnson is VP and Global Head of Solutions Engineering at SolarWinds
Image: Unsplash
You Might Also Read:
Teach Your Children About Safer Cyber Security:
If you like this website and use the comprehensive 6,500-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible