- EdTech
This blog is from one of COBIS’ Supporting Associates.
Written by iSAMS
Since the release of ChatGPT, more of us are looking for ways to reap the benefits of generative AI tools, but there are concerns over what the future looks like as AI evolves.
These concerns are particularly prevalent in the education sector. Safeguarding, data security, and the impact on learning are all top of mind for school leaders and educators.
So, what stance should your school be taking? And how do you prepare for something completely new?
What does AI look like in today’s classrooms?
Together, teachers and students are continually finding new ways to introduce generative AI. For example, many teachers are using AI to craft lesson plans, teaching resources, and new learning experiences. Learners can also use tools like ChatGPT to create revision materials and mock exam questions. Some of the most popular tools currently in use include:
· Canva Classroom Magic: This visual AI tool for educators includes features like Magic Design, a free AI design tool for creating visual content, and Magic Write, an AI text generator and writer that assists in generating written content.
· Curipod: Generate an interactive slide deck on a topic in seconds, including polls, word clouds, open ended questions for discussion, and more.
· Education Copilot: This tool provides you with AI generated templates for lesson plans, writing prompts, educational handouts, student reports, project outlines, and more.
Challenges presented by AI in schools
Inside the classroom, one of the primary challenges on the minds of teachers and school leaders is the impact of AI technology on the learning experience and ability. Are students really learning if their work is being produced by AI tools?
Outside of the classroom, the introduction of Artificial Intelligence also presents challenges for IT leaders and Data Managers. There are a lot of questions about data security and the impact on safeguarding. It’s vital to understand how AI technologies are using school data and any potential vulnerabilities these could cause. Examples of known risks include:
· Inappropriate use or sharing of personal data with the potential for surveillance and individual profiling
· Tracking student online activities and sensitive information
· Predicting future behaviours or outcomes resulting in an impact on student privacy rights
· School data used to target schools with cyber-attacks by accessing sensitive information or disrupting educational activities
How schools can mitigate these risks:
· Make sure that all AI tool users in your school understand what data should and should not be input into generative AI
· If tools are being integrated via APIs, thoroughly check settings to find out what information is accessible
· Follow your data security procedures and best practices as a priority, by ensuring data is encrypted, servers are always up to date, and continually monitoring systems for suspicious changes
Ultimately, these challenges all culminate in the same question for most schools– how do we build AI into our policies and what should these policies look like?
To learn more about how the iSAMS MIS supports school Data Managers in their daily roles, you can download your free guide here: https://hubs.li/Q02s5BJL0
Regulating Artificial Intelligence
While there are currently no specific regulations on how to implement AI in schools, EdTech and education sector experts have provided their advice on how best to approach this in your policymaking.
In the UK, you can find the Government’s current position on the use of AI in schools here. This advice focuses on how schools can use AI effectively while still being aware of its limitations. The document discusses the idea that generative AI is not a substitute for knowledge, so schools can teach how to write good prompts and sense-check the results – and this requires important skills.
However, regarding data collection and protection, the advice states that schools and colleges should:
· Always protect personal data in accordance with relevant legislation
· Not allow or cause intellectual property to be used to train these models
· Review and strengthen their cybersecurity by referring to cyber standards
· Ensure that students are not accessing or creating harmful content online by better preparing them for the online world
But what stance are schools actually taking in the real-world environment?
Experts state the best approach is to focus on your core principles and ethos when considering the use of AI in your school, ensuring your strategy has your school’s best interests at heart.
Preparing for advancing AI in schools
While no one can say for certain what schools will look like in the next five or ten years, your existing core skills are key to tackling the coming challenges.
A great way to prepare for the future of technologies in schools is to stay abreast of emerging trends. Examples of key EdTech and AI resources include:
· AI Explained: Regular updates on what’s happening in the world of AI
· The EdTech Podcast: A podcast that discusses all things new in the world of EdTech including AI
· Dedicated tech publications like MIT Technology Review
· Press releases from governments and policy makers: The European Commission AI Act is a recent example
It’s also important to lean on your existing skills and school teams. Teachers will be a great resource for feedback on how both themselves and students are already using AI. IT teams can help identify any data security concerns and advise on the most secure options.
Communicate and consult with others in the industry and be adaptable but resolute in your boundaries relating to new technologies.
To learn more about iSAMS, download your free guide for School Data Managers, including information about the future of AI in schools and the rise of Power BI technology, here: https://hubs.li/Q02s5BJL0