Making the Most of AI

Back

By Stuart Eaves, assistant principal - King Edward V1 College, Stourbridge

What is this resource and who is it for?

This resource provides an introductory overview of artificial intelligence (AI) in the education sector, specifically tailored for teachers, support staff, managers, and other key stakeholders in the sixth form college setting. It aims to demystify AI and illustrate its practical relevance and potential applications alongside managing risks and ethical considerations.

Whether you are directly involved in teaching, support educational programmes, or make strategic decisions, this guide will equip you with foundational knowledge and insights into how AI can enhance educational practices and outcomes.


Why should we pay attention?

AI is revolutionizing the way we deliver, assess, and personalize educational content. AI-driven tools can adapt learning experiences to individual needs, providing real-time feedback and support that can dramatically enhance student engagement and understanding. Moreover, as the workplace becomes increasingly reliant on AI technologies, integrating AI into education doesn’t only prepare students academically, but also equips them with the digital competencies essential for adult life and future careers. Predicting the impact of any technology always comes with a level of uncertainty; however, there is a strong consensus amongst experts that AI will significantly change the way we learn and work within the next 5-15 years. We have a responsibility to be at the forefront of this development and must move quickly to keep up with the fast-paced nature of this technology and its exponential development. Young people increasingly anticipate the integration of AI tools in their education and future employers expect new talent to be ‘AI literate’. We serve as the crucial bridge that facilitates this transition.


Risks and threats vs opportunities and benefits

Opportunities and Benefits:

  • Individualised learning and support
  • Promotion of independent learning
  • More accessibility for all
  • Significant workload reduction for staff, allowing more time to focus on impactful work
  • Considerable SEN (Special Educational Needs) support
  • More effective and efficient data analysis so we know our students better and can spot any gaps more easily
  • Promotes a flipped learning model, which enables more time in the classroom for high-level discussions and thinking

Risks and Threats:

  • The spreading of misinformation
  • Plagiarism/cheating issues
  • Potential bias
  • An over-reliance on the technology, which in turn could create a level of laziness and reduction in critical thinking
  • The world of employment changes faster than we can manoeuvre, making our careers advice and curricula outdated and not fit for purpose
  • Awarding bodies do not adapt assessment methods fast enough
  • Data security and GDPR breaches

Moral and ethical considerations

College-wide

  • Needs assessment and stakeholder involvement: Consider who is consulted when implementing new AI initiatives. Stakeholders should include students, teachers, all staff, parents, governing bodies, unions, etc.
  • Ethical and privacy considerations: Any AI procurement or development must have a strong focus on ethical considerations, data privacy, and GDPR compliance.
  • Vendor evaluation: Consideration and due diligence of the companies and vendors behind the AI tools we use.
  • Training and support: Comprehensive AI training and upskilling should be delivered to staff regularly.

Staff

  • Human oversight: All AI must be monitored and validated by staff. AI complements what we do and doesn’t replace essential human interactions.
  • Misinformation: AI can get things wrong. Fact-check in order to avoid misinformation.
  • Bias: There is evidence of bias in AI technology. It can only learn from what it has already been fed, so it can compound biases depending on what it has seen.
  • Transparency: It should be explained to other staff and students where and how AI has been used.
  • GDPR: Data collection and privacy are important. Some AI models protect your data (e.g., Bing Enterprise), whereas others don't (e.g., ChatGPT).
  • Plagiarism: Teaching staff should keep assessment methods up to date and ensure checks for plagiarism/cheating.

Students

  • Source checking: Research the AI tools used, checking for bias and accuracy, just as students would for any other source.
  • Copying: Do not copy/paraphrase AI-generated content, in part or whole.
  • Plagiarism: Do not pass off AI-generated analysis and evaluation as your own work.
  • Referencing: Reference the use of any AI, just as students would for any other source.
  • Data safety: Be safe with personal data, complying with GDPR guidance.
  • Responsibility: Do not use it to spread misinformation.
Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×