Governance and AI

Back

By Mark Monahan, assistant principal - BHASVIC

Introduction

We are experiencing a massive explosion in technological capabilities; it is safe to say that we will look back on this time as a technological revolution. As we all rush to jump on board, we need to take a moment to step back and plan our route carefully.

Artificial Intelligence offers immense benefits but poses significant challenges. Governance is crucial to ensure these technologies are used responsibly and ethically, both centrally from DfE and from our own governing bodies in our colleges and trusts. Good Al governance considers the needs of teachers, leaders, support staff, students and parents - whether that governance is about mitigating the risks or maximising the benefits of those tools.


The Current Situation

The EU has already started carving out legislation to regulate the use of artificial intelligence in all sectors, including education. This has the potential to limit and restrict how its implemented across the continent

The UK government has a lighter touch. Its philosophy on regulating the implementation and use of Al puts us firmly in the 'pro-innovation' camp, as reflected in DE's position on the use of generative Al, including large language models (LLMs) like Google Gemini or ChatGPT, in the education sector.

Generative tools such as ChatGPT are already being used in UK education for a variety of purposes, which other authors have outlined in this collection, from planning, workload reduction, lesson content generation all the way through to students accessing and using the tools to support their own learning. The fact that teachers and students can interact with these tools with natural language is what makes them so accessible.

However, as DfE notes, content produced by these tools could be inaccurate, inappropriate, biased, taken out of context, or out of date and unreliable. Moreover, the DfE emphasizes that generative Al tools can neither replace the judgement and deep subject knowledge of a human expert, nor the interpersonal relationships and understanding teaching requires. Education leaders have a responsibility to navigate the murky waters of choosing, testing and implementing Al tools in their institutions.

Ofsted's guidance, meanwhile, provides little in the way of suggestions for how schools and colleges should or could use Al. It does however state that it won't inspect the use of Al directly but will consider how it is used to improve student outcomes. In effect, one could argue that it is no different to any other tool that a college may use to drive better results.


The Opportunities and Risks

The Opportunities

Choice

The market for Al tools is filled with hundreds of providers, but with the notable names of ChatGPT, Copilot and Gemini sitting firmly at the top of the list. Most colleges and further education institutions are usually already firmly in Camp Google or Camp Microsoft. Buying into a tool which aligns with your own college infrastructure will likely give you greater control, visibility and auditability down the line, making Copilot a logical choice for Microsoft colleges and Gemini for Google colleges. This gives colleges greater ability to control and manage the rollout of these tools to staff and students. Choosing tools such as ChatGPT gives colleges no control or oversight of how and to whom the tools are deployed, and which functions are enabled.

Free Tools

Almost all popular Al tools, including CoPilot, Gemini, ChatGPT and Claude, offer a free service. Staff and students can choose to use any of these tools. with the only stipulation being that users set up an account, or a Microsoft or Google account to sign in. OpenAl has recently even opened up the newer, faster ChatGPT 40 model for free. The landscape is always shifting; colleges can choose to recommend specific tools. but students will likely choose tools that suit their workflow and their wallet. Many by now will probably already have used Snapchat Al and stuck with that as their tool of choice.

Learning

There is no doubt these tools can support the learning of students, and other chapters will deal with pedagogical use in the classroom, and how we can support students in the ethical and safe use of these tools. What is for certain is that these tools will open up a wider world of learning, giving students the equivalent of a personal tutor in their pocket.

Risks

Navigating Age Restrictions

Not all Al tools are created equal. JISC has outlined the most popular Al tools and the age restrictions imposed on them. Most Al tools have a current age limit of either 18 and over, or 13 and above with parental permission. None of the sites, however, require evidence of this. For colleges, it does mean you may need to at least advise students they should only use tools for which they are either old enough, or have parental permission for. Whether you elect, as a college, to inform parents of this, or actively seek permission is a tough question governors must consider. At present, Microsoft is limiting Copilot access to students over 18, with a pilot in the pipeline to assess the safety of its use by younger students.

Data Protection

Most Al tools improve and develop through training on the prompts and data they receive from users. This poses a risk, as sensitive and personally identifiable information can be fed into these tools without real awareness of how it is used. Furthermore, while most tools have a free version, privacy can come with a cost - if you're willing to pay, you can access greater controls over your data and how it is used.

It is important for colleges to train their students to understand this, and therefore the importance of not entering personal data. When an institution centrally rolls out Copilot or Gemini as part of its I subscriptions, data protection is baked in. If a user is signed in as a paid-for academic user, neither Google nor Microsoft will store or see your data. Use the free models, and your prompts and responses form part of the wider training.

Digital Poverty

As noted, a great many Al tools are free, but enhanced models are accessed via a subscription. What impact will this have on students who can only access the tree models? For example, students can access GPT-3.5 at no cost, but model 4 is almost £20 per month.

It has a more current training model and much-enhanced capabilities. At the time of writing, there was almost a two-year difference between the training materials used for GPT 3 and GPT 4, which means greater restrictions on the free model's currency.

Chat GPT 4 is also between 10 and 100 times more powerful, depending on the task it is given. The same applies to enhanced Google Gemini tools, with paid options not only giving access to newer, more knowledgeable, and more effective Al models, but also to enhanced Al features baked into Google Docs, Sheets and Gmail. There are often also use limits, and in peak times users access to tools may be restricted

The premium versions of these tools generally allow much longer prompt inputs, but also more extensive data input. Some models can accept file uploads, handle images, thousands of pages of text in a document, and even hours of video input. This seriously affects what students can achieve with these tools.

So, an important aspect of the governance and selection process is to consider what functionality you are giving your students by recommending or purchasing particular Al tools.

Steps for Colleges

Ethical Use of Al

At the heart of Al governance lies the ethical use of Al, including ensuring that Al technologies respect the privacy and rights of individuals. For teachers, this means using Al tools that comply with data protection laws and safeguard student information. It is also important to use Al systems transparently, and avoid relying on them for decisions that significantly impact students' lives without human oversight. Al should not be used to make academic decisions based on student datasets, or used alone to grade assessments.

Data Protection

Institutions should already be familiar with data protection laws like UK GDPR and the 2018 Data Protection Act, but applying them to Al tools is a different question. What restrictions do you have on the movement on data? Are staff aware that personally identifiable data should not be entered into tools?

Use Al tools that prioritize student privacy, and look to procure tools that offer additional protections for the data and prompts that are entered. Ensure transparency in Al-assisted decision-making processes. In many instances, the procurement process is led by technical teams, or enthusiasts. Colleges need to consider which stakeholder groups are consulted in determining the best tool for staff and students, and the final decision on which tools to use and how is made by or with senior decision makers in an institution. Colleges should seek to update data protection policies, GDPR training and acceptable use policies to include references to acceptable and unacceptable use; the model policy in this collection provides a start.

Bias and Fairness

Al systems can inadvertently perpetuate biases present in their training data, leading to unfair outcomes. Teachers must be aware of these potential biases and actively seek to mitigate them. This involves choosing Al tools that are designed with fairness in mind and being vigilant about the results they produce. Current EU guidance requires a human to be involved in the system at some point to safeguard against biased responses. While we are usually encouraged to provide an Al with more context in a prompt to get a better output, there are indications that if you feed it demographic information about a student when giving it a task to assess, it can lead to biases in the system affecting the outcome. Does the Al think boys perform worse at that task, or certain ethnic groups?

Student and Staff Induction and Training

How do we pro-actively inform and train our staff and students? All governors need to consider this question. Some of the answers BHASVIC has come up with are below.

At BHASVIC, new students undergo a "Student Digital Induction" to upskill students with the tools that they will use in their day-to-day life at the college. The college is working to include training around the use of Al in this, and ensure the college's guardrails and protections are place. We have developed a simple user guide for students which will serve as a guide for parents and teachers. It highlights the ways in which users can and should use Al, and focuses less on what they should not do. Of course, there is a strong emphasis on what constitutes academic malpractice.

BHASVIC also recently ran a three-day INET programme including the first big sessions on Al. We saw that almost half of our staff had limited experience with Al tools, which proved that training is key. Once staff had recognised the time-saving benefits, the engagement was impressive. Lack of familiarity and experience seem to be major barriers.

Finally, we have no doubt our students are already using it, but BHASVIC is scaffolding support for students via college-wide induction, tutor time content, and course-level inductions, with ongoing guidance and support for how to use these tools wisely. This includes clear signposting of the JCQ guidance for students which, in a nutshell, says: Don't use these tools to cheat and "reference, reference, reference."

Accountability and Transparency

Al governance also demands accountability and transparency in the use of Al systems. Teachers should understand how Al tools make decisions and be able to explain these processes to students.

When Al is used to support teaching, it’s important that its role is clear and that there is accountability for its recommendations.

Many institutions may be keen to explore how Al can rationalise several processes, from areas as diverse as SQL report writing, data analysis, UCAS referencing and HR processes, to marking and feedback. What is important here is that the process and automation should be clear. The EU’s philosophy here is a good one: that in any process like this, there should always be a human in the system. We can use Al to save time, and maybe even aid with decision making, but an accountable human being should always sit somewhere in that process.

Environmental Concerns

Many colleges have a strong sustainability ethos and vision. As part of staff and student induction and training, it is therefore important to highlight the energy demands and environmental impact of Al tools. Current estimates indicate that a single ChatGPT query uses the same amount of energy as running a 5W light bulb for one hour and 20 minutes, compared to a Google search needing the equivalent of 3 minutes of power for the same lightbulb. This is a twenty-fold difference.

What colleges do with this information is hard to say. We might consider offsetting programmes, or simply advising users to think carefully about how and when they use Al tools and to be mindful of the increased energy demands they have; but it does need to be taken into consideration in our governance and strategies for Al.


Conclusion

For sixth form colleges, navigating Al governance doesn’t require deep technical knowledge. It's about understanding the ethical implications, potential biases, and the need for accountability in Al use. By taking proactive steps to educate all staff and students about these concepts, we can foster a learning environment that responsibly leverages Al technologies for education.

College leaders and governors need to think carefully about the tools they use, and in the early days of any technology there are always countless platforms offering magical solutions. In reality, few platforms offer a unique edge, but colleges need to be assured that thorough processes are in place and sufficient scrutiny is given to the system being used, and what it is being used for.

There is no harm in being in the middle of the pack. Taking the time to do the due diligence will ensure colleges have made wise and considered decisions.

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×