Principles for generative AI in education

This page sets out the principles which guide the use of generative AI (GenAI) in any aspect of education at the University of Sheffield.

Graphic with the text: "How do large language models work?"
Off

On this page: 


A Common Approach to generative AI in the curriculum

The University has developed a Common Approach to ensure every undergraduate programme integrates GenAI literacy through structured teaching activities and formal assessment, while providing clarity to staff and students about acceptable AI use across all assessments.

Our approach achieves:

  • clear expectations for staff and students around GenAI use
  • structured development of GenAI literacy skills for all students
  • preparation of graduates for an AI-enabled future
  • maintained academic standards and integrity
  • consistency and equity across the institution.

Our approach is grounded in harmonisation, not homogenisation. We provide institutional expectations while respecting disciplinary expertise.

More about our Common Approach (staff university login required)

How to use GenAI for assessment (student guidance)


Our principles for generative AI in education

The following principles are the University’s response to generative AI in education. They inform the approach to GenAI within schools, programmes and modules.

These principles are intended to be relevant to a range of AI and GenAI tools and applications: 

  • AI in this instance refers to  computer systems and software that can perform tasks which would normally require human intelligence - such as pattern recognition, data analysis and decision-making. 
  • GenAI refers to a subset of AI tools that create content, including but not limited to text, images, code and audio in response to user prompts. 

These principles support a positive approach to AI, support staff and students in understanding the limitations and challenges of using such tools, and enable staff and students to use GenAI tools appropriately, responsibly and ethically. 


A positive approach to AI

The University of Sheffield takes a positive approach to AI, and this means:

  • Staff will be provided with appropriate development and training opportunities to support the embedding of AI literacy within the curriculum, taking a student-centric approach and engaging students in this process.
  • All students will have the opportunity to develop their AI literacy during their time with the University.
  • Clarity will be provided within assessment criteria regarding how AI can contribute to or detract from the achievement of learning objectives, including recognition for appropriate and responsible use.
  • Taking a proactive approach to engaging with employers and other external stakeholders, so that we best understand their needs from our future graduates.
  • These activities are supported by central services including Elevate, 301 Academic Skills, the Library, ELTC, and the Careers and Employability Service.

The limitations of AI tools

We recognise the limitations of AI tools and will support students and staff to engage critically with the tools and their outputs, and this means:

  • Recognising the potential for bias, as a result of existing biases within training datasets, and also the potential of AI tools to provide inaccurate information, false references or other hallucinations.
  • Foregrounding the primacy of learning as a human process; taking a considered approach to how AI can be used as a tool to support learning, whilst also recognising when it might hinder the acquisition of key skills, knowledge and critical thinking.
  • Recognising that the algorithms employed by many AI tools are hidden from the end user, and as such are not necessarily replicable, testable or understood.

Inclusivity and accessibility

We endorse the use of AI as an assistive technology, ensuring barriers to the use of AI tools are removed for all staff and students. This means:

  • Due consideration is given to the equity of access to AI tools used in learning and teaching activities, ensuring that student access does not require any additional cost to the student
  • All students and staff have access to Google Gemini (university login required) as the institutionally supported GenAI tool. Where possible, Gemini should be used to support learning and teaching activities.
  • Digital accessibility (university login required) is a priority when integrating AI tools into teaching, recognising that some AI tools have accessibility issues. Where these are identified, accessibility statements will be used to manage expectations and suggest alternatives.
  • An awareness that AI tools can benefit students who are neurodivergent, anxious or managing other invisible barriers to learning, and we engage with students to find out how these technologies could aid their learning and participation.

Ethical use of AI tools

We recognise the importance of ethical use of AI tools, including issues regarding environmental, social and economic factors. This means:

  • Raising critical awareness of the potential environmental, social and economic impacts of using AI tools. This includes the consumption of electricity and other natural resources, as well as the exploitation of workers in the development of AI tools.
  • Recognising the role of AI tools in perpetuating existing inequalities inherent to the datasets they are trained on, user inputs and the opaque way in which these tools arrive at a particular answer.
  • Acknowledging there are potential copyright implications in the use of GenAI tools, and that this is currently a partially understood and contested area. This includes the potential for AI to reuse original content without acknowledging its creators.
  • Ensuring students and staff are familiar with the University’s principles for using GenAI in research and innovation.

Data protection and safe use of AI tools

We practise the safe use of AI tools, including recognition of issues regarding data, privacy and intellectual property implications. This means:

  • Students and staff are educated in how personal data is processed or stored within AI tools.
  • Ensuring staff and students are aware of, and have access to, training and guidance relating to information security and data protection in the context of AI use, recognising ways in which companies developing the tools may access and use this data.
  • Ensuring staff and students are aware of how to safely use AI for research purposes, in accordance with the University’s research principles.
  • Where tools other than Google Suite are made available to students on a use case basis, the New IT Solution Request Process (staff university login required)has been followed to ensure they comply with data protection and information security policies.

AI and academic integrity

AI can be a powerful tool to support learning, but it is important to understand the boundary between academic integrity and misconduct. We will support students to uphold academic integrity in their use of AI. This means:

  • Providing clarity in expectations regarding the appropriate use of AI, with a specific focus on how this would apply in assessments. These expectations are clearly communicated to students in programme and module handbooks and align with the academic misconduct policy principles and underpinning values.
  • Giving a clear and consistent approach for students to declare their use of AI tools within assessment activities. The Acknowledge, Describe, Evidence approach is provided as a model for this.
  • Giving students the opportunity to engage in conversations about their use of AI tools, including as assistive technology, and to know where to go for guidance and support when needed (both school and central support).
  • Giving consideration to where students are using AI tools as assistive technology and that this is appropriately supported.
  • Where the use of AI is not encouraged or is prohibited, providing clarity on what this includes and why this decision has been made.

When giving consideration to the above principles, we recognise that the use of AI within education is developing at pace. These principles will be revisited and updated regularly.

These principles were last updated on 16 March 2026.

They are also supported by a suite of central guidance and support for both staff and students provided by Elevate, 301 Academic Skills, the Library, ELTC, the Careers and Employability Service and others.

A global reputation

Sheffield is a world top-100 research university with a global reputation for excellence. We're a member of the Russell Group: one of the 24 leading UK universities for research and teaching.