
Dr Caroline Green is keeping social care human in the age of AI
AI ethics not only concerns philosophers, computer scientists, policy makers or lawyers - AI is increasingly part of people's lives. We must ensure that it's developed, rolled out and used responsibly; in a way that does not reinforce inequalities or undermine the values that critical, life-improving actions - such as providing social care - are built on.
This is a concern that drives Dr Caroline Green, Director of Research and Head of Public Engagement at the University of Oxford’s Institute for Ethics in AI, and lead of the Accelerator Fellowship Programme.
Dr Green is rapidly becoming one of UK’s leading voices on the responsible use of artificial intelligence in adult social care. A Senior Research Fellow in Social Sciences at Jesus College, her scholarship sits at the nexus of AI governance, human rights and gerontology.
Whilst acknowledging that AI can benefit humanity, Dr Green maintains the importance of ensuring the responsible development, use and roll out of the technology across people’s lives.
As artificial intelligence becomes more embedded in everyday life, the social care sector in particular stands at a critical crossroads. Without a clear roadmap, the use of AI – while full of potential to improve peoples’ lives – risks reinforcing inequalities or undermining the very values care is built on.
A passion for care
It is important to remember that older people are not the problem here. The lack of adequate structures and ageism are some of the major factors we need to address urgently. AI and technology can make an immense difference to build systems that are fit for our ageing populations
Dr Caroline Green
With a background in human rights and a PhD in gerontology, focussing on human rights in care homes for older people, Dr Green’s work is driven by a dedication to improve the lives of people who need care and support and the passion she sees in the people working in care in the UK and in other countries:
'In the UK specifically, we have a foundation of what I call "fundamental values of care", many of which are enshrined in law.'
Still, many people are not receiving the access, level or quality of care they need and deserve. That passion keeps drawing her to the topic, as well as her interest to contribute to building systems fit for our ageing populations.
'It is important to remember that older people are not the problem here. The lack of adequate structures and ageism are some of the major factors we need to address urgently. AI and technology can make an immense difference to build systems that are fit for our ageing populations.'
Care is a universal experience
It was through my grandparents and working with people in care homes for older people that I saw what matters the most at the end of life. Next to good care, it is the people around us, empathy, love, forgiveness... I love the richness in what care means. That's why I am working in this area - it's such a human topic, full of emotions and the realities of living and dying.
Dr Caroline Green
We all receive and care for others at some point in our lives. Dr Green herself has seen her grandparents, who suffered with cancer and dementia, being cared for by multiple generations of her family in different settings: at home, in hospitals, in care homes.
'It was through my grandparents and working with people in care homes for older people that I saw what matters the most at the end of life. Next to good care, it is the people around us, empathy, love, forgiveness... I love the richness in what care means. That's why I am working in this area - it's such a human topic, full of emotions and the realities of living and dying.'
Social care in public narratives is often reduced to tasks like helping someone with the activities of daily life, but social care consists of much more. For example, helping someone shower is not just about the task itself. It is also about the quality of the interaction between the person receiving care and the caregiver.
These interactions can have a major impact on whether someone feels comfortable being supported in this task or goes away feeling ashamed. It’s about the quality of relationships.
AI's role in social care
AI and technology generally can really benefit people in care. It could help people to stay independent for longer, to stay in their homes, to support caregiving informally at home. It can help to lift the heavy administrative burden for professional carers. Yet we must make sure that we do not roll out AI and technology across social care blindly – there are risks that must be mitigated.
Dr Caroline Green
The use of AI applications in social care settings like care homes is not new. Common examples include fall-detection systems and pain analysis for people who cannot verbalise it - but the wide availability of new generative AI systems has sped up the development of AI powered tools to support caregiving. AI is now often hailed as saviour for the overstretched and underfunded care system.
'Generative AI has many use cases in social care. Care planning is one example. Different generative AI-powered tools allow professional caregivers to record someone's day and the system then generates a plan or report, instead of having to write it all out by hand.'
Much development and research is also being carried out into the potential role of care robots, many of which integrate AI systems to be able to hold conversations. Whilst Dr Green does not see a roll-out of care-bots across social care in the near future, she does believe we will see increased use of AI systems within services in the UK:
'AI and technology generally can really benefit people in care. It could help people to stay independent for longer, to stay in their homes, to support caregiving informally at home. It can help to lift the heavy administrative burden for professional carers. Yet we must make sure that we do not roll out AI and technology across social care blindly – there are risks that must be mitigated.'
Noticing a lack of guidance around the responsible use of AI in social care, Dr Green is working with colleagues across the care sector to address this.
'We need to understand better how people are using these systems now and what is needed in order to gain the most benefits from AI.'
What is clear to Dr Green and her colleagues is that AI will change the way that care is provided, so it is vital to understand what it is that we want to preserve in values of care.
'Addressing the gap in guidance for social care is urgent because there are risks to core values like person-centredness and dignity, risks that arise from the technology itself, from the way people use it, or from how policy is shaping up. We need to respond now, not only when we see people being harmed.'
Collaboration is essential to the responsible use of AI
Co-production to me is an essential component to understand what a responsible approach to AI in social care means because it challenges existing power dynamics between groups of people. We need to practice what we preach. Without the expertise of people who draw on care and caregivers, how are we going to understand what values we are trying to protect? And without tech providers on board, how are we going to know how to design technology in the right way?
Dr Caroline Green
Dr Green knows that new tech solutions and guidance in social care can make a real difference if people in care have co-produced them. Co-production is more than just listening to different peoples’ opinions, it is about working together as equals to identify problems and find solutions.
'Co-production to me is an essential component to understand what a responsible approach to AI in social care means because it challenges existing power dynamics between groups of people. We need to practice what we preach. Without the expertise of people who draw on care and caregivers, how are we going to understand what values we are trying to protect? And without tech providers on board, how are we going to know how to design technology in the right way?'
As a research method, good co-production is difficult, demanding strong stakeholder management and a thorough ethical framework to guide the work. However, the end results reflect all those involved in the process, enhancing a sense of togetherness and community in the sector. This is where Dr Green draws her energy from.
Over the past year, Dr Green has co-hosted the Oxford Project on the Responsible Use of AI in Adult Social Care – a project that has arisen from the desire to enhance collaboration and support the sector as it navigates the use of generative AI.
We have an opportunity to make a real difference here, and engaging with policy makers is key because we want to avoid a fragmentation of guidelines.
Dr Caroline Green
The Oxford Project started after a roundtable at Reuben College in spring 2024, which generated a shared statement outlining the need to address gaps in guidance around the responsible use of generative AI in adult social care. Over the course of a year, people from across the care community worked in collaboration to make sense of AI within their own lives and to draw out what is important to various stakeholder groups in the social care space.
In early 2025, Dr Green and the Oxford Project convened more than 150 participants - encompassing people with lived experience, adult social care workers, academics, policy professionals, and technologists – for the first time, to deliberate on the definition of the responsible use of artificial intelligence in social care.
Throughout the day there was a palpable sense of community, openness, and shared purpose. People from across the sector listened, challenged, and built together, creating a space for honest conversations, bold ideas, and a shared belief that ethical, inclusive AI is both possible and necessary.
'We asked what responsible generative AI means in care. To do so, we took a step back and reflected on what good social care is and how AI and technology can fit. Because there's no official guidance yet, we have produced a White Paper and co-designed guidance aligned with CQC regulations, plus a tech pledge.'
This work has been supported by the Department of Health and Social Care throughout, with active engagement of civil servants working on AI policy and social care. The Minister of State at the Department of Health and Social Care, Stephen Kinnock, recorded a message addressing the AI social care summit. He highlighted the potential of AI for the sector.
'We have an opportunity to make a real difference here, and engaging with policy makers is key because we want to avoid a fragmentation of guidelines,' said Dr Green.
Phase two of this project will keep co-production at its heart, leading to the creation of an Alliance for AI in Social Care to share good practice and continue developing the ethical framework for AI in social care.
Leading the conversation, globally
We need to make sure that we do not just invest in AI and technology in caregiving, but that we keep on investing in people, in professionals who will be able to provide that care. What use is an AI system that predicts falls, when there is no one around to address the risks for falls?
Dr Caroline Green
In addition to the Oxford Project, Dr Green is leading in the international conversation on the responsible use of AI in social care, bringing it to the attention of academia, the social care sector and the public in and beyond the UK.
'I have learnt so much throughout my research and engagement with people in the care community and I am really excited about what AI and technology can do for caregiving.
'But I am a bit sceptical about future policy directions. For example, we need to make sure that we do not just invest in AI and technology in caregiving, but that we keep on investing in people, in professionals who will be able to provide that care. What use is an AI system that predicts falls, when there is no one around to address the risks for falls?'
To Dr Green, the lessons from the UK are transferrable to other countries, in which care systems are experiencing similar problems - such as understaffing.
'On the international level, there is also a lot of hope that AI can solve issues in care and beyond, such as social isolation amongst adult and ageing populations. But AI should not be seen as a panacea to some of those very big problems that we are seeing in social care provision. AI is part of the solution, but we must not believe that we can simply replace human caregiving. We also need more evidence just how well AI applications are working to address problems like social isolation in a meaningful way.'
With funding from the UKRI Healthy Ageing Challenge, Dr Green has been working on an AI-powered chatbot called Dedicate. The idea behind Dedicate is to make available a trustworthy, reliable source of information and caregiving advice to family caregivers, drawing on the expertise of care organisations.
'I have heard so many stories of family caregivers, who are often facing caregiving situations in which they really need some support and advice in the moment. For example, I spoke to a person who did not know how to respond to his partner’s refusal to eat and drink while living with dementia. Generative AI can be really helpful for carers to ask care-related questions, but there are many considerations around the quality of the information and how people are using these systems.
'Dedicate is about building an AI powered tool that is based on trustworthy information sources and is aligned with good practice.'
With support from the University of Oxford, a closed beta is set to be launched soon. The hope is that quicker access to reliable guidance will change carer's lives and, eventually, make this information easily available to people in other countries.
'I have had interest from the US, Canada and other countries to collaborate with me on scaling Dedicate and I have had so much support from many people, like a team at Carnegie Mellon University to test and improve Dedicate.'
Keeping the human at the centre of a tech-supported future
If we shape responsible AI around that core of caregiving and what it means to be human, we'll see a much better future for people who need support.
Dr Caroline Green
Dr Green’s hope for the future of social care is one in which AI and other technology is seen as a tool to improve peoples’ lives, but where humans and their needs and experiences come first.
'I am always inspired and humbled by the work family caregivers and professional caregivers do. If we shape responsible AI around that core of caregiving and what it means to be human, we'll see a much better future for people who need support—and that will be all of us at some point. That's what drives me and gives me hope.'
Find out more about Dr Caroline Green's work at the Institute for Ethics in AI.