Features
In a guest post for Arts Blog, Katrin Kohl, Professor of German Literature and Lead Researcher on the Creative Multilingualism research project, writes about recent calls for all British citizens to be able to speak English.
Should we be for or against British citizens having to be able to speak English? What makes a British artist sing in Cornish when she could be communicating so much more usefully in English – not only the lingua franca of England and the British Isles, but a language that's now spoken and sung across the world? Why is the Irish language such a politicised issue when some claim that there are now more Polish than Irish speakers in Northern Ireland? And where do users of sign language fit into these debates?
The fact is this: the UK has always been, and will remain, multilingual. And this is no more incompatible with everyone being able to speak English in the UK than it would be in India. Many people switch between different languages every day, and we all, at the very least, keep different linguistic registers in play as we move between different spheres and groups of people at home, at work or at school.
Louise Casey recently asserted that the UK should set a date for everyone to speak English. She's surely not wrong when she argues that additional funding should be provided for fostering English language skills, or that building linguistic bridges between communities can promote integration. But integration isn't helped by imposing a single language top-down or assuming that diversity is best eradicated. Languages are neither confined to what is useful nor just about what the majority speaks – we need look no further than the establishment of Welsh as an official language of the UK to appreciate this fact.
Languages are about lives, as the production of Gwenno Saunders' Cornish album shows us. While her linguistic heritage may be unique (with a Cornish poet and a Welsh language activist as parents), she's not alone in being able to draw on diverse languages as a personal treasure trove. All across the UK, people cherish the languages that are part of their heritage or that they have come into contact with in other, often very individual ways. Communities pass on their languages in religious practice, supplementary schools and cultural events, and individuals make something linguistically new from cross-cultural marriages and culturally diverse school environments. A language is a special emotional resource, a voice within which embodies memories of conversations with loved (and hated) ones past and present.
This personal, emotional dimension of languages has been sidelined in the way foreign languages have come to be taught in the UK – if the value of knowing a language is reduced to its practical function, it becomes unclear why we should bother with the hard graft of learning a new language when we can make ourselves understood in English. By the same token, it then seems sufficient to promote English as the sole passport to global success, whatever other languages children might already be familiar with. Many children are made to feel ashamed of knowing another language, and some schools indeed prohibit their speaking anything other than English on the assumption that they are thereby doing the children a favour – English is imposed as part of a lifelong school uniform.
Fortunately, many schools instead embrace the multilingualism of their students, enable them to take qualifications in their home languages, and allow them to discover their own linguistic resources in creative writing that extends beyond linguistic boundaries. Creative Multilingualism has been working with Oxford Spires Academy and with Haggerston School in Hackney to find out how children respond to exploring new language spaces. Modern foreign languages can be taught as part of that process and in interaction with it. This fosters a spirit of community that isn't confined to a single language, but characterised by shared variety and enhanced understanding of the potential that linguistic diversity holds for us all: each language is a subtly different window on to the world and a different link with other groups of people. As a preparation for life in an increasingly global world, this is hard to beat.
The UK rightly takes pride in its exuberantly diverse creative talent, but there's currently little appreciation of the ways in which languages enrich the country's creative identity. The UK music scene isn't just culturally and ethnically tremendously diverse, but linguistically too. Take Punch Records, a company set up to work with emerging Black British and British Asian artists who have grown up in urban contexts where varieties of English routinely mingle with other languages. The Slanguages exhibition project serves to showcase hip-hop, grime and rap as multilingual forms with a political edge. Birmingham school playgrounds have here served as seedbeds for adventurous modes of communication that offer exciting scope for developing new rhythms, speech forms and gestural language.
The UK's extraordinarily varied linguistic heritage is an invaluable national resource. At a time when the country wants to project itself as being more than Little Britain, and more than a country on the edge of Europe, it makes sense to value all those languages that have entered the UK over the decades, centuries and indeed millennia. Each of them has left its audible traces in the population, and together they open up a multitude of living pathways to other parts of the world. We might as well celebrate our flourishing abundance of languages – they're certainly not likely to go away.
Creative Multilingualism is funded by the Arts and Humanities Research Council as part of the Open World Research Initiative.
Once upon a time the concept of machines that could think and act like people was a fantasy - or more often than not, the recipe for a male-dominated, blockbuster movie. Fast forward thirty years and artificial intelligence is transforming - at pace, both the world around us and the way we live, work and communicate within it.
Despite its topical prevalence, AI is a social hot potato that is regarded as a gift and a curse depending on who you talk to, and its purpose is widely debated. Well documented issues include the technology’s impact on the labour market and concern around the gender-gap driving the designs behind the scenes - evidenced by the perceived white male bias in the algorithms that they generate. However, the field is gradually changing and more women are not only building a future in tech, but driving some of the incredible breakthroughs that are shaping our society.
As the University prepares for its first AI Expo event next week, the women closing the inter-disciplinary AI research gender-gap at Oxford University will discuss their experiences, career highlights, and some of the biggest challenges facing the industry with ScienceBlog.
Marina Jirotka is Professor of human-centred computing, Associate Director of the Oxford e-Research Centre and Associate Researcher at the Oxford Internet Institute.
Professor Marina Jirotka.
Putting people at the heart of computing
As Professor of Human-Centred Computing, Associate Researcher at the Oxford Internet Institute and governing body fellow at St Cross College, Marina Jirotka's work focuses on keeping people at the heart of technological innovation. Her research group undertakes projects that aim to enhance the understanding of how technology effects human collaboration, communication and knowledge exchange across all areas of society in order to inform the design and development of new technologies.
What is human-centred computing and how did you come to specialise in it?
Human-centred computing puts people at the heart of computing, so they have some control over how technology affects their lives. However, as technology has become more advanced, particularly with new developments in AI and machine leaning, this becomes harder. I am very keen to keep people at the centre of the drive towards machine learning.
I became interested in computational models of the brain in the 1980s when I was studying anthropology, and my interest in AI and its societal impact grew from there. I took further studies in computing and artificial intelligence after that.
My first research position was on one of the Alvey projects. Alvey was a large UK government sponsored research programme in IT and AI which ran from 1983 to 1987. The programme was a reaction to the Japanese fifth generation computer project and defined a set of strategic priorities for channelling British research into IT improvements. I was involved in building a planner to give people advice about the welfare benefits system. The final product was a great example of early inter-disciplinary collaboration, fusing STEMM technology with social sciences understanding.
As a society we are striving to create artificial intelligence without really understanding what intelligence is, or how to get the most from it, and that is a problem.
What drew you towards a career in science and AI?
Science has always been a big part of my family; my parents and my grandfather were chemists in the Czech Republic. My mother was actually one of the first women in the country to get a degree at Charles University.
I became interested in computational models of the brain in the 1980s when I was studying social anthropology and psychology. My interest in AI and its societal impact grew from there when I studied computing and artificial intelligence.
As a society we are striving to create artificial intelligence without really understanding what intelligence is, or how to get the most from it, and that is a problem. I personally like seeing where AI can actually go and where it can take us - what it really can do compared with the Hollywood hype, and then using that knowledge to hopefully make a difference.
How has the field changed for you as a woman in AI, and what can be done to encourage more women to join the field?
When I first started I was a real oddball, not only a woman but also a social scientist. But, now that there are more of us, I notice it less. I can’t generalise but I think the human-centred theme could be a big draw. In my experience women are keen to see the outcome of an application and understand what their work and contribution will actually achieve. Whereas some of my male colleagues are more driven by product development and the theoretical side.
What are the biggest challenges facing the field?
It is important to consider the kind of world that we want to live in, build from there and start thinking about the impact that developments will have on society and institutions. At the same time, it is paramount to involve and engage people in those visions, so that human society is taken on the AI journey as well, rather than left behind.
What research are you most proud of?
In the early days it was my contribution to Alvey, but currently I would say it is the Digital Wildfire Project.
The project grew from a desire to understand and address the spread of hate speech and misinformation online. For example, public reaction to events such as the New York Stock Exchange, Hurricane Sandy and crucially the spread and impact of hate speech.
In everyday society there are safeguards in place to protect people from hate speech, but in an online environment these defences do not exist. As a result, people sometimes feel that they can say things and behave in ways that they wouldn’t in any other area of life. People are subjected to abuse that they would not normally be faced with.
Our research looked at this phenomena and offered advice to people on how to engage with and control it. We worked with a number of different stakeholders from those trying to prevent and manage it, such as the police and schools, and to those who are most vulnerable, children.
More recently we have worked with policy makers, such as the House of Lords Select Committee, on communications to advise on and support children’s digital rights. I was specialist advisor to the committee which produced a report “Growing Up with the Internet”, making recommendations relating to how internet policy should involve participation of multiple stakeholders and for the promotion of digital literacy for children. The report was debated in the House of Lords. Following this, the Secretary of State for Digital, Culture, Media and Sport, responded to the Committee’s report and announced the launch of the Government’s Green Paper for an Internet Safety Strategy in which a digital literacy programme was proposed that involves different stakeholders in order to protect children when they are online.
I have learned so much from this project, particularly about how government works. It has also been a great way of engaging with the public. We have worked with technology companies and sponsors like Santander to engage with young people and get them to share their experiences online through art and other channels.
When I first started I was a real oddball, not only a woman but also a social scientist. But, now there are much more of us, it isn't unusual, it's vital. The challenges that we have to face in the 21st century can’t be solved by one discipline or mindset alone.
What excites you most about the future of AI?
Given the state of the planet, the ways in which AI is being used in areas that humans have not been able to access, such as extreme environments and also to help wildlife and conservation, these areas are really exciting.
I’m also equally interested in and worried by transhumanism - the notion of embedding technology into a human, in order to give them super human abilities. There is already research taking place in the US, which aims to improve people’s cognitive faculties through neuro science.
What can be done to help public understanding of AI?
People want to know how things apply to them and how something is going to affect them. We need to convey the current knowledge about machine learning to people so that they understand its potential and capabilities. In many ways this is much more interesting than the current media hype.
What role does interdisciplinary collaboration play in machine learning and AI?
It is imperative, to the point that research councils actively encourage interdisciplinary work now. The challenges that we have to face in the 21st century may not be solved by one discipline alone.
You are chairing the AI & Ethics debate panel at next week's AI Expo, what are your thoughts on the event?
The AI Expo is a great idea, that will hopefully serve as a reminder of Oxford’s commitment to supporting well considered machine learning progression. I hope it will inspire more events of its kind in the future.
Learn more about Professor Jirotka’s research here
Digital Wildfire Project: #TakeCareOfYourDigitalSelf
The Big Bang has long been taken to be our universe’s beginning. However, recent Oxford University research, published in Physics Letters B, has revealed that Earth’s universe actually existed before the point known as the Big Bang. David Sloan, Postdoctoral Research Associate in Oxford’s Department of Physics, discusses the thought provoking findings.
In the 1960s Stephen Hawking and Roger Penrose proved the “singularity theorems.” These formulae showed that Einstein’s model of the early universe always reaches a point in the past at which it cannot continue. This point is what most physicists have taken to be the beginning of time. However, our findings have shown that although the interpretation of Einstein’s work breaks down, the reality of physics continues.
There have been several previous attempts to resolve the problem of the limits of Einstein’s model, in many cases scientists have created models that introduce new effects to gravity (such as string theory or loop quantum gravity) that alter the models so that they never encounter this problematic point.
Our approach is significantly different in that we do not avoid the Big Bang, but rather continue our solution straight through it to what happened before. We introduce no new principles, and make no modifications to Einstein’s theory of General Relativity--only of the interpretation that is put upon objects. Our findings do not dispute the occurrence of the event, more its position as the beginning of time. Our equations predict that the Big Bang was simply the moment where the orientation of space changed.
In other words, this new research resolves a dilemma in understanding the early universe not by creating a new model of it, but rather by reimagining the interpretation of Einstein’s existing model.
We separate the behaviour of entities in our early universe from the map that they make of the universe, and find that although the map breaks down, physics itself does not.
The technical reason why this is possible is that the equations Einstein developed contain terms which themselves cannot be calculated at the Big Bang – physical parameters such as energy density or curvature which tend towards infinity.
Importantly, however, there is a remarkable property of the equations that until now has been deeply hidden: All the terms that are problematic turn out to be irrelevant when working out the behaviour of quantities that determine how the universe appears from the inside.
A cosmic timeline of the evolution of our stars, galaxy and Universe after the Big Bang.
Image credit: ShutterstockWhen seen from the inside, there is no way to measure the overall size of the universe. All that we can see are the relative sizes of objects and their shapes. We then write the equations that determine how these relative sizes and shapes evolve purely in terms of one another without ever referencing the overall scale.
When described just in terms of shapes and relative sizes, the universe approaches the Big Bang by flattening out like a pancake. Any three-dimensional object becomes effectively two dimensional at the Big Bang. Going through the Big Bang the object becomes three dimensional again, but will appear to be back-to-front.
We based our interpretation on these terms and found that there is a well-defined universe on the other side of the Big Bang, where the same interpretation of the theory can be applied. There we see that the universe before the Big Bang looks qualitatively similar to our own with some interesting differences.
There is an inversion of “chirality”, meaning objects that look right-handed in our universe, will emerge left-handed on the other side.
Initial work has shown that thermodynamic quantities like entropy (which determine, for example, how refrigerators work, and the heat we get from the sun) are also inverted, so someone who lived in this universe would experience time that ran the opposite way to our own. From their perspective our universe would be their past.
In future work we hope to gain a better understanding of the details of this mirror universe, this observation has the potential to provide further insights into the nature of time in our universe and our own origins.
1) Eating too many bananas makes you grow more body hair by increasing levels of potassium.
2) Maggots are used in hospitals to clean infected wounds.
3) Excessive cycling can cause permanent damage to the muscles in the face.
Look at the statements above. One refers to current medical thinking, one is an idea from the past, and one has been made up entirely. But which is which (answers below)?
These are just some of the weird and wonderful statements put to people who play Mind-Boggling Medical History, a game developed by Oxford's Dr Sally Frampton and colleagues, funded by the Arts and Humanities Research Council.
Mind-Boggling Medical History is an educational game designed to challenge preconceptions about history and show how ideas in medicine change for a variety of reasons. From floating kidneys and wandering wombs to transplanted heads and dogs that detect diseases, the game challenges players to look at a series of statements and decide which concern current medical practice, which are based on historical ideas or practices no longer used, and which have been, well, made up.
Players can choose from a number of rounds related to different medical themes, including sex and reproduction, animals, mind, and treatment. A physical card pack is available to those working in education, nursing, public engagement and museums, and an online version is freely available to all.
The game draws on the interdisciplinary work of the Constructing Scientific Communities project, led by Professor Sally Shuttleworth of Oxford’s Faculty of English, which explores the concept of citizen science in the 19th and 21st centuries.
Dr Frampton said: 'Mind-Boggling Medical History originated as a public engagement activity for museum events. Because it had such a positive reception, we decided to apply for Arts and Humanities Research Council funding to help develop the game into a more sophisticated learning resource designed to aid critical thinking.
'The game is aimed at school students, nursing and medical undergraduate students, and museum visitors. Our collaboration with the Royal College of Nursing has been a really important part of the project and has helped us explore how the game might be used by healthcare students to get them thinking about the ways medical knowledge and scientific evidence change over time.
'Through the game we have tried to build on the objective of the Constructing Scientific Communities project of enhancing understanding of public engagement with medicine and science. We hope it will also show how historical facts and theories can be used to prompt questions about current understandings of medicine and science.'
Answers: (1) Fictional; (2) Present; (3) Past.
Two of Oxford's early-career academics have been chosen among this year's New Generation Thinkers by the BBC and the Arts and Humanities Research Council (AHRC).
The scheme gives researchers with a passion for communication a platform to share their ideas with a wider audience via BBC Radio 3 and other outlets.
Representing Oxford this year are theologian Dr Dafydd Mills Daniel and English scholar Dr Lisa Mullen. They are the latest in a total of ten Oxford academics who have been selected as New Generation Thinkers since the first cohort was announced in 2011.
Dr Daniel is researching Sir Isaac Newton the committed Christian and alchemist, while Dr Mullen is working on a monograph looking at the work of George Orwell through the lens of his complex medical history, examining how his experience of being a patient influenced his political thought and use of language.
Dr Dafydd Mills Daniel (AHRC / Steven Haywood Photography)Dr Daniel is the McDonald Departmental Lecturer in Christian Ethics in Oxford's Faculty of Theology and Religion, as well as a theology lecturer at Jesus College. He said: 'I was absolutely thrilled to have been chosen as one of this year's New Generation Thinkers. To go to the BBC and to talk to producers and representatives from the AHRC about making arts and humanities programmes was an incredible experience.
'I am particularly keen to highlight what it means to study theology. Theology is a number of things, and one of those things is the opportunity to study a range of subjects, from languages, ancient texts and the rich tapestry of diverse religious traditions, to ethics, philosophy of religion, and the history of ideas.
'As a New Generation Thinker, my research concerns Sir Isaac Newton. Not Newton the rational man of science, who is often regarded as the founder of the secular age, but Newton the committed Christian, who was also an alchemist – staying up all night in his laboratory attempting to discover the philosopher's stone, which would turn ordinary metal into gold. I find the interplay between Newton the scientist, alchemist and theologian a compelling subject for research in itself. It also crosses over into wider areas of my academic work, which concern the history and development of such ethical and philosophical concepts as "reason" and "conscience" from the 17th and 18th centuries into the modern day.'
Dr Mullen, the Steven Isenberg Junior Research Fellow at Worcester College, said: 'It's a huge honour to be chosen, and I can't wait to get started. Communicating ideas is a key part of being an effective researcher, and the feedback and advice I've already had from people at the BBC and the AHRC has been really useful.
Dr Lisa Mullen (AHRC / Steven Haywood Photography)'Like most academics, I’m always happy to talk about my particular area of research and why I find it fascinating, but the New Generation Thinkers award is an opportunity to plug into all kinds of different cultural conversations, and to think about how my work intersects with broader questions and debates. Questions about language, knowledge, power, the value of literature – these are all things that really got Orwell going, and they are just as urgent now as they were in the first half of the 20th century.'
The ten New Generation Thinkers for 2018 were selected after a nationwide search for the best academic ideas with the potential to be shared through the media. The winners will now have the opportunity to make programmes for BBC Radio 3 and other outlets, as well as contributing to wider media through the AHRC. In addition, the scheme partners with BBC Four, where some of the selected academics will be given the opportunity to present a programme for TV.
Alan Davey, Controller of BBC Radio 3, said: 'Radio 3's mission is to connect our audiences with pioneering music and culture, and since its launch in 2010, the New Generation Thinkers has been a central part of this. The scheme has supported and nurtured some extraordinary academic talent, giving the broadcasters of tomorrow a platform through which to present their fascinating and thought-provoking research to our listeners, and I can't wait to hear what ideas these ten exciting thinkers will bring to us in the coming year.'
Professor Andrew Thompson, Chief Executive of the AHRC, said: 'This scheme is all about helping the next generation of academics to find new and wider audiences for their research by giving them a platform to share their ideas and allowing them to have the space to challenge our thinking. The New Generation Thinkers scheme is also one of the AHRC's major vehicles for engaging the public with the inspiring research taking place across the UK. More than ever we need the new insights and knowledge that come from arts and humanities researchers to help us navigate through the complexities of our globalised world and address the moral and ethical challenges of today and tomorrow.'
- ‹ previous
- 78 of 252
- next ›




