Edinburgh Professors to lead AHRC Programme on AI
Two Edinburgh researchers will lead a project to ensure artificial intelligence (AI) and data are used responsibly and ethically across society and industry.
The Arts and Humanities Research Council (AHRC), part of UK Research and Innovation (UKRI), have appointed Professors Shannon Vallor and Ewa Luger to direct the £3.5 million programme ‘Enabling a Responsible AI Ecosystem’ in collaboration with the Ada Lovelace Institute.
The Programme Directors will work closely with the Ada Lovelace Institute, selected by AHRC as a Collaborating Partner for the Programme, to define and shape the research priorities and themes, and deliver other activities to support a responsible AI ecosystem.
Mobilising expertise
The research programme will mobilise expertise from arts, humanities and social sciences to create an environment within which AI and data-driven innovation is responsible, ethical, and accountable by default.
Through harnessing the expertise of researchers and innovators from a range of disciplines the programme will develop a responsible AI ecosystem which is responsive to the needs and challenges faced by policymakers, regulators, technologists, and the public.
The programme includes experts in philosophy, human computer interaction, law, art, health, and informatics.
AI ecosystem
The programme will build connections between academia, industry, policy, regulation and the public to help build confidence in AI, enable innovation, stimulate economic growth and deliver wider public benefit.
Professor Shannon Vallor holds the Baillie Gifford Chair in the Ethics of Data and Artificial Intelligence in the School of Philosophy, Psychology and Language Sciences and is Director of the Centre for Technomoral Futures at the Edinburgh Futures Institute.
Professor Ewa Luger holds a personal chair in Human-Data Interaction within the School of Design, is Co-Director of the Institute of Design Informatics, and Director of Research Innovation for Edinburgh College of Art.
For AI technologies to be successfully integrated into society in ways that promote shared human flourishing, their development has to be guided by more than technical acumen. A responsible AI ecosystem must meld scientific and technical expertise with the humanistic knowledge, creative vision and practical insights needed to guide AI innovation wisely. This programme will work across disciplines, sectors and publics to lay the foundations of that ecosystem.
We have reached a critical point within responsible AI development. There now exists a foundation of good practice, however it is rarely connected to the sites where innovation and change happen, such as industry and policy. We hope that this programme will make new connections, creating an ecosystem where responsibility is not the last, but the first thought in AI innovation.
The three-year programme is supported by AHRC, part of UK Research Innovation (UKRI), to develop research into the challenges and opportunities around technologies.
Enabling a Responsible AI Ecosystem is the first large-scale research programme on AI ethics and regulation in the UK.
Artificial Intelligence is already transforming the ways we live and work and revolutionising diverse sectors, from health care, to education, to the creative and entertainment sectors. The potential benefits of these huge strides in technological capability can only be fully realised if all parts of society, including the public, can trust that they have been developed responsibly from the outset. Under Shannon and Ewa’s expert leadership, the AHRC Enabling a Responsible AI Ecosystem programme will work to ensure that the development and deployment of AI and related data-driven technologies are responsible, ethical and accountable by default.
The project will be delivered with the Ada Lovelace Institute to broaden the existing foundation of responsible AI research and contribute to the wider UK vision.
There is a real opportunity for the UK research community, in collaboration with policymakers and industry, to lead the way in developing a responsible ecosystem capable of ensuring AI works for people and society. We are delighted that Professors Shannon Vallor and Ewa Luger have been appointed as Programme Directors for this major AHRC-funded research programme. We look forward to working closely with both AHRC and the Programme Directors to define and shape the strategy, identifying and amplifying diverse perspectives, engaging with the existing ecosystem and influencing policy and practice.
Arts and Humanities Research Council (AHRC), part of UK Research and Innovation (UKRI)
School of Philosophy, Psychology & Language Sciences
Image credits - Credits/ caption – L-R - Professor Shannon Vallor, Professor Ewa Luger (Image courtesy of University of Edinburgh).
General AI image - Alexa Steinbrück / Better Images of AI / Explainable AI / CC-BY 4.0