Join your nearest Empowering Women training (running in July and September only)

Bridging the ethical divide

Many organisations including Socitm are working to provide guidance on digital ethics, writes William Barker

Whether it is the issue of preventing the misuse of Covid-19 technology and data, the need to combat algorithmic bias in the development and design of public services or controversy around discriminatory facial recognition, the issue of digital ethics is never far away from the headlines.

Taking a step back can help us focus on the role of ethics. Quite simply, these are moral principles that govern a person’s or group’s behaviour and inform decision making. The ethics of digital technology and data cover ethical aspects of technological design and use, together with the ethical impacts of digital technology on society as a whole.

Organisations working to provide guidance

Given the global nature of this challenge, international bodies such as the Organisation for Economic Co-operation and Development (OECD) and Institute of Electrical and Electronics Engineers (IEEE) have identified broad principles and standards on digital ethics. These principles have been adopted by the G20 group of industrialised countries and have recently led to the creation of the Global Partnership on Artificial Intelligence which the UK, US, EU and other partners have established to champion responsible AI and data governance.

Alongside this, official bodies like the Office for Artificial Intelligence, Centre for Digital Ethics and Information Commissioner’s Office are working closely with Digital Ethics Lab, Alan Turing Institute, Open Data Institute and Digital Catapult in championing digitally ethical practice across the UK public sector.

Framework of core values

As a result, we are seeing an emerging digital ethics framework, build around the following core values:

  1. Beneficence: do good
    Benefits of work should outweigh potential risks. Digital, data and technology that interact with people’s wellbeing, finances, relationships and health particularly require robust ethical principles.
  2. Non-maleficence: do no harm
    Risks and harms need to be considered holistically, rather than just for the individual or organisation. Privacy and security risks require social and organisational design, not just technological.
  3. Autonomy: preserve human agency
    To make choices, people need to have sufficient knowledge and understanding. It is important to involve stakeholders and interest groups in ethical risk assessment and design.
  4. Justice: be fair.
    Specific issues include algorithmic bias and equitable treatment. Consider whether a technology could produce or magnify unequal outcomes, and if so, how to mitigate this.
  5. Explicability: operate transparently
    Be ready to explain a system’s working as well as its outputs. Make all stages of the implementation process open to public and community scrutiny.

Socitm is developing a digital ethical practice collection on its resource hub, built around these five core values to help leaders, policy-makers and practitioners bridge the divide between theory and practice and embed digital ethics at operational and place-based levels. This aims to increase awareness amongst political and executive leadership in organisations to encourage them to dedicate more resources to identifying and addressing relevant digital ethics concerns, across not just their organisations but the wider communities which they serve.

It also aims to educate policymakers and those in legal and policy teams on how to build governance structures that incentivise organisations to address digital ethics issues and address the needs of local people and communities
in inclusive and just ways, as well as technology practitioners, data analysts and developers on techniques to address, detect, and mitigate ethical concerns with AI and big data projects.

William Barker is an associate director of Socitm and its practice lead on digital ethics. Read his whole Briefing Paper looking at the ethical use of emerging technologies and data in detail.

Read more:

This is a feature in issue 21 of In our View.