Meet us in London on 19 October for a conference tackling what, why and how to use technology and data

Digital ethics collection | Article

Ethics by design

“Ethics by design” focus on the design phase of digital and data tools. It directly concerns technology in all its technical complexity and the know-how of engineers, programmers, etc. These ethics therefore, touch in particular on the deontology (duty-based ethics) of digital creators of all kinds (developers, digital designers, project managers, etc.). Indeed, they have an ethical responsibility from the design stage onwards, insofar as data or algorithms may or may not reproduce human biases, reveal new discriminations (or reproduce them on a larger scale), give rise to injustices, etc.

Responding to the challenge

Ethical design is critical to the ethical use of emerging technologies and data. It is important to recognise that digital technology and data analytics are a means and not the end. With this in mind, the Socitm Local Public Service CIO Council (LCIOC) Simplify – Standardise – Share paper sets out a set of ‘Design Principles’ for local public service organisations making digital technology enabled investment decisions are to Simplify, Standardise and Share:

  • Take a place first, organisation second approach to technology and data decision making and commit to making appropriate organisational trade-offs to achieve this.
  • Work towards converging to one digital transformation strategy and associated digital technology roadmap for a place, with investment decisions referenced against place-based outcomes rather than organisational priorities.
  • Insist that all solutions and providers commit to using an open standard approach that enables interoperability with other parts of the system. This includes specifying open standards for: requirements, architecture (including data models), APIs (to enable sharing information between systems), governance (as a basis for information sharing), and citizen requirements
  • Where possible, work with citizens, local public service professionals and technologists to identify issues and collaboratively co-produce solutions to outcomes.
  • Recognise that many business capabilities across local public service organisations are very similar, offering an opportunity for simplification, standardisation and sharing.

“Ethics, like technology, is design – As we’re designing the system, we’re designing society. Ethical rules that we choose to put in that design [impact the society] … Nothing is self-evident. Everything has to be put out there as something that we think will be a good idea as a component of our society.”

Source: Sir Tim Berners-Lee to speaking to International Conference of Data Protection and Privacy Commissioners October 2018

A number of key resources are available to help embed ethical design at the heart of adopting emerging technologies and data solutions:

  • The Government Digital Service (GDS) Service Standard – Service Manual is a 14-point manual to helps teams to create and run great public services. It focuses on user needs, as well as the team involved in the project and how to design a service.
  • The Open Data Institute Data Ethics Canvas helps identify and manage ethical issues – at the start of a project that uses data, and throughout. It encourages you to ask important questions about projects that use data, and reflect on the responses.
  • The Alan Turing Institute publication Understanding artificial intelligence ethics and safety is a guide for everyone involved in the design, production, and deployment of a public sector AI project: from data scientists and data engineers to domain experts and delivery managers.
  • Doteveryone has created a Consequence Scanning Manual an iterative development tool to help organisations think about the potential impact of their solutions or service on people and society. For anyone directly or indirectly involved with the design of public sector digital and data solutions or services.
  • The Data Justice Lab data literacy guidebook provides an overview of different types of tools that aim at educating citizens about datafication and its social consequences. For anyone working directly or indirectly with data in the public sector, including data practitioners (statisticians, analysts and data scientists), policymakers, operational staff and those helping produce data-informed insight, to ensure the highest ethical standard of their projects.
  • The Midlands Accord and East Accord Digital Ethics Charter is a set of common principles that digital professionals and those working with “technology for public use”.
  • Delft University’s Ethical tools for designers – can help you uncover, explore and discuss the ethical aspects of your designs. The tools are grouped based on the skill they help you develop and can be used in different phases of the design process.
  • For a more detailed approach see the SHERPA development guidelines that addresses ethical design and practice issues across smart information systems. The guidelines set-out six development steps of business understanding, data understanding, data preparation, modelling evaluation and solution/service deployment together with how ethical considerations come into play at different steps.

Wider points to consider

The following points to consider are by no means exhaustive so should be seen as a starting point to further discovery…

Are the solutions and approach under examination proportionate and ethical?

  • Ensure the system is appropriate for the application (the function, the problem) under consideration
  • Will the solution work and be better than anything else for that purpose?
  • Take care to make sure its use in the proposed context be lawful, safe, acceptable to stakeholders, wise, and not have bad side effects

Do you understand the data context and are you looking to rigorously test it is critical to the integrity of any process? (Data use needs to be examined on a case by case basis by addressing the following points in tandem with proportionality question above.)

  • Are the precise calculations and data processing methods used in the program?
  • What is the exact nature of the data used in the development and testing of the program, and in its intended operation?
  • What is the purpose for which the system will be used?
  • Remember that the social, political, professional, environmental and operational process and practice contexts within which it will be used – its use will change these, and these are where consequences will be felt.

Are the risks of bias in the datasets identified and addressed?

  • Train policy makers, designers and practitioners to focus on identifying, defining and understanding the potential of algorithmic risks from the outset if we are serious about avoiding discriminatory and adverse personal outcomes.
  • Carry out rigours impact assessment (in accordance with official guidance and GDPR) in order to analyse the possible design-induced discriminatory impacts of algorithms
  • Put in place checks and balances at each stage of development to ensure there is no bias in the results

Can the operating rationale of the algorithms deployed for artificial intelligence be explained?

  • Have a system explain ability policy encompassing the whole chain (data provenance, explanation of the reasoning followed)
  • Develop algorithms that are transparent by design, in order to make it easier to explain them and to analyse how they reason
  • Adopt a labelling (with an ethical scoring/rating system) and ethical support approach

Does the organisation offer training programmes on ethics in the creation of digital tools?

  • Set up training workshops and/or skills refresher courses within the IT and related Service Departments

Are solutions designer’s representative of the social, ethnic and gender diversity of society?

  • Draw up a HR policy ensuring social and gender diversity in the workplace

Are new projects evaluated for their impact on privacy and personal data?

  • Consider setting up an ethics committee to approve sensitive projects

Do tools and solutions protect personal data by design and factor in the right to be forgotten factored into the design chain?

  • Adopt a privacy by design approach, in accordance with the requirements of the GDPR: this means building the protection of personal data into products and services by design, but also by default (notably by abiding by the data minimisation principle introduced by the GDPR).
  • Remember these are also cultural challenge because this concept needs to be factored into a project early on

Does the correlation of data collected from various sources result in the production of personal information (as part of big data and AI projects, for instance)?

  • Put in place a system that measures the personalisation of data after processing operations