Values not eyeballs: a framework for public service digital media

Javier Ruiz is a consultant working on the interaction of technology, policy and human rights. He is the former policy director of the Open Rights Group and was a principal investigator in the EU Horizon 2020 project Values and Ethics in Responsible Technology in Europe (VIRT-EU). We have asked him to look at the Human Values work and provide his independent perspective.

Younger audiences are moving away from traditional television and radio towards online platforms such as Youtube and Netflix. These services have revolutionised the media landscape, offering flexibility, choice and quality. Unfortunately, they also carry many of the negative impacts of the internet, such as market concentration, promotion of addictive behaviours – including binge-watching – and loss of privacy. Unsurprisingly, the current approach to measuring impact, based on audience reach and maximising time spent online, drives those negative impacts, and is biased towards technology companies. Public media organisations like the BBC are faced with a stark choice: trying to compete reproducing those methods – with the associated problems – or changing the rules of the game.

The Human Values (HV) framework developed by BBC R&D proposes a shift in the way that media products and services are designed and evaluated. Instead of simply looking at numbers, these alternative assessments focus on the contribution to promote well-being and help people make positive choices in their lives. Following extensive empirical research, the HV team have created a list of key 14 “human values” relevant to the development and experiences of young people (16-34), and fundamental to all of us, regardless of age. Values here provide a link between basic psychological needs and behaviour, and include “Being safe and well”, “Receiving recognition” or “Having autonomy”.

Assessment Frameworks

The main innovation from the HV project is an assessment framework based on a  psychometric approach that surveys the audience of a product, asking them questions related to the 14 “human values”, such as whether they feel that the product “equips me (with information) that helps me feel safe”. These questions are developed from an extensive survey of established psychometric scales for a variety of indicators, such as social connectedness, self-esteem, or well-being. This audience survey is designed to be used as an alternative or complement to other traditional metrics.

This is an interesting approach and an improvement on the current methods, but also raises new questions. Asking people about their experience and feelings within a value framework is a good start, but moving from those subjective perspectives to specific elements of a product or service could require an additional layer of interpretation. Also, in some cases, people may not be aware that a product is manipulating them or their data is being abused.

The metric part of the psychometric approach also needs some additional considerations. Numerical scores are attractive because they are easily understood, but quantifying ethical design assessments needs to be done very carefully. If a technical system causes severe problems in one particular area, having an overall higher score does not mean that those issues should not be fixed. Fixed thresholds are particularly open to gaming and manipulation.

The HV team is also developing a framework for senior staff that provides a formal assessment process to test products and services with a series of questions related to each human value. This tool will be available to any organisation that wishes to see if they are aligned to the Human Values approach.

This raises the question of the relationship between ethical design assessments like this and other mechanisms for example data protection impact assessment that tend to follow the model of risk identification, prioritisation, mitigation. Specifically, there is a need to consider the overlap between ethics and potential legal obligations, e.g. providing information in plain language on how people’s data is being used by the product or service.

The fundamental approach of the project is based on psychology, and their engagement with young people gives it a solid grounding, but it may be useful to also look at the extensive developments in the field of ethics and values in technology design where some of these issues have been explored.


The toolkit presented by the HV project includes a handbook, a canvas and a set of 14 ethical value cards with more general questions that aim to elicit critical thinking about the wider implications of new products and services being created. Card sets with probing questions are a popular tool for this kind of ethical ideation exercise. Similarly, large paper canvas are used extensively for structuring group discussions.

The HV tools are well-crafted, but implementation of these and similar tools can be challenging and may require a wider effort across an organisation to develop the capacity of staff who may not be sensitised to ethical challenges. In some contexts, project members may not feel free to raise difficult questions with their managers. Additionally, although the HV toolkit is not designed specifically to increase diversity and inclusion, these participatory approaches will always be limited by the composition of the group involved. If the staff is not sufficiently diverse, the issues they raise may not relate to other demographics.

Creators can struggle to figure out what could go wrong and propose solutions. This is particularly the case when values enter into conflict, e.g. allowing users to express themselves and protecting people from abuse. The use of more sophisticated tools, speculative fiction and games can be useful in those situations.

Policy context

The HV project is very relevant to the current policy context, where there is widespread social demand for a change in our relationship with digital technology and the way it is regulated.

The GDPR has put data protection on the agenda of service providers. Besides the requirements of Data Protection Impact Assessments in some circumstances, one of the main new developments is the principle of designing technical systems with privacy in mind from the ground up, known as “data protection by design and by default”. In practice, most organisations struggle with implementing this requirement, and tools such as the Human Values toolkit can be useful.

There are also specific design requirements for children in the ICO’s Age Appropriate Design Code of Practice that fully comes into force in September 2021 and will apply to many BBC internet services. The code requires providers of certain internet services to deal with the needs and data of children (including 16-17 year olds) separately from adults. The ICO code covers several areas included in the HV framework, such as user behaviour and mechanisms to raise concerns.

Digital services are also being targeted for regulation from a competition angle, with an enhanced role for the Competition and Markets Authority and the creation of a new Digital Markets Unit. Although this may be less relevant to the BBC, there are some important challenges that will likely impact all internet services. For example the CMA is looking into user tracking by market leaders such as Google and the impact it has on competition. The provision of a private/incognito mode of access proposed in the HV assessment can be hampered if users can be tracked elsewhere.

Finally, the upcoming Online Safety Bill will place new duties of care on the “design and operation” of any BBC services that enable user interaction. The values of “Connecting with Others”, “Belonging to a Group”, “Expressing oneself”, “Growing myself” or “Receiving recognition” all could be impacted. These new duties include carrying out various risk assessments and mitigation mechanisms to deal with harmful or illegal content, child safety, and ensuring free expression and privacy. The new regime will likely increase the level of complaints and conflicts over what is appropriate content and the survey approach introduced here could be useful to establish the right balance. There are also new requirements to enable user reporting mechanisms, enhanced record-keeping and to make policies available that chime with several HV areas.

The HV framework is also aligned with other frameworks, for example the IEEE standard for design ethics in autonomous systems requires that creators of autonomous and intelligent systems “adopt increased human well-being as a primary success criterion for development”.


The Human Values project is a very good initiative that forms part of a serious effort by BBC R&D to redefine public service media beyond their own organisation. The framework might be improved by incorporating insights from existing approaches to values in design, but the main challenge for all ethical approaches to technology is the need for wider changes in the culture of organisations beyond the use of any tools.

The values and related materials cut across some of the main policy issues in media and technology. Although the framework is not structured along the typical contours of “tech policy”, we find data and privacy, design and information – addiction, filter bubbles, discrimination – and the challenges of managing user interaction that could lead to abuse, harms or misinformation. Some of these issues are being subjected to extensive regulation requiring technology assessments that are aligned with the HV approach.

Many of these policy challenges are also tackled in the wider work of BBC R&D, which includes other forms of value in the modern data economy, rethinking a public internet service and dealing with trust and fake news.

The HV framework is work-in-progress and the BBC R&D team are looking for people to test it and break it, so they can learn together. This is a great opportunity for organisations and individuals working to change the shape of digital tech and media.