The COVID‐19 pandemic has caused major disruptions for workplaces, creating legal and practical challenges around working safely. An increasing number of tools (both web‐based and smartphone applications) are emerging to help employers address these changing expectations and move towards a ‘new normal’. These tools capture and analyse data in new ways to monitor employees and processes, some using more sophisticated artificial intelligence (AI) techniques. This presents an opportunity to interrogate the potentials and risks of using such applications in a wider context of decreasing public trust in digital and AI technology.
In this case study we focus on the Hygieia app as an example of such data‐driven tools. Blending theory and methods from different academic disciplines, we will critically evaluate the app’s features and explore user experiences during an ongoing trial in a workplace. The project will involve a rapid literature review on the ethical and lawful use of workplace monitoring tools, qualitative interviews with employers and employees, and critical analysis of app features. We aim to deepen understanding of how to maximise opportunities and minimise risks of using such tools, while building towards developing guidance and theory over the long‐term.
Development of workplace safety monitoring tools is evolving – first advertised to improve workplace safety during the pandemic, these tools now move towards general health and safety in the workplace. The design of the Hygieia app reflects this – it has a classic look and feel of a corporate app, with built in checklists to begin with, while advanced features which involve artificial intelligence are still under development.
To feed the articial intelligence models, the app collects data from those checklists in an attempt to predict typical incidents, and provide further checks to minimise those incidents. This in turn provides an approximation of “safety”, however limited.
The responsibility for the data rests with the managers within organisations which use the app. It’s the managers’ job to create the necessary checklists, pass them on to employees, and to monitor the responses. The developers focus on legal compliance with privacy and data protection laws.
The app’s data is viewed as just that – data, possibly with little regard to the people who fill in those checklists, and the wider unintentional harms that it may cause.
All in all, there are factors that could both increase and decrease end-user trustworthiness of such apps. On one hand, the look and feel of the app, the developers’ compliance with laws and standards, and the attention to discourses around AI and big data may improve end-user trustworthiness. On the other hand, prioritising features relevant to the organisations buying the app rather than the end-users, the lack of detail on how AI uses the data, and the responsibility focus on the organisations using the app may lower end-user trustworthiness.
The project has potential impact in providing insights on developing AI based systems in fast-changing environments, such as the COVID pandemic. The insights identify areas of likely impacts on trust, how requirement and environment focus will likely evolve and some of the specific challenges in generating suitable AI support when there is limited training data, all of which can inform future systems development, allocation of resources and policy focus.
This project is innovative, because we used an auto ethnographic approach in the analysis of the app, drawing on the team’s multidisciplinary expertise, including psychology, computer science, law and digital health. The project also included in-depth interviews with developers.
The project’s next steps involve publishing the results in academic articles, building collaborations, and developing further research proposals based on this case study. The team is also developing guidance and feedback for the developers, employers, and employees.
Call for Events is now open! We're supporting Members and Expert Fellows to lead activities that explore aspects of TIPS in the Digital Economy. We will help to organise the activity with up to £5,000 to cover the associated costs.