Community staff recently finished participating in a training course delivered by Dr Christina Colclough, Head of the Why not Lab, and expert in the interrelationships between work and technology. Staff from across Community have been trained up as tech champions— empowered to tackle digital issues in all of our workplaces.
Over the course of the sessions, we explored the data lifecycle at work, what technologies are used in workplaces, the different stages of the process and when we can ask relevant, useful questions about our members data and their rights at each stage. We also drew out the ways in which we as a union can ensure that workers have a seat at the table— questions we should ask, points we should raise, and structures we should set up to make sure that we have a space to work with management to shape technology change, for example by setting up a tech forum.
For me a particular highlight of the training was the algorithm game— in groups we set about trying to design steps for a recruitment algorithm which showed us just how much room there would be for biases to creep in, and how trying to formulate shortcuts and rules to categorise and classify candidates can so easily create gross unfairness. This unfairness during recruitment is another crucial reason why trade unions like Community are taking this conversation so seriously — it’s about ensuring workers’ rights are protected even as the world of work changes around us.
During the sessions we talked about algorithmic management— an umbrella term being used to talk about a whole range of technologies being used to discipline, measure, track, hire and fire workers. Recently, we’ve seen examples of smart cameras being used to observe workers behaviour in invasive ways that threaten their right to privacy. We considered the key questions that we need to all be asking of management throughout the process, for example, what rights of redress can we ensure if this goes wrong?
Christina equipped us with a whole range of tools to enforce our member’s rights. We talked about how we can use the articles in the GDPR, including through running a transparency survey (based on GDPR article 13 on data collection) with shop stewards. And we explored the importance of going beyond this to negotiate for stronger collective data rights where there are gaps in the law such as being able to object where workers are subject to inferences that are not directly related to their own personal data.
Running this training at Community was critical because if we don’t know what technologies are used at work then we cannot move further to protect workers’ rights, dignity, and autonomy. Sometimes the conversation about digital tools at work sounds like it’s remote, or isolated from our workplaces. But the examples that shone out during the sessions show that these issues are happening in our workplaces today.
Another key takeaway for me was how much is matters is how we talk about this issue to make sure that people can see how critical it is. Does talking about automation and technology leave you cold or confused? Perhaps it’s better to be talking about digital?
This training course followed the work that Community has done developing recommendations for bargaining around technology at work. The next step in this important process will be the reps training that’s due to be delivered this year— equipping our reps with the same tools to address digital systems at work.
Thank you to Christina and to all the staff who joined the sessions and took the time to share such powerful insights which we hope we can translate into real changes in our workplaces going forwards.
If you’re a workplace rep, check out Christina’s guide to workplace co-governance of algorithmic systems, Community’s guide to bargaining around technology in the workplace, and you can reach out to the research team if you want to discuss digital at work.
If you are a member of Community and need help or advice, please contact us at help@community-tu.org or on 0800 389 6332.