With the combined challenges of the climate and ecological emergencies, the pandemic and mass disinformation, with rising poverty and global inequality, and heightened regional tensions, it seems impossible to drive forward ‘innovation’, without an ethical foundation.
Technology isn’t neutral, it’s vastly more powerful in the hands of those with huge financial and legal resources than any small open source project, non-profit or contributor can ever have. As technologists we need to he honest to ourselves on the potential impact of our innovations; the last few decades have shown they’re regularly more harmful than we anticipate.
While anyone can publish a well-worded statement of ambitious goals, to be at the heart of an organisation, a values statement needs to be driven by those who will try to deliver it – and we’re still pre-launch. The statement will shape our direction and we’ll actively assess progress against it. So it needs to respond to, and be written with, the founding team, advisory network and community who continues after the first MOVA project is complete. But that doesn’t mean we can’t start thinking about it…
Don’t assume your organisation values will emerge along the way if you lead with a profit model. Keep yourself accountable to these values as you grow (as if they were the shareholders).
Balancing work that generates income and work we enjoy isn’t enough; we need to allocate time and resources to caretaking – of ourselves as workers, our communities, users and the environment we operate in.
Successful digital entities go global; but most paths lead to a global entity built around the values of a single country, mindset and legal jurisdiction. Growth through federation ensures regional and local divergence is part of an organisation’s fabric.
Media, more than most sectors, carries historical cultural baggage, and is vulnerable to manipulation and unconscious bias. Having the broadest team, community and advisor diversity isn’t box-ticking but an essential path to ensuring we understand and reflect our users. And where there are gaps, we need to ensure the channels for democratic engagement are fully open, well-maintained and work properly.
Any system with the potential to increase online video consumption needs strong awareness of the related impacts of this across life-cycle from data-centres, CDNs, network infrastructure and playback devices, and a strategy to reduce and compensate for those impacts.
Open source offers transparency and open participation few organisations attempt, but with benefits rarely seen or enjoyed by non-coders. Can we bring this to an entire organisation? The collapse in confidence around consumed media and the rise in business models around disinformation, invites technology solutions that are not only open but transparent about their decisions, ownership, motivation and funding.
Content identifiers could help facilitate automated systems of censorship by repressive regimes and entities seeking to silence criticism. We need to be proactive in standing against censorship, while balancing the social and legal responsibilities around privacy, disinformation, hate speech, and illegal content. Sometimes this could create conflicting priorities – which country’s definition of legal should be followed in a global decentralised registry?. We’re aware the answers may not be simple, but will need input from, and transparent discussion, with a range of voices, in particular those most impacted by the decisions. If that can’t be offered because of gaps in language or expertise, then we shouldn’t be involved.
There was no climate emergency or sweatshop factory labour before the innovation-led industrial revolution. As techologists neutrality is denial of the potential for harm in our work; neutral tech in an unjust world is a position. With open source the risks are higher – our work could be used to attack the ideals which we hold dearest: democracy, justice and the right of all humans to have a good, free and healthy life, in a sustainable relationship with our planet.