Project values

Technology isn’t neutral.

With the combined challenges of the climate and ecological emergencies, the pandemic and mass disinformation; with rising poverty and global inequality; with heightened national divisions and conflict, it seems unwise to drive forward ‘innovation’, without a clear ethical foundation.

Technology isn’t neutral, it’s vastly more powerful in the hands of nations and giant corporations with huge financial and legal resources than any small open source project, non-profit or contributor will ever have. As technologists we need to he honest about the potential consequences of our innovations; and how often we underestimate their risks.

Computer print

{values statement}

While anyone can publish a well-worded statement of ambitious goals, to be at the heart of an organisation, a values statement needs to be driven by those who will try to deliver it – and we’re still sandboxing. Such a statement not only shapes direction, but progress should be regularly assessed against it. So it needs to respond to, and be written with, the founding team, advisory group and community who takes forward any project after the first MOVA project is complete. But that doesn’t mean we can’t start thinking about it…

Existing themes

Approaches to address ethics in tech continue to emerge – from more restrictive versions of the General Public License like the Peer Production License to frameworks like Kate Raworth’s Doughnut Economics, operating structures such as Platform Coops and Distributed Co-Operatives (DISCOs). The model of “Exit to Community” rather than IPO or takeover, described by Nathan Schneider, is appealing and could be committed to by the founding team. The DISCO Manifesto already offers some insights about responsible and sustainable tech development:

Lead from your values

Don’t assume your organisation values will emerge along the way if you lead with a profit model. Keep yourself accountable to these values as you grow (as if they were the shareholders).


Balancing work that generates income and work we enjoy isn’t enough; we need to allocate time and resources to caretaking – of ourselves as workers, our communities, users and the environment we operate in.

Build for federation not scale

Successful digital entities go global; but most paths lead to a global entity built around the values of a single country, mindset and legal jurisdiction. Growth through federation ensures regional and local divergence is part of an organisation’s fabric.

Likely areas of focus for our values statement

Diversity and democracy

Reflecting the world’s web users, not the founders

Media, more than other industries, not only carries historical cultural baggage but is vulnerable to manipulation and unconscious bias. Having the broadest team, community and advisor diversity isn’t box-ticking but an essential path to ensuring we understand and reflect the world we exist in. And where there are gaps that can’t be filled internally, the channels for democratic engagement must be open, active and meaningful.

Planetary footprint

Tracking and compensating for CO2 equivalent emissions, Resource, Energy & Water use

Any system with the potential to increase online video consumption will most likely have a negative planetary impact. A focus on the CO2equiv, water, resource, land & energy impacts of online video across life-cycle from data-centres, CDNs, network infrastructure and playback devices – is essential to first assess, then reduce impacts and compensate for them.

Transparency and trustworthiness

Pull requests are the new complaints box

Open source offers transparency and open participation few organisations attempt, but with benefits rarely seen or enjoyed by non-coders. Can we bring this to an entire organisation? The collapse in confidence around consumed media and the rise in business models around disinformation, invites technology solutions that are not only open but transparent about their decisions, ownership, motivation and funding.

Free and safe speech

Anti-censorship, pro-accountability

As well as helpful metadata, Content identifiers can (and do) facilitate automated systems of censorship by both repressive regimes and entities seeking to silence criticism. Any system using identifiers needs to balance  being proactive against censorship, while balancing the legal and social responsibilities around privacy, disinformation, hate speech, and illegal content. Sometimes this could create conflicting priorities – which country’s definition of legal would be followed in a global decentralised registry? How would consensus be found, if it can be? We’re aware the search for answers in this will need input from, and transparent discussion, with a range of voices, in particular those most impacted by the decisions. If that can’t be offered because of gaps in language or expertise, then we shouldn’t be involved.

Unafraid to defend these principles

As tech isn’t neutral, we must be clear about what we believe in

As techologists neutrality is denial of the potential for harm in our work; ‘neutral’ tech in an unjust and unsustainable world protects not only the status quo, but the powerful entities with the resources to exploit that tech. With open source the risks are higher – our work can be used to attack the ideals which we care most about, so we need to be able to say them clearly. Technology needs to support democracy, justice and the right of all humans to have a good, free and healthy life, in a sustainable relationship with our planet and each other. If we can’t be sure our work is doing that, then we shouldn’t be doing it.

This is a conversation that we’re just beginning – we’d love to hear you thoughts