← Notes licensingethicsmorphlab 7 min

Why we drafted the Living Commons License

A notes-from-the-workbench post. What we thought we were doing, where it came from, and the open problems we have not solved.

Ethical software licences have been around for a while, and most of them are better than nothing. We wrote a new one anyway. This post is the honest explanation of why, what we think it does, and — equally important — which problems we have not yet solved.

The licence is the Living Commons License (LCL). It lives as a draft under MorphLab, the part of Petitgen that works on questions that are too open-ended for a product roadmap and too important to ignore.

The default pitch for ethical licences doesn't work

If you look across the existing landscape — the Hippocratic License, the Anti-Capitalist Software License, CC BY-NC, the Atmosphere License, various Do-No-Harm variants — you see roughly the same move: take a permissive base, bolt on a list of prohibitions, and hope enforcement works out somehow.

This is better than silence. It is also narrower than the problem. Each of those licences picks a single axis (human rights, capital structure, commercial use, climate) and writes around it. You end up with a patchwork that is hard to combine, hard to explain, and easy for a well-resourced user to pattern-match into compliance without actually changing anything.

The deeper issue: software licences encode values, and the values baked into the licences most of us reach for — MIT, Apache, GPL — were set in a period when software was not yet a planetary-scale force. "Free for any purpose" made sense when the purposes we were worried about were being accidentally sued by IBM. It starts to look different when "any purpose" includes training a model on biomedical literature to design a novel pathogen, or optimising a feedlot operation.

What the LCL is actually trying to do

Five principles, none of them optional, written in the preamble and enforced across the terms. The short version:

  • Life-first ethics. All living beings have intrinsic value. Technology should enhance, protect, and respect life in all its forms — not merely serve human economic interests. Explicit, not vibes.
  • Planetary boundaries. Earth has limits. Software consumes compute, water, attention, and trust — all finite. The licence treats these as budget lines, not externalities.
  • Commons over commodification. Knowledge that affects all life should default to the commons. Commercial exploitation is conditional, not prohibited — but it must give back.
  • Intergenerational justice. Decisions today reach beings not yet born. They are stakeholders in the licence, not an externality to apologise for later.
  • Transparency. Hidden harms break every other principle. If a user of the software cannot say what they built and why, the licence is not satisfied.

Then there is the refusal list — what the licence names as prohibited uses. We have published ours publicly because we think an ethic you cannot name is not an ethic. The things Petitgen will not build are the same things we think the LCL should refuse to licence.

What is different, concretely

Three things distinguish the LCL from the field.

It is biocentric, not anthropocentric. Existing ethical licences treat humans as the protected class by default. The LCL extends standing to non-human animals, ecosystems, and future generations — and weights their interests concretely in the terms around prohibited uses and commercial conditions. This matters because AI-adjacent software increasingly acts on biological systems directly: gene synthesis, pathogen modelling, agricultural optimisation, conservation monitoring that doubles as poaching intelligence.

Commercial use is conditional, not binary. Most restrictive licences are either permissive (Apache, MIT) or anti-commercial (AGPL, CC-BY-NC). The LCL allows commercial use but requires: alignment with the principles, transparency reporting on impact, and a contribution to a commons fund or approved ecological cause. This is heavier than "free for everyone" and lighter than "non-commercial only" — and we think it matches how licensed software actually moves through the world.

It is written for digital organisms. The Morph research line explores AI systems that evolve, replicate, and carry memory forward across deployments. Existing licences have nothing to say about an organism that forks itself into a context its originator never imagined. The LCL draft carries early-stage provisions for containment, lineage attribution, and termination — we expect those to be the first part to need real work from a lawyer and a community.

Problems we have not solved

We would rather list these than have them pointed out.

Enforcement is not figured out. Ethical licences live in a grey zone between copyright, contract, and community norm. The LCL does not yet specify who has standing to enforce, what remedies look like across jurisdictions, or how to handle actors who simply ignore it. We have read the existing literature, we have a pilot plan, and we have no confidence we will solve it alone.

Defining "harm" is a minefield. Medical research that harms lab animals to treat sick humans. Agriculture that harms some insects to feed others. Conservation that culls invasive species to protect natives. The licence currently punts these to a "weigh competing harms with transparent justification" clause, which is honest but not operational.

Monitoring is unsolved. We do not know, at scale, how to verify that a commercial user is abiding by the conditions they accepted. Annual reporting is a start and also a joke — everyone reports honestly until they don't.

Compatibility with the rest of the ecosystem. OSI-approved licences do not recognise use-restriction clauses. A project under the LCL is, strictly, not open source in the OSI sense. We think that is fine for the projects where the LCL fits — and we acknowledge the cost: it means LCL-licensed work cannot be freely vendored into permissively-licensed codebases, and it complicates contribution workflows.

Greenwashing is a real risk. A licence that says "serves life" is, at the limit, a compliance product for companies that want to look good. We think the refusal list and the commons-fund clause make this harder — but not impossible. We are watching for it.

Why publish a draft at all

Because writing it in silence would be a worse version of the problem we are trying to solve. Part of the Living Commons idea is that knowledge which shapes how life unfolds should not be held by the first mover. That includes the licence itself.

So: the LCL is a draft. We use it internally — some Petitgen public releases will go out under it once it is past v1.0 — and we invite anyone working on adjacent problems to push back, fork, reuse, or reject outright. The repo is in MorphLab. If you want to argue with us, the intake agent is the quickest path; email works too.


This is the first in a series of notes from the workbench. Future pieces will cover the mechanics of memory over energy in our substrate work, our current thinking on consent-to-remember for persistent AI, and whatever else we find ourselves arguing about. Index lives here.

Working on an adjacent problem?

We would like to hear from you. Disagreement included — we would rather have the conversation than the silence.

Start a conversation → Petitgen principles