Inside planetary limits
Earth has boundaries. Compute is not free. Attention is not free. Trust is not free. We design to work within limits, and we plan for regeneration rather than extraction.
A memory-over-energy AI/ML platform inspired by biological neural networks. Systems that learn from every interaction instead of being frozen after one training cycle. Digital organisms, not static models.
The dominant AI story is scale: larger models, more data, more compute, more datacentres. Progress measured by training-run cost. The bill is paid in watts, water, attention and trust. OpenMorph is an attempt at a different answer: biological architectures that keep learning from every interaction, encode information in structure rather than in brute-force parameter counts, and stay small enough to run close to where they are used.
Formalising the containment and lineage protocols (what happens when an organism evolves, when it interacts with external systems, when it is copied or forked). Benchmark work on sparse-activation inference.
Earth has boundaries. Compute is not free. Attention is not free. Trust is not free. We design to work within limits, and we plan for regeneration rather than extraction.
Knowledge that shapes how life unfolds should belong to life — not to whichever entity reached a market first. We default to the commons, and we license like we mean it.
Full set at /why.
If OpenMorph resonates — as inspiration, as a collaboration, or as a problem you are solving in parallel — we would like to hear from you.