Transition9 logo

I'm never quite sure if the phrase is that we're blessed or we're cursed to live in interesting times. Either way, we're definitely here and it's definitely interesting.

Both the internet and the web were designed purposefully to be decentralized. No-one could take control let alone turn them off. Since then we've witnessed many power grabs — from privately owned social networks to nation states 'protecting' their citizens — interspersed with a series of community action and associated projects to re-decentralize things.

Decentralizing done well is a means to some very welcome ends, but it has proven to be a harder design challenge than centralizing. And just as you think you're making progress, along comes the next centralizing innovation.

The rate of generative AI adoption is faster than for PCs or the internet, and yet so much power is required in training the large language models (LLMs) brute-forcing this phenomenon that retired nuclear power stations are being recommissioned. Clearly, unless we all get a nuclear fusion machine under the stairs any time soon, LLM training is going to remain the domain of a tiny minority.

It's a new age of centralization. But maybe not for too long ...

Yes, given their training biases, we need to be careful how and where we apply them. And yes, they make mistakes and increasingly so. Nevertheless, I'm optimistic that LLMs may be the most wonderful stepping stone technology. But stepping to where exactly? And how?

My colleagues and I at Unnamed Labs launched Transition9 today in partnership with the wonderful Kernel community. Please check it out and don't hesitate to join in. If your organization would love to be considered a partner, please get in touch.