new website - confidentialinference.net

I’m building several projects in confidential inference space (enclava.ai, Mango AI) so I tend to stay on top of providers in the space, what they offer and how the industry is developing. I’ve had many conversations with friend building AI things about inference providers, explaining them which providers to use or more frequently just letting them know that there are alternatives. So I decided to build a website - confidentialinference.net Pricing in confidential inference space is very …. varied (to put it mildly). And to make things even more interesting the amount of companies that offer confidential models vs the amount of companies who actually run the infrastructure instead of just reselling other providers with a markup tends to be significantly different number. Not to mention the even lower number - companies who actually expose all the data needed to perform actual remote attestation so we can be sure the models are confidential and can verify there stack. That number is even smaller. ...

April 19, 2026

The end of 'trust me bro' - confidential computing for everyone

Since the dawn of computing we’ve been handing over our data to machines we don’t control. First it was the mainframe in the basement, then the server in the data center, now the container in someone else’s cloud. The bargain has always been the same: convenience in exchange for trust. You trust that the admin won’t peek at your data. You trust that the code running is what they claim. You trust that nobody compromised the infrastructure while you weren’t looking. ...

February 6, 2026