While the tech world hovers between whether A.I. is the biggest bubble since the Holland Tulip Bulbs or a few Tech Bro Billionaires owning the planet, we thought a different point of view would be refreshing.
We believe artificial intelligence will diffuse to everyone – or about everyone who wants or needs it – A.I. will be democratized.
We believe it – because we are witnessing – experiencing it. The democratization of A.I. has begun.
Our point of view begins with tech diffusion cycles.
We attended the birth or not long after, of the enterprise software industry.
Hardy souls at a now forgotten company called ADR, Applied Data Research, sold products called LIBRARIAN and ROSCOE.
If you are over 60 and play around in mainframe software – you have likely used them.
IBM didn’t think much of ADR’s products – which optimized hardware – IBM sold less of that hardware, because ADR enabled computers do more – so a lawsuit followed.
Surprise! ADR won and the enterprise software industry emerged.
Big vendor, trying to stop the little guy, lost.
You mostly, if you are over 40, recall days when there were no personal computers – the smallest computer was a DEC minicomputer then a SUN Station – scores of others.
Tech democratization began – everyone, or most everyone now has a computer at home. Computers are in cars and watches.
Music was once only available on vinyl records, controlled by record companies – today about every song ever produced can be downloaded to your phone.
Music video production was the realm of large studios – just last week, Piper Ally, a young woman who plays Rock music on bagpipes was featured in the Wall Street Journal – because she used TikTok to become a thing, lots of people love her music, as music video production became democratized.
We could go on endlessly – but you get it – almost every tech hurdle eventually falls to where the money is – the mass market.
The same is happening in artificial intelligence – we see it from our little corner and we want to share it with you here on this Substack – where we document how A.I. is happening below the surface of the lazy tech press.
A.I. according the early market entrants has at least 3 hurdles limiting A.I. to a wealthy few:
- A.I. needs tech wizards who are reportedly hiring agents – not the A.I. kind but the sports agent kind – to negotiate their breathtaking contracts.
- A.I. needs data centers to drive massive LLMs – large language models – and those data center costs are prohibitive to the masses.
- A.I. needs electric power so enormous entire towns and landscapes must be sacrificed or America falls behind China and in a few years China owns the world.
We at Fractal witness a very different reality which we share here with our vastly growing readership – watch for the P.S. (post script) at the end of this article for an important announcement.)
Currently the most visible state of A.I. is the LLM kind – like ChatGPT and scores of others. They preach the aggregation of ALL information for ANY possible question is the future.
Elon for instance is building a data center near Memphis to store all knowledge and wisdom – that is part of the current mindset.
The question “….what is the purpose of mankind” is the type of thing these LLMs seek to answer.
If that is the question, you better have a planet-sized data center and million-dollar-a-year wizards for sure.
What the tech press misses, is this is not where the A.I. money is.
Look, Kierkegaard, Nietzsche and Hegel couldn’t agree on this stuff, so no data center is going to kick out the answer.
The current craze of asking any question of Google Gemini is a lot of fun, but as article after article shows, nobody is making dough doing it.
That is why you see articles about how Oracle is in waist deep water now that smart guys are discovering the numbers – the A.I. LLM kind – with data centers as big as aircraft carriers – DO NOT ADD UP.
Just this week, our pals at Oracle are reported to have backed away from the data center construction bar before closing time moving that announcement about their new data centers for ChatGPT to 2028 – and you can bet it will never happen.
The dough in A.I. is not in LLMs and Chat GPT stuff.
You don’t have to believe us, the articles are everywhere. Nobody is making any money doing at it. Nobody.
Those are fun applications.
This is the same trajectory as the first personal computers – about games like flying airplanes.
Kids – who later ran tech companies – we have one or two at Fractal – bought Heathkits and built these computers in their bedrooms.
Along came Lotus123, followed by WordPress, Microsoft desktop products and entirely new ecosystems – on desktops emerged – delivering previously unimaginable wealth.
Apple Computer, for instance.
The future – which we encounter daily – is people solving real world problems with A.I.
Almost nobody writes about them.
Real world problem solutions make money – like routing a call to the correct agent based on a few words spoken picking up a regional accent.
Real world problems have little attributes called “constraints.”
The moment you introduce constraints to A.I. several things happen:
- You do not need wizards to build them, you need guys and gals who know the niche – as we show below.
- You do not need LLMs, because you are solving a specific problem – so you can use far smaller, precise models – and those take up fewer data centers – and far less power. With Fractal, skip the data center altogether.
- When you solve real world problems, people pay you real money – so you grow and become a market player.
Let’s take an example.
A consulting company came to Fractal with the “know how” to peruse hospital bills and identify from 10% to 17% of missed revenue. They have clients any health care organization would die to have.
Their stuff works every time.
Because missed revenue is identified after the fact – the hospital can only recover for a year back.
This company wants to use quantum speed, on current hardware to identify missed revenue the moment a claim is entered.
That’s real money and hospitals run on margins like Target and Walmart – which means they don’t make a lot on each transaction – or procedure.
The consulting company wants to create an A.I. system to do this at scale.
That is an application worth doing and it could generate billions of dollars in revenue for them by solving a huge problem.
What’s needed is their expertise – built over decades – the A.I. part comes second.
There are scores of companies working with A.I. to deliver real world solutions – that make money – because they solve a problem.
Business problems are defined – thus have constraints – so nobody cares about Shakespeare sonnets when they are trying to pay a bill or cancel a subscription.
A couple of these niche A.I. companies work with Fractal – you can bet hundreds more never heard of us.
They are out there and they are the future of A.I.
Solving problems – via constraints to the solution – small language models – no wizards needed.
Remember the quote from our mysterious Substack reader last week – which is appropriate here:
“Everyone assumes the AI race is a linear function of data centers, energy, and capital expenditure.
But that assumption itself is industrial-age thinking.
If someone shows that:
- compute can be moved to data,
- latency can be collapsed instead of scaled,
- and small systems can outperform hyperscale ones,
then the entire compute economy gets inverted.
And here’s the deeper implication no one is saying:
A world where intelligence concentrates in trillion-dollar data centers is very different from a world where intelligence runs at the edge.
One creates AI empires.
The other creates AI insurgencies.
If even part of this claim is true, the geopolitical map of the AI era shifts overnight not because models get smarter, but because the substrate of power stops being centralized.
That’s the real Black Swan.”
If you follow the trajectory of the data center construction industry – which we can demonstrate is a bubble outweighing the Holland Tulip Bulbs in the 1630s – let us refresh your thinking.
Data centers are now being opposed – successfully in communities across America.
Here are a couple of recent ones:
Here is the PR for Georgia data centers:
https://www.bbc.com/news/articles/cy8gy7lv448o
Heck, why list them individually? Here are $64 billion worth of stopped data centers in one report:
https://www.datacenterwatch.org/report
Only a year ago, the future was defined as data centers for A.I. – the irresistible force – A.I. everywhere – hitting the immovable object – not enough energy.
Farmland had to be ripped up or America would fall behind; lifestyle as we know it would end. People actually believe some of this stuff.
Those wedded to this dying narrative still believe it – the construction types, the tech press, the obsolete software companies who require gargantuan data centers because their software is mind-numbingly ponderous.
They believe it – not because it is true – but because they must believe it to survive and keep their faltering stock multiples.
Now that data centers have become the new ASBESTOS, SECOND HAND SMOKE, FLUORIDE IN THE WATER – combined – municipalities are stopping them. Done.
Citizens are winning as never before.
The contingency tort lawyers are not far behind – as reports multiply of the health problems to those living near data centers.
Here: https://news.ucr.edu/articles/2025/11/21/california-data-center-health-impacts-tripled-4-years
That’s why you are hearing about data centers in space – there are few citizen groups up there.
Into this maelstrom comes low I/O computing which enables any current data center to do from 10x to 1,000x the processing with no appreciable increase in energy.
No new data centers are needed – as we often point out – we aren’t the only guys proving this is possible – we are just the ones with the best Substack – so keep sending this to your pals. We have hundreds of new subscribers.
Read our piece on how data centers being built today will become the JC Penney abandoned strip malls of high tech – stranded assets within the half-decade.
Environmental organizations and their attorneys contact us constantly asking us to demonstrate on an Apple or Intel device the size of a shoe box – an entire Oracle Cluster – that resides in most data centers. The Apple Mini does the work of that Oracle Cluster.
For the last 2 years the Fractal team demonstrated low I/O computing to companies generally in the electric utility industry.
We even built a website showing all the electric utility applications we currently run in production:
For due diligence visitors, we demonstrate an Oracle-based billing system – most utilities use it.
The Oracle system, on a fully dedicated data center processes their about a million bills taking about 4 hours a day, 23 days a month. That’s 92 clock hours – in a full data center.
Fractal built the parallel system, and it runs in less than 10 minutes on a computer the size of a cigarette carton. Much less than 10 minutes.
Here’s the complete set of metrics:

The Fractal Utility site, noted above shows a couple of dozen other utility applications – with the same performance. Every one of these applications is in full production today and we have a luminary energy partner taking them to the utilities across America.
When we called this low I/O computing – everyone thought it was pretty cool.
When we built application after application in parallel with Oracle or other DB systems, always showing those performance improvements, nobody thought much about it.
They appreciated the quantum speed on current hardware and never took the thought to conclusion.
We have a video – 3 years old – on one of our sites noting there is no application that requires a data center.
Nobody ever commented on that video.
Nobody did the math about the implications.
Nobody asked, “well if you run 1,000 times faster, you only need 1,000th the hardware, and 1,000th the energy, for the same work, so why do we need a data center?”
Whoa! When we pointed that out, all hell broke loose.
It’s OK to have quantum speed on current hardware – it’s OK to demonstrate it to hundreds of people – but do not say America does not need more data centers – even if that is the provable, logical conclusion.
Now things have settled down and some of those niche companies are challenging their markets by building A.I. applications, that solve gnarly problems – without a data center. Run them on a few Apple Minis or other cheap hardware – plugged into the wall.
Thus their costs are so low, they too are becoming disruptive.
If you can deliver virtually any application, without a data center, everything changes.
The result is the last of the 3 barriers to A.I. democratization collapses – anyone can build an A.I. system around THEIR expertise – without the prohibitive costs of a data center.
That day is at hand.
We know it is because we have been asked to perform live demonstrations – using an Apple Mini – the size of a shoe box – of an Oracle Cluster equivalent – run in a data center – running on that tiny box.
We can do this on any computer, we just like Apple.
So we now demo this to prospective users – particularly in the warfare side of the government – because everyone knows there is no data center on a battlefield.
Soon you will see us with those environmental organizations – showing up at conferences, council meetings, with that Apple Mini demonstrating the equivalent of the Oracle Cluster – plugged into a wall outlet.
When a few Apples can run a huge application – no data center is needed.
We also get calls from the hedge funds. One took us through a full due diligence with their hired tech team.
Hedge funds are interested in the no data center story – for much different reasons: put options on Oracle and Palantir.




