Brixo Newsletter: LLM Efficiency & Thinking Machines
From Notion's shrinking profits to OpenAI's $115B burn rate—plus the story of how four researchers accidentally named artificial intelligence.
Brixo Latest
Right now, your engineering team is stuck in optimization quicksand. While your competitors ship new features, your developers are buried in performance tuning and cost optimization—critical work that pulls them away from innovation. This isn't sustainable.
That's where our Model Efficiency Assessment comes in. We evaluate your inference and LLM production stack to determine whether it's performing well and optimized for cost—so your team doesn't have to.
You'll receive a comprehensive report featuring an efficiency score that can be tracked over time, helping you understand if you're making the right improvements without diverting engineering resources.
When we identify opportunities for improvement, our suite of LLM optimization tools can automate the tedious work, freeing your team to focus on the critical roadmap items that drive your business forward.
If you’re interested, drop us a line 😎: contact@brixo.com
What Stood Out
Cutting-Edge AI Was Supposed to Get Cheaper. It’s More Expensive Than Ever.
“Ivan Zhao, chief executive officer of productivity software company Notion, says that two years ago, his business had margins of around 90%, typical of cloud-based software companies. Now, around 10 percentage points of that profit go to the AI companies that underpin Notion’s latest offerings.”
While it’s expected that AI will have an effect on margins, it’s still jarring to read it in writing.
What we read/listened/watched
Inference whales' are eating into AI coding startups' business model
Making the case that fixed seat-based pricing is not going to work for most AI companies. This gets most interesting at the top of the market. How much costs are the frontier models willing to eat for market share?
Econ 102 Podcast with Noah Smith and Erik Torenberg
An interesting conversation between 2 people that are deep into this topic. You can read more on Noah’s substack:
OpenAI expects business to burn $115 billion through 2029
This news came out alongside the announcement that they will partner with Broadcom to develop their own chips.
Tokens Are Getting More Expensive from Ethan Ding’s Substack
His point of view on token costs has become a trending narrative.
Final brick
Ever wondered how the term “artificial intelligence” came to be?
In 1956, John McCarthy gathered twenty scientists at Dartmouth College to form a new discipline around the question "Can machines think?" Initially calling it "automata studies," McCarthy found this term didn't attract much attention, so he rebranded it as "artificial intelligence"—a deliberate marketing choice that embedded grand promises within the technology's very name.
The word "intelligence" sounded inherently desirable and sophisticated, immediately garnering more interest from both funders and scientists eager to join a field with such ambitious goals.