> “When we work on making our devices accessible by the blind, I don’t consider the bloody ROI.”
This made me sad. I moved out to Silicon Valley a few months after Jobs passed. I remember feeling so hopeful and inspired that technology could make the world a better place, and I saw the same in other founders. Today I look around and feel ashamed of the tech industry. The founders don’t talk about changing the world anymore, they just have dollar signs in their eyes. It’s been a long time since I saw any technology that felt inspiring the same way it used to feel.
Since there’s no such thing as 100% proof in science, when and who gets to “call it” that the signs of life have been discovered outside of Earth? Presumably NASA, but how do they decide?
in summary, we were all imagining a Star Trek "take me to your leader" moment of First Contact, and instead he imagines it will be a slow transition of increasing evidence that convinces more and more people over years and years.
Officially it'll be "called" when a politician feels like it's solid enough proof for them to leech off of it.
Scientifically, it'll be called when there's too much evidence to genuinely dispute it. Probably when a collection of specimens is found, either dead or alive, that can be shown to be unrelated or distantly related to Earth life.
I think we’ll look back on this period as The Great Enshittification where everyone ran out of ideas but capitalism demands growth so everything just got worse. The mass manufacturing of mediocre AI content might be the force that ends the digital era and maybe we’ll all just go outside again.
Here's hoping. It's not that there's no new ideas, it's that they don't deliver VC sized returns. The only thing that delivers the returns investors are looking for are Ponzi schemes.
Nothing can beat Ponzi schemes because it's all promises and the returns only trickle up to the top, they are not evenly distributed.
There are ideas that can deliver VC-sized returns but they depend on a system state different from ours - scaled up energy system, housing market, education system, manufacturing capacity, research capacity, etc etc. Until we unscrew the current status quo, we(USA) will continue limping through these innovations without the capacity to fully utilize them.
> From the limited perspective of software development, today’s models are well-worth their per-token cost.
At the current price or real price? Anthropic said a $200 subscription can cost them $5000 so the real price could be anywhere from 10-30x the current price.
No, that is probably one of the worst cases they probably saw. Most likely the subscription inference cost is much lower than you expect. If you look at costs for similar open models they are much lower than what you get by buying from anthropic, so that is the real cost basis I expect.
It's likely Amazon is making a fucking killing though.
While $5000 is a lot, the people who rack up close or just over a thousand "API equivalent cost" are pretty common.
> Most likely the subscription inference cost is much lower than you expect.
This is probably not true because they'd be screaming it off every rooftop were that the case.
Same deal with the API inference. Even the "profitable on inference" claim is sourced back to hearsay of informal statements made by OpenAI/Anthropic staff. No formal announcements, nothing remotely of the "You can trust what I'm saying, because if I'm lying the SEC will have my head" sort.
Yet making such statements would be invaluable. If Anthropic can demonstrate profitability before OpenAI, they could poach most of the funding. There's no reason to keep it a company secret.
And API inference is only part of the total costs, not even bringing in training and ongoing fine-tuning. If they're not even profitable on inference, how could they hope to be profitable overall.
I'm going to be a dickhead for a moment here, apologies, there's no way to say this that isn't rude to you. This is still the same hearsay "In an interview, somewhere."
> Let’s say half of your compute is for training and half of your compute is for inference. The inference has some gross margin that’s more than 50%.
But the context, the very previous sentence is:
> Think about it this way. Again, these are stylized facts. These numbers are not exact. I’m just trying to make a toy model here.
Here, Amodei is in effect using weasel words. He is not giving any actionable claims about Anthropics margins, merely plucking an arbitrary number. Why 50%? Is 50% reasonable? Is 50% accurate to the company? Those are all conclusions the listener draws, not Amodei.
> I don't know about SEC rules
The main premise is that, as a CEO, there are some regulations you are beholden to. You're not allowed to announce you've made a trillion dollar profit, sell all your stock, and then go "teehee just kidding". The SEC prosecute you for securities fraud if you do that stuff.
This makes such weasel words as earlier suspicious. Because the exact statement Amodei gives is not prosecutable. He's not saying anything about the company, just doing a little "toy model".
The degree to which it is intentional that this hearsay travels and is extrapolated from "Well he picked 50% because it's a reasonable figure, and because he's CEO, a reasonable figure would have to be a figure akin to what his company can achieve" into "Anthropic has 50% margin", that's up for debate. Maybe it is intentional, maybe Amodei is exactly the same kind of shitweasel as Altman is. Probably he's just a dumbass who runs his mouth in interviews and for whatever reason cannot issue the true number in an authoritative statement to dismiss this misconception.
Hence my original comment; If the real number were better than the hearsay rumours of the number, Amodei would immediately issue a correction; It'd be great for the company. Hell, even if 50% were about the margin, that'd be great! To promote that from mere hearsay to "we're profitable, go invest all your money" would also be huge. Really, any kind of margin at all would put him ahead of OpenAI.
But he doesn't issue a correction. He doesn't affirm the statement. Perhaps he has other reasons for that, but a rather big reason could be that the margin number is in fact pretty bad.
Now, the observant reader will note I am also using a weasel word there. I do not know whether the number is good or bad, your take away should be "it could be bad." Not "it is bad". Go pressure Amodei into giving us the real number.
Self reply as I could've explained the SEC thing better:
Anti-fraud regulators like the SEC give an inherent trustworthiness and credibility to CEOs and other market participants. You can trust that they're not lying to you, because they would be sent to jail if they were.
Another example are general anti-fraud regulations; Consider how one would trust North American or European steel suppliers more than Chinese steel suppliers.
It's not that the Chinese are "evil lying people" and Americans are "saints who never lie", it's that you can trust American, Canadian, and European courts to hold the liars accountable by regulations even if you're not in any of those regions. But the Chinese liars won't be held accountable by regulations.
Thus also the opposite, if someone opts out of this credibility granted to them by anti-fraud regulations, their words may not be quite so truthful.
SEC rules means CEO cannot lie or deliberately hide the cost of something.
50%+ Margin statements have basically been "We are making 50% on delivering it." This does not include ANY of the costs of getting to this point, training, scraping, datacenters, people and so forth.
They are basically saying "Oh yea, the cost of GAS in the car is only X so charging Y per mile is great margin" while ignoring maintenance, cost of acquiring the car and so forth.
That's a tad naive. CEOs can and have and often lied about everything:
Sam Bankman-Fried, Elizabeth Holmes, Kenneth Lay - and hundreds if not thousands more.
The SEC is a regulatory agency, not able to bring criminal charges. The above-named for the most part had to be prosecuted by the Department of Justice or sometimes state attorneys.
but comparing your margin of charging to drive a mile to the price of gas makes a lot of sense? that is the only variable cost in the equation. training / scraping / people are all pretty much fixed costs.
> While $5000 is a lot, the people who rack up close or just over a thousand "API equivalent cost" are pretty common.
I think if you're not Anthropic and you don't have access to the actual data, then you can't say for sure. A bunch of anecdotes on terminally-AI people on twitter is not making a convincing case for me, IMO.
On the other hand, if similarly sized models cost much much cheaper than this, why, in the world, would Anthropic have much higher costs than that?
Also, counterpoint, maybe they want you to think that they have higher costs so you're more willing to actually pay for it?
It's less than the other tech CEOs who seem to evade criticism on HN. Elon literally worked for Trump, accomplished nothing, and ended up just leaking everyone's social security data. Thiel and Palantir are profiting from war and building out the surveillance state. Bezos made a $75M documentary about Melania. Larry Ellison took over TikTok US to squelch any criticism of US and Zionist war atrocities.
We need to kill SaaS. Apps should be local-first and have peer-to-peer data sync. These companies won't stop until they use your data to replace you and enrich their owners.
What’s the scaling bottleneck? If you made a local-first, P2P version of Figma what would break first? For a company of like 50 people, I doubt you’d have more than 100GB of data so it should fit on everyone’s computers. The P2P syncing part seems solvable, even if you need a centralized handshake server somewhere. And from the user perspective I don’t see why the UX couldn’t be identical, so it’s all the same to them.
It seems like the real bottleneck is something else.
This made me sad. I moved out to Silicon Valley a few months after Jobs passed. I remember feeling so hopeful and inspired that technology could make the world a better place, and I saw the same in other founders. Today I look around and feel ashamed of the tech industry. The founders don’t talk about changing the world anymore, they just have dollar signs in their eyes. It’s been a long time since I saw any technology that felt inspiring the same way it used to feel.
reply