Hacker Newsnew | past | comments | ask | show | jobs | submit | sethops1's commentslogin

The thing i keep coming back to is that an LLM backed query is so, so much more expensive than a typical web request. What kind of advertising is going to align in the value necessary to cover those costs, plus margin? Chatbots aren't YouTube, users aren't going to sit through 30 second ads, I don't think.

Neither was 99.99999% of the content they stole.

Any source for this claim?

all the scrapped data on internet?? are you that naive lol

Scraped != stolen.

LOL No if you didn't have explicit permision to use it for Training, you didn't have permission this is called stealing.

Nonsense. Copying is not theft, this debate was solved 25+ years ago!

Have you seen the "you wouldn't steal a car" ads? Or this video? https://youtu.be/IeTybKL1pM4?si=utZ5KjmK-C2-fFdP



Obligatory, do you actually need kubernetes? I struggle to imagine any tiny startup that does.

As a solo dev who just started his second cluster a few days ago... I like it.

Upfront costs a little higher than I'd like. I'm paying $24 for a droplet + $12 for a load balancer, plus maybe $1 for a volume.

I could probably run my current workload on a $12 droplet but apparently Cilium is a memory hog and that makes the smaller droplet infeasible, and it seems not practical to not run a load balancer.

But now I can run several distinct apps running different frameworks and versions of php, node, bun, nginx, whatever and spin them up and tear them down in minutes and I kind of love that. And if I ever get any significant amount of users I can press a button and scale up or horizontally.

I don't have to muck about with pm2 or supervisord or cronjobs, that's built in. I don't have to muck about with SSL certs/certbot, that's built in.

I have SSO across all my subdomains. That was a little annoying to get running, took a day and a half to figure out but it was a one time thing and the config is all committed in YAML so if I ever forget how it works I have something to reference instead of trying to remember 100 shell commands I randomly ran on a naked VPS.

Upgrades are easy. Can upgrade the distro or whatever package easily.

Downsides are deploys take a minute or two instead of sub-second.

It took weeks of tinkering to get a good DX going, but I've happily settled on DevSpace. Again it takes a couple minutes to start up and probably oodles of RAM instead of milliseconds but I can maintain 10 different projects without trying to keep my dev machine in sync with everything.

So some trade-offs but I've decided it's a net win after you're over the initial learning hump.


> I can run several distinct apps running different frameworks and versions > don't have to muck about with pm2 or supervisord or cronjobs, that's built in. I don't have to muck about with SSL certs/certbot

But doesn't literally any PaaS and provider with a "run a container" feature (AWS Fargate/ECS, etc) fit the bill without the complexity, moving parts and failure modes of K8s.

K8s makes sense when you need a control plane to orchestrate workloads on physical machines - its complexity and moving parts are somewhat justified there because that task is actually complex.

But to orchestrate VMs from a cloud provider - where the hypervisor and control plane already offers all of the above? Why take on the extra overhead by layering yet another orchestration layer on top?


Not the original poster but have tried all that that. It's far easier with Kubernetes - just deployment, service secret & ingress config and stuff just works cleanly in namespaces without stuff at any risk of clobberring each other.

As the sibling comment already mentioned, k8s is not much more complexity once you're past the learning curve. I used to host with ec2 + scripts earlier. K8s actually solves a lot of problems that you will have to solve yourself anyway.

Running Kubernetes in a managed environment like DO is no harder than using docker compose.

The "app" isn't even capitalized, which is my favorite part!

Yes. There are numerous fresh accounts being created to flood this very thread.

Can't help but wonder if that's a strategy that works until it doesn't.

I'd rather get nothing, because a thoughtless blob of text being pushed on me is insulting. Nothing, otoh, is just peace and quiet.


Is that machine also going to be segmented on a private VLAN?


Or this behavior is just programmed, the old fashioned way.


This is one of the things that’s so frustrating about the AI hype. Yes there are genuinely things these tools can do that couldn’t be done before, mostly around language processing, but so much of the automation work people are putting them up to just isn’t that impressive.


But it’s precisely the automation around LLMs that make the end result itself impressive.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: