Wonderful article and a good fit with HN’s motto of “move slowly and preserve things” as opposed to Silicon Valley’s jingoistic “move fast and break things”.
It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.
That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.
Yes, it would be nice to know with certainty who is behind these bills. It sucks how much opaque money influences American politics.
Josh Gottheimer's press release[1] on HR8250 mentions the "Meta Parents Network." I don't know what that is, but it does have "Meta" in the name.
Buffy Wick's noise about AB1043 claimed it was passed with the support of tech companies. I have spoken directly to one person close to AB1043 who told me Facebook argued against AB1043. I have doubts. But if true, I suspect they were not arguing in good faith and had ulterior motives.
In the end, no matter who is secretly lobbying for or against age verification bills all over the planet, the bills are terrible, and we should fight them.
a) The agent doesn't need to read the implementation of anything - you can stuff the entire projects headers into the context and the LLM can have a better birds-eye view of what is there and what is not, and what goes where, etc.
and
b) Enforcing Parse, don't Validate using opaque types - the LLM writing a function that uses a user-defined composite datatype has no knowledge of the implementation, because it read only headers.
I don't get this take. Once a modern corporation starts making money, all the people in it diligently work to expand their influence by starting new projects and hiring as many people as possible. That seems to be human nature. Why will AI tools change that? Nobody is feeling important because they manage 50 AI agents. They feel important because they manage 50 people.
What percentage of the jobs in a modern office are truly necessary? If automation had the ability to kill jobs over the long term, we'd all have been idle since the industrial revolution. But instead we keep inventing new things that we need.
there is no anonymity on the internet. the sum of your devices characteristics are close to unique anyway (i could be wrong but i think this is accurate). which kind of supports the hypothesis that this is about shifting responsibility for age verification due to laws coming from other countries recently. i have no idea how this will work on linux, it probably wont.
Ah, I guess by "that" you meant the touch part, not the uncopiable part.
There are many ways to implement this. I think some Chromebooks have FIDO gated on a physical button.
If you have an unlocked device with keys usable requiring a mere touch, I'm not sure fingerprint adds much value. A button would be enough.
Actually checking with fingerprint only addresses an extremely narrow attack where someone who wants to attack you steals your device (so already physical access, meaning not DPRK hackers) while it's unlocked, and only getting a window of opportunity until you've called your security department to lock your account.
… and yet this attacker would NOT be willing to use force against your person, to make you use your fingerprint.
Sure, if that's a threat model that's worth your time, use fingerprint too.
Keep in mind that already going from software only (and arguably this includes OTP app on your phone) already means effectively going to zero. Google moved to security keys and says “We have had no reported or confirmed account takeovers since implementing security keys at Google” — https://krebsonsecurity.com/2018/07/google-security-keys-neu...
So there are extreme diminishing returns after just security key with touch.
An app solution even gets a callout in that article as being not as good.
Hopefully this doesn't seem like advertising - I'm not affiliated with the project in any way. I just particularly enjoy playing with cyberdecks [1] and stumbled upon this while browsing. I continue to have a lot of fun with the uConsole and SDR, but I've long wanted a Cardputer with a bit more oomph. I should add, if anyone is interested in a uConsole, brace yourself for the shipping times... [2].
Again, it is not based on number of tokens. If it was solely based on number of tokens then things like cache misses would not impact the usage so much. It's based on the actual cost which includes things like the caching costs.
The quilt patch series comes from the time when I was basing my work on the Debian version, it was easier for me to follow upstream than rebasing branches.
Most patches are now merged into master, only some unfinished work is still in that series. I should update the docs.
> we have failed to broadly adopt any new compiled programming languages for HPC
The article neglects that all of C, C++, and Fortran have evolved over the last 30 years.
Also, you'll find significant advances in the HPC library ecosystem over the trailing years. Consider, for example, Trilinos (https://trilinos.github.io/index.html) or Dakota (https://dakota.sandia.gov/about-dakota/) both of which push a ton of domain-agnostic capabilities into a C++ library instead of bolting them into a bespoke language. Communities of users tend to coalesce around shared libraries not creating new languages.
> That means a cautious C programmer who doesn't know who will be using their code never allows NULL to be passed to `free()`.
If your compiler chokes on `free(NULL)` you have bigger problems that no LLM (or human) can solve for you: you are using a compiler that was last maintained in the 80s!
If your C compiler doesn't adhere to the very first C standard published, the problem is not the quality of the code that is written.
> If they aren't avoiding passing NULL to `free()`, they haven't suffered long enough to be good.
I dunno; I've "suffered" since the mid-90s, and I will free NULL, because it is legal in the standard, and because I have not come across a compiler that does the wrong thing on `free(NULL)`.
Parent talks about new languages, as per the article Fortran or C doing fine. I speculate the benefit of C++ over Rust how it let programmers instruct the compiler of warranty that goes beyong the initial semantic of the language. See __restrict, __builtin_prefetch and __builtin_assume_aligned. The programming language is a space for conversations between compiler builders and hardware designers.
It highlights the often perplexing human tendency to reinvent rather than reuse. Why do we, as a species, ignore hard-won experience and instead restart? In doing so, often making mistakes that could have been avoided if we’d taken the time or had the curiosity/humility to learn from others. This seems particularly prevalent in software: “standing on the feet of giants” is a default rather than exception.
That aside, the article was thoroughly educational and enjoyable. I came away with much-deepened insight and admiration for those involved in researching, designing and building the language. Resolved to find and read the referenced “steelman” and language design rationale papers.