Hacker Newsnew | past | comments | ask | show | jobs | submit | MaxBarraclough's commentslogin

> A programming language is a formal specification language that we know how to compile

Plenty of real programming languages are ambiguous in ways that surely disqualify them as formal specification languages. A trivial example in C: decrementing an unsigned int variable that holds 0. The subtraction is guaranteed to wrap around, but the value you get depends on the platform, per the C standard.

> There are plenty of formal specifications that cannot be compiled, even not by an AI. If you use AI, how do you make sure that the AI compiler compiles correctly?

By proving that the code satisfies the formal spec. Getting from a formal spec to a program (in an imperative programming language, say) can be broken down into several stages of 'refinement'. A snippet from [0] :

> properties that are proved at the abstract level are maintained through refinement, hence are guaranteed to be satisfied also by later refinements.

[0] https://www.southampton.ac.uk/~tsh2n14/publications/chapters...


A formal specification language doesn't have to be deterministic.

And yes, if you can prove that the implementation is correct with respect to the formal spec, you are good, and it doesn't really matter how you got the implementation.

Refinement is one approach, personally, I just do interactive theorem proving.


> A formal specification language is a programming language that we don't know how to compile.

Not really, on both counts.

Firstly they're not really programming languages in the usual sense, in that they don't describe the sequence of instructions that the computer must follow. Functional programming languages are considered 'declarative', but they're still explicit about the computational work to be done. A formal spec doesn't do this, it just expresses the intended constraints on the correspondence between input and output (very roughly speaking).

Secondly, regarding the we don't know how to compile it aspect: 'constraint programming' and SMT solvers essentially do this, although they're not a practical way to build most software.


> Open source is not synonymous with the GPL and most businesses try to avoid open source software when implementing their core competency.

What you mean here? Businesses often implement their own core code, but they don't deliberately favour closed-source software.


It's no guarantee, but it's a positive indicator of trustworthiness if a codebase is open source.

I don't have hard numbers on this, but in my experience it's pretty rare for an open source codebase to contain malware. Few malicious actors are bold enough to publish the source of their malware. The exception that springs to mind is source-based supply chain attacks, such as publishing malicious Python code to Python's pip package-manager.

You have a valid point that a binary might not correspond to the supposed source code, but I think this is quite uncommon.


> It's no guarantee, but it's a positive indicator of trustworthiness if a codebase is open source.

It's something we as techies like to believe due to solidarity or belief in the greater good, but I'm not sure it's actually justified? It would only work if there's a sizeable, technically-inclined userbase of the project so that someone is likely to have audited the code.

If you're malicious, you can still release malicious software with an open-source cover (ideally without the source including the malicious part - but even then, you can coast just fine until someone comes along and actually checks said source). If you're anonymous there is little actual downside of detection, you can just try again under a different project.

Remember that the xz-utils backdoor was only discovered because they fucked up and caused a slowdown and not due to an unprompted audit.


> It would only work if there's a sizeable, technically-inclined userbase of the project so that someone is likely to have audited the code.

Not really. There's a long history of seemingly credible closed-source codebases turning out to have concealed malicious functionality, such as smart TVs spying on user activity, or the 'dieselgate' scandal, or the Sony rootkit. This kind of thing is extremely rare in Free and Open Source software. The creators don't want to run the risk of someone stumbling across the plain-as-day source code of malicious functionality. Open source software also generally makes it easy to remove malicious functionality, or even to create an ongoing fork project for this purpose. (The VSCodium project does this, roughly speaking. [0])

Firefox's telemetry is one of the more high-profile examples of unwanted behaviour in Free and Open Source software, and that probably doesn't even really count as malware.

> If you're malicious, you can still release malicious software with an open-source cover (ideally without the source including the malicious part - but even then, you can coast just fine until someone comes along and actually checks said source).

I already acknowledged this is possible, you don't need to spell it out. Again I don't have hard numbers, but it seems to me that in practice this is quite rare compared to malicious closed-source software of the 'ordinary' kind.

A good example of this was SourceForge injecting adware into binaries. [1]

> Remember that the xz-utils backdoor was only discovered because they fucked up and caused a slowdown and not due to an unprompted audit.

Right, that was a supply chain attack. They seem to be increasingly common, unfortunately.

[0] https://vscodium.com/

[1] https://en.wikipedia.org/wiki/SourceForge#Installer_with_adw...


> In future everyone will expect to be able to customise an application, if the source is not available they will not chose your application as a base. It's that simple.

This seems unlikely. It's not the norm today for closed-source software. Why would it be different tomorrow?


Because we now have LLMs that can read the code for us.

I'm feeling this already.

Just the other day I was messing around with Fly's new Sprites.dev system and I found myself confused as to how one of the "sprite" CLI features worked.

So I went to clone the git repo and have Claude Code figure out the answer... and was surprised to find that the "sprite" CLI tool itself (unlike Fly's flycli tool, which I answer questions about like this pretty often) wasn't open source!

That was a genuine blocker for me because it prevented me from answering my question.

It reminded me that the most frustrating thing about using macOS these days is that so much of it is closed source.

I'd love to have Claude write me proper documentation for the sandbox-exec command for example, but that thing is pretty much a black hole.


I'm not convinced that lowering the barrier to entry to software changes will result in this kind of change of norms. The reasons for closed-source commercial software not supporting customisation largely remain the same. Here are the ones that spring to mind:

• Increased upfront software complexity

• Increased maintenance burden (to not break officially supported plugins/customizations)

• Increased support burden

• Possible security/regulatory/liability issues

• The company may want to deliberately block functionality that users want (e.g. data migration, integration with competing services, or removing ads and content recommendations)

> That was a genuine blocker for me because it prevented me from answering my question.

It's always been this way. From the user's point of view there has always been value in having access to the source, especially under the terms of a proper Free and Open Source licence.


For good measure here's a link to Dijkstra's The undeserved status of the pigeon-hole principle.

https://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD109...


For even better measure here's a slice of HN reactions to EWD1094:

https://news.ycombinator.com/item?id=46085897


That's not the approach they're referring to, iOS doesn't support that. They're referring to delivering the compiled native code as part of the app package.

> faster WASM and WebGPU

Regarding WASM at least, it seems to depend. https://arewefastyet.com/


The last link is broken. GitHub repo: https://github.com/howerj/muxleq


Thanks.


From the article:

> At this time an open source HDMI 2.1 implementation is not possible without running afoul of the HDMI Forum requirements.

I wonder on what basis. Perhaps an obligation to ensure the software resists reverse-engineering?


    > Perhaps an obligation to ensure the software resists reverse-engineering?
I assume that Blu-Ray is similar. As I understand, there are no fully open source implementations of a video decoder for Blu-Ray discs. (Is that still true in 2025?)


> As I understand, there are no fully open source implementations of a video decoder for Blu-Ray discs. (Is that still true in 2025?)

As far as I am aware VLC Media Player is capable of playing blu ray dics:

> https://www.reddit.com/r/linux4noobs/comments/1ke5ysq/how_to...

but you have to install some additional files:

> https://wiki.videolan.org/VSG:Usage:Blu-ray/

> https://www.reddit.com/r/linux4noobs/comments/1ke5ysq/commen...

If this does not satisfy your claim "there are no fully open source implementations of a video decoder for Blu-Ray discs" tell me where I am wrong.


Hat tip. I was unaware. When I looked deeper, it requires you to supply the encryption keys for each disc. I highly doubt this method is "approved" by the Blu-Ray consortium. I don't even know the legality in highly advanced economies.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: