Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yeah, my inner pessimist says that's a stupid thing to say. On the other hand, when was the last time something this bad and widespread showed up?


Let's surmise that the reason two distinct researchers (Mehta and Codenomicon) found this same bug in a short timeframe is that the recent Apple & GnuTLS bugs have caused many teams to begin a fresh review of long-ignored shared codebases.

If so, is this the first major bug discovered, with many more to come as they are flushed out by the new level of vigilance? Or, is it the only/last one, being revealed now because the deep dive has now wrapped up?

Those seem to me to be the interesting questions.


A few minutes' thought doesn't turn up a counterexample, so I concede the point.

I still wouldn't bet on it, though, as we haven't had a lifetime's experience with such widespread use of relatively homogeneous software. The failure of Enigma wasn't worldwide, but for those who used it, it was a security calamity.

If anything, the limited message I'll take away from this is that defense in depth is good. It's only the web's reliance upon OpenSSL that makes this particular bug so bad.


Well, the internet is only ~20 years old. I'm not strong enough in statistics to actually model this, but if something "this bad" happened once within 20 years, it might be reasonable to expect it would happen again within another 20. It's possible it is a every-hundred-years event, but unlikely. Perhaps even more often in the future, as software complexity continues to grow?


Ruby on Rails 'arbitary code execution'? Debian removing entropy from SSL?




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: