That has nothing to do with anything. If you're just having a dick measuring contest, Google has contributed a ton to the open source community. Google also provides nearly 100% of funding for Mozilla.
Open source is not about smelly dicks, it's about transparency, reusability, philsophy, politics. It's a lot of things (and they matter, because right now, society and technology are changing fast).
For the average end user, it may not particularly matter that Mozilla's PDF implementation is open source and Google's isn't. However, for people like me who want to implement PDF-related functionality in their web/desktop application, it matters a great deal.
If you're in the US, you may not notice it, but Bing is plain awful for international users. Google has spent a considerable amount of resources improving their engine to serve results based on your location and Google's Search Engine is clearly the best by a long shot. This is also why I'm not using DuckDuckGo.
My guess would be that if the offer is similar, Mozilla would be inclined to stick with Google. I'm sure you know how irritating it is to realise you missed a checkbox and now your homepage and default search are set to Bing - it's probably a user experience Mozilla was happy to avoid.
Plus, there are probably some advantages to having Google 'on side'.
Perhaps they had made an offer. We never would know.
We don't even know the details of the Mozilla-Google deal. For all the openness that Google and Mozilla appear to proffer, the agreement between them is a secret.
Mozilla has been saying for years that they will use the best engine, and take that engine's money if it's available. If Bing were clearly better than Google, they'd switch and be taking Microsoft's money.
They have said that during the time they are being offered the most money by the best search engine. That's easy. But Google keeps paying more which means Mozilla is after the highest bidder.
That is not how Mozilla's decision making works. There's considerably more to it than such simple economics.
As a non-profit, we are here to create a public good. It's actually goods, plural, and services too. Firefox and other Mozilla products and services exist to promote the Mozilla mission, not to enrich the Mozilla organization.
Yes. Sustainability is certainly a factor in some of our decision making. Mozilla would be foolish to ignore significant opportunities to increase its mission impact. But revenue for sustainability is not the sole or even the dominant factor in any major product decision at Mozilla.
I'd started using Foxit for the same reason. But the speed advantage is no longer an issue. Adobe had closed the gap and Acrobat reader now loads just as fast as Foxit. Besides, Acrobat's rendering is far superior to Foxit's (not to mention Foxit fails to load some PDF document or displays garbage)
My problem with Adobe is that whenever I install any of their products I get the feeling I've just made my system more vulnerable to malware and the like. They've got an atrocious record when it comes to security bugs.
I've not found anything that is even close to on par with Acrobat for fill-in PDF forms either.... Other viewers can sort-of handle it, but the results don't look right most of the time.
It's not just a matter of open/closed source. Bundling in yet another C binary is only going to widen the surface area for security vulnerabilities for millions of people, and there's really not a great reason to do so anymore. PDF.js proves that it's possible to write high-performance javascript to do this stuff, without introducing any security risks and it's completely cross-platform. Computers are only going to get faster, and js compilers are only going to get better too.
But Chrome's works better and has been around (and useful for millions) for years. Perhaps they'll move to a JS solution in the future, but if I had to choose between the two right now I'd take the faster more complete one.
JS-everywhere ideologues are always trying to buy a hamburger today and pay for it tomorrow. Someday machines will be faster. Someday the implementation will on par with the alternatives.
How it works today is what matters to users. Tomorrow the native stack will just make use of more hardware features, it will still be faster, and we'll still be having the same conversation that we'be been having with JS ideologues for 10 years now. "Someday ..."
This is what the XML ideologues used to say too. The problem wasn't XML, it was the libraries, or the lack of dedicated XML processing hardware. Then someone realized that if you got rid of the core complexity -- XML -- you could stop layering on piles of additional complexity while trying to work around the root of the problem.
Something being closed source matters to some people. You might not be one of them, but other people might have different perspectives that cause them to value an open source option above a closed source equivalent.
With a couple Google queries, you can find out that the Chrome PDF viewer has had multiple vulnerabilities that allow code to be executed in the sandboxed environment, and that there have also been Chrome vulnerabilities that allow sandboxed code to escape the sandbox.
Sandboxing is certainly a good thing, but it's not perfect. Sandboxed code still increases the attack surface. The rationale behind pdf.js is that, because it's almost entirely unprivileged JavaScript, it's unlikely that there's a way to exploit a browser in the presence of pdf.js that wouldn't work in the absence of pdf.js.
The browsers themselves have also been subject to such vulnerabilities. I dare say the attack surface of a full browser implementation is much larger than just the sandboxing, and sandboxing ideally sits behind all I trusted operations in the browser, providing a single common last line of defense.
Source? I don't believe this since so many levels of confirmation is required. I think the biggest source is the simple links to "KimKSexTape.jpg.exe".
Java has had multiple drive-by exploits discovered in just the past few months, which by definition don't require confirmation.
I was inaccurate, however. Java is almost certainly the cause of most exploits, but I daresay that user inexperience (to put it kindly) is probably the source of most infections. Touchè.