Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> serious stuff should be built and tested before pushing anyway

Yes and no. Build in your CI server that's set up to mirror your prod environment after pushing, but before merging. That's what the whole industry of CI providers and integrations built into and around GitHub and GitLab is for.



To be fair, one of the annoying things about how PRs work is that they don't test the merge, they test the commit relative to its original base. Your tests may pass in the PR, but fail once applied to later changes in the main line.


Most CI solutions will take care of that; they'll pull in the PR, merge it locally with master, and run tests on that.


I can't speak to the "merge-and-run" behavior of CI, but in my experience, most of these merge conflicts arise because of a time delay between making the PR (at which point CI is run) and merging the PR. It would be quite resource intensive to re-build every PR against every new branch in master.


> It would be quite resource intensive to re-build every PR against every new branch in master.

Depends on just how resource intensive a build is, and how often commits are made to master.

Here at work, we don't do that, but it wouldn't be a complete clusterfuck if we were to.


In my experience, Travis CI tests the merge at the time the pullreq is created. It also has a "re-run" button that lets you merge the latest master and run the tests on top of that.


And depending on setups, testing locally is testing far fewer environments than CI. I only have one laptop, but many of my projects need at least OS X, Linux, and Windows, and often multiple versions of those.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: