We now mirror GitHub PRs into Gerrit. Gerrit is still our upstream Git server. Gerrit is still where we do reviews and press the "Merge" button. (And once merged, it gets mirrored back to GitHub)
But now we have a bot that slurps PRs into Gerrit and syncs comments back and forth and closes PRs when needed. (We'll do more fine-grained comment sync in the future. We just shipped an "MVP" for the Go 1.11 tree opening).
This is smart, decreasing the hurdles to contributions will inevitably result in more contributions. Economics 101. And I don't think it necessarily follows that they will be lower quality either, maybe for first time contributors, but we all start at the same place and have to learn. More first time contributors will also lead to more repeat contributors.
Any time I come across a project that uses Gerrit I immediately abandon any proposed patches I had planned to submit. Gerrit is terrible for actually working with and is unfriendly even to experienced git users. I have to battle with Gerrit to submit patches to MediaWiki and it is a painful process every time.
Clarifying that I believe that my personal off work time is valuable to myself. The more time and friction I have to spend putting in a pull request/code review is the less time I get to myself.
The problem I see with Github is the fact that it encourages contributors to commit repeated fixes to a pull request, rather than amending/rebasing the patch set. The other problem that Github has is that it doesn't provide a way to get interdiffs between patchset revisions. That is, if I open a pull request, get some comments, address those comments and rebase my changes in, then there's no way for the reviewer(s) to see what I actually changed.
This obviously increases contributions, which is great. Does this now require more time from people with Run Trybot access to manually kick off a CI run? I wondered why Go always used that instead of automatically kicking off CI for each patchset. To save resources perhaps?
At its heart, I am not certain the increased contributions are always good. There is a lot of theory written about how to build and grow out languages, and most of it seems to indicate that there needs to be a strong, central group of people with veto power. So long as the Go internal team retains this power, it's good, but having a community driving development for itself outside of the core language team often leads to (a) too many cooks in the kitchen and (b) adding features vocal fans want that nobody else cares about.
Given that one of the reasons GitHub is so popular is that it's a common toolchain, I wonder if it is indeed obvious that replacing the code-review tool with a different one (not to mention one that is often considered much less intuitive, if not usable) will result in more contributions. I hope the Go team posts some results in a few months.
This is an excellent change. While Gerrit is undoubtedly more powerful than GitHub's PR, GitHub is easy and familiar. I've found a fair number of small fixes and changes I've wanted to make to Gerrit projects but just abandoned due to not being inline with my current workflow. Now, I'll submit a PR and figure out the rest later. Hopefully, code review still being through Gerrit doesn't mean more stale PRs to deal with though. I guess time will tell...
It seems odd to me that Golang treats Github as a first class citizen for package management, yet a second class citizen for core contributions.
I'm really only a dabbler in Golang, so would appreciate any context.
For instance I was affected by the recent go-bindata owner change[1]. The Github user deleted their account and some random user reregistered the original user's name and recreated the repo (albeit seemingly innocently to help everyone get their CIs running again).
Actually while we're on the subject, Golang's whole package management experience is surprisingly disappointing :/
Go does not treat Github any differently than any other VCS provider when it comes to package management.
OG gophers vendor their deps into their repo and avoid creating dependencies on 3rd party code whenever possible. The Go stdlib is very full featured so it is not unrealistic.
The problem with the go-bindata could be avoided by always vendoring/forking your deps and never trust any VCS provider (e.g. GitHub, Bitbucket, source forge) to handle your critical dependency needs.
Take a look at the dep[1] tool. It allows you to ship your project with all of its dependencies included. This means that only your source is needed to build your application/package.
The Gopkg.toml/Gopkg.lock files in dep are quite similar to Rust's Cargo.toml/Cargo.lock files. I think it's a good move as Rust has probably the best package management story out there.
Also, as the other comment mentioned, github.com is not "special" in any way. Any website with a git repo will work just as well. In fact, some key libraries are served from golang.org/x/<whatever>[2] not github.com.
Slightly off topic, but would anyone have a similar solution/experience to sync between GitHub and Gitlab? (community version, not EE, since this is for a Free Software project)
We now mirror GitHub PRs into Gerrit. Gerrit is still our upstream Git server. Gerrit is still where we do reviews and press the "Merge" button. (And once merged, it gets mirrored back to GitHub)
But now we have a bot that slurps PRs into Gerrit and syncs comments back and forth and closes PRs when needed. (We'll do more fine-grained comment sync in the future. We just shipped an "MVP" for the Go 1.11 tree opening).
Some more information is at https://golang.org/wiki/GerritBot
Code is at https://github.com/golang/build/blob/master/cmd/gerritbot/ge...
Clarifying that I believe that my personal off work time is valuable to myself. The more time and friction I have to spend putting in a pull request/code review is the less time I get to myself.
I have first hand information from two projects (one of them large) that contribution, apart from drive-by doc fixes, does not increase.
Meaningful contribution decreases, because existing devs leave or become inactive because of GitHub.
A lot of projects moving to GitHub are in the Gervais-principle stage.
For security, mostly. Our isolation is good but not perfect.
We could probably do a smaller set of (secure) builders for all patches, but it hasn't been a priority.
I'm really only a dabbler in Golang, so would appreciate any context.
For instance I was affected by the recent go-bindata owner change[1]. The Github user deleted their account and some random user reregistered the original user's name and recreated the repo (albeit seemingly innocently to help everyone get their CIs running again).
Actually while we're on the subject, Golang's whole package management experience is surprisingly disappointing :/
[1]https://www.reddit.com/r/golang/comments/7vv9zz/popular_lib_...
OG gophers vendor their deps into their repo and avoid creating dependencies on 3rd party code whenever possible. The Go stdlib is very full featured so it is not unrealistic.
The problem with the go-bindata could be avoided by always vendoring/forking your deps and never trust any VCS provider (e.g. GitHub, Bitbucket, source forge) to handle your critical dependency needs.
The Gopkg.toml/Gopkg.lock files in dep are quite similar to Rust's Cargo.toml/Cargo.lock files. I think it's a good move as Rust has probably the best package management story out there.
Also, as the other comment mentioned, github.com is not "special" in any way. Any website with a git repo will work just as well. In fact, some key libraries are served from golang.org/x/<whatever>[2] not github.com.
[1]https://github.com/golang/dep [2]https://github.com/golang/go/wiki/SubRepositories
Anyone who vendored, or at worst kept a local copies of their dependencies littered all over their machine somewhere would have been fine.
Dead Comment
Dead Comment
Dead Comment