Let the Merge Games begin!
We currently have four different teams that are working on large modifications to RavenDB.
Large modifications means that they are working on a feature for a relatively long time and very frequently need to do modifications to large swaths of the code base. Oh, and the common theme for all of them is that they are all big enough that it means that you cannot just merge back into the main branch. There are often a lot of failing tests or even uncompilable state during a long refactoring session.
The good thing is that we are pretty good about making sure that we merge from the main branch on a regular basis. The bad thing is that once we start merging those big changes, the other large refactoring are going to have to deal with a lot of changes happening very quickly.
Hence, the merge games. The fastest one of those team is able to hit the “can we put this in the main branch” point, the less work it is going to be for them. On the other hand, the slower you are in getting to that point, the more conflicts you are likely to run into and have to resolve.
I don’t think it would be a best seller book series and I doubt that I’ll get a movie deal from it, but in a certain select group of people, I think that this will be an amazingly fun game (as long as you aren’t the one left holding the shitty end of the merge conflicts).
Comments
The best way to deal with this kind of situation is always put refactoring work into separate commits, or even have a shared separate branch for refactoring purposes only. It's kind of mind twisting to refactor on a branch then pull and add your functional changes, but it can work allot better than giant merges all with refactorings.
Pop, In one of those branches we have, as of this morning, 350 changes files, 17K lines removed, 9K lines added. And over 150 commits.
I thought ravendb 4 dev was winding down? Seems like some big changes are still coming?
Have you looked at using http://www.semanticmerge.com/ to help with merging strategy? I've used both PlasticSCM and Semantic Merge and it helps a lot with long-term merging. First, the fact that the merging engine is aware of C# syntax and semantics means that you can more easily track and merge large scale changes. Second, (this is more Plastic than SemanticMerge), when merging from main to dev repeatedly, it can keep track of merges that have already taken place and use the already resolved conflicts to keep you from having to repeat merge-work you have already done.
Eric, Yes, we have one major feature remaining, but there is a lot of book keeping that we started to do, some of them are quite large, unfortunately.
Stuart, Yes, some of the guys are using it, but the issue isn't actually how to merge, the issue is how to work with / review with four major parallel branches. The answer, by the way is a lot of rebases :-)
Totally understand - I've been in the same boat. Got into Plastic before I got into Git, so merges are my preferred tool, with a strong merging engine. That said, I can definitely appreciate rebasing as a good technique as well. I love that final push when they start to come together and you can close the branches with a satisfied feeling that "big things" have been accomplished! :)
https://trunkbaseddevelopment.com/ is the way to go. Avoid the horrible merges.
Something I like to do in this situation is to have each person "own" their merge, and help others in resolving it into their branch by pairing with them just for the merge. Yes, the person who commits first has it easy, but with this practice they are encouraged to be compassionate to their team-mates(!). So if A commits first, and B and C are still going, A is responsible for making sure B and C have successfully merged in her changes before he is "done" with the ticket. If B is next, he has to help C. As he is last, C has to do the most merges, but doesn't need to help anyone merging his changes into their feature branch.
Comment preview