Good is the enemy of great, but ‘great’ is not universal

I have lots of great discussions with Simon Harris, James Ross and Perryn Fowler.  Mostly because we’re all passionate about doing the best possible job we can.  Also because we like to poke fun at our own little pet loves and fanboi’isms.

Our latest set of shelling of each other has been about source code control systems.  Now, the actual item is pretty irrelevant, but it serves as a vehicle for concrete discussion of a more abstract topic.

Essentially for any particular problem, there is a solution which is “good enough”.   Now, for this particular case, I’ll posit that ‘SVN’ is good enough.  A great solution for a very competent team, which is distributed may be ‘Git’.  A great solution for a not so experienced team may be ‘SVN’. 

What?  How can you say that?  Well, it’s pretty simple really. 

Good covers all the basic requirements of the solution.  For source code control, let’s say that’s ‘saving various versions of my code and letting my peers share it in a reliable way’ (you can tell I’m not putting too much effort into this, but deal with it).  SVN covers this, and you really can’t go below this level of functionality – because then you’re not covering all your bases.

Next, we look at great, but ‘great’ for Simon and Perryn is very different to ‘great’ for me, and many of the teams I work with.  ‘Great’ is all about making that next step, and being as productive as possible.  Git (or whatever tool-de-jour) is going to be more productive for Simon, but it’s going to be an unmitigated disaster in the hands of inexperienced and software developers who really just don’t care that much.  So, moving ‘up’ to Git, will be a productivity decline for development teams who don’t have the capacity to deal with it.

I blogged a few years ago about similar problems with languages here and here


Vista and Windows 7 slow copying

For what ever reason the disk access system on Vista and Windows 7 is completely fucking broken.  Under certain circumstances of driver (and/or hardware) if you copy across a network, or even off a USB connected drive the performance is completely shit.  This is 100% due to the upper layer code, because using lower primitives, the performance is spectacular.

So, why do I care?  We have a Netgear ReadyNAS with 40Gb of photos that have been backed up over time (from XP) and Susi’s laptop died, so we got a new one (nice new Dell for $800) and found that trying to copy them back would have taken longer than the heat death of the universe.

Did all the Googling, did all the “fixes” (which mostly related to networking) but none of it resolved the underlying issue.

I thought – fuck this – time to go down the Unix path.  So, looked at Rsync and found that Windows7 and Vista have a commandline tool called ‘robocopy’.  It’s basically a poor mans copying tool with archiving and/or mirroring capabilities.  It’s pretty cumbersome to use but not a real stretch for me, however I did come across as part of the whole investigation a GUI to use robocopy with – called RichCopy

It works great.  Copied the 40Gb at 150Mb/sec rather than the 3kb/sec I was getting.  This might not work for everybody, but it certainly resolved the issues I was having.