"I do not feel obliged to believe that the same God who has endowed us with sense, reason, and intellect has intended us to forgo their use."
- Galileo Galilei

Rewriting from scratch? Yeah, it's a bad idea.
Friday, December 2, 2011 | Permalink

We all know the feeling. We have code that's been lying around for a long time, collected a bit of dust, gotten a bit obsolete perhaps, or gotten twisted up and hacked around. Code that we're just not very happy with. It's a mess. It's ugly. It's fundamentally broken. It's not future proof. It's probably still working and doing its thing for the most parts, but we're not happy with it. So we're bringing up that big "rewrite" idea.

I'm probably just rehashing common knowledge here, but rewriting software from scratch is almost always a bad idea. Yet it is so appealing. And smart people fall into the trap all the time. And we never learn. No amount of experience seems to take the sheen out of the imagined perfect cleaned up code that would result from such an effort.

There are others that have written on this topic before, for instance here. And I could add another example to the list, the Orca driver back when I was at ATI. It was decided that the old OpenGL driver sucked. It was a mess. Nvidia had been producing higher quality drivers, but from my perspective we were catching up and was actually in a pretty decent shape at that point. Of course, I wasn't a driver writer so what did I know? When I heard about the complete rewrite of the OpenGL driver, I was quite skeptic but positive at the same time. But before long I started questioning the wisdom of going down this path. Initially both drivers were maintained in parallel. Fewer bugs were fixed and very little happened in terms of features, but there was hope for the better once the new driver was in good shape. But things were taking longer than expected. Don't it always? Eventually the whole team worked on the new driver. The old driver came to a complete halt. No updates. Month after month. Nvidia was pumping out new cool extensions on a regular basis like always. The new driver wasn't ready to ship. Fun stuff, especially when the next hardware generation came around, like they do on regular basis, and gosh, perhaps we need an OpenGL driver for it? The new one didn't have nearly all the features of the old one, nor was it higher performance. Of course, once the driver was in good enough shape that it could run Doom3 it was all fun again. Soon enough they had optimized it so that it showed a decent performance gain over the old driver. So it was all worth the effort, yay! Except of course, Doom3 was important and the Orca driver wasn't ready for prime time, so some of these optimizations were backported into the old driver, and would you know it, now the old driver was faster again!
Now, I wasn't a driver writer and I didn't have full insight in all the events surrounding this driver, so this story is mostly my causal observation from an ISV relations point of view. Someone on the team might have seen it differently. But still, I believe ATI lost at least a year, if not two, in their OpenGL support. By now that's ancient history, and I was happy that AMD shipped OpenGL 4.2 drivers on release date. But I bet there are people on the team now complaining that the code is a mess.

Anyway, my intention wasn't really to tell that story, but instead tell a story of my own. Some of you may remember my old announcement that I started working on Framework4. It's depressing when I look it up and I realize that was three years ago, and it's still not done! Gosh, I thought it was only two years ago, but it's three. So what prompted me to rewrite my framework? It's not like Framework3 wasn't working. I've been pumping out oodles of demos using it. It's just that it was a bit ugly. It was getting kind of obsolete. DX11 was around the corner and OpenGL was getting a serious overhaul and finally started deprecating things, and I wanted to make this big move into the new age. I had rewritten my framework before, this was my third one. I had great visions about new clean code, using standard coding conventions, with nicer interfaces. In the beginning it was fun of course. I was coding up cool stuff, new SIMD optimized vectors classes, compile time hashes, modern interfaces matching DX11 and OpenGL 3.0+, poking around a little with compute shaders. Soon though I wanted to make a real demo using it. Pretty soon I realized the framework had a pretty long way to go before I could ship a demo using it. So I started fixing up the rendering stuff. I was pretty much just writing code that wasn't that much different from what I was doing in Framework3. "I'll optimize it later anyway, I just want it to work for now." And then came the boring parts. My GUI system. At first I wanted something new, but found that I had no particular visions about what a better one might look like. So I just copied over the old one from Framework3, changed the code to standard conventions, fixed up all the rendering to Framework4 style. Boooooring, but eventually up and running. And I kept running into all those small things, things that I haven't implemented yet, things that separate a running prototype from a releasable demo. So I got kind of bored at the whole thing, and ended up spending less and less time on the project. Of course there are other factors at play here too. The last couple of years have been rather life changing for me. Went from single to having a girlfriend and ultimately married and now I am a father of a two months old baby. I rarely find myself spending whole days coding anymore. Even if I'm free the whole day, it's not like I can sit down 8 hours straight coding like I sometimes used to. I haven't exactly stopped coding though, but whenever I have had a demo idea recently, I've been implementing it in Framework3.

I'm still poking around in Framework4 at times though. Actually, just recently I did some work to bring the OpenGL part into the 4.2 era. But I haven't been able to let go of the feeling that somehow I took the wrong path after all. Perhaps I should've just upgraded Framework3 instead. Back when DX10 was new I opted to just make a D3D10 renderer instead of doing the big rewrite. I felt it was too much work. I suppose it was one of my wiser moves. I had to bend things a little, change a few interfaces to shoehorn it into the Framework3 paradigm, but I got it working. I had to update a bunch of old demos though, but that was actually not that much work. So I've been thinking a little lately, what is it actually that I really wanted with a new framework? And the answer is pretty much this: DX11 rendering. So after contemplating the move for a while I finally gave it a shot yesterday. I implemented a DX11 renderer in Framework3. Basically I just copied the DX10 renderer and search-replaced "D3D10" with "D3D11" and then crunched through all the compile errors until I had a fully working DX11 renderer. This exercise took about an hour. An hour!!! I spent one hour and now I have a fully working DX11 renderer in Framework3. It's ahead of where I am with Framework4.

So now I'm not sure what I will do going forward. For sure I'll replace the DX10 renderer with the DX11 one and let all old DX10 demos run through DX11 API, but with a D3D10 feature level. Perhaps I should just refactor Framework3 to fix all those things I've been unhappy about. Surely I'll have to update all old demos too, but chances are that that's not much work this time around either. Or maybe I'll just copy Framework3 to a new folder and call it Framework4, and then start refactoring things from there?

Moral of the story: Rewriting from scratch is almost never the right way to go. It's easy to overestimate the benefits and underestimate the effort necessary to get there.

[ 23 comments | Last comment by Yanko (2014-09-13 02:19:48) ]