"Wimps take backups;
real men upload to ftp and let people mirror it."
- Linus Torvalds
More pages: 1 2 3
Rewriting from scratch? Yeah, it's a bad idea.
Friday, December 2, 2011 | Permalink

We all know the feeling. We have code that's been lying around for a long time, collected a bit of dust, gotten a bit obsolete perhaps, or gotten twisted up and hacked around. Code that we're just not very happy with. It's a mess. It's ugly. It's fundamentally broken. It's not future proof. It's probably still working and doing its thing for the most parts, but we're not happy with it. So we're bringing up that big "rewrite" idea.

I'm probably just rehashing common knowledge here, but rewriting software from scratch is almost always a bad idea. Yet it is so appealing. And smart people fall into the trap all the time. And we never learn. No amount of experience seems to take the sheen out of the imagined perfect cleaned up code that would result from such an effort.

There are others that have written on this topic before, for instance here. And I could add another example to the list, the Orca driver back when I was at ATI. It was decided that the old OpenGL driver sucked. It was a mess. Nvidia had been producing higher quality drivers, but from my perspective we were catching up and was actually in a pretty decent shape at that point. Of course, I wasn't a driver writer so what did I know? When I heard about the complete rewrite of the OpenGL driver, I was quite skeptic but positive at the same time. But before long I started questioning the wisdom of going down this path. Initially both drivers were maintained in parallel. Fewer bugs were fixed and very little happened in terms of features, but there was hope for the better once the new driver was in good shape. But things were taking longer than expected. Don't it always? Eventually the whole team worked on the new driver. The old driver came to a complete halt. No updates. Month after month. Nvidia was pumping out new cool extensions on a regular basis like always. The new driver wasn't ready to ship. Fun stuff, especially when the next hardware generation came around, like they do on regular basis, and gosh, perhaps we need an OpenGL driver for it? The new one didn't have nearly all the features of the old one, nor was it higher performance. Of course, once the driver was in good enough shape that it could run Doom3 it was all fun again. Soon enough they had optimized it so that it showed a decent performance gain over the old driver. So it was all worth the effort, yay! Except of course, Doom3 was important and the Orca driver wasn't ready for prime time, so some of these optimizations were backported into the old driver, and would you know it, now the old driver was faster again!
Now, I wasn't a driver writer and I didn't have full insight in all the events surrounding this driver, so this story is mostly my causal observation from an ISV relations point of view. Someone on the team might have seen it differently. But still, I believe ATI lost at least a year, if not two, in their OpenGL support. By now that's ancient history, and I was happy that AMD shipped OpenGL 4.2 drivers on release date. But I bet there are people on the team now complaining that the code is a mess.

Anyway, my intention wasn't really to tell that story, but instead tell a story of my own. Some of you may remember my old announcement that I started working on Framework4. It's depressing when I look it up and I realize that was three years ago, and it's still not done! Gosh, I thought it was only two years ago, but it's three. So what prompted me to rewrite my framework? It's not like Framework3 wasn't working. I've been pumping out oodles of demos using it. It's just that it was a bit ugly. It was getting kind of obsolete. DX11 was around the corner and OpenGL was getting a serious overhaul and finally started deprecating things, and I wanted to make this big move into the new age. I had rewritten my framework before, this was my third one. I had great visions about new clean code, using standard coding conventions, with nicer interfaces. In the beginning it was fun of course. I was coding up cool stuff, new SIMD optimized vectors classes, compile time hashes, modern interfaces matching DX11 and OpenGL 3.0+, poking around a little with compute shaders. Soon though I wanted to make a real demo using it. Pretty soon I realized the framework had a pretty long way to go before I could ship a demo using it. So I started fixing up the rendering stuff. I was pretty much just writing code that wasn't that much different from what I was doing in Framework3. "I'll optimize it later anyway, I just want it to work for now." And then came the boring parts. My GUI system. At first I wanted something new, but found that I had no particular visions about what a better one might look like. So I just copied over the old one from Framework3, changed the code to standard conventions, fixed up all the rendering to Framework4 style. Boooooring, but eventually up and running. And I kept running into all those small things, things that I haven't implemented yet, things that separate a running prototype from a releasable demo. So I got kind of bored at the whole thing, and ended up spending less and less time on the project. Of course there are other factors at play here too. The last couple of years have been rather life changing for me. Went from single to having a girlfriend and ultimately married and now I am a father of a two months old baby. I rarely find myself spending whole days coding anymore. Even if I'm free the whole day, it's not like I can sit down 8 hours straight coding like I sometimes used to. I haven't exactly stopped coding though, but whenever I have had a demo idea recently, I've been implementing it in Framework3.

I'm still poking around in Framework4 at times though. Actually, just recently I did some work to bring the OpenGL part into the 4.2 era. But I haven't been able to let go of the feeling that somehow I took the wrong path after all. Perhaps I should've just upgraded Framework3 instead. Back when DX10 was new I opted to just make a D3D10 renderer instead of doing the big rewrite. I felt it was too much work. I suppose it was one of my wiser moves. I had to bend things a little, change a few interfaces to shoehorn it into the Framework3 paradigm, but I got it working. I had to update a bunch of old demos though, but that was actually not that much work. So I've been thinking a little lately, what is it actually that I really wanted with a new framework? And the answer is pretty much this: DX11 rendering. So after contemplating the move for a while I finally gave it a shot yesterday. I implemented a DX11 renderer in Framework3. Basically I just copied the DX10 renderer and search-replaced "D3D10" with "D3D11" and then crunched through all the compile errors until I had a fully working DX11 renderer. This exercise took about an hour. An hour!!! I spent one hour and now I have a fully working DX11 renderer in Framework3. It's ahead of where I am with Framework4.

So now I'm not sure what I will do going forward. For sure I'll replace the DX10 renderer with the DX11 one and let all old DX10 demos run through DX11 API, but with a D3D10 feature level. Perhaps I should just refactor Framework3 to fix all those things I've been unhappy about. Surely I'll have to update all old demos too, but chances are that that's not much work this time around either. Or maybe I'll just copy Framework3 to a new folder and call it Framework4, and then start refactoring things from there?

Moral of the story: Rewriting from scratch is almost never the right way to go. It's easy to overestimate the benefits and underestimate the effort necessary to get there.

Name

Comment

Enter the code below



xoofx
Friday, December 2, 2011

Indeed!

Though, as your story implies, there is a noticeable difference between rewrite a whole application from scratch then rewrite part of the code from scratch.
If the original application is using components/interfaces layers between parts of the code, It is always easier to plugin-in a new underlying implem, sometimes by tweaking the interface to reflect the new features into other parts of the application, with a manageable impact, without having to break the whole code, as you probably did for your Framework3.
The question of rewrite from scratch arise when the paradigm of the whole design of the application is completely changing: meaning that external interfaces are changing as well as the whole internal plumbing, interfaces, workflows, datas... etc. In the end, this is not a rewrite but a new engineering of a solution.
In the case of AMD OpenGL story, as the external interface was obviously not changing, the rewrite of such a critical component was clearly not a good move. It would have been more careful to identify key part of the component to rewrite, and start incrementally from here.

Ashkan (@Aphrodite3D)
Friday, December 2, 2011

Don't rewrite a software and you have to deal with the same mess each day, every day, watching it getting worse and worse by the day, until production literally grinds to a halt. In such a code base, there would eventually come a time that we, as software developers, would be ever-increasingly spending more and more time hacking around to fix a bug or get the new feature/improvement play *nicely* with the rest of the codebase. When your code is a mess, debugging suffers, readability suffers, productivity suffers, scalability suffers, performance suffers, hell even the business itself suffers as it slowly starts to loose its competitive edge since adding new features or fixing newly found bugs (not to mention that messy code is prone to producing more bugs in the first place) takes more and more resources, be it time or money. You are loosing an all grounds. Period. Moral of the story: Not re-writing software sucks. Rewriting software also sucks. We either have to choose between the lesser of two evil when it's too late, or be smart from the get go and write overall clean software and strive to keep it in good shape. That's the lesson that software developers and business owners must learn. That's the lesson that must be tune to their ears. A lesson, which as a result of short-sightedness, they seem to be so fervently resist to acknowledge. Hacks will always come back to bite us in the ass. That's the golden rule. Rewriting is evil, sure, but it's not the root of all evil. Hacking is the root of all evil. Hacking is the reason why re-writes eventually become so devilishly tempting, and as long as we so ceaselessly insist on making the same mistake over and over again we're doomed to live the same fucking hell over and over again.

Hoof! Had to take this off my chest!

- Peace

Ashkan (@Aphrodite3D)
Friday, December 2, 2011

Don't rewrite a software and you have to deal with the same mess each day, every day, watching it getting worse and worse by the day, until production literally grinds to a halt. In such a code base, there would eventually come a time that we, as software developers, would be ever-increasingly spending more and more time hacking around to fix a bug or get the new feature/improvement play *nicely* with the rest of the codebase. When your code is a mess, debugging suffers, readability suffers, productivity suffers, scalability suffers, performance suffers, hell even the business itself suffers as it slowly starts to loose its competitive edge since adding new features or fixing newly found bugs (not to mention that messy code is prone to producing more bugs in the first place) takes more and more resources, be it time or money. You are loosing an all grounds. Period. Moral of the story: Not re-writing software sucks. Rewriting software also sucks. We either have to choose between the lesser of two evil when it's too late, or be smart from the get go and write overall clean software and strive to keep it in good shape. That's the lesson that software developers and business owners must learn. That's the lesson that must be tune to their ears. A lesson, which as a result of short-sightedness, they seem to be so fervently resist to acknowledge. Hacks will always come back to bite us in the ass. That's the golden rule. Rewriting is evil, sure, but it's not the root of all evil. Hacking is the root of all evil. Hacking is the reason why re-writes eventually become so devilishly tempting, and as long as we so ceaselessly insist on making the same mistake over and over again we're doomed to live the same fucking hell over and over again.

Hoof! Had to take this off my chest!

- Peace

Nick
Friday, December 2, 2011

Since I don't know your whole situation and experiences, you can take this comment or leave it, I'm commenting from my own perspective on similar efforts -

I wouldn't consider the rewrite wasted effort at all. It sounds to me like you learned plenty, including what was good about your previous framework. If you don't try something like that, you'll never know. Going through the experience you did also means that next time you find yourself in a situation you'll be better able to know when to cut your losses.

I sometimes embark on a rewrite knowing that I will throw it away. Sometimes I do it because I have an idea that I want to try out, or sometimes I want to prove to myself that some great idea is not really that great - and a lot of the time the only way to know know those things is to do it.

@jlauha
Friday, December 2, 2011

Copy the Framework3 to Framework5 and fix the most important things.

Wouldn't be the first time in history when "version 4" never gets out (at least Microsoft doesn't seem to like v4, e.g. DirectX 4 was never released).

Overlord
Friday, December 2, 2011

In my experience rewriting parts of the code on a continuing basis and having a plan for doing so is the key to having all your stuff up to date.
I probably spend more time planing what and how to code than i do actual coding.
I also tend to write code for just this purpose from the beginning by having things being pretty much self contained so i can just swap out a piece whenever i like.

Total rewrites are never a good idea, though every so often i still find myself wanting to do it, either way i am pretty soon forced to do it anyway as my last/first engine is pretty old and no amount of re jiggering can fix that mess.
I just hope it doesn't take that long (though it probably will).

Axel
Friday, December 2, 2011

Winamp also did a rewrite with Winamp 3 that failed (bloated, slow, etc.). They backported most of the stuff to the Winamp 2 code base and called it Winamp 5 (2 + 3 equals 5, right?)

So copy framework 3, backport the changes from 4 and call it framework 7

Rob L.
Friday, December 2, 2011

"Or maybe I'll just copy Framework3 to a new folder and call it Framework4, and then start refactoring things from there?"

Yupp, that's what I do most of the time. Otherwise you can't really test things, have side-by-side tests to verify things work as expected or even better.

Alternative: A few years ago I ripped out the window management from my framework and implemented a separate system in a stand-alone application from scratch. New interfaces and all, but it didn't do anything else, just create a window with an OGL context. After that I put that into my framework, updated all applications and got a nice, new and clean window management system. It's good to have your code modular and with very little dependencies (internal and external), so you can replace components without too much headaches.

Although, at work we just had to rewrite all of our new technology from scratch, because the old code base was such a mess and was too much integrated with other outdated and by that time unsupported code. I don't mind writing "boring" code, but I made sure I write all the "basic" things (game engine and low level systems) first anyway and rendering last. ;D

So, a rewrite can be good sometimes, but it is very rarely really necessary.

Best regards!

More pages: 1 2 3