"I don't see the logic of rejecting data just because they seem incredible."
- Fred Hoyle
More pages: 1 ... 11 ... 21 ... 31 ... 40 41 42 43 44 45 46 47
Humus takes a look at the world today
Thursday, November 21, 2002 | Permalink

I guess by now noone really have missed the launch of the GeForce FX. This next chip from nVidia which is supposed to move the King of 3D title back to nVidia again has been met by mixed feelings from media, gamers and developers. We have heard everything from "rocks" to "sucks". Here's my take ...

I'm certain the card will rock. There are certainly strong points of this chip where it outdoes the Radeon 9700, like longer fragment programs, up to 32 bits/channel operation, more flexible math etc. There are also weak points, like floating point textures being limited to texture_rectangle only, while the 9700 can do any of 1D/2D/3D/cubemap/texture_rectangle. All this is nice and dandy, but there's a trend that worries me. I have been expecting that the graphic market would mature over the years, but I guess I've been proven wrong, with both good and bad consequences ...

The good part first, I expected graphics to slow down in progress a little, less performance boost between generations, less new and exciting features, like where the audio card industry is today. The Radeon 9700 really proved me wrong. It outperformed anything on the market, especially with anisotropic and antialiasing maxed out where it won with an order of magnitude higher performance than earlier and competing chips. It also came with a large set of new features and I currently find myself in a situation where I have a lot more possibilities than ever. The step is larger than what I felt going from my G400 to Radeon and from the Radeon to Radeon 8500.

The bad part, which worries me a little, is that the extremely competive nature of the graphic industry is leading us (maybe with no return) into a direction where we don't really want to go. What I'm thinking of is the vacuum cleaner nVidia has attached to their GeForce FX chips.

Let's go back in time a little. Back in the old days, not even CPUs had active cooling. When chips started to appear which required active cooling people had a little hard to accept it at first, but over time people have gotten used to it. Today everyone takes for granted that a CPU should have a monstrous heatsink with a loud fan attached to it and noone reflects over the fact that the little thing uses more power than a lightbulb. I guess electricity simply is to cheap for anyone to care, but many do care about the noise. I certainly do.
Now take a look at graphic cards. My VoodooII was fine with with passive cooling. So was my G400. My Radeon had a small fan. My Radeon 8500 had a small fan. My Radeon 9700 has a slightly larger fan and requires extra power from a HD connector. The Voodoo 5 had a similar solution, but people bashed it for it's ridicolous power consumption. Noone complained about the 9700 requiring it. Maybe because it was so much ahead of everything else, something you couldn't say about the Voodoo 5. Now the GeForce FX arrives, also requiring extra power from a HD connector. Additionally, it comes with a vacuum cleaner attached to it to keep it cool. Of course nVidia pushes the thing as a revolutionary cooling solution and some people will drool over it as if it's the coolest thing ever. But I'll tell you something, this is NOT progress. This is a step towards somewhere we don't want to go. Computers are already noisy enough. I was delighted to read that the new Athlon 64 was running quite cool due to the SOI process. That's progress! If a vacuum cleaner attached to your graphic card is what it takes to beat a competitor, then I think it's time to get back to the drawing board; and rethink your strategies.

It keeps amazing me that performance still is the main selling point of graphic boards. One would expect that in this time and age people would start to value other attributes a little higher, most games do already run fine in high resolution with anisotropic and antialiasing. It's time to move along. Graphic companies need to start pushing other attributes. There's no shame of being slower, even if you release your card later. The GeForce FX has enough new cool stuff to be able to sell without requiring it's own ventilation system and filling both the AGP and a PCI slot. Just clock that darn thing at level it can handle! Power consumption scales quite linearly with clock speed, and quadratically with voltage. Just lower the clock a little; which might mean you can turn the voltage down a little too; with the result of slightly less performance, but without the need of a vacuum cleaner; and thus probably a more affordable price in the end for the customer. There are so many attributes of the card that nVidia could push instead, like features, driver quality, linux support (a large market I'm still waiting on ATi to give a rat's ass about) etc. It doesn't need to outperform the 9700 to be worth it's price. In all honesty though, the "silent computing" thing is progress. That's a really good feature I hope to see from more vendors. No need to run a fan at full speed when the chip doesn't have a whole lot of work to do. I don't buy the claims that it doesn't matter if the thing is noisy during a gaming session. Sure, partly true for high speed action games like UT, but not true for other games. You don't want air flow noise to disturb you while playing Unreal/Doom3 style games.
Anyway, lots of words here; what I really wanted to say is just this: Please, don't make noisy computing the norm of the future.

- Humus


Two new demos
Tuesday, November 19, 2002 | Permalink

Two new demos up. Motion blur and depth of field.


Tuesday, November 19, 2002 | Permalink

No, it's not about an update of the tweaker. Rather the opposite. I'm planning on removing it from this site. This piece of software is horribly outdated, so it doesn't make much sense to keep it here. I'll let it be up for a week or so, then I'll remove it. I'm putting this note up now so I don't get a lot of emails asking where it went.
I do occasionally get questions about it, heck, I even get some questions about Raid-On tweaker still. While it's always nice to know that people use my software, this piece of code represent something of the past and an interest in tweaking I had a couple of years ago. Today I'm not much of a tweaker, tweaking is something for the entusiast gamer, not something for a developer. I need stability and reliability, performance is secondary.
So ...



Sunday, November 17, 2002 | Permalink

I've got several reports about the infinite terrain demo crashing, it happened for myself from time to time too. I *think* it's gone now, but I'm not sure, at least it hasn't crashed in a while now. I have also fixed a problem with normals at block edges which caused visible lines along the terrain at certain places.
Will move on to some other project now.


Demo news
Saturday, November 16, 2002 | Permalink

I have uploaded a new infinite terrain demo.

I have also gone through the pain of updating and uploading my older demos based on the new framework due to some design changes in the framework. All these demos should now compile and work, except the "Shadows that don't suck" demo that's still affected by a rendering to texture driver bug.

Quick update:
It turned out my first version of the infinite terrain demo didn't work at all on cards that doesn't support GL_ATI_vertex_array_object, but that's fixed now. I also included the font texture that I forgot the first time.


Saturday, November 16, 2002 | Permalink

Finally, some snow!
This has been a slow winter in northern Sweden, some snow coming and going, but now for the first time the ground is completely covered with snow.


ATi demos and drivers
Wednesday, November 13, 2002 | Permalink

Microsoft just released DX9 RC0 to public. Following this ATi finally released their demos showing off the Radeon 9700 features which requires DX9. To be able to run them ATi also released a new set of DX9 RC0 compatible drivers. These drivers also contain a new OpenGL driver with full support for GL_ARB_fragment_program, GL_ATI_draw_buffers, WGL_ATI_pixel_format_float and GL_NV_occlusion_query. Guess I'll be even more busy the coming days.

It also turns out these drivers fix a few bugs. The Laplace and Dither demos now work again. The "Shadows that don't suck" demo is still broken though.

Click the headline for the goods.


Mandelbrot demo
Monday, November 11, 2002 | Permalink

Ah, what we all have been waiting for, a mandelbrot renderer running in hardware.


More pages: 1 ... 11 ... 21 ... 31 ... 40 41 42 43 44 45 46 47