|
|
Synthetic Benchmarking: Good or Obsolete? A-KO recently told me about a discussion on the Guru of 3D forums regarding the ATI/NVidia Driver cheating debate. I wanted to chime in from an avid Unreal gamer's POV. The discussion of who does what with their drivers for 3DMark Scores' sake is no longer valid in my opinion. The days of Futuremark, or any other 'sythentic' gaming benchmark, are gone. While the benchmark is fun to use and watch, it is no longer relevant in 'real world' gaming. Here is why.. When 3DMmark and other synthetic benchmarks first made their way onto the scene gaming was completely different. These benchmarks used more Open GL tests and very few D3D ones. This was a direct reflection of the state of gaming at the time. Just prior to the benchmark coming around the biggest problem in the gaming community had to do with the original Quake and Open GL. With the advent of Open GL some gamers had the ability to 'see through' the water and kill opponents that couldn't see them. Was this cheating? No, it wasn't because the code was there but not all cards could support this feature. Now just because it wasn't cheating didn't mean it was fair. However things were about to change in the gaming world. As the gaming community moved from Open GL to Direct3D, the hardware changed as well. Now all cards that started coming out supported '3D Gaming'. Since there were two main formats you wanted to see what card did the best in which format. This is where synthetic benchmarks got their fame and fortune. With two major formats you needed something that could push the card and tell you what will happen. When PC Mag's online benchmark was originally created it tested how your card did with 8, 16, 32 or 64 megs of textures. The benchmark keyed on what your card CAN or CANNOT do as opposed to what they THINK it should do. This was important because you wanted to determine what features you should turn on while playing in the game. These features directly affected your game play online due to the amount of frames your computer could produce. That is no longer the case today as most games are running the current D3D format and graphic engines have completely changed as well. Now there are non-important features such as the 'Rag Doll' effect that has no impact on the game play and is purely for looks. Since this is pure eye-candy and you only need to have it if you want it and your PC can support it. So for Futuremark to try and emulate both current games and 'future' games and try and dictate how a gamer plays is not very wise. How can one benchmark emulate several different engines and how gamers will play them? The answer is it can't, the only real way to tell how your card is going to do in a certain game is to run it in that game. What is the point of getting 10,000 marks if you go online to play your favorite game and you get 30fps in the real game? I'll tell you what will happen, you will get beat by someone that is getting a smooth 75fps avg while your machine is bogged down on a frame rate killing map. So in essence people should not be mad at ATI or NVidia for trying to 'gain' an advantage in a benchmark. You should be mad at them for wasting time they could be using on making the driver better in games you play now, all while keeping their eye on the future. In the end if you like Futuremark use it. However, don't go to a forum and bitch because your rig is fast in the benches but sucks in real world games because that will only get you flamed. As always this are just my opinions and mine only, Tycho If you wish to discuss this more head over here _______________ Sources: FutureMark PC Mag Online Neowin.net
|
or in irc #unrealops on irc.gameradius.org
|
||
The Unreal Ops site was created with inspiration from Eyeball-Design. All © content on this site is property of their respective owners. You must contact the Webmaster if you would like to borrow any Unreal Ops Exclusive content on this site. |