Announcement

Collapse
No announcement yet.

nVidia caught cheating on benchmarks

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #76
    Originally posted by Ned


    Well, I don't even want to imagine what it would be like to be a Vietcong going up against the US Army and Marine Corps in Vietnam. We had an awesome firepower advantage because we had airmobile resupply of ammo, artillery, tactical air and B52s. The Vietcong had none of this. I think our "kill" ratio was at least 10-1 and more like 20-1.
    I would wonder.. the poor VC were pretty diehard in their beliefs.... so they fought to the death.
    For there is [another] kind of violence, slower but just as deadly, destructive as the shot or the bomb in the night. This is the violence of institutions -- indifference, inaction, and decay. This is the violence that afflicts the poor, that poisons relations between men because their skin has different colors. - Bobby Kennedy (Mindless Menance of Violence)

    Comment


    • #77
      Originally posted by Asher

      Nvidia and ATI were both in that developer beta program, but Nvidia left a long time ago in protest over how 3DMark03 was being written for ATI cards in mind, and have so much stuff be emulated on the Nvidia architecture. It makes even less sense, since the 1.1/1.3 pixel shaders are almost universally used in DX8 games, not ATI's 1.4.
      Yes, it was about 1.4 going down to 1.1 skipping 1.3 if the card doesn't support 1.4? Or something like that?
      I just took their word when they said that 1.3 has no real performance over 1.1. Nobody (well I didn't see it anywhere) challenged that argument so I have to go with what they(futuremark) said.
      What goes for using 1.4 when the card supports it, I don't see any reason _not to use it_ if is perfoming better like they said and the card supports it. There is no reason why they should have gone how the games uptill then had done, because when released it was ment for the future games. Is there any reason to believe games won't support 1.4 from now on (now there are more cards that can do 1.4)?
      There was a lot of discussion about this back then, and the answers sure satisfied me.

      What I don't know is (I am not pro, just someone who reads news!):
      What is more likely, games will use ps2.0 and go down to 1.1/1.3 in the future if the card doesnt support it (is this practical/possible)???
      Or 1.4 and go down to 1.1/1.3?
      Or just use 1.1/1.3.
      It's about goddamn time games start using dx8/9, untill then my gf4 will stay in my comp... we'll have to wait.
      As long as games don't use any cool shaders 3dmark2003 will not serve its purpose. The dawm of cinemating, where the hell are you, "I have been ready" for ages now.

      Comment


      • #78
        Originally posted by tinyp3nis
        Yes, it was about 1.4 going down to 1.1 skipping 1.3 if the card doesn't support 1.4? Or something like that?
        I just took their word when they said that 1.3 has no real performance over 1.1. Nobody (well I didn't see it anywhere) challenged that argument so I have to go with what they(futuremark) said.
        That's not the problem, though.

        The GeForce FX supports 1.4, it just needs to be emulated. 3DMark pings the drivers, determines it has 1.4 support, and runs them. They're far slower than Nvidia's equivalent 1.3. One of the things they got busted for cheating is forcing the GeForce FXs to run it in 1.3, since it makes so much more sense.


        What goes for using 1.4 when the card supports it, I don't see any reason _not to use it_ if is perfoming better like they said and the card supports it.
        The only card it performs better on is the Radeon. Don't let the numbering confuse you, 1.4 is not inherently better than 1.3. They're just different versions that MS okayed. Nvidia made 1.1-1.3 (1.0 never got released), ATI made 1.4. They're just different ways to go about doing pixel shading.

        Since ATI cards also support 1.1-1.3, virtually no game has implemented 1.4 since it's rather silly to. Which is why I'm completely dumbfounded why Futuremark made 3DMark03 so heavy on it. Hell, in all but one of the DX9 tests, it still uses 1.4 instead of 2.0! It's like somebody went out of their way to make the Radeon's performance better, but that's just my opinion. Either that, or gross stupidity.

        What is more likely, games will use ps2.0 and go down to 1.1/1.3 in the future if the card doesnt support it (is this practical/possible)???
        Or 1.4 and go down to 1.1/1.3?
        Or just use 1.1/1.3.
        What's most likely is no games, at all, will use 1.4. If it's a DX8 game, it'll use 1.3, if it's a DX9 game, it'll use 2.0.

        1.4 is ridiculous, and slow on everything but Radeon cards, and the only GeForce card that uses it is the FX. There's also no real advantage to it.
        "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
        Ben Kenobi: "That means I'm doing something right. "

        Comment


        • #79
          Originally posted by Asher

          That's not the problem, though.

          The GeForce FX supports 1.4, it just needs to be emulated. 3DMark pings the drivers, determines it has 1.4 support, and runs them. They're far slower than Nvidia's equivalent 1.3. One of the things they got busted for cheating is forcing the GeForce FXs to run it in 1.3, since it makes so much more sense.



          The only card it performs better on is the Radeon. Don't let the numbering confuse you, 1.4 is not inherently better than 1.3. They're just different versions that MS okayed. Nvidia made 1.1-1.3 (1.0 never got released), ATI made 1.4. They're just different ways to go about doing pixel shading.

          Since ATI cards also support 1.1-1.3, virtually no game has implemented 1.4 since it's rather silly to. Which is why I'm completely dumbfounded why Futuremark made 3DMark03 so heavy on it. Hell, in all but one of the DX9 tests, it still uses 1.4 instead of 2.0! It's like somebody went out of their way to make the Radeon's performance better, but that's just my opinion. Either that, or gross stupidity.


          What's most likely is no games, at all, will use 1.4. If it's a DX8 game, it'll use 1.3, if it's a DX9 game, it'll use 2.0.

          1.4 is ridiculous, and slow on everything but Radeon cards, and the only GeForce card that uses it is the FX. There's also no real advantage to it.
          Now hold on there, 1.3 faster than 1.4? I would love to see a link. Seriously, I haven't heard about this being said not even once. I also thought that 1.4 (like 2.0) is supposed(!!) to be in the card if it is fully directx9 card. It's supposed to be one of the dx9 specifications, not just ATI thing?
          Links!

          Comment


          • #80
            1.3 is "faster" on Geforce, because they can't do 1.4 well, is that what you are saying?
            If so, it's not the same thing as 1.3 actually being faster than 1.4!
            Nvidia does have monopoly, and the games will be made so that they work on most+slowest cards.

            Comment


            • #81
              Originally posted by tinyp3nis
              Now hold on there, 1.3 faster than 1.4? I would love to see a link. Seriously, I haven't heard about this being said not even once. I also thought that 1.4 (like 2.0) is supposed(!!) to be in the card if it is fully directx9 card. It's supposed to be one of the dx9 specifications, not just ATI thing?
              Links!
              1.3 is much faster than 1.4 on the GeForce FX, because the 1.4 is translated into 2.0 code, THEN evalulated by the GeForce FX. Not only is there the extra translation step, there's also the overhead of running 2.0 code to do simple 1.4 things.

              The GeForce FX runs 1.1-1.3 natively, 2.0 natively, and 1.4 is emulated.

              1.3 is "faster" on Geforce, because they can't do 1.4 well, is that what you are saying?
              If so, it's not the same thing as 1.3 actually being faster than 1.4!
              Neither is faster than eachother if implemented equally. 1.4 is drastically different than 1.3, one is not better than the other. 1.3 is Nvidia's DX8.1 implementation, 1.4 is ATI's DX8.1 implementation. Nvidia turned their spec in before, so it was labeled 1.3, ATI turned theirs in later, so it got 1.4.

              The thing is, the only video cards who support 1.4 natively are Radeons. 1.3 is supported natively by S3, SiS, Nvidia, and ATI. That's why games use 1.3 for DX8, and 2.0 for DX9.

              1.4 is a failed addition by ATI, and the only mainstream app who uses it is 3DMark03. And they use it very heavily.
              "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
              Ben Kenobi: "That means I'm doing something right. "

              Comment


              • #82
                So in short, do you think Futuremark was lying when they said that 1.1-> 1.3 had no real performance advantage, but 1.1-> 1.4 did have some.

                Comment


                • #83
                  Originally posted by tinyp3nis
                  So in short, do you think Futuremark was lying when they said that 1.1-> 1.3 had no real performance advantage, but 1.1-> 1.4 did have some.
                  It has a performance advantage on the Radeon only, which is why I find it strange that ATI sent developers to help with the program, and why Nvidia quit it in protest...

                  And the reason it has a performance advantage on the Radeon is because the Radeon was designed around using 1.4, and 1.1/1.3 was added at the last minute, but still run natively.
                  "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
                  Ben Kenobi: "That means I'm doing something right. "

                  Comment


                  • #84
                    Now that I can agree with The reason why the better(only little but still) 1.4 wont be used is because Nvidia refuses to use it, FX should have had it in it.

                    Comment


                    • #85
                      The FX doesn't have it natively, because they didn't think they'd need it. Why waste more transistors when you've already got 130M of them?

                      No games used 1.4 then, only 2 use them now (and also have native 1.3). 3DMark is a very notable exception...
                      "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
                      Ben Kenobi: "That means I'm doing something right. "

                      Comment


                      • #86
                        2 games that uses it is 2 games more than I knew of What's their names?

                        Comment


                        • #87
                          Hell if I can remember them.
                          "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
                          Ben Kenobi: "That means I'm doing something right. "

                          Comment


                          • #88
                            Originally posted by Fez
                            I aways played MOHAA.. my favorite game indeed... I never played Ghost recon though.

                            In Vietcong, you usually have a medic. Or sometimes first aid kits. If you play on easy you can always get healed to 100%.. but if you play on normal or harder.. you will notice the medic only can do so much.. you sustain injury overtime no matter if you get treated by the medic.

                            That is exactly how it is in Vietcong too.. if you play Vietcong mode.. there is no save. And it is insanely difficult. I tried it. They say in the manual.. that is how it was like.

                            Also you have to keep your team alive because each member has a specific function.
                            Well Fez, I just got through Vietcong on easy mode. It was really hard. I can only imagine what it would be like on Vietcong mode. For example, in most cases I would just run up to the VC and blaze away with my automatic weapon. If I were killed with only one hit, this would be impossible.

                            That last set mission, defending the base, was made a lot harder because I started with the wrong weapon. I think the shotgun is best. I had the M1 Garand Sniper, which I quickly had to throw away in fravor of a shotgun.

                            I can see how this game has a lot of replayabilty with the choices of weapons on each mission. A very good game, overall.

                            BTW, the book We Were Soldiers reported how loud the battles were in Vietnam with so many authomatic weapons shooting all at once. I think this game capturef that aspect of the war to a tee. That last battle was really loud.
                            http://tools.wikimedia.de/~gmaxwell/jorbis/JOrbisPlayer.php?path=John+Williams+The+Imperial+M arch+from+The+Empire+Strikes+Back.ogg&wiki=en

                            Comment


                            • #89
                              Originally posted by Ned

                              ... that Russia took enormous casualties in assualting the German army in 44-45, suffering far more casualties than it inflicted on the Germans, while at the same time the US Army took very few casualties and inflicted many times that number on the Germans? Was the primary difference in the two theaters the use of airpower by the US? Or was there something more, like bad Russian tactics?)
                              Actually, many factors. Most important was lack of option to surrender on Eastern Front (you became a casualty anyway).

                              After reading this thread, and trying the latest NVid driver for my obsolescent TNT2 (it trashed my hard disk ), I can safely say my next upgrade is going to be a Radeon.

                              And I will be playing Ghost Recon and Vietcong on it (there are lots of the latter in the local exchange game stores - it must be very tough).
                              Some cry `Allah O Akbar` in the street. And some carry Allah in their heart.
                              "The CIA does nothing, says nothing, allows nothing, unless its own interests are served. They are the biggest assembly of liars and theives this country ever put under one roof and they are an abomination" Deputy COS (Intel) US Army 1981-84

                              Comment


                              • #90
                                Originally posted by Cruddy


                                Actually, many factors. Most important was lack of option to surrender on Eastern Front (you became a casualty anyway).

                                After reading this thread, and trying the latest NVid driver for my obsolescent TNT2 (it trashed my hard disk ), I can safely say my next upgrade is going to be a Radeon.

                                And I will be playing Ghost Recon and Vietcong on it (there are lots of the latter in the local exchange game stores - it must be very tough).
                                Cruddy, The two games are simply extraordanary.

                                As to the differences between the Russian and American performance, I can hardly think that it has anything to do with the doggedness of the Germans on the Eastern Front. In the West, we killed a lot more Germans per American. It seems, rather, that the Germans were not surrendering on the Western Front.

                                The Russians had a lot of airpower as well. So I am beginning to think that the Russians were big on human wave charges.
                                http://tools.wikimedia.de/~gmaxwell/jorbis/JOrbisPlayer.php?path=John+Williams+The+Imperial+M arch+from+The+Empire+Strikes+Back.ogg&wiki=en

                                Comment

                                Working...
                                X