Announcement

Collapse
No announcement yet.

GeForce FX (NV30) is finally here -- What a disappointment...

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #31
    8x Nvidia AF is equivalent to 16x ATI AF. They just use different algorithms and names, they're effectively the same thing.
    "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
    Ben Kenobi: "That means I'm doing something right. "

    Comment


    • #32
      The story behind NV30:

      The Nv30 originally taped-out in its original form back in feb of last year. Which is why you all around here kept getting in flame wars over the summer after every single Stockholder address. Its taped out, its not taped out, its taped out etc etc etc.. There were even *very* early revisions of the Nv30 in peoples hands before the R9700 debuted at E3 in that doom-III system.

      The problem was, that for whatever reason the Nv30 was having serious Signal bleeds and other anomalies which caused various low level revamps of the design. Yes there were a few issues with the .13u process where the Nv30 was concerned but other products had already been completely finished at .13u all through last year. However the entire nature of the issue changed once the true nature of the R300 was revealed. Then the entire focus changed on getting the core speed up. Origionally the Nv30 was designed to be released at 350 and 400mhz. which was clearly not going to cut it against the R300 from a technology and performance leadership position. Back to the drawing board for a face lift, and Final revisions etc were finished around august, and the chip was finally ready for a Final Tape-out in early september. However this is where all the MAJOR issues with .13u comes into play. They hit serioius issues at that point with the implimentation of low-k. Which is what they were completely banking on for getting lower power and heat, in order to get the process running at 500mhz, the new target Clock Speed for the ULTRA. the whole Train wrecked for about 14 days. Nvidia at that point said.. screw this low-k, we can make it without it. They got the chip back in its final form witout low-k. It *almost* worked but there was some issue that required a final Metal layer respin. Which had them back and with a final working prototype board with initial drivers just before their big launch. However, the card was still only running at 400mhz at that point. It would run at 500mhz.. but it was very unstable. Thus they at this point (I believe) did one more revision to the chip they have now. They already knew that they would need a giant cooling system as soon as low-k fell through. What they did not count on was the necessity of a 12 layer PCB for complete stability due to power requirements and other issues until pretty late in the game. Thus the big announcement a few weeks ago.

      Which all brings us to where we are today. And why the Nv30 is in the current condition that it is. Ultimately it boils down to an Nv30 that was origionally targeted at 350-40mhz, with about 2.5x the performance of the GF4 in the *best* sinario. Which of course would have been a great product if the R300 was targeted at the GF4.
      "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
      Ben Kenobi: "That means I'm doing something right. "

      Comment


      • #33
        ATI fanboys get my goat.

        You can brag all you want when they actually release drivers that work with high end games.

        Ya Ive heard all the stuff about the catalyst drivers getting better and better. But my Radeon 7000 in my labtop would disagree.

        Comment


        • #34
          The plot thickens...

          GeForce FX Reviews Wrong?
          While we are still looking into this, it seems that the in-game screen shots posted on the Net yesterday showing off IQ produced by the GeForceFX 5800 Ultra are "wrong".

          There is no doubt that we criticized the GFFX for its AntiAliasing, and now it seems that we may have not had the proper evidence to base our conclusions on. To quote ourselves from this page:


          "With NoAA you can see the aliasing is quite predominant. 2X AA and Quincunx don’t seem to do much on the GeForceFX visually, but the FPS are affected comparing the shots to the original with no AA enabled."

          Of course all of this left us a bit puzzled, and wondering about the AA abilities of the drivers, but the "facts" are the fact correct?

          We have been working with NVIDIA on this to get an answer and it seems that now we have the preliminary information to give us a bit more insight on the question.

          The GeForceFX's technology applies filters that effect AntiAliasing and Anisotropic filtering before the frame buffer and after the frame has left the frame buffer. In short, this means that all of our screenshots do not accurately represent the true in-game visual quality that the GFFX can and will produce, as the screen shots were pulled from the frame buffer (in the "middle" of the AA process). We have come to conclusions about the GFFX IQ (Image Quality) that may be simply wrong.

          While we cannot answer for other reviews of the GeForceFX it is very possible this is an issue with those articles as well, if they were in fact thorough enough to cover IQ.

          We are currently working on a way to capture the images properly and will be revisiting the GeForceFX 5800 Preview by covering the IQ portion of our preview with proper screen shot comparison or further information addressing the truth surrounding this situation.

          Certainly this is a huge issue it seems that NVIDIA was not even aware of when they issued us the review units. Having 48 hours to preview the card over Superbowl weekend compounded this, and while that is no excuse for improper evaluation on our part, it did certainly impact our ability to do a better evaluation. We are sorry for any incorrect evaluations we have made and are working now to remedy the situation. Any new information will be posted here on our news page.
          From HardOCP.
          "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
          Ben Kenobi: "That means I'm doing something right. "

          Comment


          • #35
            I have an ATI 9700 Pro. Good to see that I get all this performance out of one card where the equivalent Nvdia card requires two.

            BTW, the ATI card does require a power input. I also have a second HDD. To date, my original DELL power supply is sufficient to handle both.
            http://tools.wikimedia.de/~gmaxwell/jorbis/JOrbisPlayer.php?path=John+Williams+The+Imperial+M arch+from+The+Empire+Strikes+Back.ogg&wiki=en

            Comment


            • #36
              I have an ATI 9700 Pro. Good to see that I get all this performance out of one card where the equivalent Nvdia card requires two.
              I think you misunderstand. They're both single cards, Nvidia's takes up an additional slot space because the cooling unit is too large in the Ultra version.
              "The issue is there are still many people out there that use religion as a crutch for bigotry and hate. Like Ben."
              Ben Kenobi: "That means I'm doing something right. "

              Comment

              Working...
              X