nvidia GPU's

Buy or hold off?

  • Wait until reviewers have thoroughly tested and posted results?

    Votes: 3 42.9%
  • Be the early adopter and pre order?

    Votes: 1 14.3%
  • Wait until AMD has something to offer?

    Votes: 0 0.0%
  • Keep what you have and wait for next gen?

    Votes: 3 42.9%

  • Total voters
    7

Nukem

Where am I again?
Messages
4,718
Location
Virginia Beach, VA
Are you going to purchase a new 30 series GPU this time around?

EDIT:
While there are a ton of thing that will have to get verified, I saw this card and made me start to consider it.
https://videocardz.net/colorful-geforce-rtx-3080-10gb-igame-neptune

But... I also think I am going to wait to see the release road map, since I read somewhere (can't remember) that there might be a 20GB version of the 3080; And while Ray Tracing isn't that big a deal to me... There are (and will be) titles that will saturate Vram, so I'd rather "future proof" without spending 2K. Cause like everything else, if you think these cards are gonna go for MSRP, you're crazy. :p
 
Last edited:
While I am an early adopter in many ways I am going to pass on the 3000 series for now. My 5700XT is still fairly new and I don't play any games that even stress my current GPU. Plus I don't have the funds to run out and grab one when they become available so that takes care of that.
 
None of the above. I just got an RTX 2060 (about a year ago actually) that's gonna hold up for years to come for me. I'm a 1080p60 gamer so maybe I'll upgrade after 3-4 more years or even more. I can just reduce settings to manage. I think people think RT and extreme high graphical differences are "better" differences just because they chose to see it like so, while actually they are just differences at this level. Rasterized effects vs RT and high-very high (depending on the game) vs ultimate/extreme/nightmare/etc. (depending on the game's naming) settings only look different, not better, in actual gameplay which is different than sticking too close to objects, stair at the sky, change between settings, etc. to compare. Enemies won't just stand there watching. Some wise people say high-very high is for playing the game and extreme max is for screenshots. Otherwise people won't still play at 1080p-2k to prioritise high frame rate over 4K in fast paced games. I remember once I had to go to 720p from 1080p. First the difference bothered me because I compared. But I played for sometime and forgot about it. It was when I had GTX 680 4GB before RTX 2060 and wanted stable 60 FPS in Darksouls 3.

But if I'm to get a new card, I'd wait for RTX 3000 reviews first. AMD-ATI cards don't work well on the TV's I use for some reason (I don't use monitors anymore). Nvidia cards were always TV friendly with me unlike AMD-ATI. But maybe later models improved (last I tried was HD 6000 series.
 
The 3090 is the Titan replacement, and since this is only the first of many cards coming out, I'm sure there will be ti versions. My only question is; Will cards be even close to MSRP that nvidia put out?
 
With their history in naming there's no telling. At some point the xx90 card was a dual chip.
Price does look good for the 3070 if it overall practically outperforms 2080 Ti. Seems to good to be true really so yeah, the MSRP thing...
 
With their history in naming there's no telling.
I mean, they came out and said the 3090 is the Titan halo card, and the 3080 is the performance card. If there will be any form of Ti card it's going to be a 3080ti that will only be dropped in response to AMD.
the MSRP thing...
Nvidia has been good about MSRP, it's AIBs and etailers always raising the prices.
Price does look good for the 3070 if it overall practically outperforms 2080 Ti
It outspecs the 2080ti in every way, it should beat it across the board.
 
Back
Top Bottom