Jump to content
Bavor

What does everyone think of the upcoming Nvidia 3000 series GPUs?

Recommended Posts

20 hours ago, Bavor said:

I'm comparing the Nvidia "brand" founders/reference edition cards for price comparisons to keep things equal.  

 

 

 

Am I wrong when I remember AIB cards actually being cheaper than founders edition cards before the 2000 series? I thought that was the case. Regardless Nvidia has won complete mind-share in almost everyone with their 3000 cards, they're fucking great at advertising, and are gonna retain the top 2 card spots. I just disagree with their practices completely now moreso, and GPUs are hugely overpriced.

Link to post
Share on other sites
2 hours ago, Raj said:

Am I wrong when I remember AIB cards actually being cheaper than founders edition cards before the 2000 series? I thought that was the case. Regardless Nvidia has won complete mind-share in almost everyone with their 3000 cards, they're fucking great at advertising, and are gonna retain the top 2 card spots. I just disagree with their practices completely now moreso, and GPUs are hugely overpriced.

The blower style cards, stock clock cards, and some of the cards with lower end coolers were cheaper than the founders edition cards on release.  I think many AIB partners tried to cash in on the market and most non blower GTX 1070 cards were in the $419.99-$499.99 range.

The plastic shroud blower cards were $399.99-$414.99 upon release.  I can't find the pricing on release date, but there was a mention of the MSI Armor cards that came with the inadequate GTX 970 cooler were under $419.99.  I thought I had a screenshot of EVGA's prices upon release, but I can't find it.  I found one mention of the stock clock version of the EVGA ACX dual fan model being $419.99 around release time.

I think most GPUs are priced around what the market will bear.  The 2000 series was higher priced for its performance and didn't sell as well as the 1000 series.  I'm pretty sure the first run of the 3070 and 3080 will sell out quickly.  You will even see the first run of custom PCB 3090 series GPUs sell out quickly even in the $1700-$2000 price range.  People look at custom 3090's as a great performance bargain compared to the 2080 ti cards with custom coolers.

AMD's advertising helps Nvidia.  I think the issue is a large part of AMD's advertising and marketing is actually awful.  I find half of AMD's social media stuff to have too much trolling, childish behavior, and immaturity if they are trying to advertise to people who can afford their more profitable products.  They also have hyped up their products before release with half truths and that leads to disappointment.  Their approach doesn't help their reputation.  Basically it seems that AMD's social media side of the GPU marketing is aimed at people who usually can't afford their GPUs without financial assistance of others(teenagers).  Other parts of their marketing almost seems to be misleading.  For example, the pre release "Ryzen is for gamers" yet 1st gen Ryzen performs worse in most games upon release than the i5 and i7 CPUs available at the time.  AMD marketing the Radeon VII as a gaming card, then a few months after release, when there was some disappointment, they change the marketing to it being a GPU for content creators.

I understand that you have to appeal to a wide audience in marketing computer gaming hardware and that some humor helps sell products and you have to emphasize your best features, but it almost seems that some of AMD's marketing hurts them more than helps them.  GamersNexus's recent video about AMD GPU marketing covered some of the same stuff I've said in the past. 

I want to see what Intel brings to the GPU market.  I don't expect their first GPUs to be great, but I expect that their 2nd GPU series will actually be competitive.  

Noctua already has this cooler listed as incompatible with many x299 motherboards.  This is nothing new.  I almost feel that this is click bait because you run into the exact same issue with most 5700 XT cards that have back plates on the same X299 motherboard.  Didn't Noctua already came out with a different NH-D15 version with more GPU clearance.

Link to post
Share on other sites

lol worrying about CPU bottlenecks....

 

Its not going to be an issue at all unless you are trying to run a 360Hz monitor at 1080p, or running a 6yo CPU at high refresh

 

To be realistic, if you are going to be buying a 3070/3080/3090, you aren't going to be gaming at 1080p. You will be at least at 1440p 144Hz, in which case you probably also have a CPU from the last 2-3 years. All of which will be absolutely fine for that resolution and refresh rate.

Link to post
Share on other sites

>10gb vram

>only significantly better at 4k

oh no no no I was right, better turn ray tracing off in bf5 or else you cant run it. Also I'm seeing just over 370 watt usage. I guess Nvidia's strat of getting people to impulse buy worked, the gains are good in and of themselves, but Nvidia overhyped their own cards, mostly for 4k, where they only have 10gb of vram.

Link to post
Share on other sites

Its enough for 1440p 144Hz.

 

Don't have to care too much about vram.

 

Its fine for 4k now, but 2-3 years is a long time for games that are probably going increase their texture sizes. Who knows what will happen with MS DirectStorage and what difference that would make.

It would be less of a gamble than the RTX 20-series but a gamble nonetheless for 4k.

Link to post
Share on other sites
23 hours ago, Raj said:

>10gb vram

The 10GB VRAM buffer isn't even relevant in most games. Doom Eternal with maxed textures was the only game to exceed 8GB VRAM used from everything I've seen benchmarked

Link to post
Share on other sites
3 hours ago, TouchFluffyTail said:

The 10GB VRAM buffer isn't even relevant in most games. Doom Eternal with maxed textures was the only game to exceed 8GB VRAM used from everything I've seen benchmarked

in 1-2 years do you think 10gb is enough?

Link to post
Share on other sites
On 9/17/2020 at 4:58 PM, TouchFluffyTail said:

In 1-2 years 8GB is likely remain enough for the vast majority of games at 4k

 

so far red dead redemption 2  and a few games with RT on already are maxing the 10gb out at 4k.

Link to post
Share on other sites

Windows and other programs report VRAM allocation, not use. Just because it says it uses 10+GB of VRAM doesn't mean it is.

Not to mention that the 3080 is still largely incapable of 4k60 DXRwithout using DLSS, which doesn't render at 4k anyway

Link to post
Share on other sites
On 9/20/2020 at 12:51 AM, Raj said:

so far red dead redemption 2  and a few games with RT on already are maxing the 10gb out at 4k.

Utilities such as MSI Afterburner and Windows task manager show VRAM reserves and not VRAM in use.  When you actually measure the VRAM in use, you see its a lot less than the VRAM requested/reserved.

Link to post
Share on other sites
On 9/16/2020 at 5:48 PM, Raj said:

Also I'm seeing just over 370 watt usage. 

The reference design 6800 XT uses 280-325 watts during gaming based on the reviews I've seen. That doesn't look like a huge difference, especially when you look at its power consumption in newer games where the 6800 XT is in the ~320 watt range.  So its 50 watts less than the 3080.  Considering GDDR6X also uses more power, that's not a huge difference.

The aftermarket 6800 XT models are using 350-385+ watts for 2 to 4 FPS more on average than the reference 6800 XT card at various resolutions and still performing worse than the RTX 3080 founder's edition card in most games.

It doesn't look like the power consumption is that bad now.

Link to post
Share on other sites

I've undervolted my 3080 to use 300W and lose no perf. I generally sit in the 200W range

Link to post
Share on other sites

I'm assembling a space heater for the winter.

ki8fj92.jpg

Link to post
Share on other sites

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...