Jump to content
Bavor

What does everyone think of the upcoming Nvidia 3000 series GPUs?

Recommended Posts

Apparently they throw off a lot of heat.  EVGA already confirmed all of their hybrid cards will have a 240mm radiator minimum and the kingpin card will cum with a  360mm radiator.  The 2080 ti kingpin came with a 240mm radiator and the rest of the 2000 series hybrids had a 120mm radiator.

Link to post
Share on other sites

I'll believe the performance improvements they claim when I see them via benchmarks from reputable tech sites and YouTube channels. That 3070 seems like a steal though if their claims do hold true.

Link to post
Share on other sites
58 minutes ago, Spinee said:

I'll believe the performance improvements they claim when I see them via benchmarks from reputable tech sites and YouTube channels. That 3070 seems like a steal though if their claims do hold true.

3070 has more cores than the 2080 ti and faster memory.  If its not faster than a 2080 ti, many people will wonder WTF happened.  More cores and faster memory making it slower would be really odd.

Link to post
Share on other sites

10,000 cuda cores. Potentially 2x the speed of a 2080ti.

Let the price gouging begin.

IF you can find one.

That being said, I'm going to wait till 4k gaming monitors come standard with HDMI 2.1

Link to post
Share on other sites
3 hours ago, crapcannon said:

10,000 cuda cores. Potentially 2x the speed of a 2080ti.


That being said, I'm going to wait till 4k gaming monitors come standard with HDMI 2.1

Apparently the way they count CUDA cores is different so it's not an apples to apples comparison.

At this point, it's TVs rather than gaming monitors that are pushing the latest display tech (OLED/HDR) and they are starting to get adaptive sync as well. The LG CX TVs are "It" right now: 4K, 120Hz, OLED, HDR, Freesync... and starting @ $1500 for 48". Plus you would need a 3090 to really get the most out of it.  That's still too much for even most enthusiast gamers.

I think the 5000 generation will be when the 4k 120/144hz really becomes affordable so the 3070/3080 looks like the sweet spot if you don't need the absolute cutting edge. I could see a long, prosperous lifetime @ 1440p/144hz or 4k/60hz (same way 970 was golden for 1080p peeps).

I'm planning on finally making the upgrade from a 970 @ 1200p/60hz to 1440p/144hz. Still on the fence about 3070 vs 3080 but my plan would be to skip the 4000 generation and then upgrade again in ~4years to 5000 series + 4k.

Link to post
Share on other sites

My plan is 3080 for my 1440p/144Hz monitor and potentially a monitor upgrade in 2 years and skip a generation of GPUs.

 

 

Also 3080Ti this generation extremely likely to just to shit on AMD because its nvidia.

Link to post
Share on other sites

I heard each core ALU has double FP performance so Nvidia just multiplied the real CUDA core count by 2. So to get the old CUDA core count divide by 2. This makes sense as to why all the cards have seemingly ridiculous amounts of CUDA cores: the core counts aren't equivalent.

They're still gonna be fast as fuck, but I'll believe the claimed performance when I see it. Saying "Oh it's x % faster" when your benchmarks are using Raytracing and DLSS is bullshit, because the new cards have faster and more plentiful RT cores and tensor cores, so it's a meaningless comparison.

 

Link to post
Share on other sites

I'm happy with my GeForce GTX 1660 and I don't have any plans to replace it for next three years.

Link to post
Share on other sites
On 9/3/2020 at 5:27 AM, Deus__Ex__Machina said:

lmao anyone who bought a 2080ti for 1200$

that is all 

how come? havent watched the video. are they pricing them cheaper than the 20 series? that cost was a major downturn for me.

On 9/4/2020 at 4:46 PM, DrWeb7_1 said:

I'm happy with my GeForce GTX 1660 and I don't have any plans to replace it for next three years.

Likewise with my 1070, still maxes every game I own at 1080p, and I don't have plans for any monitor upgrades in the near future either. 

Link to post
Share on other sites
22 minutes ago, Assassin7 said:

how come? havent watched the video. are they pricing them cheaper than the 20 series? that cost was a major downturn for me.

Likewise with my 1070, still maxes every game I own at 1080p, and I don't have plans for any monitor upgrades in the near future either. 

If nVidia's marketing stuff is true on the 3000 series, then the 3080 is double the performance at just above half the cost, and the 3070 will be about equal, if not faster, at half the price. If those claims are true, yeah, anyone that bought a 2080ti will probably be kicking themselves.

But while it does sound like this generation of cards will be a major step forward, I'm also sticking with 1080p native monitor for the foreseeable future, so I expect my 1660 to continue to be more than adequate.

Link to post
Share on other sites
18 minutes ago, Balthazars said:

If nVidia's marketing stuff is true on the 3000 series, then the 3080 is double the performance at just above half the cost, and the 3070 will be about equal, if not faster, at half the price. If those claims are true, yeah, anyone that bought a 2080ti will probably be kicking themselves.

But while it does sound like this generation of cards will be a major step forward, I'm also sticking with 1080p native monitor for the foreseeable future, so I expect my 1660 to continue to be more than adequate.

People forget that the 2080 Ti was absurdly overpriced in the first place. The 980 Ti was $649 on launch and the 1080 ti was $699. This is finally bringing prices down to where they should be for that performance level.

Now people think it's a fantastic deal, even though it's what it should be.

Link to post
Share on other sites
1 hour ago, Tman450 said:

Now people think it's a fantastic deal, even though it's what it should be.

dosent make it any less fun to watch the 2080ti buyers cry 

@Assassin7 like balth mentioned the 3070 is supposed to basically be the same as a 2080ti if not slightly better and it costs 500$US while the 3080 is supposedly about 40% better in performance to the 2080ti while being priced at 700$US

so basically people who bought the 2080ti @ 1200$ US are seething atm because their cards just got devalued by more than half. go look up 2080ti's on ebay or similar sites atm, its hilarious 

Link to post
Share on other sites
12 hours ago, Assassin7 said:

Likewise with my 1070, still maxes every game I own at 1080p, and I don't have plans for any monitor upgrades in the near future either. 

When it comes to new generation hardware, I prefer doing "tier+1" upgrade scheme. For example, I have used GTX 1050 from 2017 till 2020. Since 2020 I have GTX 1660 I already mentioned.

When hypothetical 1700 series will be launched, the upgrade would be like GTX 1660 => GTX 1770.

 

I'm afraid that my explanation was unclear a bit since English is not my native language. =-

Link to post
Share on other sites
4 hours ago, DrWeb7_1 said:

When it comes to new generation hardware, I prefer doing "tier+1" upgrade scheme. For example, I have used GTX 1050 from 2017 till 2020. Since 2020 I have GTX 1660 I already mentioned.

When hypothetical 1700 series will be launched, the upgrade would be like GTX 1660 => GTX 1770.

  Reveal hidden contents

You're good on the english mate :P

I wont see a need to upgrade until my PC feels like its not playing any new games maxed out at the resolution and fps I want, which is currently 1080p 60Hz. 

I built my rig in early 2017, so basically just before Intels Incremental upgrades due to being so far ahead of AMD ended lol. Ill upgrade it maybe next year or the year after, we'll see what games come out. Ill probably upgrade with the Nvidia 4000 series tbh.  

I went 660>970>1070 for graphics cards. Though the first two were both as a broke student lol, I got lucky with twitch donations to afford the 970.

Link to post
Share on other sites
28 minutes ago, Assassin7 said:

I built my rig in early 2017, so basically just before Intels Incremental upgrades due to being so far ahead of AMD ended lol

My rig was built in middle 2017 on LGA1150 platform because I was not happy with LGA1151 those days and was not interested in AMD either. Also I'm using Windows 8.1 because Windows 10 is not my option due to really bad behavior on all of my hardware (builds newer than 1507 produced ton of errors) and Windows 7 is about "we disabled Aero because that application sucks".

Assuming that AM4 boards with B450/X470 chipsets support Windows 7, reusing these drivers in Windows 8.1 won't be a problem for me. Also 8.1 has native NVMe support, so there will no hassle with driver embedding.

Edited by DrWeb7_1
Wrong component.
Link to post
Share on other sites

expensive and I doubt their performance gain claims, they're probably literal lies. I'm guessing we get a 3070 super competitor this year from AMD for 150$ cheaper and with a reasonable amount of vram, but with ray-tracing. Need to see independent benchmarks.

Weird how people think the 3080 and 3070 are any cheaper, they're same as overpriced Turing shit, and Super cards are going to be ridiculous. The 3070 should be like 300-350$.

Link to post
Share on other sites
On 9/2/2020 at 5:30 PM, dustygator said:

At this point, it's TVs rather than gaming monitors that are pushing the latest display tech (OLED/HDR) and they are starting to get adaptive sync as well. The LG CX TVs are "It" right now: 4K, 120Hz, OLED, HDR, Freesync... and starting @ $1500 for 48". Plus you would need a 3090 to really get the most out of it.  That's still too much for even most enthusiast gamers.

I have one on my desk, finally cracked and drank the OLED Kool-Aid. WoT runs at max on a 4K even with a 1080Ti, so the 3080 should be glorious overkill. Seven more days...

(The game is claiming 95 fps, but I'm assuming the screen is only showing 60 fps G-Sync through the 1080's 2.0 connector.)

Link to post
Share on other sites
13 hours ago, Raj said:

expensive and I doubt their performance gain claims, they're probably literal lies. I'm guessing we get a 3070 super competitor this year from AMD for 150$ cheaper and with a reasonable amount of vram, but with ray-tracing. Need to see independent benchmarks.

Weird how people think the 3080 and 3070 are any cheaper, they're same as overpriced Turing shit, and Super cards are going to be ridiculous. The 3070 should be like 300-350$.

 

That's a bold statement about "lies."  Nvidia has potential for actual competition thins time around, so I doubt they exaggerated the performance by much.  You really expect a lot for the price.  The leaked benchmarks so far show very little exaggeration about performance.

You think 2080 ti performance(3070) should be $300-$350?  That would be the PC performance deal of the century.

I have doubt that AMD is going to be able to offer anything with similar performance more much more than $50 less than Nvidia.

On 9/2/2020 at 8:30 PM, dustygator said:

Apparently the way they count CUDA cores is different so it's not an apples to apples comparison.

At this point, it's TVs rather than gaming monitors that are pushing the latest display tech (OLED/HDR) and they are starting to get adaptive sync as well. The LG CX TVs are "It" right now: 4K, 120Hz, OLED, HDR, Freesync... and starting @ $1500 for 48". Plus you would need a 3090 to really get the most out of it.  That's still too much for even most enthusiast gamers.

I think the 5000 generation will be when the 4k 120/144hz really becomes affordable so the 3070/3080 looks like the sweet spot if you don't need the absolute cutting edge. I could see a long, prosperous lifetime @ 1440p/144hz or 4k/60hz (same way 970 was golden for 1080p peeps).

I'm planning on finally making the upgrade from a 970 @ 1200p/60hz to 1440p/144hz. Still on the fence about 3070 vs 3080 but my plan would be to skip the 4000 generation and then upgrade again in ~4years to 5000 series + 4k.

 

If I remember correctly from reading the documentation the day of the video Nvidia optimized the cuda cores for two different tasks, so you can't compare core count directly.  Half the cores are optimized for one type of task and the other half are optimized for another type of task.

A 2080 ti still struggles with many newer games at max settings at 3440x1440 100+ Hz and 4K resolution.  So there is still room for improvement.  I think that's where DLSS will help.

The volume of sales probably helps make TV tech cheaper than high end monitors.  We just need the high end TVs and monitors all to support the latest HDMI and Displayport versions to make the tech usable.

3 hours ago, Necrophore said:

I have one on my desk, finally cracked and drank the OLED Kool-Aid. WoT runs at max on a 4K even with a 1080Ti, so the 3080 should be glorious overkill. Seven more days...

(The game is claiming 95 fps, but I'm assuming the screen is only showing 60 fps G-Sync through the 1080's 2.0 connector.)

WoT is still not very demanding on graphics hardware.   Its playable at 4K 60 FPS with a 1050 ti on the medium or high preset.  I was getting 180-200 FPS at 4K  with the Ultra preset with my current cards.

What has me excited about the 3000 series is newer games and also the reduction in video encoding times and 3D rendering times.

Link to post
Share on other sites
1 hour ago, Bavor said:

 

That's a bold statement about "lies."  Nvidia has potential for actual competition thins time around, so I doubt they exaggerated the performance by much.  You really expect a lot for the price.  The leaked benchmarks so far show very little exaggeration about performance.

You think 2080 ti performance(3070) should be $300-$350?  That would be the PC performance deal of the century.

I have doubt that AMD is going to be able to offer anything with similar performance more much more than $50 less than Nvidia.

 

 

t. nvidia shill

yes I do think a Nvidia 70 series card should be 300-350$. You dont fucking pay for each performance increase every 2-3 years, thats to be expected in the new card. Yes I understand inflation is a thing so maybe 400 at the most. Nvidia literally shills this idea of getting better performance for 100 or 200$ less, when the cards in total are going up in price each generation. I'm saying an 86% increase in performance from 2080ti to a 3080 is extremely suspect when their best generation leap 980ti->1080ti was a lot letss than 86%, you know about 35% which is normal.

Why would you believe that on the same technology using the same native resolution with DLSS off would be anywhere near 86%. Perhaps DLSS 2.0 with RT may account for a lot better performance. But I'm severely doubting that an 86% increase has gone from lower tier card. The RX 5700XT was selling for 350$ for a long ass time, and nows like 400$, and its on average in games 2% slower, and the 2070 super costs 500$. So explain to me, why is this a bold statement to doubt something thats extremely fishy?

 

There is one possibility I am going to entertain, that the 3080 may actually be insanely good at 4k for some reason, but I dont see myself going for 4k when I can have a higher FPS 2k experience.

Synthetic benchmarks it would not surprise me that 3000 series cards get insane gains over Turing cards. But the test is always games. And actually I'm slightly worried that CPUs arent good enough to keep up with GPUs for this generation specifically. Maybe a high end ryzen 4000 CPU might be necessary for some games to not be bottlenecked. That is for anything below 4k.

Link to post
Share on other sites

Jensen called the RTX 3080 the flagship.  Also, Nvidia has been adding more lower end models than they had in previous generations.  That basically moved the product stack up in Nvidia's lineup.  Especially with the addition of the ti and super cards at the lower end.

If Nvidia considers the RTX 3080  the flagship, then Its the card that replaces the 2080 ti and is the equivalent to the 1080 ti in the 1000 series 980 ti in the 900 series.  So the 3060 series will be the replacement for the equivalent 2070, 1070, and 970. 

There is another way to look at it.  I bought a Nvidia GTX 1070 new back when they were released.  MSRP was $449.99 at the time.  I had a BestBuy coupon and rewards points, so I ended up getting $50 off, but that's irrelevant.  Considering the price of the 3070 only went up $50 from the release price of the 1070 that's not bad.  Looking at the online inflation calculators, the $450 GTX 1070 price on release date is the equivalent of $490-$500 now.  The 3070's price seems reasonable when you consider inflation.

Eventually the, MSRP decreased to $380 for the GTX 1070 with the release of the GTX 1070 ti.  $380 is about $415-$425 now when you account for inflation.  The $350 price for the RTX 3070 you want is the equivalent of about $320 on the release date of the GTX 1070.

I don't think you understood the chart Nvidia showed during the presentation if that's where you get the 86% number from.  It was based on performance relative to the GTX 980 for 4K gaming over multiple games.  I don't know how you are getting an 86% number from that chart or any other official info released by Nvidia.  The chart said that the 2080 ti and 3070 are about 3.25 times higher FPS at 4K than the 980 and the 3080 is about 4.5 times higher FPS at 4K than the 980.

6SXpZrJ.jpg

If you are referring to the "leaked" geekbench results, where the 3080 is 86% faster than the 2080 in some tests, well that has nothing to do with gaming performance.  If Nvidia allows all the cuda cores to operate at the same time for processing different instruction types, then its possible that in certain math calculations the 3080 is 86% faster than a 2080 at those types of calculations because it could run more calculations per clock cycle.

Is the 5700 XT even worth $350 or its $399 launch price considering it took AMD 6+ months to get properly working drivers for many people?  Is 6 months of black screens, crashes, and games not running properly worth the savings?  Also the launch price reduction that AMD did with the 5700 XT may have been a gamble to get more new customers to AMD.  Basically, it sounds like it was a marketing ploy.  Also, AMD may have had to reduce the price a second time to the $350-$380 range to get people to buy the 5700 XT after all the problems customers had with it when it was released.  Every hardware forum, Facebook group, and reddit was full of people having issues with the cards.  AMD officially posted on Reddit asking for feedback and info from customers having issues with the 5000 series GPUs so they could resolve the widespread issues.

You can find non founders edition 2070 Super cards form EVGA and MSI for $449.99 and $459.99 on sale on a regular basis.  It almost sounds like its worth it not to have to possibly put up with the widespread driver issues.

I wouldn't worry about a CPU bottleneck unless you have a low end CPU.  Even with a 2080 ti, the difference between a 3600X and 3800X when locked at the same CPU speed is minimal outside of a few games.  Game engine developers are just starting to finally adapt their game engines to use more than 6 or 8 threads recently.  There are games where a 9900K or 1 10900K overclocked still shows a significant improvement at 1440p over a 3800X or 3950X, but those aren't the majority.

I actually hope AMD can make a competitive higher end GPU this time around and we can finally get some competition back into the market.  I think Nvidia left that wide open space above the 3080 for a 3080 ti or Super card in case Nvidia has something competitive to release.  

Link to post
Share on other sites

I'm comparing the Nvidia "brand" founders/reference edition cards for price comparisons to keep things equal.  There are too many differences in coolers and other hardware for different manufacturers models to compare them equally.

No hardware manufacturers or vendors had a $379.99 card available at launch.  The $379.99 cards weren't available until months later.  They may have been advertised, but nobody had them in stock.  Also the use of low priced models brings up another issue.  People could argue semantics of lower priced models with inadequate cooling, such as some of the lower priced MSI models that overheated because they used a cooler originally designed for a lower wattage GPU.

You need to actually fact check Wikipedia and not assume everything there is correct.  Just because someone entered information incorrectly on Wikipedia, doesn't mean its a fact.

LUBJqkP.jpg

https://arstechnica.com/gadgets/2016/05/nvidia-gtx-1080-1070-pascal-specs-pricing-revealed/

 

 

 

Link to post
Share on other sites

  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...