Jump to content
Shade421

GTX 1080 As tested by an actual owner...

Recommended Posts

So, as some of you know I have been rocking AMD's unholy abomination, the R9 295X2 for about a year now. I loved it. I'm actually considering framing it and hanging it on my wall as a shrine to my favorite video card of all time. I in fact bought a 980 at launch and sold it to pick up the 295. But, Nvidia has hyped their new GPU launch sufficiently for me to bite and order a GTX 1080 Founder's Edition. Why I paid the premium for the reference card will become clear later.

First off, some system info:

Spoiler

i7-4790K @ 4.8GHz

16GB AMD Gamer Series 2133MHz DDR3

ASUS ROG Maximus VII Formula

PNY Nvidia Geforce GTX 1080 Founder's Edition

Intel 730 Series 240GB SSD

Some 3TB HDD I can't remember

Windows 10 64-bit

Cooler Master Silent Pro Gold 1000 Watt PSU

Acer XR341CK primary display

Samsung U28D590D secondary

Fractal Design Define S Mid Tower

Front mounted 360MM Rad, 3x Corsair SP120 fans

Top mounted 240MM Rad, 2x Corsair SP120

Rear mounted 140MM Rad, 1x Corsair SP140

Thermaltake Pacfic PR22-D5 pump and res combo

Koolance CPU block

EK GPU block

 

Spoiler

mPEECFH.jpg

So, anyone that has done custom loop water cooling already knows why I spent the $100 premium to get a reference design. Water blocks are typically only made for reference PCB's. If you are not going to water cool, DO NOT get the Founder's Edition. This is the only logical reason to buy one.

Size is not small, but it's significantly smaller than the 295X2. Here's a comparison:

Spoiler

mUmmEj0.jpg

It will fit in mid towers no problem. Smaller, you'll have to do some research, but that's the price you pay for small form factor building.

Ok so baseline testing. I am strapped for time at the moment, but will be diving in to a bunch of in-depth testing tomorrow since I have the day off. For now, I ran 3d Mark Firestrike (all 3 versions) using default settings and a quick overclock I got stable using MSI Afterburner, the same way I OC'd the 295.

My 295X2 was not a particularly good overclocker, but it wasn't bad either. i was able to run it stable all the time at 1089MHz core, and 1470MHz memory on both GPU's. Max temp was stable at 73C, thermal throttle threshold was 75C.

The 1080 I got seems to be a pretty good overclocker. So far I've gotten it stable at 2126MHz core, 5508MHz memory. The core clock is the max reported boost. GPU-Z reorts this incorrectly for some reason, saying my max boost is 1959, but Afterburner, 3dMark, and Unigine Valley show it as pegging at 2126 at full load. From the reviews I've seen, that's a pretty decent speed to get out of a Founder's edition. This is all with a 10% power limit increase. I will refine this and probably see if I can't get over 2150 on lower power limits when I have more time. Thermal throttle threshold for the card is 83C, but on a custom loop tht kept the 295 under 75C, this thing hasn't gotten above 42C, and is usually under 40. There's some headroom here to play with for sure.

So initial results compared between the 2 cards:

Spoiler

1PrbMXj.png

oTYoyZC.png

uHDfu5c.png

A big note to keep in mind here; the 295X2 scores here are the highest I was ever able to achieve with that card. Running for pure score, it was just getting it to complete the tests without crashing. It was artifacting all over the place, screen flickering, temp creeping up to 74C and only avoiding throttling because the tests ended first. I was NEVER able to actually game at these settings. All this while no shit actually raising the temperature in my house (not my room, the whole house) noticeably, and drawing something like 700W just for the GPU's.

The 1080 breezed through these tests with not a single flicker, artifact, stutter, driver crash, etc. And not once did it break 40C, all while drawing <200W. And this is a single GPU, which means far more stable and no dicking around with XFire/SLI profiles.

Spoiler

Trash pile after installing the waterblock:

DbXH8Lr.jpg

So, IMO it's very much worth it so far. If anyone has any specific questions or benchmarks they want to see, fire away. I'll be diving in to this head first tomorrow morning.

@BlackAdder the HEVC media file is downloading, I'll get that test to you as well.

Share this post


Link to post
Share on other sites

Thanks for this, might reconsider the founders since I'll be upgrading both case and cooling system (to watercooling). 

Will have to wait for AMD releases still since they're talking a VERY big game for that kind of pricerange, even with SLI profiles

Share this post


Link to post
Share on other sites
9 minutes ago, Folterknecht said:

EK and others also make blocks for custom cards

Yes they do, but which ones are hit and miss, and none of them are out yet (or weren't when I ordered mine anyway). Plus I didn't want to rely on availability of a certain custom card and a block for it when I add a second one, which I almost certainly will sometime in the next few months.

Share this post


Link to post
Share on other sites

Assuming that two XFire 480s are approximately equivalent to a 1080 (yes, I'm aware that's a big if), would you say the price difference overrides the downsides of using a dual GPU setup?

Assuming that two XFire 480s are approximately equivalent to a 1080 (yes, I'm aware that's a big if), would you say the price difference overrides the downsides of using a dual GPU setup?

Share this post


Link to post
Share on other sites
1 minute ago, thegeek2 said:

 

Assuming that two XFire 480s are approximately equivalent to a 1080 (yes, I'm aware that's a big if), would you say the price difference overrides the downsides of using a dual GPU setup?

Ashes is a "AMD tech demo" ... expect 2x 480s more in the are of a single 1070, if everthing works correctly. And thats beside the SLI/CF limitations and NVs response in form of the coming 1060.

Share this post


Link to post
Share on other sites
3 minutes ago, thegeek2 said:

Assuming that two XFire 480s are approximately equivalent to a 1080 (yes, I'm aware that's a big if), would you say the price difference overrides the downsides of using a dual GPU setup?

 

You have some other issues: 

  • additional heat (two sandwich cards produces lot of heat)
  • additional power (double the power)
  • driver/application optimizations (from what i heard WOT is poor in SLI/CF setups)
  • additional problems like (overheating/no support/crashing/stutter)

When you consider all things, i think it's better deal to buy overpriced single card, at least that's only downside... 

 

Share this post


Link to post
Share on other sites
1 minute ago, thegeek2 said:

Assuming that two XFire 480s are approximately equivalent to a 1080 (yes, I'm aware that's a big if), would you say the price difference overrides the downsides of using a dual GPU setup?

If it is equivalent, it's a no-brainer; get a 1080. If it's better by a decent margin, then it's much more a judgment call. If you're an experienced user and know how to dig in to the backdoor driver settings and registry entries to get crossfire/sli working in some titles then it's up to you to decide whether the performance is worth the headache. I loved my 295X2, but i also almost never recommended it to anyone looking for a powerhouse video card; the tweaking to get it working right in some games (WoWS in particular comes to mind) was just not something the average user is going to want or be able to do. If the performance is just equal then the single GPU is going to be much better.

Share this post


Link to post
Share on other sites
2 hours ago, Kolni said:

Thanks for this, might reconsider the founders since I'll be upgrading both case and cooling system (to watercooling). 

Will have to wait for AMD releases still since they're talking a VERY big game for that kind of pricerange, even with SLI profiles

Wait for Vega as they're releasing HBM2 cards, and then decide.

 

Also, nice review there of the 1080. I'm honestly more interested in realtime FPS in games rather than pure benchmarks. I don't really know how you'd do that right now since you already switched your cards out, drivers... 

Share this post


Link to post
Share on other sites

First: im jealous as all fuck at you.

Second: I would love to see a comparison of what you get out of actual game settings on the 295x2 and the highest stable OC on the 1080 that you could properly game on. Or a comparison at reference speeds

Share this post


Link to post
Share on other sites
9 minutes ago, Siimcy said:

Wait for Vega as they're releasing HBM2 cards, and then decide.

 

Also, nice review there of the 1080. I'm honestly more interested in realtime FPS in games rather than pure benchmarks. I don't really know how you'd do that right now since you already switched your cards out, drivers... 

 

3 minutes ago, Assassin7 said:

First: im jealous as all fuck at you.

Second: I would love to see a comparison of what you get out of actual game settings on the 295x2 and the highest stable OC on the 1080 that you could properly game on. Or a comparison at reference speeds

I'll be doing this tomorrow, mostly in Witcher 3 for the comparison, since I have a few Fraps benchmark runs on the 295 saved in that game. As for stock speeds, I never saved anything of the stock speeds on the 295, but looking up the benchmarks others did when it was released is pretty universal for stock speeds. Not much variation between individual cards until you OC.

I'll run a few Fraps benchmarks on the 1080 in Witcher 3, TW:WH, and WoWS once I get the OC dialed in better. I'm not home now, so it won't be until tomorrow morning.

Share this post


Link to post
Share on other sites

I guess this means I need to pick one up when my favorite vendor has a spiffy one available.

Share this post


Link to post
Share on other sites
13 minutes ago, TaylorSwift said:

Thoughts on losing freesync due to different GPU brand?

Was completely unnoticeable in the benchmarks I ran before I had to leave. I didn't use it much with the 295 either. When you're maxing the refresh rate of the display anyway, just cap the fps. No sync required. Tearing does drive me absolutely insane though so if I start seeing it in games I may just sell the 4K secondary and get an X34.

 

4 minutes ago, PityFool said:

That's pretty impressive.

However, all that dust all over the desk and jamming up your intake/exhaust fans legit triggered me internally.

Dust is annoying but unavoidable. I have 2 German Shepherds. Even when they aren't shedding they just get dander everywhere. But the intakes are all filtered, there are no fans on the bottom.

Share this post


Link to post
Share on other sites

Someone on a buy & sell facebook page in my city listed a 1080 for $2050 CAD... I didnt know scalping transfered over to computer components xD

Share this post


Link to post
Share on other sites
2 minutes ago, Evroz621 said:

Someone on a buy & sell facebook page in my city listed a 1080 for $2050 CAD... I didnt know scalping transfered over to computer components xD

lol not surprised. I looked at them on amazon on launch day, they were all well over $1k.

If I followed the current market, I could sell my used 295 for over $600. Top end PC components are kinda stupid as far as pricing trends. Mr. facebook Canadian selling that thing is nuts though.

Share this post


Link to post
Share on other sites
2 hours ago, Evroz621 said:

Someone on a buy & sell facebook page in my city listed a 1080 for $2050 CAD... I didnt know scalping transfered over to computer components xD

It's because there's always this one dumbass that will buy it. :kjugh:

Share this post


Link to post
Share on other sites

Ok so i lied, I apparently don't have any of the benchmark runs with Fraps from the 295x2 still saved. Oh well. Ran a few more and here were the results:

Witcher 3

2016-06-11 23:51:27 - witcher3
Frames: 4346 - Time: 60000ms - Avg: 72.433 - Min: 64 - Max: 76

2016-06-11 23:53:07 - witcher3
Frames: 4323 - Time: 60000ms - Avg: 72.050 - Min: 63 - Max: 76

This was done after I changed my graphics settings, since it reverted to defaults when I launched. Basically, I hit the Ultra preset, and then went and turned everything up to max that wasn't already there, including the Nvidia Hairworks crap. Runing around fighting giant centipedes and you can see the results. Holy shit this game is beautiful. the 295 would average about 70 FPS with some settings turned down a bit, like grass density, and with hairworks off completely. VSync was enabled for this run.

World of Warships

2016-06-12 00:09:18 - WorldOfWarships
Frames: 4399 - Time: 60000ms - Avg: 73.317 - Min: 66 - Max: 76

Same as Witcher 3; hit the max preset and then went and turned everything up to max manually. VSync was on, hence the 76FPS cap. However, I had to turn off the OC to run it. I had the same issue with the 295 in this game; launching it with an OC enabled caused a display driver crash upon loading the port or a replay. So 73 FPS average with no overclock at max settings. Boo hoo.

The Division

2016-06-12 00:28:14 - TheDivision
Frames: 2791 - Time: 60000ms - Avg: 46.517 - Min: 39 - Max: 58

I basically hit the Ultra preset, maxed everything else manually, and ran the in game benchmark. What, did anyone think I was actually going to play this game? Lolno. Punished the system a bit though. Lowest result I've gotten in anything. Had to run this game on a single GPU in borderless window mode on the 295. I gave up bothering with this game before they fixed the horrendous epilepsy inducing flicker on crossfire and SLI setups. It ran at 30FPS at best.

HEVC Video Test

2016-06-11 23:56:17 - wmplayer
Frames: 4471 - Time: 60000ms - Avg: 74.517 - Min: 59 - Max: 87

For this test I used this video file from the site @BlackAdder linked in the other thread. It played perfectly. No load on the CPU at all and maxed out at 2% load on the GPU at base clock speed. It didn't even bring it out of an idle state.

 

This is the one anomaly I've encountered, and I'm at a bit of a loss to explain it:

Spoiler

8nvpfM4.png

 

Share this post


Link to post
Share on other sites

Looks like I might just have to get this or the 1070 in order to play games at med-high settings when I get my new monitor on monday... I suspect my 770 will struggle @ 3440x1440p...

 

It's because there's always this one dumbass that will buy it. :kjugh:

Thats the sad part. Theres so much foreign money in my city, someone with 'stupidmoney' will pay that much money for it.

Share this post


Link to post
Share on other sites
11 minutes ago, Evroz621 said:

Looks like I might just have to get this or the 1070 in order to play games at med-high settings when I get my new monitor on monday... I suspect my 770 will struggle @ 3440x1440p...

It will, but all these results are on my XR341CK, same resolution as what you're getting. #UltraWideMasterRace2016. Which one are you getting?

Share this post


Link to post
Share on other sites

Glad to know, I assumed you were at 1080p because of the Unigine Benchmark settings. Your results, along with other people's benchmarks at Ultrawide have made it clear which GPU I will have to buy.

Im going to wait a month or two because the aftermarket coolers will be in supply, and amd might even have a competitive card for cheaper. I hope the 490, or whatever they will name it, will compete in the high end.

I bought a LG 34UC87-C on Ebay a week ago for $850 CAD after shipping+duties. I did all my research and decided I'd bite the bullet and get the 34" 3440x1400, instead of 34" 2560x1080, or 29' 2560x1080. Also I read great things about the curvature increasing immersion, so thats another preference I ticked off with my purchase.

r/ultrawidemasterrace was a very helpful community 

Share this post


Link to post
Share on other sites

I actually went from 4k to ultrawide. I still use the 4k display as a secondary, but I can never go back to 16:9 as a primary. The curve is very nice, you'll love it. The 1070 should handle most everything at medium-high settings, I would think. But the board partner 1080's are going to be cheaper than mine and perform better out of the box, so I'd suggest doing what you have to do to get one of those instead.

The Unigine settings are the highest default settings. I used that preset so it would be simple for anyone else to replicate the test on their machine and get an equal comparison.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Recently Browsing   0 members

    No registered users viewing this page.

×
×
  • Create New...