Price and lifespan are nowhere near the only criteria when it comes caps. Different types have different characteristics and are suitable for different purposes.
MLCC are cheaper. The FE has MLCC all over it. Other companies have tried using a mixture, and it is leading to issues. These issues did not exist on Turing. It's just another show of how doggy Samsung dies are. Or is it that Nvidia clocked balls of them because they are so crap? as underclocking magically fixes this issue. I never thought I would ever see Nvidia release a Vega.
I have the LG 38GL950G, which is also for sale in the market place. I think the aspect ratio is great, as you get the best of both worlds. 16:9 with black borders on either side with unsupported titles and 21:9 for many modern games. For productivity this is also great as there is lots of screen real estate - I think @Anfield explains it well in his post.
It is either Nvidia or the AIBs who are to blame. However unless some AIB commits seppuku by leaking design guildeline documents they got from Nvidia we will never know if Nvidia provided AIBs with sufficient and accurate information to make the proper decisions.
Having had time to digest the information I am actually going to agree with Jay for once and defend Nvidia and their partners here. If the box states a 1700 odd mhz boost clock and the card meets that criteria? then people can GTFO. People are installing tools to boost the clocks over 1900mhz and then whining because their card is crashing. Only you remove that boost and the card works fine. All this will do is get people to abuse DSR and send cards back that are perfectly fine. I've not changed my stance. I know how poor 8nm Samsung is, but, if Nvidia or partners state a frequency and it's met? you can bugger off. I have never had any luck in the silicon lottery. Not ever. I bought one half decent 670 Jetstream. The second one? wouldn't even do 50 mhz more than stock clocks. Never sent it back because I didn't care. Same goes for the Vega 64 I had. 20mhz would see it pink screening. However, again I didn't send it back. People do need to get real with their expectations these days. Or at least understand that Ampere has been pushed pretty hard already and hey, it is cheaper than Turing. Expecting a fantastic overclock though? yeah, that's not realistic. Now if I paid £1700 for a 3080 super duper edition I would expect it, like I did with my Kingpin. But then EVGA guarantee 2150 on the cooler it comes with and that is exactly what I got. And I paid for it. Then it literally comes down to pure luck. Two people I know bought the cheap Gigabyte OC and both have it boosting well clear of 2ghz on air.
Thank you so much for very detailed responses. Not sure it’s the right resolution for me........ Don’t really want to deal with tweaking to get games working. 1600 height is very appealing for productivity though. I’ll start a new thread when I’m ready to upgrade in the future.
this is a good res and aspect ratio I use a less good dell version 3818? that only does 60hz, everything works well on it no scaling needed, I just use it for office and design work and I have only done minimal gaming on it as I use a larger 4k tv for that, productivity side of things great if you can't stomach a 4k/5k screen.
This is true, the many 4K screens I have used including Acer's 49" monitor, Windows wants to use at least 300% scaling, even though you can read the screen perfectly from 90cm away at 100% scaling. My LG just works at 100%, so no messing around with oversized apps. I also find that although this monitor is less than 4K resolution, it seemed to be the sweetspot for the 2080 Ti running most games over 100fps.
Although it does kind of lead you down a dangerous path of wanting more. I went from some old 60hz AOC 34" 3440x1440 to a 75hz 38" LG... Then I was like the size is nice but that is still too slow so I bought a 144hz 34" MSI but using a smaller monitor feels incredibly wrong, so I now have the 32:9 240hz Odyssey G9 on preorder (and I have no idea how well I will cope with the extreme curve on that so we'll see how long it'll be until I'm demanding even more).
Apparently the Ampere Quadros use a EPS 8-pin for power instead of the new 12-pin Which in the context of the types of machine Quadros are normally found I suppose sort of makes sense... maybe...
so my gut feeling was right after all, the day after i "bought" the card i got an email informing me there's no stock and i can either cancel my order or wait. i chose the latter, in the meantime i was starting to think i might not see the card until late 2020 seeing as 99% of the stock went to scalpers and reviewers but after 9 days of agonizing wait the card finally arrived today ! from my limited testing, aka playing Control for an hour i haven't experienced any crashes so far.
yeah, i've seen the reports of crashing, here's two benchmark runs with a small 50hz "overclock" -https://www.3dmark.com/3dm/50954230? https://www.3dmark.com/3dm/50959799? the core clock stayed mostly in 1900Mhz with occasional spikes 2025 2010 mhz. adding another 10mhz would crash the benchmark though, not sure if that's an actual issue with the card or just the fact the silicon is being pushed to its limit since the advertised boost clock is 1815Mhz whatever might be the case the fact i had to go out of my way to make it crash makes me think the issue could be overblown, only time will tell.
If it helps, you only need to tweak if you want to. You dont have to play everything in ultrawide. As true_gamer hinted it’s the best of both worlds since if ultrawide isn’t supported you still have a massive 16:9 display that looks and plays incredibly. I have a 34” 1440p ultrawide and have played 150 hours of Hollow Knight in 16:9 and didn’t even notice the black bars. When I did notice I simply didn’t care. Game is amazing.
It's the 970 memory debacle all over again. Tens of thousands of people bought 970's because they performed so well in reviews. Then a few weeks later pcper drops the "3.5GB" bomb and everyone goes nuts saying how terrible the card is, when the reality is that the performance hasn't changed, only people's perception of the card has. The 3080 still has beastly performance but, like with CPUs, the days of manual overclocking are near an end as cards already overclock themselves to near enough their maximum potential anyway.
People got used to lots of free performance, Nvidia have used most of that free performance up if it works at the stock settings it’s shipped at people have no right to return
what I have read from the, ... frankly mostly ignorant youboobers is that MLCC's are some magic ******** and its hard to get the "more expensive" caps and its totally not lack of time to properly develop non FE boards. Super smash bro's rundown time, small value caps filter high frequencies, large value caps filter low frequencies. why is every freaking board on the market dual footprinted for a gazillion small caps (and at quantity < 1 US cent a part) but everyone is tossing on rather large beefcake tants to make the total goal? Either: A) AIB's were not informed of the stableness of the chip (which happens all the time) and were playing it safe B) AIB's were informed of the of the stableness of the chip and chuckled "heh heh heh you can not be serious at this ****" C) foul play from Nvidia to give themselves a 1 up look at the back of the cpu, there's about 1.3 billion 0402 pads on the back of the damn thing with a couple 1210 or 2512 pads occupying's the same space, its got to be option A or B. Option C is not an option cause Nvidia has been playing the same X, X ultra, X gts, X ti, X titan game for fukin 25 years now and have no reason to suspect the general public would catch on its 100% of a lack of time, nvidia did it as a safey net, and used it during EMI testing, then when passing it along to the IAB parteners provided the data said hey YO, and the likes of Zotac (stable but crap cards) said ya but I can save a Nickle
The only way any theory could really be proven would be to remove a die from an "offending" card and put it on one that does not. I know three guys with 3080s and each bought a "crap" one. Yet none of them have had any issues, and one got his to 2070mhz with no problems at all (a Gigabyte OC, on Jay's crap list). So is it because the power phases are supposedly crap at filtering, or, is it that some dies are just more sensitive to noise and are just crap dies? If Nvidia were not allowing them to clock balls (it's the driver apparently, not tools artificially upping power limits etc) then there's a very strong likelihood they would all be perfectly stable at 1710mhz (the FE boost clock) and this problem would not exist. Mostly because when people overclocked them themselves they would have simply put it down to a poor overclocker and that they hadn't won the lottery. Instead you have *all* cards trying to boost to well over 2ghz and some of them are failing at it. Which when you take everything that has happened into account (mainly Nvidia going first and not being sure what AMD have so have pushed up the clocks to make them look better) makes it a bit clearer. Recently something similar happened with motherboards. They were enabling certain features as "stock" and thus certain boards looked a lot faster than others. I don't remember the exact specifics, but reviewers ended up having to go back and review all of the boards again.