Discussion in 'Article Discussion' started by Sifter3000, 13 Apr 2010.
Yes, that would be interesting, but unfortunately NEC needed its screens for an event.
Jeese... The game from this test was indeed DiRT 2. However, the in-game picture on page two is from an older article and is displaying GRiD.
I would also like to just say that after all the 'FAIL' comments regarding nVidia's Fermi I am amazed that ATi would release a card this expensive with no benefit over much cheaper cards. I would also like to question what is up with the results of the HD5970 as in some tests it seemed to perform slower than a HD5870... Is this purely down to Driver support or the lower clockrates?
Somehow, for the price of an Eyefinity setup, I can't see high-street versus internet cable prices being a problem. And its also shown me why the Fermi cards, with 512MB more memory, outperformed ATI's, something I've been wondering about ever since seeing the reviews.
I know the 5970 is lower than the 5870 in DoW becuase DoW hates mutli-gpus, not sure about crysis though....
From the article:
It’s also worth noting that the dual-GPU Radeon HD 5970 was significantly faster than either game in many games, although it refused to work in Crysis at the Eyefinity resolution of 5,760 x 1,080.
"...faster than either [game] in many games" - Surely that should read "...faster than either card in many games".
I would have liked to have seen a photo of what the windows desktop looks like with this arrangement. Does Windows 7 treat all the displays as one screen (with the taskbar across the whole bottom), or as six separate screens (task bar only on the "main" center). I could see this card being very useful in productivity environments.
One large screen. You can see the task bar stretched across the three bottom screens in the Hands On article.
lol... I was going to comment that you couldn't but you just don't notice the task bar at first it looks so small. But everytime I see examples of Eyefinity, and especially that AMD pic from the mentioned article I can't help but feel it is something beautiful gone horribly wrong... like Angelina Jolie after she has been bottled in the face.
I love my eyefinity. Best money I ever spent, is it something you need to rush out for? ehh not completely Ive never had more fun both at first and even now playing games, I also have a slight advantage compared to most people which has helped numerous occasions in both racing and shooting games. I have 3 24 inch dell tn's I got these and a display port adapter and 3 adjustable desk mounts for around 800 usd... Worth it for me, enjoy it more than any other part of this pc.
I tried that with my setup - yuck! It doesn't help that the middle screen is a Dell U2410 so it's different depths on each side but, for me at least, it's better to deal with the bezels being a bit wider. After about 10 minutes or so into a game I don't notice it anyway, I just look past it.
You set it for what you want - Eyefinity mode treats it as one desktop with the taskbar across all displays but you can switch it back to a traditional multi monitor setup just by changing the profile and Win 7 treats them as separate again.
I find that having the screens flatter and a touch further away from me works better than angled either side as I can see all of the side screens in my peripheral vision. It's not ground breaking but it is nice for games that support surround. BFBC2 does an excellent job, as does AvP, others less so.
Both off page 10:
left and right screen (unless you cockney side is coming out)
being able to see
I think the possibility of a large but not too large screen (just enough to fill your peripheral vision) that is shaped more could be nice, like having 3x3 19" monitors or something. If they were to really move in this direction I would think the whole graphics chips could to with an overhaul. Our peripheral vision is probably much worse and I doubt we really need high resolutions or great anti-aliasing there to give the desired effect.
Yes, sorry. Mi reading skilz iz obvously az bad az mi speling skillz. So I've edited it to something hopefully less useless.
I am one of the lucky people with such a setup at home, and i can say that while rotating to 3600 x 1920 does work, it's just much less immersive. Basically, if you want to do this, you are far better off getting a Full-HD 42" TV and use that.
In my experience, the whole point of this super-widescreen setup is that you cannot see all of it.
I can best explain it with the example of being overtaken in a racing game. On a conventional aspect ratio, the car comes onscreen from behind a bezel. On a wide enough setup, like this eyefinity one, the car just comes into view from an area you were not looking at. This creates the illusion that if you should turn your head, there would be even more to see there. A setup where you see the bezels on the sides fails to create this illusion.
If you want to, i could post a few videos to illustrate the effect and the difference between 5760x1200 and 3600x1200, but really, take my word for it.
Did i mention yet that it rocks beyond belief?
Also, i'd like to add a few comments about the displayport adapter discussion. There's a lot of confusion going around, with people reporting that passive adapters work aswell. They dont, at least not for 3x1 eyefinity setups.
The reason for this is that the Cypress core only has two DVI/HDMI clockgenerators, so it can only output two DVI/HDMI signals. This is also the reason you cannot simply use 2x DVI + HDMI on the regular 5xxx-cards. A passive adapter asks for an HDMI/DVI signal over the DP-port. An active adapter asks for a DP-signal and converts that to HDMI/DVI.
This might be useful to add to the article, since it is this piece of understanding that allowed me to go out and get the right gear. I always like to think that Bit-Tech is not only about knowing the how, but also the why
Hence why this review showed eyefinity in its far more useful 3x1 layout. I agree though 3x2 seems pretty pointless in its current form.
Given that a 30" won't provide the additional peripheral vision eyefinity provides I am slightly confused by your comment. Also remember that eyefinity isn't just for games.
What would be very useful to review in the future is whether the HD5970 shows an improvement with 2GB of memory per GPU (and clocked to HD5870 clocks). Its clear by reviews at high resolutions / settings that it has the gpu grunt but runs out of memory and stumbles. At these resolutions/settings the HD5870 runs out of grunt before it runs out of memory.
I agree with Pookeyhead (though I wouldn't have been quite so dismissive!). I can't forgive the bezels, especially for FPSs. Basically the technology has IMHO been introduced a little too early.
However I can see this being extremely useful for graphics designers/CAD users etc. I wouldn't be surprised to see a Firepro version popping up later this year, albeit for an extortionate price.
I sit about a yard away from my screen and the immersion is huge, just about filling up all my vision. Rendering extra pixels just for peripheral vision is IMO a waste of computing resources. I remember someone suggesting that ATI should implement a low resolution mode for peripheral screens - this would be a good solution for those still wanting this feature.
I'd like to see 5 in portrait, that would be funky.
What would be useful (to me at least) would be if games supported multiple monitors so you could have different views on each one. I.e. In a racing sim, have the centre screen showing what's infront, and the 2 side screens showing their respective left/right views
Exact same conclusion as hard. They tested a crossfire too and said the extra memory helped there as it gave enough power to enable higher AA. I guess if drivers become better and allow enough extra frames to enable AA you might start to see a need for 2gb on a single card eyefinity setup.
16:9 is too skinny by default, even with just one screen
shame the 16:10 ones are so much more expensive
Separate names with a comma.