Discussion in 'Article Discussion' started by Tim S, 16 Mar 2007.
I'll really start to care when they get a product out. Interesting enough, indeed good to hear about it because it will force nVidia to think about responding wtih similar technology, but it all looks like damage control over the late launch now.
It sounds like they are still in R&D
nah my guess is that they wanted to wait 'til launch to drop all these awesome bombs about the tech, but were held back from launching for whatever reason, so to spoil nvidia's party they're having to release some info to keep the punters hot.
Whats the audio going to be like on these?
This is awesome news. Finally some people got it right. We need dedicated video processors. PCs start to look more like consoles nowadays with multiple cores and multiple specialized processors. This is the future. Old-thinking generic processing cores just doesn't cut it anymore with even more demanding games and video codecs.
In order to play full 1080p VC-1 HD DVD now you need minimum Dual-Core 2GHz and a powerfull graphics card with HDCP. Are someone seeing a pattern here if AMD didn't suddenly realize that an independent (dedicated) video processor is the solution?
I don't believe GPUs handle 50% of the job of playing back video. IMO GPUs only deinterlace and do IVTC, no decoding of H.264 or MPEG-2. That's still handled by the main CPU. I got an NVIDIA 7800 GTX 256MB based graphics card and it just doesn't playback AVC and MPEG-2 1080p even when hardware acceleration is activated. The only player I have teste hardware accelerated AVC decoding with is Nero ShowTime which indicated that this is enabled during playback. I'm using NVIDIA's own PureVideo decoder for MPEG-2 acceleration when watching MPEG-2 .ts (transport stream) movies and DVDs. 720p is just fine, but when trying 1080p it stutters and lags.
I will add that I have an AMD Opteron 150 single-core cpu @ 2,6GHz (OC) and 2GB RAM.
Surely this line should be the other way around? PCs have had multiple CPUs far longer than consoles have, it just that recent (ie: 'next gen') consoles took the lead for a while because they are (arguably) so ludicrously overpowered. Intel seems dead set on making dual-core PCs standard in just about every home, and quad-core affordable.
This is a good thing for AMD/ATi - they seem to be trying to get a lot of things out into the wild at the minute that their competitors don't have... however, it would be nice to see some hardware, rather than just endless press-releases...
Nice! It's about time. I don't game, so I've no need for a high end video card...but I'd like good video playback. If they stick one of these chips in a sub-$80 video card, and it can decode 1080p H.264 without leaving the CPU with work to do, I'll be very very happy. Especially if the video card, being cheap/low-power, is also passively cooled.
Something I don't understand... there are HD-recorders that can play 1080p video (as long as the recorded/copied movie was already 1080p ofcourse). That thing doesn't have such a big videocard, and isn't that powerhungry.
Then how is it possible that PC-users are "struggling" to get 1080p on their monitors? Am I wrong to think that if someone has a 30" monitor and a videocard that supports that monitor, he can play PC-games at the native resolution? I mean, those 20" and bigger monitors have been around for longer then 2007, and we've seen resolutions of 1050 (vertical resolution) for a few years now. Then why is it suddenly so hard to view 1080p movies on a system?
Maye it's a silly question, but with similar monitorresolutions which have been around for quite some time, and gaming at that same resolution has been possible for a long time as well... it just seems weird.
The standalone players have a specialised chip that is dedicated to HD decoding, this is what is being implemented here.
Thinking about it, surely the existing hardware is more then powerful enough to do it. Why bother with adding more hardware, more complexity, more costs to an already powerful unit?
I suspect that you're not alone in that. It would be silly if this is only for high-end cards (that shouldn't need it).
What interests me is the built in sound. If this has any decent quality I could see this as a challenge to Creative. Perhaps Nvidia would have to bring back an integrated version of SoundStorm as a counter to this.
It just pulls whatever sound your chosen soundcard produces and pushes it internally rather than externally. I've no idea HOW it works exactly, I'm guessing it lets the soundcard do what it does then just rips the output signal but we won't until the R6xx launch.
The audio quality should be perfect as it's digital, so the sound quality should depend on your decoding hardware. Although I say should be with a great deal of emphesis there are many ways of loosing information.
Thanks for the clarification. I'm now wondering what effect this will have on Digitally Restrictive Mangling.
Separate names with a comma.