Discussion in 'Article Discussion' started by Tim S, 2 Mar 2007.
Interestingly, I've just taken off the heatsink from the Asus board and the RS690 northbridge is labelled slightly differently!
also, why in the FPS test does it say "time in seconds - lower equals better"?
Does this mean AMD/ATI are going to end up dominating the mobile market? At least they're now concentrating on it and producing some decent kit.
"but it was delayed because of *dalays" *delays
"Join us as * delve down inside" *we?
"Radeon X1250 it isn’t *an derived" *-an?
Thats just me being picky though the review is really informative and this chipset looks like it could be a major bonus for use in multimedia rig
Thanks, fixed - drop me a PM if there's anything else.
i was wondering if anyone could tell me what that glue like stuff around the edge is (maybe even glue ) and what its for, always wondered about
Wow thanks for that info, that's quite interesting, i would have guessed that the north bridge also connected to the memory
But either way, only 1 thing can be read from memory at a time, so whether its the NB reading it or the CPU - i guess it doesn't make a huge difference, plus the CPU really needs the memory access more
am i mistakin, or is the msi board the one with four memory slots and asus only has 2?
Nope, the board with the Asus logo on it appears to have four memory slots: http://www.bit-tech.net/hardware/2007/03/02/amd_690g_chipset/2.html
ati x1250 is a modified version of x700 series
we cant expect top scores in benchmarks. anyway this will perform far better than the nvidia 6100 onboard GPU solution
It's just a bonding material used to seal the edges of the die to the packaging.
My fault on the graphs too!
First nice review! I particularly liked seeing what you lose in non-graphics performance by sharing 256MB ram with the IGP. About 0-6% by my calculations which is pleasantly lower than I was expecting.
I've read elsewhere that the display controllers are single link only?
I don't understand this. Could you explain why RS690 doesn't get to use both of the memory channels in the memory controller (128bit).
Rumour has it Intel are working on a new driver for the G965 which should result in a large improvement in the current compatibility and performance issues. Could you confirm this?
Are these boards Micro atx? Socket AM2?
Tim was using the 30" Dell at the time, but I will test if it's dual link. I'm pretty certain it IS dual link since it has all the necessary pins for Dual Link DVI-D.
I've ammended the article. For some reason my memory told me that DDR is 32bit to match CPUs (hence you used to only need one, unlike SIMMs), so dual channel was 64bit.
No idea if Intel is working on a new driver, but even if they are the hardware to do the work isn't there. Even if they manage to get it DX9 compatible it'll still be completely unplayable.
you say it will play pitifully in 640x480 yet you mention the detail settings are the same as other benchmarks you make and if lowered the frame rates would be higher.
When you say the benchmarks for other hardware are you talking about the top end GFX cards that you look at? e.g. 8800
If so, why would anyone play with high details but a resolution of 640x480 when they could bump it up to a playable 1024x768 and just lower the unneeded detail.
It's not as if anyone would buy this board and then try and play all the latest games on a 30" LCD, they'd be looking to play the common games like cs:source as a playable fps which would ignore most shiny details.
I don't think the GMA X3000 in the i965 got a fair shake at all. Please spend some more time with the drivers to get it running the games you benchmarked. Xbit Labs were able to get reasonable frame rates out of the X3000 in Quake 4 and Half Life 2.
At the moment, yours is the only comparison of the sort, and I'm very interested in the results of these tests.
Featured on Engadget !!
The review by Anandtech says the 690G supports 2 independent display controllers with both being digital and HDCP compliant. However they mention that HDCP only works for one of the digital connections at a time. Given this info I will paste the same questions I asked there but haven't had a reply to namely:
1) How do they accomplish having HDCP support for both dvi and hdmi given that they're on independent display controllers? My understanding was that seperate crypto ROMs were required for each controller/output. Simple answer would be that they indeed have 2 sets of keys but I assume this isn't the case given that they only let you use HDCP on one digital output at a time. So how does that all work?
2) How is vga implemented in the display controllers? 1=HDMI 2=DVI-I(hence dvi or vga) or some other configuration?
3) In a related point (upcoming mobile version of chipset) What connection do laptops use internally for their screens? I've asked this question on a few other sites but never got an answer. Surely someone must know? The reason I ask is I'm interested in getting a laptop in future which supports hdcp for both the laptop screen AND via an external digital connection to a larger display.
Sorry for being long winded but would really appreciate any info you can give.
I will go back to it for the next 690G review, but everytime I tried a DX9 game it threw up a DX error. Our tests were run on maximum quality to provide a direct comparison with other reviews, just the resolution has changed, although in the next one we will lower all the settings as well.
The CryptoROMs are specific to each video output afaik, but every board I've seen bar the Asus has only one digital output so it's not mattered. Asus use a PCI-Express card for HDMI and HDTV out, which wasn't included in our initial package since it's a pre-production jobbie and we just literally got the board (no drivers even, and AMD doesn't have any on their site either ).
Don't understand question 2?! DVI and HDMI use internal TDMS, which are then disabled in the value model. You can change outputs in the ATI display driver.
Laptop: no idea, depends on the specific LCD controller used in the laptop. Not many people take apart review samples they get because it usually means irreversibly breaking the unit.
Unless your laptop has blu-ray, is HDCP needed?
Yeah confusion definitely results from the fact that what a chipset supports and what the mobo maker implements isn't always the same. If the crypto roms are specific to each digital output (which I also thought was the case) then why is HDCP only supported via one output at a time? If the cryptoroms are part of the chipset (and hence there are 2 sets) then I see no reason for limiting their use to one digital output at a time unless it's an issue of drivers.
Sorry for not being clear. Only interested in 690G. There are 2 display controllers. What does each support? E.g. Controller 1=HDMI 2=Dvi & VGA (hence DVI-I) and it's up to the mobo maker which of these outputs they support or some other configuration. The promo stuff for the chipset says it supports 2 independent digital outputs and vga but doesn't say how exactly these outputs are arranged within the chipset.
Yeah as mentioned no one's been able to answer that question The reason I'm interested is because I'm wandering what my next laptop requires in order to play hi def dvd's should I choose to replace the optical drive it'll come with with a hidef one once the prices come down. I plan to buy a new laptop later this year so won't be able to afford one that comes with a hidef drive and so would need to buy one seperately further down the line. Essentially I want laptop that's ready in every way for next gen dvd's but without initially coming with the necessary optical drive. Hope that makes sense
Separate names with a comma.