AMD 690G: Technology Overview
by Derek Wilson and Gary Key on February 28, 2007 2:30 PM EST- Posted in
- CPUs
The AMD 690G/690V chipset consists of an RS690 Northbridge and SB600 Southbridge. AMD's intent with this chipset is to provide an attractive alternative to the NVIDIA 6100 family but more importantly a total platform solution when paired with an AM2 that is very competitive against the current Intel G965 family in the budget sector. The 690G is directed towards the consumer market with a heavy emphasis on multimedia capabilities via the X1250 graphics core while the X1200 core on the 690V chipset is targeting the business market where the X1250's AVIVO capabilities are not as important. Processor support is for the 690G/V is extensive with the AMD Athlon 64, AMD Sempron, AMD Athlon 64 X2 Dual-Core, and AMD Athlon 64 FX all working properly on our test units.
In the case of the X1250, it is no surprise that AMD has reached back to previous generation hardware for the base design of their new integrated GPU. Lower transistor count means smaller die size and lower cost, and the X100 series fits the bill with its lack of SM3.0 support and use of 24bit floating point precision. The basic design for the X1250 is taken from the X700, with some modification. While we would love to see Shader Model 3.0 support (which NVIDIA hardware has in their 6100 series and current Intel hardware claims to be capable of), developers writing DX9 applications will still be designing for the SM2.0 target which the X1250 meets.
Many AVIVO features (including 10-bit per component processing) have been implemented on X1250, bringing higher quality video decode to integrated graphics. Unfortunately, with this improvement comes some sacrifice, as the number of pipelines on the X1250 is cut down from the X700. The X1250 weighs in at four pixel shaders and like other X100 series hardware this also means four texture units, z-samples and pixels per clock. The major change when compared to the X700 is the number of vertex shader units have gone from six to zero. All vertex shader operations are handled by the CPU. The core clock speed operates at 400MHz and can be increased to 500MHz within the BIOS depending upon the board manufacturer. We have also overclocked one of our boards to 550MHz with a third party utility but performance unfortunately does not scale well due to the limitations of the GPU. Based upon these numbers you can expect overall performance closer to that of the X300 than the X700.
As for memory, the GPU can handle up to 1 GB of memory, but support is dependent on the BIOS. AMD uses an optimized unified memory architecture (UMA) design, and all graphics memory is shared with system memory. For our tests, we found 256MB to be the sweet spot, as performance seemed to be degraded with 512MB or 1GB graphics memory, especially under Vista where the base memory requirements are significantly higher than XP. This may end up being different depending on implementation, but we will stick with the 256MB recommendation for now and our upcoming test results bear this out.
While none of this will revolutionize gaming, it certainly puts AMD ahead of Intel and NVIDIA at this time. We'll have to wait and see what happens with the final performance numbers, but the X1250 looks to be a very solid platform for integrated graphics performance and features with an extremely competitive price target around $85.
Looking beyond architecture, most people who will actually be using integrated graphics won't be bothered with games or high-end 3D applications. This hardware will be most used for 2D and video applications. Let's take a look at the features we can expect in these areas.
Supporting a maximum resolution of 2048x1536, the X1250 can easily run any CRT at maximum resolution. This tops NVIDIA's 6150 max resolution of 1920x1440 and equals Intel's G965. Larger 30" flat panel monitors won't be able to run at native resolution, so the business user who needs huge desktop real estate will have to stick with add-in graphics cards. As for output features, the video hardware supports S-Video, YPbPr, HDMI 1.3, and Dual-Link DVI. Of course, the actual interfaces available will depend on the implementation, but HDMI and DVI ports will also support HDCP. The digital outputs each use TMDS transmitters that run at 165MHz.
The GPU supports two independent display outputs, and both DVI and HDMI outputs can be used at the same time. The only caveat is that HDCP will only work over one digital output at a time. This isn't a huge issue, as most people won't be watching two different protected movies at the same time on a single computer. Also, in spite of the single display limitation, HDCP can be used over either HDMI or DVI. This gives the X1250 an advantage over graphics cards that initially supported HDCP. Many cards only allowed HDCP over one HDMI or DVI port while the other was always unprotected only.
As for HDMI, the audio support is enabled through an interface in the RS690 Northbridge while the SB600 Southbridge handles the HD audio controller interface. The standard HD audio codec will come from Realtek who has developed a driver package that allows the user to control both the HDMI and HD audio interfaces from a single application. Having the HDMI audio hardware on-board means we aren't simply using a pass-through on the graphics card as we have seen in most implementations. This enables the user to have better control over audio adjustments through the integrated software package. We are still waiting for the full specifications of the HDMI audio capabilities but have tested our 5.1 setup with output set to 24-bits at 48kHz.
For video acceleration features, the X1250 is capable of hardware acceleration of MPEG-2 and WMV playback. MPEG-4 playback decode is not hardware accelerated, but it is supported in software via the driver. DVD and TV (both SD and HD resolution) playback can be offloaded from the CPU, but we have seen some severe choppiness or blank screen issues with HD media formats at 1080p although 720P worked fine. AMD has indicated that this issue will be addressed in a future driver and the chipset is fully capable of 1080p output with an upper end CPU and proper software support.
For those who wish to use discrete graphics along side their integrated solution, AMD supports a feature they call SurroundView. This enables support for three independent monitors in systems with integrated and discrete AMD graphics. The feature works as advertised and may be useful for business users who want more than two monitors at a low price. Gamers who want more than two monitors will certainly have to take a different route.
The AMD 690G/690V utilizes the SB600 Southbridge that was introduced last May and continues to be a competitive offering although both Intel and NVIDIA's latest chipsets are offering six SATA ports along with RAID 5 capability. We will further delve into the differences between the NVIDIA 6100/6150, Intel G965, and AMD 690G chipsets in our upcoming performance review.
In the meantime, AMD has introduced a very competitive integrated graphics chipset that offers slightly better video performance than its competition at a very competitive price point that should please a majority of potential customers. How long they will remain competitive is up for debate with both NVIDIA and Intel launching new IGP platforms in the coming months. For now, AMD can brag about having the fastest IGP solution - which unfortunately is only slightly faster than a discrete X300 card.
In the case of the X1250, it is no surprise that AMD has reached back to previous generation hardware for the base design of their new integrated GPU. Lower transistor count means smaller die size and lower cost, and the X100 series fits the bill with its lack of SM3.0 support and use of 24bit floating point precision. The basic design for the X1250 is taken from the X700, with some modification. While we would love to see Shader Model 3.0 support (which NVIDIA hardware has in their 6100 series and current Intel hardware claims to be capable of), developers writing DX9 applications will still be designing for the SM2.0 target which the X1250 meets.
Many AVIVO features (including 10-bit per component processing) have been implemented on X1250, bringing higher quality video decode to integrated graphics. Unfortunately, with this improvement comes some sacrifice, as the number of pipelines on the X1250 is cut down from the X700. The X1250 weighs in at four pixel shaders and like other X100 series hardware this also means four texture units, z-samples and pixels per clock. The major change when compared to the X700 is the number of vertex shader units have gone from six to zero. All vertex shader operations are handled by the CPU. The core clock speed operates at 400MHz and can be increased to 500MHz within the BIOS depending upon the board manufacturer. We have also overclocked one of our boards to 550MHz with a third party utility but performance unfortunately does not scale well due to the limitations of the GPU. Based upon these numbers you can expect overall performance closer to that of the X300 than the X700.
As for memory, the GPU can handle up to 1 GB of memory, but support is dependent on the BIOS. AMD uses an optimized unified memory architecture (UMA) design, and all graphics memory is shared with system memory. For our tests, we found 256MB to be the sweet spot, as performance seemed to be degraded with 512MB or 1GB graphics memory, especially under Vista where the base memory requirements are significantly higher than XP. This may end up being different depending on implementation, but we will stick with the 256MB recommendation for now and our upcoming test results bear this out.
While none of this will revolutionize gaming, it certainly puts AMD ahead of Intel and NVIDIA at this time. We'll have to wait and see what happens with the final performance numbers, but the X1250 looks to be a very solid platform for integrated graphics performance and features with an extremely competitive price target around $85.
Looking beyond architecture, most people who will actually be using integrated graphics won't be bothered with games or high-end 3D applications. This hardware will be most used for 2D and video applications. Let's take a look at the features we can expect in these areas.
Supporting a maximum resolution of 2048x1536, the X1250 can easily run any CRT at maximum resolution. This tops NVIDIA's 6150 max resolution of 1920x1440 and equals Intel's G965. Larger 30" flat panel monitors won't be able to run at native resolution, so the business user who needs huge desktop real estate will have to stick with add-in graphics cards. As for output features, the video hardware supports S-Video, YPbPr, HDMI 1.3, and Dual-Link DVI. Of course, the actual interfaces available will depend on the implementation, but HDMI and DVI ports will also support HDCP. The digital outputs each use TMDS transmitters that run at 165MHz.
The GPU supports two independent display outputs, and both DVI and HDMI outputs can be used at the same time. The only caveat is that HDCP will only work over one digital output at a time. This isn't a huge issue, as most people won't be watching two different protected movies at the same time on a single computer. Also, in spite of the single display limitation, HDCP can be used over either HDMI or DVI. This gives the X1250 an advantage over graphics cards that initially supported HDCP. Many cards only allowed HDCP over one HDMI or DVI port while the other was always unprotected only.
As for HDMI, the audio support is enabled through an interface in the RS690 Northbridge while the SB600 Southbridge handles the HD audio controller interface. The standard HD audio codec will come from Realtek who has developed a driver package that allows the user to control both the HDMI and HD audio interfaces from a single application. Having the HDMI audio hardware on-board means we aren't simply using a pass-through on the graphics card as we have seen in most implementations. This enables the user to have better control over audio adjustments through the integrated software package. We are still waiting for the full specifications of the HDMI audio capabilities but have tested our 5.1 setup with output set to 24-bits at 48kHz.
For video acceleration features, the X1250 is capable of hardware acceleration of MPEG-2 and WMV playback. MPEG-4 playback decode is not hardware accelerated, but it is supported in software via the driver. DVD and TV (both SD and HD resolution) playback can be offloaded from the CPU, but we have seen some severe choppiness or blank screen issues with HD media formats at 1080p although 720P worked fine. AMD has indicated that this issue will be addressed in a future driver and the chipset is fully capable of 1080p output with an upper end CPU and proper software support.
For those who wish to use discrete graphics along side their integrated solution, AMD supports a feature they call SurroundView. This enables support for three independent monitors in systems with integrated and discrete AMD graphics. The feature works as advertised and may be useful for business users who want more than two monitors at a low price. Gamers who want more than two monitors will certainly have to take a different route.
The AMD 690G/690V utilizes the SB600 Southbridge that was introduced last May and continues to be a competitive offering although both Intel and NVIDIA's latest chipsets are offering six SATA ports along with RAID 5 capability. We will further delve into the differences between the NVIDIA 6100/6150, Intel G965, and AMD 690G chipsets in our upcoming performance review.
In the meantime, AMD has introduced a very competitive integrated graphics chipset that offers slightly better video performance than its competition at a very competitive price point that should please a majority of potential customers. How long they will remain competitive is up for debate with both NVIDIA and Intel launching new IGP platforms in the coming months. For now, AMD can brag about having the fastest IGP solution - which unfortunately is only slightly faster than a discrete X300 card.
14 Comments
View All Comments
phusg - Thursday, March 1, 2007 - link
Excuse the pretty much off-topic, but hey it's from the article:Buggy as hell that AMD chipset, but at the time a nice low cost entry into dual core computing. I can only get mine stable when I run my RAID and Gbit cards in the 32bit/33Mhz slots, which is a real shame. Sound drivers are also fun. How is yours set-up? This would also be a good place to post it if somebody feels like sharing: http://www.2cpu.com/forums/showthread.php?t=46394">http://www.2cpu.com/forums/showthread.php?t=46394. Cheers.
Renoir - Wednesday, February 28, 2007 - link
1) How do they accomplish having HDCP support for both dvi and hdmi given that they're on independent display controllers? My understanding was that seperate crypto ROMs were required for each controller/output. Simple answer would be that they indeed have 2 sets of keys but I assume this isn't the case given that they only let you use HDCP on one digital output at a time. So how does that all work?2) How is vga implemented in the display controllers? 1=HDMI 2=DVI-I(hence dvi or vga) or some other configuration?
3) In a related point (upcoming mobile version of chipset) What connection do laptops use internally for their screens? I've asked this question on a few other sites but never got an answer. Surely someone must know? The reason I ask is I'm interested in getting a laptop in future which supports both hdcp for the laptop screen but also via an external digital connection to a larger display.
Sorry for the long post
A5 - Wednesday, February 28, 2007 - link
On page 2, in the audio section, you wrote:"of the HDMI audio capabilities but have tested our 5.1 setup with output set to 24-bits at 48,000MHz."
Should be 48KHz or 48,000Hz, not 48GHz :)
JarredWalton - Wednesday, February 28, 2007 - link
We have very high quality audio. It's under tight wraps, though, so you didn't hear about it here! (*Unh... sorry*)Tujan - Wednesday, February 28, 2007 - link
to see how this chipset is implemented.Probably won't get a set up with everything shown in the schematic at the first of the article.
Remembering when AMD used to have their ''own''chipsets...is this chipset going to be something wich is 'solely ' AMD marketed. Or will there be derivitives from MicroStar,Asus,Gigabyte etc ? Will ALL the chipsets featured in this article considering AMD/ATI be only that single vendor representation ?
I'll be glued to the articles as they transpose. So these chipsets are actually Blue-ray,HD-DVD ''ready''chipsets. Big hmmm there. Personally I dont think that anything less than that in inches bigger than 32 to be HD. So the HDCP stuff,etc,seems to have a bi-product of selling the new disks from the copyright world.Implementing their schemes on less that what would be considered HD.(another topic of course).I think HD should be considered in inches no matter the parameters of pixel count. Or frame/..strength.
Move it on over for the space necesary for HD. Minus the screen size,and video strength....I'd enjoy the technology for its mediums strength of storage space. Another emplementation facet,creeping up into the computerdom spectrum.
Spoelie - Wednesday, February 28, 2007 - link
The chipset as displayed are all chipset features, no extra chips needed. Not providing the necessary connectors would be extremely stupid.In inches? Someone trying to compensate? ;p
You can have a 50" plasma with 800xWhatever resolution all you want, it will never be HD or come close to portray the details a 24" 1920x1200 LCD can. "High definition" defines how much information you display, how sharp the picture is, not on how big a surface you're displaying that information. If that's what it meant, "HD" has been available for decades, thanks to projectors.
So if you really only think in inches, you shouldn't be bothering with the new formats, just watch VHS or videocd's (300xblabla mpeg) in your big ass projector room. Call your friends and tell them what a superHD setup you have. Never mind the blur and color bleeding, look how big it is!
Tujan - Thursday, March 1, 2007 - link
Right now,I have a max.capable computer screen of 1280x1240 using a vga connection.With a 19" monitor.The definition is not as much a credential as the media,or medium of wich you would wish to display . DVDs right now,are not actually HD. Unlike DVDs though Blue-ray,and HD-DVD require credentials of HDCP for the video card,and display in order to use the media.
You could go and use the newer large storage media for something other than protected content.That perhaps you created yourself,as in HD.Utilizing the criteria for 'high-definition'detailed as x(pixel)by x(pixel).I can get everything I need out of copyprotected media on a 19"screen.Or 24"for that matter.
I dont see the agony of accomadating the studios on the mere relevance of x(pixel)by x(pixel) when the reality is anything other than a large screen ,is defined by if it has HDCP or not. Im happy that there is a difference. Home made HD does not need the criteria of credential in order to be such a thing. If it is defined by x(pixel)by x(pixel) in wich video cards are capable of,and any screen size will do,more power to the smaller screens.
Still....the new HDCP content is a 'good thing'. It just to me doesn't have relevence on small screens. HD is not about small screens and HDCP. Its about having an HD movie.
I am definitely waiting for the bleeding to stop. More power to you at your desktop of course.
strikeback03 - Thursday, March 1, 2007 - link
HDCP was about the movie studios attempting to stop piracy. Of course now that AACS has been broken (multiple times in multiple ways) HDCP will just be a big pain in the rear and still not stop piracy.OrSin - Wednesday, February 28, 2007 - link
I going to just say what?I missed about half of what he was saying.
sprockkets - Wednesday, February 28, 2007 - link
Having 6 SATA ports is nice, but Raid 5 is all software done anyhow with nVidia chipsets, so what is the big deal?What is the max resolution over DVI/HDMI? I know the 6150s does not have the same resolution over DVI as it does the VGA port.