AMD 690G: Technology Overview
by Derek Wilson and Gary Key on February 28, 2007 2:30 PM EST- Posted in
- CPUs
The AMD 690G/690V chipset consists of an RS690 Northbridge and SB600 Southbridge. AMD's intent with this chipset is to provide an attractive alternative to the NVIDIA 6100 family but more importantly a total platform solution when paired with an AM2 that is very competitive against the current Intel G965 family in the budget sector. The 690G is directed towards the consumer market with a heavy emphasis on multimedia capabilities via the X1250 graphics core while the X1200 core on the 690V chipset is targeting the business market where the X1250's AVIVO capabilities are not as important. Processor support is for the 690G/V is extensive with the AMD Athlon 64, AMD Sempron, AMD Athlon 64 X2 Dual-Core, and AMD Athlon 64 FX all working properly on our test units.
In the case of the X1250, it is no surprise that AMD has reached back to previous generation hardware for the base design of their new integrated GPU. Lower transistor count means smaller die size and lower cost, and the X100 series fits the bill with its lack of SM3.0 support and use of 24bit floating point precision. The basic design for the X1250 is taken from the X700, with some modification. While we would love to see Shader Model 3.0 support (which NVIDIA hardware has in their 6100 series and current Intel hardware claims to be capable of), developers writing DX9 applications will still be designing for the SM2.0 target which the X1250 meets.
Many AVIVO features (including 10-bit per component processing) have been implemented on X1250, bringing higher quality video decode to integrated graphics. Unfortunately, with this improvement comes some sacrifice, as the number of pipelines on the X1250 is cut down from the X700. The X1250 weighs in at four pixel shaders and like other X100 series hardware this also means four texture units, z-samples and pixels per clock. The major change when compared to the X700 is the number of vertex shader units have gone from six to zero. All vertex shader operations are handled by the CPU. The core clock speed operates at 400MHz and can be increased to 500MHz within the BIOS depending upon the board manufacturer. We have also overclocked one of our boards to 550MHz with a third party utility but performance unfortunately does not scale well due to the limitations of the GPU. Based upon these numbers you can expect overall performance closer to that of the X300 than the X700.
As for memory, the GPU can handle up to 1 GB of memory, but support is dependent on the BIOS. AMD uses an optimized unified memory architecture (UMA) design, and all graphics memory is shared with system memory. For our tests, we found 256MB to be the sweet spot, as performance seemed to be degraded with 512MB or 1GB graphics memory, especially under Vista where the base memory requirements are significantly higher than XP. This may end up being different depending on implementation, but we will stick with the 256MB recommendation for now and our upcoming test results bear this out.
While none of this will revolutionize gaming, it certainly puts AMD ahead of Intel and NVIDIA at this time. We'll have to wait and see what happens with the final performance numbers, but the X1250 looks to be a very solid platform for integrated graphics performance and features with an extremely competitive price target around $85.
Looking beyond architecture, most people who will actually be using integrated graphics won't be bothered with games or high-end 3D applications. This hardware will be most used for 2D and video applications. Let's take a look at the features we can expect in these areas.
Supporting a maximum resolution of 2048x1536, the X1250 can easily run any CRT at maximum resolution. This tops NVIDIA's 6150 max resolution of 1920x1440 and equals Intel's G965. Larger 30" flat panel monitors won't be able to run at native resolution, so the business user who needs huge desktop real estate will have to stick with add-in graphics cards. As for output features, the video hardware supports S-Video, YPbPr, HDMI 1.3, and Dual-Link DVI. Of course, the actual interfaces available will depend on the implementation, but HDMI and DVI ports will also support HDCP. The digital outputs each use TMDS transmitters that run at 165MHz.
The GPU supports two independent display outputs, and both DVI and HDMI outputs can be used at the same time. The only caveat is that HDCP will only work over one digital output at a time. This isn't a huge issue, as most people won't be watching two different protected movies at the same time on a single computer. Also, in spite of the single display limitation, HDCP can be used over either HDMI or DVI. This gives the X1250 an advantage over graphics cards that initially supported HDCP. Many cards only allowed HDCP over one HDMI or DVI port while the other was always unprotected only.
As for HDMI, the audio support is enabled through an interface in the RS690 Northbridge while the SB600 Southbridge handles the HD audio controller interface. The standard HD audio codec will come from Realtek who has developed a driver package that allows the user to control both the HDMI and HD audio interfaces from a single application. Having the HDMI audio hardware on-board means we aren't simply using a pass-through on the graphics card as we have seen in most implementations. This enables the user to have better control over audio adjustments through the integrated software package. We are still waiting for the full specifications of the HDMI audio capabilities but have tested our 5.1 setup with output set to 24-bits at 48kHz.
For video acceleration features, the X1250 is capable of hardware acceleration of MPEG-2 and WMV playback. MPEG-4 playback decode is not hardware accelerated, but it is supported in software via the driver. DVD and TV (both SD and HD resolution) playback can be offloaded from the CPU, but we have seen some severe choppiness or blank screen issues with HD media formats at 1080p although 720P worked fine. AMD has indicated that this issue will be addressed in a future driver and the chipset is fully capable of 1080p output with an upper end CPU and proper software support.
For those who wish to use discrete graphics along side their integrated solution, AMD supports a feature they call SurroundView. This enables support for three independent monitors in systems with integrated and discrete AMD graphics. The feature works as advertised and may be useful for business users who want more than two monitors at a low price. Gamers who want more than two monitors will certainly have to take a different route.
The AMD 690G/690V utilizes the SB600 Southbridge that was introduced last May and continues to be a competitive offering although both Intel and NVIDIA's latest chipsets are offering six SATA ports along with RAID 5 capability. We will further delve into the differences between the NVIDIA 6100/6150, Intel G965, and AMD 690G chipsets in our upcoming performance review.
In the meantime, AMD has introduced a very competitive integrated graphics chipset that offers slightly better video performance than its competition at a very competitive price point that should please a majority of potential customers. How long they will remain competitive is up for debate with both NVIDIA and Intel launching new IGP platforms in the coming months. For now, AMD can brag about having the fastest IGP solution - which unfortunately is only slightly faster than a discrete X300 card.
In the case of the X1250, it is no surprise that AMD has reached back to previous generation hardware for the base design of their new integrated GPU. Lower transistor count means smaller die size and lower cost, and the X100 series fits the bill with its lack of SM3.0 support and use of 24bit floating point precision. The basic design for the X1250 is taken from the X700, with some modification. While we would love to see Shader Model 3.0 support (which NVIDIA hardware has in their 6100 series and current Intel hardware claims to be capable of), developers writing DX9 applications will still be designing for the SM2.0 target which the X1250 meets.
Many AVIVO features (including 10-bit per component processing) have been implemented on X1250, bringing higher quality video decode to integrated graphics. Unfortunately, with this improvement comes some sacrifice, as the number of pipelines on the X1250 is cut down from the X700. The X1250 weighs in at four pixel shaders and like other X100 series hardware this also means four texture units, z-samples and pixels per clock. The major change when compared to the X700 is the number of vertex shader units have gone from six to zero. All vertex shader operations are handled by the CPU. The core clock speed operates at 400MHz and can be increased to 500MHz within the BIOS depending upon the board manufacturer. We have also overclocked one of our boards to 550MHz with a third party utility but performance unfortunately does not scale well due to the limitations of the GPU. Based upon these numbers you can expect overall performance closer to that of the X300 than the X700.
As for memory, the GPU can handle up to 1 GB of memory, but support is dependent on the BIOS. AMD uses an optimized unified memory architecture (UMA) design, and all graphics memory is shared with system memory. For our tests, we found 256MB to be the sweet spot, as performance seemed to be degraded with 512MB or 1GB graphics memory, especially under Vista where the base memory requirements are significantly higher than XP. This may end up being different depending on implementation, but we will stick with the 256MB recommendation for now and our upcoming test results bear this out.
While none of this will revolutionize gaming, it certainly puts AMD ahead of Intel and NVIDIA at this time. We'll have to wait and see what happens with the final performance numbers, but the X1250 looks to be a very solid platform for integrated graphics performance and features with an extremely competitive price target around $85.
Looking beyond architecture, most people who will actually be using integrated graphics won't be bothered with games or high-end 3D applications. This hardware will be most used for 2D and video applications. Let's take a look at the features we can expect in these areas.
Supporting a maximum resolution of 2048x1536, the X1250 can easily run any CRT at maximum resolution. This tops NVIDIA's 6150 max resolution of 1920x1440 and equals Intel's G965. Larger 30" flat panel monitors won't be able to run at native resolution, so the business user who needs huge desktop real estate will have to stick with add-in graphics cards. As for output features, the video hardware supports S-Video, YPbPr, HDMI 1.3, and Dual-Link DVI. Of course, the actual interfaces available will depend on the implementation, but HDMI and DVI ports will also support HDCP. The digital outputs each use TMDS transmitters that run at 165MHz.
The GPU supports two independent display outputs, and both DVI and HDMI outputs can be used at the same time. The only caveat is that HDCP will only work over one digital output at a time. This isn't a huge issue, as most people won't be watching two different protected movies at the same time on a single computer. Also, in spite of the single display limitation, HDCP can be used over either HDMI or DVI. This gives the X1250 an advantage over graphics cards that initially supported HDCP. Many cards only allowed HDCP over one HDMI or DVI port while the other was always unprotected only.
As for HDMI, the audio support is enabled through an interface in the RS690 Northbridge while the SB600 Southbridge handles the HD audio controller interface. The standard HD audio codec will come from Realtek who has developed a driver package that allows the user to control both the HDMI and HD audio interfaces from a single application. Having the HDMI audio hardware on-board means we aren't simply using a pass-through on the graphics card as we have seen in most implementations. This enables the user to have better control over audio adjustments through the integrated software package. We are still waiting for the full specifications of the HDMI audio capabilities but have tested our 5.1 setup with output set to 24-bits at 48kHz.
For video acceleration features, the X1250 is capable of hardware acceleration of MPEG-2 and WMV playback. MPEG-4 playback decode is not hardware accelerated, but it is supported in software via the driver. DVD and TV (both SD and HD resolution) playback can be offloaded from the CPU, but we have seen some severe choppiness or blank screen issues with HD media formats at 1080p although 720P worked fine. AMD has indicated that this issue will be addressed in a future driver and the chipset is fully capable of 1080p output with an upper end CPU and proper software support.
For those who wish to use discrete graphics along side their integrated solution, AMD supports a feature they call SurroundView. This enables support for three independent monitors in systems with integrated and discrete AMD graphics. The feature works as advertised and may be useful for business users who want more than two monitors at a low price. Gamers who want more than two monitors will certainly have to take a different route.
The AMD 690G/690V utilizes the SB600 Southbridge that was introduced last May and continues to be a competitive offering although both Intel and NVIDIA's latest chipsets are offering six SATA ports along with RAID 5 capability. We will further delve into the differences between the NVIDIA 6100/6150, Intel G965, and AMD 690G chipsets in our upcoming performance review.
In the meantime, AMD has introduced a very competitive integrated graphics chipset that offers slightly better video performance than its competition at a very competitive price point that should please a majority of potential customers. How long they will remain competitive is up for debate with both NVIDIA and Intel launching new IGP platforms in the coming months. For now, AMD can brag about having the fastest IGP solution - which unfortunately is only slightly faster than a discrete X300 card.
14 Comments
View All Comments
mino - Wednesday, February 28, 2007 - link
Single link (165MHz) DVI is limited to 1920x1200@60HZ. IT is caused by the bandwith limitation of the DVI standard.Shame they did not include dual-link DVI interface :(.
chucky2 - Wednesday, February 28, 2007 - link
2 Questions to start:1.) Is it really HDMI 1.3 that's supported, or 1.2? Everyone seems to be saying something different.
2.) Is HDCP <i>always</i> enabled over DVI and HDMI, or, do the motherboard manufacturers have to make it so? I'm just wondering if a onboard DVI port motherboard is bought based on this chipset, if it's a guaranteed that DVI port is HDCP enabled.
3.) Can anyone from AMD or the motherboard manufacturers give us a more than off the cuff indication if these 690G/690V boards will support the new incoming AM2+ CPU's in the coming months?
Thanks!
Chuck
mino - Wednesday, February 28, 2007 - link
1) I see no reason for ANYBODY launching new GPU solution with just 1.3 support, it would be stupid beyond comprehension2) HDCP means the board must have code license. Threfore it is up to the board maker to decide whether he would pay for the license or pass the savings to the consumers who do not need HDCP (i.e. bussiness users).
3) All AM2+ CPU's will be compatible with all AM2 boards. Only limitation beeing with special low-TDP SFF boards. AMD has stated countless times.
chucky2 - Wednesday, February 28, 2007 - link
1.) Well, I see no reason on why this chipset took months longer to get out than it did, but there you go. Trust me, with the bungles (only way to look at the slip the AMD version of this chipset has encountered) that ATI/AMD has let happen, HDMI 1.2 or even 1.1 wouldn't surprise me.2.) That's why I wanted confirmation from AnandTech on this, because board makes are notoriously bad about being specific or less than forthright about things like this.
3.) When AnandTech says they confirmed with AMD that AM2+ CPU's will work in all 690G/690V motherboards, then I'll believe it. Until then, it's a popular rumor.
Chuck