NVIDIA 680i: The Best Core 2 Chipset?
by Gary Key & Wesley Fink on November 8, 2006 4:45 AM EST- Posted in
- CPUs
It has been six months since NVIDIA announced their new 500 series chipsets. This past May the launch coincided with the release of AM2. Today NVIDIA launches a new chipset family, called the 600i family, with no mention at all of AMD and a launch date to coincide with the new Intel Core 2 Quad (Kentsfield).
Perhaps these two events, set just six months apart, best define the dramatic shifts that have occurred in the enthusiast market during this time. AMD was undisputed performance leader for the past couple of years, and enthusiasts didn't much care about Intel chipsets. With the launch of Core 2, however, the enthusiast world changed. Today Core 2 Duo and Quad are the undisputed performance leaders and AMD is once again the "value" chip. This will likely change again in the future, but for now Intel Core 2 is clearly the processor enthusiasts are demanding.
Of course, that has been the problem for NVIDIA. Where their 590/570/550 family was just great with AMD processors, their Intel variants left a lot to be desired. NVIDIA is a company that proclaims loudly its support of the enthusiast, and it had to be embarrassing that the NVIDIA chipsets for Intel were also the worst overclocking chipsets in the market. NVIDIA needs credibility as a provider of enthusiast chipsets in order to sell their top-end SLI to Intel buyers, since Intel has supported the competitor's ATI CrossFire as their multi-GPU standard. Features of nForce 590 looked great, but the overclocking performance, or rather the lack of it, kept enthusiasts away from the 500 series for Intel.
In addition, in the past 6 months AMD bought ATI, NVIDIA's major competitor in graphics. NVIDIA had become the leading supplier of chipsets for motherboards supporting the AMD processor, and with ATI moving to AMD that market position was now in jeopardy. ATI also had competent chipsets for AM2, and everyone expected AMD to make good use of those capabilities in the future.
What had been the minor annoyance of not having a good enthusiast chipset for Intel's Core 2 Duo quickly became a major problem for NVIDIA. The enthusiast was now buying Intel processors instead of AMD, their major competitor was now part of their largest customer in the chipset market, and the world's largest supplier of chipsets for Core 2 Duo - Intel themselves - was supporting the ATI CrossFire multi-GPU solution. NVIDIA needed a new product for the Intel Socket 775 that would excite the enthusiast enough to buy NVIDIA for Intel, increase NVIDIA's market share in the Intel chipset market, and provide a superior platform for SLI on Intel.
That product launches today in the NVIDIA 600i chipset family. The "i" is for Intel, and for now the 600 family is only available for the Intel Socket 775. (Future NVIDIA chipsets for the AMD platform will be named with a small "a" following the number.) The family will include some value boards and a top-end 680i that claims incredible overclocking on Core 2 Duo and Core 2 Quad processors. The new chipset also delivers dual x16 SLI to the Intel platform in a board NVIDIA is confident enthusiasts will want to own.
NVIDIA cut their teeth in the AMD market, but the Intel chipset market is a much more ambitious target. In the past AMD was only a minor player in the AMD chipset market, but Intel is the largest supplier of chipsets for their own socket 775 processors. Intel also has a long and impressive history of innovations in the chipset market. Intel chipsets are widely regarded as the top performers in almost any category supporting Intel processors. This is a very different market than the AMD platform NVIDIA targeted and conquered. With ATI now part of AMD, with current AMD chipsets moving toward the value category, and with the enthusiast buying Intel processors, the desire to target the Intel market is logical. However, as NVIDIA quickly found out with the 500 family for Intel, they must have the goods to persuade buyers to choose NVIDIA instead of Intel.
The real question then is whether the 680i and the 600i chipset family are the best available in the Intel market. If we believe NVIDIA marketing the answer is a resounding yes. Does the 680i live up to all the advance hype? We hope to provide answers to that question.
Perhaps these two events, set just six months apart, best define the dramatic shifts that have occurred in the enthusiast market during this time. AMD was undisputed performance leader for the past couple of years, and enthusiasts didn't much care about Intel chipsets. With the launch of Core 2, however, the enthusiast world changed. Today Core 2 Duo and Quad are the undisputed performance leaders and AMD is once again the "value" chip. This will likely change again in the future, but for now Intel Core 2 is clearly the processor enthusiasts are demanding.
Of course, that has been the problem for NVIDIA. Where their 590/570/550 family was just great with AMD processors, their Intel variants left a lot to be desired. NVIDIA is a company that proclaims loudly its support of the enthusiast, and it had to be embarrassing that the NVIDIA chipsets for Intel were also the worst overclocking chipsets in the market. NVIDIA needs credibility as a provider of enthusiast chipsets in order to sell their top-end SLI to Intel buyers, since Intel has supported the competitor's ATI CrossFire as their multi-GPU standard. Features of nForce 590 looked great, but the overclocking performance, or rather the lack of it, kept enthusiasts away from the 500 series for Intel.
In addition, in the past 6 months AMD bought ATI, NVIDIA's major competitor in graphics. NVIDIA had become the leading supplier of chipsets for motherboards supporting the AMD processor, and with ATI moving to AMD that market position was now in jeopardy. ATI also had competent chipsets for AM2, and everyone expected AMD to make good use of those capabilities in the future.
What had been the minor annoyance of not having a good enthusiast chipset for Intel's Core 2 Duo quickly became a major problem for NVIDIA. The enthusiast was now buying Intel processors instead of AMD, their major competitor was now part of their largest customer in the chipset market, and the world's largest supplier of chipsets for Core 2 Duo - Intel themselves - was supporting the ATI CrossFire multi-GPU solution. NVIDIA needed a new product for the Intel Socket 775 that would excite the enthusiast enough to buy NVIDIA for Intel, increase NVIDIA's market share in the Intel chipset market, and provide a superior platform for SLI on Intel.
That product launches today in the NVIDIA 600i chipset family. The "i" is for Intel, and for now the 600 family is only available for the Intel Socket 775. (Future NVIDIA chipsets for the AMD platform will be named with a small "a" following the number.) The family will include some value boards and a top-end 680i that claims incredible overclocking on Core 2 Duo and Core 2 Quad processors. The new chipset also delivers dual x16 SLI to the Intel platform in a board NVIDIA is confident enthusiasts will want to own.
NVIDIA cut their teeth in the AMD market, but the Intel chipset market is a much more ambitious target. In the past AMD was only a minor player in the AMD chipset market, but Intel is the largest supplier of chipsets for their own socket 775 processors. Intel also has a long and impressive history of innovations in the chipset market. Intel chipsets are widely regarded as the top performers in almost any category supporting Intel processors. This is a very different market than the AMD platform NVIDIA targeted and conquered. With ATI now part of AMD, with current AMD chipsets moving toward the value category, and with the enthusiast buying Intel processors, the desire to target the Intel market is logical. However, as NVIDIA quickly found out with the 500 family for Intel, they must have the goods to persuade buyers to choose NVIDIA instead of Intel.
The real question then is whether the 680i and the 600i chipset family are the best available in the Intel market. If we believe NVIDIA marketing the answer is a resounding yes. Does the 680i live up to all the advance hype? We hope to provide answers to that question.
60 Comments
View All Comments
yyrkoon - Thursday, November 9, 2006 - link
From my little experience with an Asrock board that can use this program, it WILL adjust clock frequency on the fly, however I think that voltage changes need be done only by rebooting. Reguardless whether I'm remembering correctly, I'm fairly certain atleast one possible change needs to be done during, or after a reboot, could be thinking of clock multiplier maybe ?Pirks - Thursday, November 9, 2006 - link
that sucks. guess I'll have to wait till nVidia makes 100% nonreboot-OC mobo, or on-the-fly-OC mobo where you just click a couple of buttons in Windows and voila - your machine turns from quiet office machine to a Crysis fireball, and vice versa - I can dream, can't I? ;)ssiu - Wednesday, November 8, 2006 - link
Since NVIDIA claims the 680i has better FSB overclock than the 650i's, and the 680i results are on par with the mainstream P965's, I am afraid that the 650i's would be significantly worse than the DS3s/P5Bs. In other words, I am afraid that the 650i's are not really a new competitive option for budget/mainstream overclockers.yyrkoon - Wednesday, November 8, 2006 - link
I dont think any true enthusiast is going to be buying a mid range board(chipset) to begin with. If the Intel numbering shceme is anything like the AM2 numbering scheme, the 650i will probably have less availible PCI-E lanes as well, and would be a major factor in my personal decission in buying any such hardware, and I know I'm not alone ;)Jedi2155 - Wednesday, November 8, 2006 - link
I don't think your definition of enthusiast is wholly correct but rather the Manufacturer idea of enthusiasist. I personally think many enthusiasists do indeed have a limited budget, and after seeing the pricing of Asus 680i board, I think mid-range is the way to go...hoping for a cheap < $250 680i board >_>.yyrkoon - Wednesday, November 8, 2006 - link
Yeah, He wasnt talking about true enthusiasts though, I realize this after re-reading his post.One a side note, that if my board brand of choice suddenly went away (ABIT), I would seriously consider buying a Gigabyte board, but the DS3 doesnt seem to be making a lot of people happy in the stability category. What I'm trying to say here, is that perhaps the board MAY not OC as well, but that according to what I've read (reviews, forum posts, and A LOT of newegg user reviews), it couldnt do much worse than the Gigabyte board in this area.
The second question I'd be asking myself, is WHO THE HELL is EVGA . . . we all know they make Video cards (probably the best for customer support for nVidia products).
I'm definately interrested in the 680i chipset, but i think my brand of choice for MANY years now would remain the same, and that I'll be sticking with ABIT :)
Gary Key - Wednesday, November 8, 2006 - link
1. The reference board is designed and engineered by NVIDIA. Foxconn manufactures the boards for the "launch" partners that include BFG and others. Asus, Abit, DFI, Gigabyte, and others will have their custom designed boards out in a few weeks.2. The Abit board is very interesting, here is pic of it - http://img474.imageshack.us/img474/2044/in932xmaxy...">Abit 680i - ;)
yyrkoon - Thursday, November 9, 2006 - link
Didnt even know there was one this close to release gary, lol thanks for the link. Judging by the 5 SATAII connectors, previously released ABIT boards, and what LOOKS like an eSATA connector on the back panel, I suppose this board will support eSATA, and possibly a SATA PM ?Stele - Friday, November 10, 2006 - link
That Abit 680i board looks very interesting indeed... if nothing else because it looks like it sports a digital PWM power supply circuitry similar to that used by DFI in the latter's LANParty UT NF590 SLI-M2R motherboard (the Pulse PA1315NL coupled inductor array is a dead giveaway, as it is designed for use only with Volterra's VT11x5M digital PWM circuitry).Unfortunately more information on such circuitry is proving very difficult to find (Volterra themselves restrict their product details and datasheets to design partners only) ... it'd be great to know how such a power circuit compares in performance and capabilities over the traditional PWM-MOSFET-based ones.
Curiously, the Abit 680i seems to have dropped the AudioMax daughter board.
yyrkoon, I'm guessing the 5th SATA II and the eSATA port are there courtesy of an SiI3132 controller - which is likely the little square IC under the upper heatpipe, just beside the audio connector block. As such, the usual capabilities and features of the said IC would apply, I think :)
yyrkoon - Wednesday, November 8, 2006 - link
I'd just like ot point out that DualNet technology is NOT true NIC Teaming, or rather Link agrregation(802.11a/d I think).When I first heard about DualNet I was extremely excited, since I had been doing TONS of research on NIC bonding etc, but after doing some homework, I found that DuelNet only supports out going packets. It was my hope that you could link two of these boards via a regular GbE switch, and get instant 2GbE connections, but this is not the case(unless they've recently redone DualNet).
Now to the question: Since SATA port Multiplier HBAs require a specific SIL chip(s) on the device they communicate with (to give full speeds of a true RAID), what are the chances that nVidia boards will work with these devices ?
In the past, I've seen two AM2 boards that have a built in SIL chip with eSATA connectors on the board back panel (ABIT, and Asus), but onboard SIL 'chipsets' seem to be rather limited(as in only supporting PM support on two SATA connections). I'd personally REALLY like to see this technology standardized, so it doesnt matter WHAT SATA controller chipset you're using. I also think that once nVidia realizes that PM support onboard is a major plus, and once they implement it, they COULD be taken seriously by many Intel fans.
Also, some Intel chipset fans believe that Intel chipsets are best for a rock solid system (for the record, I'm not one of these people), I guess we'll see if nVidia will change thier minds.