Intel Atom D510: Pine Trail Boosts Performance, Cuts Power
by Anand Lal Shimpi on December 21, 2009 12:01 AM EST- Posted in
- CPUs
No Native Hardware H.264 Decoding: Long Live Ion
The integrated GMA 3150 graphics hasn’t been used by Intel before, it’s a 45nm shrink of the GMA 3100. It’s technically a DX9 GPU running at 400MHz, however as you’ll soon see - you can’t really play any games on this platform. The GPU only offers hardware acceleration for MPEG-2 video, H.264 and VC-1 aren’t accelerated.
Max output resolution is also limited. The best you can get over a digital connection (HDMI/DVI) is 1366 x 768, over analog VGA you can do 2048 x 1536 (only 1400 x 1050 on the N450). It’s a curious coincidence, Poulsbo also had a 1366 x 768 digital output limitation.
Dual-core Atom "Pineview", the left half are the two CPU cores, the right portion is GPU + memory interface
And now we see why Intel skimped on the GPU abilities: nearly half the die is used for graphics.
Single-core Atom "Pineview"
On the single core part, more than half the die is the GPU/memory controller. At 32nm this won’t be a problem, but today at 45nm it is what it is - we get a mediocre GPU.
The NM10 Express Chipset
Pine Trail is all about integration. Pulling the memory controller and GPU on-die let board makers either build smaller, simpler or more feature rich motherboards. In fact, one of the benefits of integration is that all Atom motherboards can now be built using 4 layer PCBs. Previously, only the desktop Atom boards could be built on 4 layer PCBs, now netbook boards can be just as cost effective.
The old ICH7 (left) vs. the new NM10 (right) - pictures are to scale, the NM10 is really that much smaller
With the memory controller and GPU on-die, the “chipset” in Pine Trail has been reduced to a single chip external to the CPU. It’s called the NM10 Express Chipset and it connects to the new Atom CPU via a 2.5GB/s DMI link.
Intel’s NM10 supports 8 USB 2.0 ports, two SATA ports, HD audio, 2 x 32-bit PCI slots and 4 PCIe lanes. The NM10 is derived from existing ICH technologies, but bundled in a smaller package for use in small form factor motherboards.
Intel lists one interesting “optional ingredient” that can be connected to the NM10 chipset: a third party HD video decoder.
The FTC hasn’t won it’s case yet so most manufacturers still prefer to support Intel and would rather pair Pine Trail with a Broadcom H.264 decoder than go for something like Ion. It’s Intel’s concession to those who demand high definition video acceleration. Honestly, I would’ve preferred to see something that could do it natively instead of relying on a 3rd party solution. I suspect that the 3rd generation of Atom will solve this; at 32nm there’s more than enough transistor budget to integrate a GMA4500 series core, which would finally bring Atom up to feature parity with NVIDIA’s Ion chipset...just two years later
What About Ion 2?
Pine Trail uses Intel’s DMI to connect Pineview and the NM10 chipset. NVIDIA doesn’t have Intel’s blessing to sell chipsets that use DMI, so NVIDIA can’t produce something that takes the place of the NM10 chip.
NM10, however, has an integrated PCIe controller. It’s possible that NVIDIA’s next-generation Ion will simply connect via PCIe to the NM10 chip.
41 Comments
View All Comments
Shadowmaster625 - Monday, December 21, 2009 - link
Why doesnt AMD just take one of their upcoming mobile 880 series northbridges and add a memory controller and a single Athlon core? It would be faster than atom, more efficient than Ion, and could be binned for low power. Instead they just stand there with their thumbs up their butts while Intel shovels this garbage onto millions of unsuspecting consumers at even higher profit margins.JarredWalton - Monday, December 21, 2009 - link
The problem is that even single core Athlons are not particularly power friendly. I'm sure they could get 5-6 hours of battery life if they tried hard, but Atom can get twice that.Hector1 - Monday, December 21, 2009 - link
Do you want some whine with that ? Where were you when chipsets were created by taking a bunch of smaller ICs on the motherboard and putting them altogether into one IC ? PCs became cheaper and faster. We thought it was great. Do you know anything about L2 Cache ? It used to be separate on the motherboards as well until it was integrated into the CPU. PCs became cheaper and faster and we thought it was great. Remember when CPUs were solo ? They became Duo & Quad making the PCs faster and dropping price/performance. AMD & Intel integrated the memory controller and, whoa!, guess what ? Faster & lower price/performance and, yes, we thought it was great. It's called Moore's Law and it's all been part of the semiconductor revolution that's still going on since the '60s. GPUs are no different. They're still logic gates made out of transistors and with new 32nm technology, then 22nm and 16nm, the graphics logic will be integrated as well. Seriously, what did you think would happen ?TETRONG - Monday, December 21, 2009 - link
Do you understand that Moore's Law is not a force of nature?Intel has artificially handicapped the low-voltage sector in order to force consumers to purchase Pentiums. Right where they wanted you all along.
Since when is it ok for Intel to dictate what type of systems are created with processors?
First it was the 1GB of Ram limitation, now you can't have a dual-core. When does it end?
"We have a mediocre CPU, combined with a below average GPU-according to our amortization schedule you could very well have it in the year 2013(after the holidays of course), by which time we should have our paws all over the video encoding and browsing standards, which we'll be sure to make as taxing as possible. Official release of USB 3.0 will be right up in a jiff!
Voldenuit - Monday, December 21, 2009 - link
The historical examples you cite are not analagous, because intel bundling their anemic GPUs onto the package makes performance *worse*, and bundling the two dies onto a single package (they're not on the same chip, either, so there is no hard physical limitation) makes competing IGPs more expensive, since you now have to pay for a useless intel IGP as well as a 3rd party one if you were going to buy an IGP system.And just because a past course of action was embraced by the market does not mean it was not anti-competitive.
bnolsen - Saturday, December 26, 2009 - link
Performance is worse?? As far as I can see the bridge requires no heat sink and the cpu can be cooled passively. Power use went way down. For this plaform that is improved performance.My current atom netbooks do fine playing flash on their screens and just fine playing 720p h264 mkv files.
If you want a bunch of power use and heat, just skip the ion platform and go with a core2 based system.
Hector1 - Monday, December 21, 2009 - link
You need to re-read the tech articles. Pineview does integrate both the graphics and memory controller into the CPU. It's the ICH that remains separate. Even if it didn't, what do you think will happen when this goes to 32nm, 22nm and 16nm ? As for performance, Anand says in the title "Pine Trail Boosts Performance, Cuts Power" so that's good enough for me.Intel obviously created the Atom for a low cost, low power platform and they're delivering. It'll continue to be fine-tuned with more integration to lower costs. The market obviously wants it. SOC is coming too (System On a Chip) for even lower costs. Not the place for high performance graphics, I think.
This is really about Moore's Law marching on. It's driven down prices, increased performances and lowered power more than anything else on the planet. Without it, we'd still be paying the $5000 I paid for my 1st PC in 1980 -- an Apple II Plus. What you're saying, whether you know it or not, is that we should stop advancing processes and stop Moore's Law. Personally, I'd like to see us not stop at 45nm and keep going.
kaoken - Monday, December 21, 2009 - link
I agree that progress should be made, but bundling an intel cpu and IGP into a chip is anti-competitive. I wouldn't mind though if there were an intel cpu and ati/nvidia on a chip.JonnyDough - Tuesday, December 22, 2009 - link
Hector is right in one respect, and that is that if Intel is going to be dumb, we don't have to purchase their products. I especially like the sarcastic cynicism in the article when mentioning all the things that Intel's chip CAN'T do. They just don't know how to make a GPU without patent infringement. If they can't compete, they'll try using their big market share to hurt competition. Classic Intel move. They never did care about innovation, only about market share and money. But I guess that's what happens when you're a mega corp with lots of stockholder expectations and pressure. I'll give my three cheers to the underdogs!overvolting - Monday, December 21, 2009 - link
Hear Hear