Mac OS X 10.6 "Snow Leopard" milks multi-processor CPUs with "Grand Central"

Between all the iPhone talk yesterday, very little way discussed of Snow Leopard, the 10.6 update to Mac OS X. John Markoff of the New York Times fills in the blanks.

Apple, he claimed, has made a parallel-programming breakthrough.
It is all about the software, he said. Apple purchased a chip company, PA Semi, in April, but the heart of Snow Leopard will be about a parallel-programming technology that the company has code-named Grand Central.

"PA Semi is going to do system-on-chips for iPhones and iPods," he said.

Grand Central will be at the heart of Snow Leopard, he said, and the shift in technology direction raises lots of fascinating questions, including what will happen to Apple's partnership with Intel.

"Basically it lets you use graphics processors to do computation," he said. "It's way beyond what Nvidia or anyone else has, and it's really simple."

Apple in Parallel: Turning the PC World Upside Down? [Bits.Blogs.NYTimes.com]

Update: Here's Apple's official Snow Leopard page.

Join the Conversation

5 Comments

  1. disclosure: i only read the linked article and didn’t look for any further info …

    not that i really expected apple to be more forthcoming with the info, but it strikes me as more than a little silly to say that no one knows how to efficiently use multiple cores but that you’ve found out how to do so AND that it involves harnessing yet more computational resources.

  2. Given John Markoff’s history of breathless stenography, I’ll take this with a grain of salt until I see some real world benchmarks. Remember the ludicrous desktop supercomputer hype around the PowerPC offerings.

  3. How come Apple wants an NVIDIA killer, given that Apple’s using their GPUs? CUDA is pretty easy to use if you know C, but it takes a lot of effort to get really kick-butt performance with it. (It’s easy to write slow CUDA, alas…) Maybe Apple has solved some of those problems but they need a new language design to accomplish it? Or does Apple not want to be tied to NVIDIA’s whims?

  4. btw not to be elitist, but shouldn’t this sort of thing go on /. instead of here? The steampunk ironic-retro-pop-art post-singularity applications are only indirect in this case 😉

  5. HilbertAstronaut,

    OpenCL isn’t intended to be an “NVIDIA killer” at all. You answered your own question when you mentioned, “CUDA is pretty easy if… but it takes a lot of effort…”

    If you look at the history of Apple’s efforts in API development, they tend to emphasize making technologies that are so compelling and easy to use that developers actually do use them, eagerly. For an example, look at how Apple enables their developers to take advantage of the OpenGL hardware acceleration built into most Macintosh computers with the Core Image and Core Animation API, without those developers really needing to learn much about OpenGL at all.

Leave a comment

Your email address will not be published. Required fields are marked *