Multi-Core

What became of multi-core programming problems?

SAN FRANCISCO--As the Intel Developer Forum gets under way this week, one hardly unexpected theme of CEO Paul Otellini's keynote address was that Moore's Law continues. Ivy Bridge, Intel's upcoming 22-nanometer processor platform, is slated for 2012. This continuation of Moore's Law means that a given area of silicon will contain more transistors.

Until relatively recently, more transistors more or less mapped directly to faster processor performance. That's because the additional transistors were primarily used to boost processor frequency and increase fast local memory--changes that were largely invisible to software. However, beginning around the middle … Read more

Pervasive takes on multicore programming

Writing software that can simultaneously make use of multiple processors can be hard. Yet the advent of multicore processors--four cores per chip is now common--means that more and more software needs to do just that.

With processor performance increases now increasingly coming through the ability to handle more execution threads, rather than handling individual ones faster, multithreaded programming, in one form or another, is pretty much the only path to writing faster software, going forward.

Researchers and developers are tackling this issue from a lot of different angles, including new languages and a greater focus on multithreaded programming in computer … Read more

Intel: Use our CPU (not their GPU) for games

Intel is back, pitching its processors for gaming graphics.

The chipmaker will attempt to promote its silicon for sophisticated game effects at the upcoming Game Developers Conference in March, as it strives to make a case for quad-core processors in lieu of graphics chips from Nvidia and Advanced Micro Devices.

The pitch goes like this: "Learn how to easily add real-time 3D smoke, fog and other fluid simulations to your game without using up the GPU." That's according to an Intel Web page entitled Intel at Game Developers Conference. (The CPU is the central processing unit, or … Read more

Five riffs on EmTech08

I spent the past couple of days attending Technology Review's EmTech08 conference at MIT. Lots of interesting speakers and ideas, some in areas of tech that I follow day-to-day (such as cloud computing) and others that I follow more in the vein of an interested observer (alternative fuels, open voting systems). In many respects, it's a refreshing change of pace from the events I commonly attend that tend to be more focused on today's immediate IT concerns.

EmTech08 gave me lots to mull--and I'll roll that mulling into more in-depth pieces down the road. For today, … Read more

Microsoft, Intel to sponsor multicore development research

Correction: The Microsoft and Intel press conference is scheduled for Tuesday.

Microsoft and Intel on Tuesday are expected to launch a joint research initiative to tackle programming for multicore processors.

The two PC industry giants sent out a media alert saying that they will host a teleconference to announce the research venture.

The Wall Street Journal on Monday reported that the venture will focus on multicore programming and that the bulk of the work will be done at the University of California at Berkeley.

The need for more research stems from the emergence of processors with two or more processing … Read more

CPU: The future of GPU?

For those who play PC games (and please count me in), the most expensive and necessary investment has always been the graphics card (also known as the GPU, graphics processing unit). High-end cards, from either ATI or nVidia, can cost $500 and up. That's not even factoring in the case, cooling system, power supply, etc., which also have to be equally high-end to support the increasingly large and power-hungry graphics cards. And there seems to be no end to all this. Or is there?

At IDF 2007, there was a demo running Quake 4. There wasn't much to … Read more