Deliver Your News to the World

Gartner Says as the Number of Processors Swells Inside Servers, Organizations May Not Be Able to Use All Processors Thrust on Them


Server Processor Counts Will Rise to Levels That Strain the Ability of Current Software to Make Effective Use of That Capability

STAMFORD, Conn., The relentless doubling of processors per microprocessor chip will drive the total processor counts of upcoming server generations to peaks well above the levels for which key software have been engineered, according to Gartner, Inc. Operating systems, middleware, virtualization tools and applications will all be affected, leaving organizations facing difficult decisions, hurried migrations to new versions and performance challenges as a consequence of this evolution.

“Looking at the specifications for these software products, it is clear that many will be challenged to support the hardware configurations possible today and those that will be accelerating in the future,” said Carl Claunch, vice president and distinguished analyst at Gartner. “The impact is akin to putting a Ferrari engine in a go-cart; the power may be there, but design mismatches severely limit the ability to exploit it.”

On average, organizations get double the number of processors in each chip generation, approximately every two years. Each generation of microprocessor, with its doubling of processor counts through some combination of more cores and more threads per core, turns the same number of sockets into twice as many processors. In this way a 32-socket, high-end server with eight core chips in the sockets would deliver 256 processors in 2009. In two years, with 16 processors per socket appearing on the market, the machine swells to 512 processors in total. Four years from now, with 32 processors per socket shipping, that machine would host 1,024 processors.

Gartner said that organizations need to take heed of the issue because there are real limits on the ability of the software to make use of all those processors. “Most virtualization software today cannot use all 64 processors, much less the 1,024 of the high-end box, and database software, middleware and applications all have their own limits on scalability,” Mr. Claunch said. “There is a real risk that organizations will not be able to use all the processors that are thrust on them in only a few years time.”

Mr. Claunch said that the software which runs today’s servers has both hard and soft limits on the number of processors that the software can effectively handle. Hard limits are often documented by the vendor or creator of the product and are therefore relatively easy to discover. They are determined by implementation details inside the software that stop it from handling more processors. In this way, an operating system might use an eight-bit field to hold the processor number, meaning a hard limit exists of 256 processors. Soft limits, however, are uncovered only from word of mouth, real-world cases. They are caused by the characteristics of the software design, which may deliver poor incremental performance or, in many cases, yield a decrease in useful work as more processors are added.

Often the soft limit is noticeably below the hard limit for software, meaning that overheads and inefficiencies produce seriously diminished value for large processor counts that may technically be within the supported configurations of the software.

“There is little doubt that multicore microprocessor architectures are doubling the number of processors per server, which in theory opens up tremendous new processing power,” concluded Mr. Claunch. “However, while hard limits are readily apparent, soft limits on the number of processors that server software can handle are learned only through trial and error, creating challenges for IT leaders. The net result will be hurried migrations to new operating systems in a race to help the software keep up with the processing power available on tomorrow’s servers.”

Additional information is available in the Gartner report “The Impact of Multicore Architectures on Server Scaling.” The report is available on Gartner’s Web site at

Gartner, Inc. (NYSE: IT) is the world’s leading information technology research and advisory company. Gartner delivers the technology-related insight necessary for its clients to make the right decisions, every day. From CIOs and senior IT leaders in corporations and government agencies, to business leaders in high-tech and telecom enterprises and professional services firms, to technology investors, Gartner is the indispensable partner to 60,000 clients in 10,000 distinct organizations. Through the resources of Gartner Research, Gartner Consulting and Gartner Events, Gartner works with every client to research, analyze and interpret the business of IT within the context of their individual role. Founded in 1979, Gartner is headquartered in Stamford, Connecticut, U.S.A., and has 4,000 associates, including 1,200 research analysts and consultants in 80 countries. For more information, visit


This news content was configured by WebWire editorial staff. Linking is permitted.

News Release Distribution and Press Release Distribution Services Provided by WebWire.