Deliver Your News to the World

Boeing And Sun Microsystems Federal To Launch Open Architecture To Solve Extreme Data Computing Issues


Boeing [NYSE: BA] and Sun Microsystems Federal, a wholly owned subsidiary of Sun Microsystems, Inc. [Nasdaq: SUNW], have announced plans to launch an industry-leading, open architecture that will enable organizations to collect, process and store massive amounts of data at extremely high rates of speed.

Capable of processing more than 10 gigabits of data per second, the joint solution is designed to foster data analysis, sharing and decision making for a variety of markets, including government, life science, energy, education, aerospace, entertainment and media. Ten gigabits per second is equivalent to processing 250 copies of the complete works of Shakespeare or 125 chest x-rays in one second.

“This collaboration will enable us to provide images to our geospatial and intelligence customers faster, more cost effectively and in a higher resolution than ever before,” said Brian Knutsen, general manager of Boeing Mission Systems, Boeing’s center for data collection, imaging and archiving. “With 50 years of service to the National Geospatial Agency, Boeing remains committed to this important customer community and is pleased to leverage Sun’s expertise in open architectures into our products and services.”

“More and more organizations need a computing architecture that enables real-time access to data at speeds of 10 gigabits per second,” said Evan Harrigan, principal engineer at Sun Microsystems. “Sun and Boeing have designed an architecture that provides both real-time access and redundancy, eliminating a single point of failure.”

The architecture addresses the computing demands of several data-intensive tasks, including: operational intelligence and surveillance, epidemic trend analysis and prediction, failure analysis of aircraft and ships, predictive traffic management, weather and ocean forecasting, and virtual design.

Target applications for the 10 gigabit technology include experimental analyses and simulations in scientific disciplines such as high-energy physics, climate modeling, earthquake engineering, astronomy, human genomics and the development of nano-scale electronic devices. In such applications, massive datasets must be shared by a community of hundreds or thousands of researchers distributed worldwide.

These researchers need to be able to transfer large subsets of these datasets to local sites or other remote resources for processing. The success and continued proliferation of such advanced data sharing depends heavily on high-performance data acquisition, transfer and storage for real-time data collection, processing, visualization and simulation.

Leveraging open standards for the new architecture enables fast connections to external sources of data, allowing the architecture to be re-used across organizations and industries while reducing technology investments.


This news content was configured by WebWire editorial staff. Linking is permitted.

News Release Distribution and Press Release Distribution Services Provided by WebWire.