You might be asking yourself: why would NASA need a network with speeds of 100 Gb per second (Gbps)? The answer is big data.
When data gets so large and complex it cannot be handled by a standard IT system, it’s classified as big data. An example of the amount of data the space agency will be dealing with is the Square Kilometer Array telescope.
Fully operational by 2016, it will be generating 700TB/second of data – an immense amount of information – and if the network doesn't have the capability of handling many Gbps, NASA will be experiencing the mother of all bottlenecks.
The demonstration will be held at a conference for supercomputers SC12. The system was developed by the NASA Goddard Space Flight Center and the NASA Ames Research Center.
The role of cPacket will be to provide the networks, real-time monitoring and measuring the performance of distributed applications, cloud and networks using its cVu Traffic Monitoring switches. It will help NASA identify any problems to the system during the demonstration.
“High speed networks are important for our efforts to expand our supercomputing and cloud infrastructure. These higher speed networks are mission critical and it is important to monitor them in real time to identify any performance bottlenecks or application behavior anomalies, which interfere with our mission,” said Paul Lang (News - Alert), senior networking engineer from the High End Computer Networking Group (HECN) NASA Goddard Space Flight Center.
NASA, along with its partners, will be conducting a live demonstration of the 100 Gbps file transfer between NASA Goddard in Greenbelt, MD and its booth at the conference using local and wide area networks (LAN and WAN).
The goal is to achieve the 100 Gbps mark and pass the 60 Gbps achieved at last year's conference.
The technology is part of the HECN (High-End Computer Networking) Team. The team's goal is to give NASA's High End Computing Capability (HECC) project and NASA Advanced Supercomputing (NAS), a mode of transportation suitable for the amount of data that is created across the many disciplines of research the agency conducts every year..
NASA’s reliance on supercomputers is not only to gather data, but to conduct simulation experiments to predict weather patterns and other complex subjects that create massive amounts of data. An effective transport method means data can be sent or received from long distances, including outer space, without much delay.
Want to learn more about the latest in communications and technology? Then be sure to attend ITEXPO Miami 2013, Jan 29- Feb. 1 in Miami, Florida. Stay in touch with everything happening at ITEXPO (News - Alert). Follow us on Twitter.
Edited by Braden Becker