Today, the Web is exploding with a tsunami of data inundating organizations like never before, and traditional systems are shying away from storing, let alone analyzing, the thousands of petabytes of data. The birth of Apache Hadoop has provided them with a fundamentally new way for efficient handling of exponential data.
Archimedes, a healthcare modeling organization that takes publicly available clinical data and uses it to answer complex, vital healthcare questions for researchers, pharmaceutical companies and government agencies, used the Hadoop framework for the distributed processing of large sets across clusters of computers.
"Up until the time we had big data analytics, research using healthcare data took a significant amount of time and effort to analyze," stated Katrina Montinola, vice president of engineering at Archimedes.
However, the heavy costs associated with buying a new cluster to deploy and manage Hadoop did act as a deterrent. Archimedes needed a scalable, reliable solution that could handle the exponential growth that the company anticipated. It therefore selected the Univa (News - Alert) Grid Engine, a distributed resource management platform, to overcome the bottleneck.
"Univa Grid Engine is enabling organizations to unleash the power of Hadoop framework,” said Gary Tyreman (News - Alert), CEO at Univa.
With the help of the Univa Grid Engine and its open-source big data solution, the Hadoop framework, Archimedes built its ARCHeS Aggregator, enabling researchers and healthcare practitioners to turn data into decisions faster than ever before.
“With Univa Grid Engine, the complex analysis being completed by Archimedes' solutions can be done quickly and made available to researchers and physicians in a convenient format that is informative and efficient," added Montinola.
Archimedes thus built a scalable, efficient, cost-effective cluster architecture that was capable of supporting multiple applications and claimed that its operating and deployment cost had reduced by 50 percent.
More importantly, it stated that the company could operate its Hadoop application on the existing compute infrastructure, eliminating the need for additional resources and hardware. It also minimized the risks associated with open-source software and ensured that jobs were prioritized.
In related news, Attunity (News - Alert) Ltd., a provider of information availability software solutions, recently released its Attunity Managed File Transfer for Hadoop. The next-generation enterprise data transfer solution is designed to accelerate big data collection processes and integrate them seamlessly into and out of Hadoop, the open-source programming framework for large-scale data processing.
Want to learn more about the latest in communications and technology? Then be sure to attend ITEXPO Miami 2013, Jan 29- Feb. 1 in Miami, Florida. Stay in touch with everything happening at ITEXPO (News - Alert). Follow us on Twitter.
Edited by Rachel Ramsey