Enterprises today collect and generate more data than ever before. Hadoop is an open-source framework for running applications on large clusters of commodity hardware, designed to solve the scalable, reliable storage and analysis of both structure and complex data. Ted Dunning, architect at MapR Technologies, recently presented “The Power of Hadoop to Transform Business,” talking about the future of Hadoop. Integration of Hadoop with traditional IT makes a big difference in the way companies can use scalable computing.
Supermicro is a hardware-focused company, providing end-to-end green computing solutions for enterprise IT, data center, cloud computing, HPC and embedded systems worldwide. Supermicro designs and develops the ideal turnkey pilot racks for getting started with Apache Hadoop. Leveraging Supermicro's optimized servers and switches as a foundation, Supermicro has designed two turnkey racks to get anyone started - 14U and 42U versions.
Supermicro has focused on integration, remote systems and power management to lower the deployment and commissioning timeframes to provide developers access to native hardware environments both quickly and easily. I recently got the chance to speak with Charles Liang, CEO and Tau Leng, VP and GM at Supermicro about the company’s offerings and how it faces Hadoop and big data.
Liang explained Supermicro’s products focus on optimizing power consumption, consumption, density, performance and efficiency. Its Hadoop-optimized hardware differentiates the company from competitors, and its products emphasize not only data capacity and density, but also bandwidth performance. The company has a big opportunity in this space, as it spends a lot of time optimizing its hardware and solutions for Hadoop. According to Liang, Hadoop has become a cost-effective way to grow your business, and the industry will see more and more companies dedicating resources to this area.
Its open platform offers Hadoop developers access to a large scale infrastructure for testing, refining and enhancing their big data analytics applications. Supermicro’s enterprise-class compute and storage systems offer the ideal platform for organizations looking to quickly implement or transition to Hadoop analytics and a flexible, cost-effective path to scalability as business needs evolve.
Its extensive line of products include 1U and 2U rackmount servers optimized for Hadoop. Supermicro Hadoop systems feature thermal and power designs to deliver optimal performance-per-watt and performance-per-dollar.
Supermicro offers a series of products called the Twin (News - Alert) Family of solutions, which are a series of building blocks for best application optimization. Its patented Twin Architecture with up to 16 DIMMs per node for highest performance is the foundation of the most advanced server platforms in HPC/data center, cloud computing and enterprise IT applications. It also offers FatTwin for Hadoop, which features a 4-node front I/O with support for Intel (News - Alert) Xeon processor E5-2600 family, up to 512GB DDR3 1600MHz ECC registered DIMM and built-in server management tools with dedicated LAN port.
When it comes to enterprise IT and Hadoop, one of the most important things is power efficiency. Supermicro’s system easily outperforms others, and can save $600 in the lifespan of one server. To put that in perspective, Google (News - Alert) has tens of thousands of servers in a data center – for just one data center, it can save $4 million in one year when it comes to power.
To learn more about Supermicro and its Hadoop offerings, visit www.supermicro.com/hadoop.
Edited by Allison Boccamazzo