Hadoop: How Open Source can Whittle Big Data to SizeAdded 3rd Mar 2012
In 2011 'Big Data' was, next to 'Cloud', the most dropped buzzword of the year. In 2012 Big Data is set to become a serious issue that many IT organisations across the public and private sectors will need to come to grips with.
The challenge essentially comes down to this: How do you store the massive amounts of often-unstructured data generated by end users and then transform it into meaningful, useful information?
One tool that enterprises have turned to to help with this is Hadoop, an open source framework for the distributed processing of large amounts of data.
Hadoop lets organisations "analyse much greater amounts of information than they could previously," says its creator, Doug Cutting. "Hadoop was developed out of the technologies that search engines use to analyse the entire Web. Now it's being used in lots of other places."
In January this year Hadoop finally hit version 1.0. The software is now developed under the aegis of the Apache Software Foundation.
"The releases coming this year will effectively become Hadoop 2.0," Cutting says. "We're going to see enhanced performance, high-availability and an increased variety of distributed computing metaphors to better support more applications. Hadoop's becoming the kernel of a distributed operating system for Big Data."
Hadoop grew out of Nutch, a project to build an open source search engine Cutting was involved in. Development of Nutch is also conducted under the patronage of the Apache Software Foundation.
"The Hadoop ecosystem now has more than a dozen projects around it," says Cutting. "This is a testament to the utility of the technology and its open source development model. Folks find it useful from the start. Then they want to enhance it, building new systems on top.
"Apache's community-based approach to software development lets users productively collaborate with other companies to build technologies from which they can all profitably share."
Hadoop setups are available from big names in the Cloud computing space, including Amazon (through Amazon Elastic MapReduce) and IBM; in December Microsoft announced a "limited preview" of Hadoop on its Windows Azure Cloud service. Hortonworks, a company set up by Yahoo (which runs a 42,000-node Hadoop environment and is a key driver of the project), and Cloudera, which employs Cutting as chief architect, also offer Hadoop-related services.
Cloudera offers a distribution of Big Data software called CDH -- Cloudera's Distribution Including Apache Hadoop. "This is open-source, Apache licensed software," Cutting says. "Folks can develop their applications against these APIs without fear of ever being locked into paying any one vendor.
The company sells support and a licence to its proprietary software, Cloudera Manager, which helps deploy and monitor CDH. The Oracle Big Data Appliance, released in January, runs CDH.
"Appliances are a great way to get a customer in the door, but most folks end up buying a customised cluster," Cutting says. "Some folks may find the appliance itself to be the right solution, but more frequently people want something that's more suited to their particular uses.
"Folks tend to start with a small proof-of-concept system, perhaps 10 or 20 nodes. Once they've gained some experience with this then they have an idea of both how big their production system needs to be and what its bottlenecks are. This informs the balance of storage, compute, memory and networking that will serve them best.
"Over time, as workloads evolve and grow, folks may gravitate towards common configurations, but we're not yet seeing a lot of one-size-fits-all solutions."
Cutting says when he started Hadoop, which was named after his son's toy elephant, he didn't realise just how significant the project would end up being. "I thought it would probably be useful to lots of folks, but I didn't think much about how many or how they might use it," Cutting says. "I certainly didn't think that it would become the central component of a new paradigm for enterprise data computing.
However, the software is "ultimately the product of a community," he adds. "I contributed the name and parts of the software and am proud of these contributions. The Apache Software Foundation has been a wonderful home for my work over the past decade, and I am pleased to be able to help sustain it."
Cutting uses the example of a hypothetical large retailer to explain what Hadoop can do with an enterprise's data: "Instead of just being able to analyse national sales over the past month, it can with Hadoop analyse sales trends over many years. This lets them better manage pricing, inventory and other core aspects of their business: They get a higher resolution picture of their business.
"Similarly, credit card companies can better guess whether a transaction is fraudulent, banks can better guess whether someone is credit worthy, oil companies can better guess where to drill, and so on. In nearly every case they can use data they were formerly discarding to improve the quality and profitability of their products."
Cutting predicts continued exponential growth in Big Data analytics. "We're still in the steep part of the adoption curve and will be for at least a few more years," he says.
"It will be a while before growth merely tracks that of the larger economy. Developing economies like China and India will fuel continued growth in this space."
In the government sphere, adoption of Big Data technologies has been mixed, Cutting says: Intelligence communities have been early adopters, but other parts of government may not have even begun grappling with it.
"Even folks who are already using these technologies will continue to expand their use for years, incorporating data from new sources and finding new applications," he adds. "We're still at an early stage of the adoption curve.
"Most industries are currently dipping their toes into Big Data. The ones to watch are the industries we expect to grow the most. For example, healthcare and telecom create huge amounts of data that's not yet used as effectively as it could be."
"China, India and Australia are generating an increasing number of fast-growing, successful technology companies and are challenging Taiwan as a technology hub in the region," said Ichiro Nakayama, DTTL leader, Technology Fast 500 Asia Pacific program.
This growth was at 36.3% for this year and will hold strong at these levels over the next 4-5 years, according to a newly released report by IDC.
Mobile devices generated 20% of the world's browsing activity last month, the first time that the surging category reached the 1-in-5 milestone, according to StatCounter, a Web analytics company.
Microsoft sold out the Dell Venue 8 Pro tablet within minutes of kicking off an online discount Monday morning.
One of the biggest drivers for change in 2013 was the rapid growth of unstructured data, according to CommVault A/NZ area vice president, Bryan Stibbard.
Hewlett-Packard released new "converged systems" that aim to get customers up and running quickly with virtualized applications and big-data analytics.
Summary: Samsung launched the 840 EVO mSATA (mini-Serial ATA) Solid State Drive (SSD) line-up, which includes the industry's highest capacity mSATA SSD ultra-thin notebooks.
Verizon has signed an agreement to acquire EdgeCast Networks, in an effort to enhance its video delivery and Web services capabilities.
It's hard to get a good job in IT these days, but it's all too easy to lose one.
Microsoft has waffled over the past few days about how long it will continue to sell Windows 7, initially stating that it had already stopped shipping the operating system to retailers and OEMs, but shifting the status over the weekend to "to be determined."
Citrix has started shipping a slimmed-down version of its GoToMeeting online meeting and video conferencing product that had been in beta testing since September.
Akamai Technologies today has agreed to acquire Prolexic Technologies, a distributed denial-of-service (DDoS) mitigation services company, for $370 million.
Cisco Systems's third annual Global Cloud Index forecasts that global cloud traffic will more than quadruple, from 1.2 zettabytes in 2012 to 5.3 ZB in 2017.
The research firm notes that currency issues and a lack of reforms have slowed Indian IT spending growth in the current year.
The director of information architecture at agency Nomensa details the biggest design trend for 2014.