Developing Big-Data Applications with Apache Hadoop
Interested in live training from the author of these tutorials?
See the upcoming Hadoop training course in Maryland, co-sponsored
by Johns Hopkins Engineering for Professionals. Or, contact firstname.lastname@example.org for info on
customized Hadoop courses onsite at your location.
Following is an extensive series of tutorials on developing Big-Data Applications with Hadoop. Since each section includes
exercises and exercise solutions, this can also be viewed as a self-paced Hadoop training course.
All the slides, source code, exercises, and exercise solutions are free for unrestricted use.
Click on a section below to expand its content. The relatively few parts on IDE development and deployment use
Eclipse, but of course none of the actual code is Eclipse-specific. These tutorials assume that you already know Java; they definitely move too fast
for those without at least moderate prior Java experience. If you don't already know the Java language, please see
the Java programming tutorial series.
It is becoming increasingly common to have data sets that are too large to be handled by traditional databases, or by any technique
that runs on a single computer or even a small cluster of computers. In the age of Big-Data, Hadoop has evolved as the library of choice for handling it.
This tutorial gives a thorough introduction to Hadoop, along with many of the supporting libraries and packages. It also includes a free downloadable
virtual machine that already has Hadoop installed and configured, so that you can quickly write code and test it out.
See the "Source Code and Virtual Machine" section at the bottom of this tutorial.
Installing and configuring Hadoop is a tedious and time-consuming process. So, we have provided a Ubuntu Virtual Machine with Hadoop already installed
(plus Java, Eclipse, and all the code from this tutorial and its associated exercises). This VM can be installed for free on any Windows,
MacOS, Linux, or Solaris platform. Click on the link below for details.