Hadoop is a framework for running applications on large clusters of commodity hardware. The Hadoop framework transparently provides applications both reliability and data motion.
The lucky kids of JavaSchools are never going to get weird segfaults trying to implement pointer-based hash tables. They're never going to go stark, raving mad trying to pack things into bits. They'll never have to get their head around how, in a purely functional program, the value of a variable never changes.
C. Bellettini, M. Camilli, L. Capra, and M. Monga. Reachability Problems, volume 8169 of Lecture Notes in Computer Science, Springer Berlin Heidelberg, (2013)
C. Bellettini, M. Camilli, L. Capra, and M. Monga. Symbolic and Numeric Algorithms for Scientific Computing (SYNASC), 2012 14th International Symposium on, page 295-302. IEEE Computer Society, (September 2012)
G. Limaye, J. Chaudhary, and P. Punjabi. International Journal on Recent and Innovation Trends in Computing and Communication, 3 (3):
1699--1703(March 2015)
M. Becker, H. Mewes, A. Hotho, D. Dimitrov, F. Lemmerich, and M. Strohmaier. International Conference Companion on World Wide Web, page 17--18. Republic and Canton of Geneva, Switzerland, International World Wide Web Conferences Steering Committee, (2016)