I got an email recently which asked the question, “If our world is really looking down the barrel of environmental catastrophe, how do I live my life right now?” This motto — shorthand for Gandhi’s…
Facebook revealed some big, big stats on big data to a few reporters at its HQ today, including that its system processes 2.5 billion pieces of content and 500+ terabytes of data each day. It’s pulling in 2.7 billion Like actions and 300 million photos per day, and it scans roughly 105 terabytes of data each half hour. Plus it gave the first details on its new “Project Prism”.
D. Abadi, A. Marcus, S. Madden, and K. Hollenbach. VLDB '07: Proceedings of the 33rd international conference on Very large data bases, page 411--422. VLDB Endowment, (2007)
R. Ondas, M. Pelikan, and K. Sastry. GECCO 2005: Proceedings of the 2005 conference on
Genetic and evolutionary computation, 2, page 1785--1786. Washington DC, USA, ACM Press, (25-29 June 2005)
G. Folino, C. Pizzuti, and G. Spezzano. Proceedings of the 13th International Conference on
Tools with Artificial Intelligence, page 129--135. Dallas, TX USA, IEEE, (7-9 November 2001)
H. Wang, K. Zhang, Q. Liu, D. Tran, and Y. Yu. Proceedings of the 5th European Semantic Web Conference, Berlin, Heidelberg, Springer Verlag, (June 2008)