Apache Spark is a lightning-fast cluster computing designed for fast computation.
It was built on top of Hadoop MapReduce and it extends the MapReduce model to efficiently use more types of computations which includes Interactive Queries and Stream Processing.
This is a brief tutorial that explains the basics of Spark Core programming.
List of Topics :
Apache Spark - Introduction
Apache Spark - RDD
Apache Spark - Installation
Apache Spark - Core Programming
Apache Spark - Deployment
Advanced Spark Programming