Big data encompasses a collection of data sets so vast that it becomes difficult to process with normal or traditional data processing applications. The task includes analysis, capture, curation, search, sharing, storage, transfer and visualization. The Big Data deals with Kilobyte, Megabyte, Gigabyte, Terabyte, Petabyte, Exabyte, Zettabyte and Yottabyte. The general accepted definition of Big Data is “So large data that it becomes difficult to process it using the Traditional system�. It becomes difficult to analyze when the size becomes heavy. It depends on the capabilities of the system. Big Data is resulting into large and growing files at high speed in various formats. Big data is usually measured by velocity, volume and variety. Velocity is the high speed at which the data is coming in. Volume is the data that results into large files. The varieties are the files that come in various formats. The Big Data is used in decision making. It is a way to eliminate challenges and opportunities and analyze and extract meaningful information.