Big Data, how big is big

Size

When it comes to Big Data, there is no way to get away from the discussion on the Size of data. It is one of the three V’s (Volume)

“Big data” is high-volume, -velocity and -variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight and decision making. ~Gartner

 

The interesting part about the Volume of data is not just the sheer about of it that we generate on a daily basis but the fact that due to the lowering cost of storage, we are able to actually keep a good portion of it around.  This in turn gets us talking about size.  This is where people try to dazzle you with their intellect.  For example, you may have heard of an Exabyte.  There may be a possibility that you know that and Exobye is

1 EB = 1018 bytes (10 with 18 zeros behind it)

1000000000000000000B

But seriously even if you are a mathematician, the complexity derives from the fact that most people cannot visualize the size of data in that format.

Is that important?   You bet it is.  One of the key attributes of a Data Scientist is to be able to adequately convey the complex to non-technical people.  To bridge the gap between those with the tools and those with the power (or budget) in an organization.  This may be a small example but it is a good one, breaking down how big data is in terms people can understand is important.

grandmaSo all of that being said, I like to describe it in a way that even my Grandma can understand.  I just relate everything to a GB.  From GB harddrives, to GB in your phone, to GB in your camera, most people can wrap their head around the size because they are used to doing mental calculations based on how many songs, how many files, how many pictures they can hold on x number of GBs.

OK… So lets start talking about big numbers now.

Terabyte = 1024 GB.

With the low cost of storage now, many of you may have Terabyte drives at home or in your computer.  I have a 2TB drive that I carry around with me with EVERYTHING I need on it.  So this one is not much of a comprehension challenge.

Petabyte = Over 1 Million Gigabytes

Ok, so one step up. Over One Million Gigabytes.  What does that get us.  Well, the size of the 10 Billion photos on Facebook is 1.5 Petabytes.  Or the amount of Data processed by Google…. Per day is 20 Petabytes.

Exabyte = Over 1 Billion Gigabytes

When we get into Exabytes, we start talking about how much storage entire Data Center Building hold.  Like Utah Data Center for the Comprehensive National Cyber security initiative (CNCI).  Or cold storage for those Facebook pictures we mentioned.

Zettabyte = over a Trillion Gigabytes

It is said that over 2 ZB of data is created every day.  We don’t store this much (we don’t have the ability) but this much is created. The “Zetta” was recognized by the 19th International Committee for Weights and Measueres in 1991 (along with “Yetta”) .  So where is all of this data coming from?  See the infographic below.

DatainOneMinute

infographic by Domo.com

infographic by Domo.com

 

 

Yottabyte = over 5 thousand trillion Gigabytes

Of course if you are the NSA, you need more storage.  Like the NSA’s secret (ha ha) Data Center in Utah. http://www.foxnews.com/tech/2013/06/13/what-we3-know-utah-nsa-mega-data-warehouse/

Imagine being able to comb through that data.   A Yottabyte is so big that you have to start talking about small things to put in in comparison.  For example there are 1 Yottabytes of Atoms in 7000 human bodies.  Or A Yottabyte of 1TB hard drives would require a data center covering 1 million city blocks