The web is getting smarter.  And just when the world begins to figure the internet it out, it changes. The internet of a decade ago is nothing like today’s version and today’s version will be antiquated in the next five years, ChicagoNow reports. The reason—big data.

Big data is a quickly tossed around term that is roughly translates to describe extremely large sums of complex data that is avalanches the web daily.  The measurements of this data is currently calculated in petabytes and exabytes (the latter equally a digit followed by 18 zeros)!  Furthermore, statistics show that the net will reach 1.3 zettabytes of data in 2016.

However, all of this was predicted decades ago by data miners.  They knew that huge sums of information would engulf the web and there would a market to find hidden gems in big data.

Before the advent of wikis and the dot-com bust, it was understood that big data was going to be a different animal.  At the turn of the century, the addition of the net to economic markets was solidified and big data was beginning to take on a form.

Speaking about the adolescent years of the web Rich Spitzer of TrendPointers, Inc. relates, “There was a trajectory of information circulating; it was the old connect-the-dots.

Read more

About the Author:

eCampus News staff and wire reports


Add your opinion to the discussion.