Pinning down what qualifies as Big Data — and whether it’s an educational panacea or a corporate-driven technological phase — is sure to be at the center of educational technology discussions for years to come.
eCampus News assistant editor Jake New, in the first part of our “Higher Education’s Big (Data) Bang” series, explored the working definition of Big Data analytics, as understood by some of the field’s most prominent voices.
“Big Data is this exponential increase of information that’s been going on since the 1950s,” said Jim Spohrer, the director of Global University Relations Programs at IBM, a company that has partnered with campuses to drive the study and adoption of data analytics.
The varying definitions and understandings of Big Data and what it might mean for higher education were recently summarized in the MIT Technology Review, as the publication examined survey results centered around what Big Data is.
Here are some of the various Big Data definitions — as told by data-centric organizations and corporations — mentioned in the Technology Review.
1. Gartner. In 2001, a Meta (now Gartner) report noted the increasing size of data, the increasing rate at which it is produced and the increasing range of formats and representations employed. This report predated the term “dig data” but proposed a three-fold definition encompassing the “three Vs”: Volume, Velocity and Variety.This idea has since become popular and sometimes includes a fourth V: veracity, to cover questions of trust and uncertainty.
2. Oracle. Big data is the derivation of value from traditional relational database-driven business decision making, augmented with new sources of unstructured data.
See page 2 for how Intel and Microsoft define Big Data…
3. Intel. Big data opportunities emerge in organizations generating a median of 300 terabytes of data a week. The most common forms of data analyzed in this way are business transactions stored in relational databases, followed by documents, e-mail, sensor data, blogs, and social media.
4. Microsoft. “Big data is the term increasingly used to describe the process of applying serious computing power—the latest in machine learning and artificial intelligence—to seriously massive and often highly complex sets of information.”
5. The Method for an Integrated Knowledge Environment open-source project. The MIKE project argues that big data is not a function of the size of a data set but its complexity. Consequently, it is the high degree of permutations and interactions within a data set that defines big data.
6. The National Institute of Standards and Technology. NIST argues that big data is data which “exceed(s) the capacity or capability of current or conventional methods and systems.” In other words, the notion of “big” is relative to the current standard of computation.
How do you define Big Data? Share your thoughts with me at @eCN_Denny and @ecampusnews.