Mac photo with matrix movie screen

What does the Big Data term really mean?

What does the Big Data term really mean? 700 460 Nicola Lapenta

The full definition with 5 V’s

Big data refers to a precise concept of a data set. In this article, I will expand the most recent and complete definition that takes into consideration all the components to which this type of data refers.

Contents

Introduction

Big data is a relatively modern field of data science. It explores how large data sets can be broken down and analyzed in order to systematically extract insights and useful information. Previously, conventional data processing solutions were not very efficient when it comes to capturing, archiving, and analyzing big data. Increasingly, corporate infrastructures manage ever larger and more complex datasets from different sources.

But what does big data really mean?

While many enterprise have invested in building a data storage and aggregation infrastructure in their organisations, they don’t understand that aggregating data alone doesn’t add value.

What matters is what you do with the collected data. With the help of advanced data analysis, it is possible to derive useful information from the collected data. These insights are what add real value to decision making.

Companies with traditional business intelligence solutions are therefore unable to maximize their value. To successfully understand what big data means, it is necessary to elucidate the definition of the 5Vs that indicate: volume, variety, velocity, veracity and value.

The figure show the 5Vs of big data

Figure 1 The 5 V’s of Big Data.

1 Volume

The volume of big data defines the amount of data that is produced. Today, data is generated in large quantities and from various sources. For example, unstructured data from social feeds, clickstreams on web pages, from mobile apps or sensor-enabled equipment. Most companies acquire tens of terabytes of data per day. For some, it can be hundreds of petabytes.

Developing recent open-source frameworks like Apache Spark is essential for big data growth because they make data cheaper to store, easier to manage, and make analytics fast and comprehensive.

With such high volumes, it can be difficult for enterprise to manage them with conventional business intelligence methods. Must implement modern business intelligence infrastructures and tools to effectively capture, store and process such unprecedented amounts of data, in some cases even in real time.

2 Variety

Big data also implies the processing of different types of data collected from multiple channels such as computer systems, networks, social media, smartphones. They are generally classified as structured, semi-structured and unstructured data. While structured data is that who’s format, length and volume are clearly defined, semi-structured data can only partially conform to a specific data format.

Unstructured data, on the other hand, is not organized and does not conform to traditional data formats. Data generated via digital media and social media can be classified as unstructured data. In fact nearly 80% of the data produced globally, including photos, videos, mobile data and social media content, is unstructured in nature.

The figure show the 3Vs of big data

3 Velocity

The speed with which data is generated, collected and analyzed has a direct impact on timely and accurate business decisions. Data should be captured as close to real-time as possible to make it available at the right time.

Normally, the highest data speed is obtained by directly copying the contents into RAM (Random Access Memory) instead of writing them to hard disk. Often even a limited amount of data available in real time produces better business results than a large volume of data that takes a long time to capture and analyze.

4 Veracity

Because big data is vast and involves so many data sources, the possibility exists that not all data collected is of good quality or accurate in nature. Therefore, when processing large data sets, it is important that the validity of the data be verified before proceeding and relying on the information gathered.

In other words, it is essential to attribute its validity and truthfulness. The validity of big data is the guarantee of the quality and credibility of the information collected.

The figure show the 2Vs of big data

5 Value

Although data is produced in large volumes today, simple collection is of no use. We need to generate business insights to add real value to companies. In the context of big data, value refers to the extent to which data positively impacts a company’s business. This is where big data analytics comes in.

One way to ensure that the value of big data is substantial and worth investing in time and resources is to conduct a cost / benefit analysis. By calculating the total cost of big data processing and comparing it to the ROI that business insights are expected to generate, companies can effectively decide whether big data analytics will actually add value to their business.

Conclusions

Although big data is a reality today and volumes continue to grow, its usefulness is only at the beginning. Cloud computing has further expanded the possibilities offered with truly elastic scalability where developers can simply create ad hoc clusters to test a subset of data.

Finding value in big data does not only mean managing them in the best way which still represents a big advantage. It is about developing an entire process that requires experience and focused analysis to ask the right questions and formulate informed business hypotheses.

To learn more about the services I offer for this type of data, contact me directly from the contacts section. It will be a pleasure to provide you with further useful information to your business.

You may also be interested in reading the article How Parquet files save time and resources.

Published by: Nicola Lapenta

Photo by Markus Spiske on Unsplash

Credits: Berkeley Executive Education, Wikipedia