author photo
By Clare O’Gara
Mon | Nov 4, 2019 | 7:42 AM PST

The new National Institute of Standards and Technology Big Data Interoperability Framework has us talking a big game.

A big game about securing Big Data, that is.

Security and privacy difference in Big Data

In the new framework, NIST cites several ways in which securing Big Data differs from traditional security and privacy implementations.

NIST summarizes the difference like this:

"Big Data is increasingly stored on public cloud infrastructure built by employing various hardware, operating systems, and analytical software. Traditional security approaches usually addressed small-scale systems holding static data on firewalled and semi-isolated networks."

Now let's look at some helpful specifics.

Specific challenges for Big Data security and privacy

NIST created a list of eight major characteristics that set Big Data projects apart, making these projects a security and privacy challenge:

  1. Big Data projects often encompass heterogeneous components in which a single security scheme has not been designed from the outset.
  2. Most security and privacy methods have been designed for batch or online transaction processing systems.
  3. The use of multiple Big Data sources not originally intended to be used together can compromise privacy, security, or both.
  4. A huge increase in the number of sensor streams for the Internet of Things creates vulnerabilities in the Internet connectivity of the devices, in the transport, and in the eventual aggregation.
  5. Certain types of data thought to be too big for analysis, such as geospatial and video imaging, will become commodity Big Data sources.
  6. Issues of veracity, context, provenance, and jurisdiction are greatly magnified in Big Data.
  7. Volatility is significant because Big Data scenarios envision that data is permanent by default.
  8. Data and code can more readily be shared across organizations, but many standards presume management practices that are managed inside a single organizational framework.

The need for specificity: a rise in Big Data

Maybe you are a data scientist trying to get a handle on Big Data security and privacy, or perhaps you are a cybersecurity leader trying to get a handle on the same thing.

Either way, expect record amounts of Big Data to secure. According to NIST:

"Data generation is expected to double every two years to about 40,000 exabytes in 2020. It is estimated that over one-third of the data in 2020 could be valuable if analyzed.

Less than a third of data needed protection in 2010, but more than 40 percent of data will need protection in 2020."

All the more reason to check out the new NIST framework for Big Data.
Also, here is the NIST Big Data Security and Privacy section.

[RELATED: 5 Things to Know About the NIST Cybersecurity Framework]

Tags: Big Data, NIST,
Comments