To manage risk, NIST 800-39 recommends organizing risk across “three tiers of organization, mission/business processes, and information systems” (NIST, 2011). To some extent, this risk framework assumes an organizational monoculture that may not be present for Big Data. Managing risk across organizations may prove to be the norm under certain cyber-physical system / IoT scenarios. []
13.30.1PII as Requiring Toxic Substance Handling
Subsection: Need text.
Subsection: Need text.
13.30.3Transparency Portal Scenarios
Subsection: Need text.
13.30.4Cross-organizational Risk Management
Subsection: Need text.
13.30.5Algorithm-Driven Issues
Subsection: Need text
13.30.6Big Data Forensics and Operational AAR
Subsection Scope: Need text. AAR = After Action Review. Describe why big data requires potentially unique preparation for forensics and AAR.
13.31Big Data Security Modeling and Simulation (ModSim)
Subsection: Needs references to NIST and other ModSim standards.
Penetration Testing is accepted as a best practice for security professionals. However, penetration testing cannot detect numerous security problems which arise. As systems become more complex and multi-organizational,
More than a decade ago, Nicol called for increased “emulation, in which real and virtual worlds are combined to study the interaction between malware and systems” [20]. Such methods question the usual assumptions about attack surfaces; red teams typically focus on perimeter attacks. White hat efforts do not have these limitations, but lack the necessary tools to test what-if scenarios internally. ModSim, in addition to code walkthroughs and other methods, allows for security threats to complex systems to be more systematically studied.
13.31.1Safety Systems Modeling
Subsection Scope: Need text.
13.32Security and Privacy Management Phases
Earlier versions of this standard did not clarify design-time, in-situ and forensic (after-the-fact) considerations. This version explicitly addresses three phases for managing security and privacy in big data.
-
Build Phase The SnP Build Phase occurs when a system is being planned, or while under development (in the agile sense). In a straightforward case, the Build Phase takes place in a “green field” environment. However, significant Big Data systems will be designed as upgrades to legacy systems. The Build Phase typically incorporates heaviest requirements analysis, relies the most upon application domain-specific expertise, and is the phase during which most architectural decisions are made (Ryoo, Kazman, & Anand, 2015).
-
Note: This phase is roughly analogous to 800-53 Planning controls.
-
Build phases that incorporate explicit models include the business model canvas. As Scott Shaw argued, ““If architecture is the thing you want to get right from the start of your project, you should be modelling the business domain as the sequence of events that occur” (Lea, 2015).
-
At the build phase, delegated access management approaches should be designed in, using, for example two-way TLS, OAuth, OpenID, JSON web tokens, HMAC signing, NTLM, etc. Architects must consider compatibility with the big data stack of choice (e.g., some feel that
-
The design pattern recommended for authorization is stateless, not using sessions or cookies.
-
in Situ Phase This phase reflects a fully deployed, operational system. An in situ security scenario shares elements with operational intelligence and controls. In a small organization, operations management can subsume security operations. Development may be ongoing, as in an agile environment where code has been released to production. Microservices present “huge challenges with respect to performance of [an] overall integrated system” [21]. Regardless of the predecessor tasks, once released into production, security challenges exist in an arena shared with operations – including issues such as performance monitoring and tuning, configuration management and other well-understood concepts. This relationship is discussed in more detail in the Volume 6, “Management Fabric.”
-
Decommissioned Phase In its simplest form, this phase reflects a system that is no longer operational. For example, data from a (probably) decommissioned Radio Shack application was provided by the bankruptcy court to a third party. There is a more nuanced version of the Decommissioned phase as well. “Significant” changes to an existing app could be seen as a decommissioning. See also Gartner’s “Structured Data Archiving and Application Requirement” (Landers, Dayley, & Corriveau, 2016). This Phase also includes design for forensics analytics.
In addition to prior work by Ruan et al. (Ruan & Carthy, 2013) the Cloud Security Alliance proposed a Cloud Forensics Capability Maturity Model. []
Modifications for Agile Methodologies
Subsection scope: Need text
Domain-Specific Security
The importance of domain-specific considerations was a key insight derived from the HL7 FHIR consent workflow use case. Implementers cannot assume that genomic data should be treated using the same practices as electric utility smart meters.
-
Identify domain-specific workflow
-
Domain-specific roles
-
Domain-specific “share” policies, content, controls.
Organizations (sole proprietorship) must identify which facets of big data systems are sharable and to whom. This is key to understanding how the BDSQ should be applied. For some, the domain model is not significantly different from the profession or industry; these are in some sense, “global” and non-proprietary. Other aspects of the domain model contain intellectual property, internal roles, execution strategy, branding, tools deployed, etc.; these are shared only selectively.
In the BDSQ, this is simplified to public and private “views” (Burger, 2014). Using this approach, views can evolve (co-evolve with code, or as code itself) over time. When it comes time to federate, a “public” view is available a BDRA component.
Share with your friends: |