Wednesday, June 7, 2017

NIST SP 800-171 and its potential impact on NSF science

By: Grayson Harbour, Scott Russell, Craig Jackson, and Bob Cowles

CTSC has recently seen an uptick in conversations in the community concerning NIST Special Publication 800-171 (SP 800-171) [1] and “Controlled Unclassified Information,” or CUI. In this post, we explain what CUI is, what SP 800-171 is, when and where SP 800-171 applies, and the impact it might have on the NSF science community. As will be elaborated in this post, we are concerned that the requirements of SP 800-171 may evolve into a general purpose set of security requirements that is applied to a wide range of non-federal entities, something SP 800-171 was not intended to do and for which it is not a good fit.  The NSF science community should keep SP 800-171’s limitations in perspective, particularly considering its focus on confidentiality-oriented controls, as opposed to the availability and integrity protections so critical to the science mission and of growing importance in context of ransomware.

NIST SP 800-171 and CUI both fall under the umbrella of federal cybersecurity. The primary federal cybersecurity law is FISMA, [2] which requires federal agencies to implement security controls for their systems commensurate to those systems’ risks. (This process is outlined in detail in the NIST Risk Management Framework, [3] and the controls used are compiled in NIST SP 800-53.[4]) Agencies also have more narrow security requirements arising from other statutes, such as HIPAA. [5] These requirements are typically vague, though, resulting in a hodgepodge of different control sets based on varying agency interpretations of what each law requires. And this system only becomes more complex when agencies try to extend their specific requirements to third parties.

The specific security requirements imposed on federal entities don’t directly affect anyone who isn’t a federal entity. [6] However these regulations can be (and are) imposed indirectly through binding agreements, like procurement contracts, grants, and cooperative agreements. The logic of this is fairly intuitive: data doesn’t suddenly stop needing protection when passed to a contractor. Although we normally associate these requirements with classified data, the same principle applies to unclassified data too. Which brings us to CUI . . .

What is CUI?
Controlled Unclassified Information, or CUI, is exactly what its name suggests: it is (1) unclassified information, that (2) is subject to federal regulatory control. So CUI quite simply is unclassified information that a federal law, regulation, or policy says needs to be protected in some way. But there’s a catch: since we’re in the world of federal cybersecurity, CUI only includes information that is made by or for the federal government. [7] Information non-federal entities make entirely on their own isn’t included, even if there is a law regulating that type of data. [8] This distinction will be important when we discuss scenarios where SP 800-171 could potentially apply.

What is the CUI program?
Before 2010, CUI was handled in different ways by different agencies; DoE, HHS, DoJ: they each applied their own cybersecurity standards. In 2010, the White House decided that the federal government needed to standardize how it handled CUI, so it issued executive order 13556, creating the Controlled Unclassified Information program. The goal here was to create a uniform minimum set of requirements for all CUI handled by the federal government. Rather than the prior hodgepodge, every federal agency would follow the same federal regulation, codified in 32 CFR 2002. [9]

What is SP 800-171?
NIST SP 800-171 is a corollary to the CUI program and 32 CFR 2002, designed to help federal agencies apply the new, uniform CUI standards to non-federal entities that may handle their CUI. It is intended to serve as a point of reference for the terms of binding agreements between federal entities and non-federal entities that receive CUI. Put simply, SP 800-171 is a guidance document; it has no independent regulatory force. This makes sense in context: CUI is by definition information that federal entities have to protect, and the requirement to protect CUI doesn’t disappear when the federal entities hand the CUI to a third party. So federal entities are supposed to incorporate SP 800-171 adherence into their binding agreements with the third parties to ensure those third parties apply the same protections the federal entity is obligated to apply.

In case the preceding paragraph didn’t make sense, it may help to understand how federal regulations work when applied to non-federal entities. Unlike Congress, which can legislate on almost anything, the executive branch can only regulate how executive branch agencies behave, (or enforce powers given to them by Congress). But the executive branch can regulate those third parties indirectly through binding federal agreements (contracts, grants, etc.). It's not the law of the land, but if you want to do business with the federal government, you have to follow their rules. This is why SP 800-171 isn’t just “the law” or “not the law”: it all comes down to what you’ve agreed to.

Ok, but what is SP 800-171?
Now that you know why it exists, what exactly does SP 800-171 require? Quite frankly, SP 800-171 is a list of security controls taken straight from the control requirements for protecting confidentiality in NIST SP 800-53. [10] Although frequently analogized as SP 800-53’s little sibling, this is somewhat misleading: SP 800-171 is focused on confidentiality, and doesn’t include controls whose purpose is primarily integrity or availability. This actually makes perfect sense in context, since the CUI program is mostly standardizing the implementation of federal laws, (most of which are privacy laws), whose primary concern is typically confidentiality.

That being said, not all CUI is made the same, and under the federal CUI program, some CUI has slightly higher requirements than others. For federal agencies, this is boiled down to “CUI-Basic” and “CUI-Specified.” Essentially, CUI-Basic is the floor which no one can pass below, whereas CUI-Specified means that some requirements will be heightened. If this all sounds potentially confusing, don’t worry, the categorization of CUI is laid out in the CUI Registry [11], which helpfully tells you both whether information is CUI, and whether it is CUI-Basic or CUI-Specified. If it isn’t in the CUI Registry, it isn’t CUI. (However, as will be discussed below, the specific terms of your agreement will always be the most important consideration.)

Now here’s where things get tricky, the CUI-Basic/CUI-Specified distinction and the CUI Registry all apply to federal entities, not non-federal entities. Naturally, the intended effect is that they should apply to non-federal entities transitively through binding agreements, but it’s easy to imagine some wires getting crossed in this process. So how will you know what to do?

How do I know if SP 800-171 applies to my project?
Despite all of the potential for confusion, application of SP 800-171 is very simple. If SP 800-171 makes its way into your binding agreement, do SP 800-171. Otherwise, you are not directly obligated to adhere to SP 800-171. As stated above, SP 800-171 is a guidance document, it is not mandatory for non-federal entities unless incorporated as a requirement through some binding legal agreement. So while the terms of this discussion often feel very broad, (both “Controlled Unclassified Information” and “Non-federal Entity” enjoy enormously broad definitions), this is because they are just building blocks for agencies to use when drafting contracts and other binding agreements.

That being said, we’re glazing over a lot of details. For instance, CUI is subject to marking requirements [12],so it is unlikely that you could unwittingly receive CUI. Similarly, there are provisions for challenging the designation of information as CUI if you believe that certain data shouldn’t be covered. But the broader point that must always be emphasized is that this will ultimately depend on what you agree to in your binding agreement. If an organization agrees to meet SP 800-171 for all of their systems, (not just those that handle CUI), they may well be held to this standard. The CUI program is intended only for CUI, but the standards it uses could be applied more broadly.

How might the CUI Program affect the science community?
Putting aside the strict question of “am I bound,” the potentially more salient question is “will I be bound in the future?” This is much harder to answer, but a quick perusal of the CUI Registry should give some indication of whether SP 800-171 might eventually apply to NSF science projects and facilities. The CUI Registry lists several categories of information that a researcher might encounter, store, and subsequently need to protect under the CUI program: student records, export-controlled research, critical infrastructure data, and controlled technical information [13] are all CUI. And as the CUI program continues to expand, the CUI Registry is expected to have more categories added.

There are two basic ways you could imagine the CUI program impacting a researcher: (1) you receive CUI from the federal government specifically for research purposes, or (2) your grant or cooperative agreement incorporates the requirements of SP 800-171 in some form. [14] The first is fairly uncontroversial, but also very limited. The second, however, would be an enormous shift in the research community, and probably a bad one.

Would SP 800-171 be a good thing for the NSF science community?
To put it bluntly: no. SP 800-171 was not designed to advance the science community’s mission, and the requirements it puts in place impose a substantial burden without any evidence of a corresponding benefit. [15] Apart from this basic mismatch in scope and capabilities, we’re also concerned that the approach of SP 800-171 reinforces a checklist mentality toward security (notoriously common in FISMA/NIST RMF environments, despite their lip service to risk management), where problems are solved by predetermined lists of controls. This incorrectly assumes that security is a solved problem, and doesn’t allow for effective risk management. Also troubling is that SP 800-171’s focus on controls may lead organizations to view security as an add-on: something that is slapped onto existing systems to manage their problems, rather than as a core component of system design. Much of our community is designing and building new technology. More troubling still is that SP 800-171 is likely to be quite expensive to implement. After all, SP 800-171 is not a comprehensive information security framework [16]; it is simply a collection of confidentiality requirements arising from federal privacy laws. So it is certainly not tailored for scientific research given its notable lack of focus integrity and availability, two critical concerns for science, despite its high cost.

Ultimately, our biggest concern is that SP 800-171 will come to be seen as a sort of RMF-light: something that is regularly applied as a security baseline to any project or entity that has an agreement in place with a federal entity. As stated above, SP 800-171 was not drafted as a comprehensive security framework for anyone, let alone the science community, and its goals and structure are not in line with the needs and capabilities of the science community. So, although some are beginning to recommend preemptively implementing standards like SP 800-171 as a form of future-proofing against possible federal regulation, we certainly would not go that far. However, considering the possibility that these requirements might find their way into grants and cooperative agreements, the science community should be aware of SP 800-171, its structure, limitations, and burdens, and be prepared to negotiate reasonable confidentiality provisions, perhaps like those controls already in place to protect confidential information at your facility or institution. Increasingly, the science community needs to show that it has effective, efficient approaches to cybersecurity tailored to its mission and the actual risks it faces.

[2] Federal Information Security Management Act of 2002 (FISMA), Pub. L. No. 107-347, Title III, 116 Stat. 2899, (2002), available at
[3] For an overview of the NIST Risk Management Framework, see, e.g., NIST SP 800-37 rev. 1, “Guide for Applying the Risk Management Framework to Federal Information Systems,” (Feb. 2010), available at
[4] NIST SP 800-53 rev. 4, “Security and Privacy Controls for Federal Information Systems and Organizations,” (Apr. 2013), available at
[5] For more information on the HIPAA Security Rule, see, e.g., “Security Rule Guidance,” HHS,
[6] Certain laws, like HIPAA, regulate both federal and non-federal entities. Our discussion is limited to the ways these laws regulate federal entities.
[7] 32 CFR 2002.4(h) available at, (defining “Controlled Unclassified Information” to include only “information the Government creates or possesses, or that an entity creates or possesses for or on behalf of the Government” (emphasis added)).
[8] Although it may not be CUI, the data may still be subject to security regulations, as with HIPAA and the HIPAA Security Rule.
[10] The specific controls selected are largely equivalent to SP 800-53 Moderate - Confidentiality, but SP 800-171 does omit a select few security controls that are: (1) entirely federal (and therefore wouldn’t make sense to apply to non-federal entities), (2) not directly related to the protection of CUI, or (3) assumed to be implemented by third parties already.
[12] The details of the marking requirement are laid out in 32 CFR 2002.20, available at For the various markings used, see
[13] “Controlled Technical Information” means “technical information with military or space application that is subject to controls on the access, use, reproduction, modification, performance, display, release, disclosure, or dissemination.”
[14]  Although the definition of CUI only includes data that is made by or for the federal government, there is potentially a great deal of leeway in exactly what data is made “for the federal government,” which may allow for SP 800-171 to creep into a wide range of government agreements. But since this would be a new addition, such a broad interpretation of CUI could be resisted by the third party.
[15] For a discussion of the application and implementation of SP 800-171 in higher education institutions, see “An Introduction to NIST Special Publication 800-171 For Higher Education Institutions,” HEISC, (Oct. 2016), available at
[16] For an example of a more comprehensive set of security controls, see, e.g., CIS Critical Security Controls, SANS   

Monday, June 5, 2017

CCoE Webinar June 19th 11am ET: Using the Blockchain to Secure Provenance Meta-Data

Dr. Richard Brooks and Dr. Tony Skjellum are presenting the talk "Using the Blockchain to Secure Provenance Meta-Data," on June 19th at 11am (Eastern). Note: Due to a CTSC conflict this presentation is being held a week earlier than our normal schedule.

Please register here. Be sure to check spam/junk folder for registration confirmation with attached calendar file.
Provenance meta-data, also known as data pedigree, is a set of data that explains how information was derived. A number of provenance systems exist. They are useful for finding the sources of errors; allowing system users to have confidence in the materials; and potentially providing legal justification for decisions. An open issue has been how to properly secure this meta-data, in a manner that extends beyond trusting the information providers. Blockchain technology provides a universally accessible ledger of transactions that is the basis of the current generation of crypto-currencies. The blockchain structure provides guarantees of system integrity that make it exceedingly difficult for malicious insiders to tamper with data. Our project adapts blockchain concepts to securing provenance meta-data. The talk we present will include the following topics: 
  • A brief survey of provenance systems that discusses security needs; 
  • The presentation of three illustrative use-cases that motivate the development of a provenance security framework; • A short tutorial on the structure of the blockchain; 
  • A brief overview of the current generation of crypto-currencies; 
  • An explanation of what aspects of crypto-currencies are ill-suited to our application; 
  • An overview of our system architecture, emphasizing two important points:
    • Our ability to integrate existing tools, and 
    • The portions of the system that we are developing; 
  • A discussion of our current status; and 
  • Plans for the next phase.
More information about this presentation is on the event page.

Presentations are recorded and include time for questions with the audience.

Join CTSC's discuss mailing list for information about upcoming events. To submit topics or requests to present, contact us here. Archived presentations are available on our site under "Past Events."

Friday, May 26, 2017

Workshop on Trustworthy Scientific Cyberinfrastructure at PEARC17

Please join us for CTSC's Workshop on Trustworthy Scientific Cyberinfrastructure the morning of Thursday, July 13 at PEARC17 in New Orleans. The workshop will have the following schedule:

Time Content
9:00-9:30 Update from the NSF Cybersecurity Center of Excellence (CCoE)
9:30-10:30 Cybersecurity for Small and Medium Science Projects
10:30-11:00 Refreshment Break
11:00-11:30 Security for Science Gateways
11:30-12:30 Community Forum

The workshop concludes with a Community Forum where attendees can share cybersecurity challenges and success stories in an informal setting. CTSC representatives will be on hand to lead the discussion and answer questions. We invite community members to present short lightning talks during the Community Forum. Contact to register your lightning talk topic.

View the PEARC17 Schedule for more details on the workshop and other PEARC17 sessions. PEARC17 Registration rates increase on June 1, so register early!

Tuesday, May 23, 2017

CTSC’s Situational Awareness Expanding Collaborations

In an attempt to improve on the security alert service CTSC offers, the Situational Awareness (SA) group within CTSC has recently expanded its monitoring streams and collaborations to include Open Science Grid (OSG) and the European Grid Infrastructure’s Software Vulnerability Group (EGI: SVG). These entities join CTSC’s current collaborations with REN-ISAC and XSEDE in mutually sharing advisories. The immediate benefit of this effort is that CTSC and its collaborators have increased their monitoring channels, enabling a larger window to survey for potential threats to science cyberinfrastructure. This improvement in shared knowledge will better position us, as well as our new partners, to pass on the important alerts to our communities.

To register for CTSC SA advisories, see

If you're a member of a trust community that would like to share alerts with CTSC, please contact us at

Monday, May 8, 2017

CCoE Webinar May 22nd 11am ET: Cybersecurity Research: Transition To Practice (TTP)

Emily Nichols and Dr. Alec Yasinsac are presenting the two-part talk "Cybersecurity Research: Transition To Practice (TTP)," on May 22nd at 11am (Eastern).

Please register here. Be sure to check spam/junk folder for registration confirmation with attached calendar file.
The U.S. National Science Foundation Transition To Practice (TTP) program is critical to the successful deployment and realization of value for NSF-funded cybersecurity research. Transition to Practice has been named a priority by the National Science and Technology Council’s subcommittee on Network and Information Technology Research Development (NITRD), since 2011, as the participating agencies recognize the need to see funded research adopted by the operational community and ultimately make a positive impact on society. Currently, a chasm exists between the output of the academic cybersecurity research community, and the operational Information Technology (IT) community, which acquires system prototypes that often result from later stage academic research components and implements them in operational environments as either proofs of concept or in operations. The goal of the NSF TTP program is to enable NSF-funded cybersecurity research to cross this chasm and become an operationalized asset to add value in our nation’s cybersecurity efforts.
Internet2 Collaborative Innovation Community (CINC UP): Cybersecurity Research Transition to Practice Acceleration Opportunities (Emily Nichols)
Internet2 is leading an NSF funded EAGER project to benefit members as together we develop a comprehensive TTP program with the goal of enabling as many NSF cybersecurity grants as possible to transition to practice in an accelerated fashion. 
Please join us to discuss how together the Internet2 community of NSF funded cybersecurity researchers, IT operations, and institutions including universities, labs, industry members and affiliates can work together to enable the application of cybersecurity research. 
NSF's SATC TTP Ecosystem (Dr. Alec Yasinsac)
NSF is offering substantial resources to support TTP efforts, including TTP training for PIs, match-making services, mentoring services, a best-practices repository, and software development resources. 
A key resource is the SaTC (Secure and Trustworthy Cyberspace) TTP designation. PIs that have mature research results can apply for three year awards up to $500k or four year projects up to $1.2m exclusively to conduct TTP activities. 
In this presentation, we will present the case for TTP, identify the unique aspects of the TTP Designation in the SaTC Solicitation, and will describe elements of the anticipated TTP ecosystem. This talk is relevant for academics of all rank, to research scientists in government and academic laboratories, and industry members that are interested in harvesting NSF-funded cybersecurity research
More information about this presentation is on the event page.

Presentations are recorded and include time for questions with the audience.

Join CTSC's discuss mailing list for information about upcoming events. To submit topics or requests to present, contact us here. Archived presentations are available on our site under "Past Events."

Monday, May 1, 2017

2016 NSF Community Cybersecurity Benchmarking Survey Report

The 2016 NSF Community Cybersecurity Benchmarking Survey Report is now available:

Benchmarking information is frequently used to develop a common understanding of cybersecurity’s status and norms within a community. The purpose of this survey project was to collect, analyze, and publish useful baseline benchmarking information about the NSF science community’s cybersecurity programs, practices, challenges, and concerns. We received 27 responses to the survey including 16 responses from respondents with annual budgets greater than $1M (including 9 responses from the ~25 NSF Large Facilities).

We hope the results and analysis provide some benchmarking insight and inspire discussion.

Thursday, April 27, 2017

OSiRIS Engagement Summary

OSiRIS (Open Storage Research Infrastructure, NSF award #1541335) is a multi-institutional project aimed a providing a distributed storage infrastructure that allows researchers to manage and share data from their home computing facilities with other partner locations. The University of Michigan, Michigan State, Wayne State, and Indiana University are working together to develop the transparent, high-performance storage infrastructure which will be available to connected locations on participating campuses. The project will provide data sharing, archiving, security, and life-cycle management, all implemented and managed with a single distributed service.

In October 2016, CTSC began an analysis of the new OSiRIS Access Assertions (OAA) design. CTSC and OSiRIS staff worked together via a series of weekly phone calls to review the design of the authentication and authorization framework for OSiRIS. As OSiRIS is an open-source project, all design documentation and related code for OAA is available on GitHub.

Since the OAA design was at an early stage, CTSC asked OSiRIS staff to document the various use-case scenarios which would be addressed by OAA. This resulted in a set of requirements needed by scientists (end-users), system administrators, and network administrators.

Next, CTSC began the review of the core OAA system. It was discovered that OAA borrows concepts from OAuth 2.0 (RFC 6749), including JSON Web Tokens (RFC 7519) and the practice of issuing short-lived access tokens and long-lived refresh tokens. The resemblance of OAA to OAuth 2.0 inspired the team to use the OAuth 2.0 Threat Model and Security Considerations (RFC 6819) as an evaluation framework for the OAA system. Over the course of several weeks, the OSiRIS team used recommendations from the OAuth 2.0 Threat Model to make modifications to the evolving OAA design, as noted in the final engagement report.

The above swim lane diagram, produced by the OSiRIS team during the engagement, helped the CTSC team understand the OSiRIS Access Assertions (OAA) design.

After the review of the core OAA design, the review shifted to the integration of OAA with other OSiRIS components including Ceph and NMAL/perfSONAR. As the integration is still in an early phase, CTSC staff reviewed the integration design for potential issues drawing on knowledge of similar analyses in the past.

OSiRIS is using COmanage Registry for managing groups and roles for researchers and administrators. CTSC staff has significant experience with COmanage, so several conference calls were of the question-and-answer variety where OSiRIS staff were able to ask detailed questions about COmanage and how to best leverage the power of the software for their particular scenarios.

CTSC's involvement early in the design and implementation phase enabled the OSiRIS developers to incorporate several security recommendations before development had proceeded to a point where change would have been painful. CTSC identified no significant weaknesses in the resulting design. CTSC encouraged OSiRIS to apply for a follow-on engagement after implementation is complete, to review design changes that may have occurred during implementation and initial deployment.

Edited to add: See also the OSiRIS blog post on our engagement.