Coordinate, Not Mandate, Cloud Computing Standards, NIST Told
Cloud computing is so hot that too many groups are trying to develop standards for the technology, imperiling broader adoption, industry representatives said Thursday at a National Institute of Standards and Technology workshop in Washington. But the proper role of NIST is to aid the “de facto efforts around standards” among these groups, including by coordinating, not writing its own rules, said Tim Mather, a founding member of the Cloud Security Alliance and a former executive at RSA and Symantec. The government must be careful not to “squelch” innovation in taking new approaches to cloud computing, said Stephen Schmidt, Amazon Web Services chief information security officer and a former FBI section chief.
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
"We're perhaps a third of the way through a 10-year shift” to cloud computing, said David Campbell, a technical fellow at Microsoft. Having provided proven savings in “high volume hardware,” the technology shows real potential in the “re-architecting” of information systems themselves, he said. The technology makes trying new ideas fast and inexpensive, reducing the “cycle time” for innovation, Schmidt said. Leasing out computing resources from a huge online commerce platform “wasn’t a radically new concept” but Amazon was the “catalyst,” said Jim Blakley, Intel’s director of data center virtualization and cloud computing. “I don’t think we've even seen the beginning” of rapid cost declines and virtualization, he said, projecting about 20 years of cloud evolution.
But some hurdles remain, such as slow Internet access speeds, Blakley said. At a recent conference of rural hospitals, several said their upload speed was 1.5 Mbps, which is “very prohibitive” to cloud services, he said. Schmidt identified “psychological challenges” posed by IT workers who want to keep close control over what they consider their computers, instead of working with remote or virtualized servers. They must be persuaded of benefits from using the cloud, such as “full visibility” into an entire system from one location and helping employees to do their jobs better instead of being the target of gripes, he said. Many IT workers rightfully fear for their jobs -- but they must “get over it,” Mather said to laughs.
Without standards specifically for cloud computing, private efforts abound, Mather said. “There’s too many of these things and we need some consolidation around those efforts.” The government’s new FedRamp program for certifying once tech products that all agencies can use (see separate report in this issue) is a good start that “should have been done 20 years ago,” but it’s important to coordinate with business and with the European Union and its GovCloud initiative, he said. “This is all about economies of scale,” and government use of the cloud won’t be helped by “a bunch of one-off efforts.” With its move into cloud computing, especially the security and compliance aspects, the federal government will bear the “pain” of the standards process so the transition will be smoother for future users, Blakley said.
The problem isn’t how to define cloud computing, said Ari Schwartz, chief operating officer at the Center for Democracy & Technology. “We're well past that at this point.” The problem is jurisdictional issues raised by a technology that doesn’t fit into legislative frameworks such as the Privacy Act of 1974 and the Electronic Communications Privacy Act of 1984 created for earlier systems that weren’t networked, he said. ECPA has 17 standards for information access, and law enforcement won’t necessarily notify users if their cloud-stored information is accessed, he said.
Settling on “global norms” is important but only to a point, Schwartz said, warning of the danger of picking technology winners and losers. U.S. mediation with other countries on conflicting de-facto and de-jure standards would be helpful, Schmidt said. Campbell asked for help in distinguishing commercial interests from “national” interests in countries that are new to IT planning. FedRamp is important to Amazon because it will simplify the process of designing products to each agency’s specifications, Schmidt said. Imagine tweaking the same product for its 900,000 commercial customers, he said. Interoperability and compatibility concerns are holding back providers from “doing things in a big way,” Blakley said, asking for NIST to involve them in a “community process” instead of working with a few big providers behind closed doors.
"I get a little nervous when I hear ‘cloud computing’ and ’standards’ in the same sentence,” Mather said. NIST can give the guidance and set the framework, but “the market has to be free to innovate here.” It’s not bad that traditional standards organizations don’t “move at the speed of light,” he said. The government can use its purchasing power to shape standards, Campbell said. Perhaps most relevant for NIST is settling disputes over interpreting federal rules, Schmidt said. “Let’s make the same words mean the same things” across agencies and even within them, since some departments’ bureaus disagree with each other. That’s exactly the problem in interpreting cloud providers’ service-level agreements, Campbell said: If it takes a couple minutes to reach a particular resource, does that violate a contractual pledge of 99.999 percent “availability"? There’s no common definition for a “security incident,” either, Mather said. But Blakley said it’s “really too early” to talk about such standards.
NIST can start laying the groundwork for standards in areas such as identity management, Campbell said. Blakley said NIST should consult the Distributed Management Task Force, led by Intel, VMWare, Cisco and Microsoft among others, before settling on how to administer cloud environments. Mather recommended work on cryptographic standards and other matters “down in the weeds,” to keep the development of “inter-clouds” from being hampered. “We're frankly stuck with a scalability problem” otherwise. Mather also said NIST should start reviewing which standards are outdated and should be ditched -- the “baggage” -- in consultation with agency CIOs.
Tim Grance, NIST’s cyber and network security program manager, called much of the worry unwarranted. NIST rarely devises standards on its own, because its charge is to develop standards through “open, voluntary industry consensus” processes, Grance said. The problem isn’t necessarily outdated standards but how they're used in risk management, Curt Barker, NIST chief cybersecurity adviser, told an Oracle employee in the audience who complained about rules blocking “collocation” of applications on the same server. If an agency takes a list of 200 NIST controls and tries to implement all of them, “we'll break all of our systems,” he said. Each application of controls should be system-specific, and NIST should work more closely with agencies to make them that, Barker said. -- Greg Piper
NIST Cloud Computing Notebook
The federal government needs vendors’ help to devise standards to spread cloud computing across agencies more, federal CIO Vivek Kundra told the workshop. Agencies will benefit from new governmentwide initiatives such as FedRamp, in which all agencies can use products that any has certified, instead of vendors’ needing separate certifications for each, he said. Kundra rattled off notable successes in cloud computing. The SEC replaced decade-old technology with Salesforce.com to help investors, cutting the time to process new cases to 7 days from 30 and giving officials a single platform to track cases, upload documents and share them, he said. The old system could take 10 seconds to recognize input “keystroke to keystroke” but the new one operates closer to real time. Recovery.gov, set up to track stimulus spending, moved this week to Amazon Web Services at a substantial saving, forgoing new servers and data centers, Kundra said. It’s the first governmentwide program in the cloud, he said. “It cuts horizontally, every single department … and it also cuts vertically, to the state and local level.” Utah’s government has moved from 1,800 servers to 400 virtual servers under state CIO Steve Fletcher, with projected savings of $4 million in a $140 million budget, and is offering services to local governments, Kundra said. Los Angeles adopted Google Apps to get video and instant messaging features, as well as 25 times more storage with redundancy, with “iterative upgrades built into the products themselves.” The Department of Health and Human Services took six weeks to implement Salesforce for 2,000 users working on health IT implementation, a process that otherwise would have taken years, he said. Most importantly for Interior Secretary Ken Salazar, he can e-mail all his employees at the same time from a cloud platform, Kundra said. “They were spending a fortune, and they had a ridiculous ratio of employees to servers,” roughly six-to-one. “It’s real, it’s here, it’s part of the policy framework,” and cloud computing is being implemented in a “responsible and methodical way,” Kundra said. “We're not simply webifying” everything.