Updated: Aug 8, 2020
In this section in each issue, industry experts share thought leadership on the latest innovations and developing technology and how to manage it. So far, 2012 has seen rapid advances in technology and policy for government contracting companies, particularly in the areas of cyberspace rules of engagement, the effects of FISMA updates, and the success of Cloud First.
DoD Cyberspace Rules of Engagement
Today’s battlefield has extended to cyberspace, with cybersecurity no longer a defense-only mission. With the Pentagon’s DARPA enlisting proposals for Plan X, for instance, attacks on the virtual battlefields will soon be forming a critical prong of any offensive. According to Army Gen. Keith Alexander, commander of the U.S. Cyber Command, “The risks that face our country are growing faster than our progress, and we have to work hard on that.” As the Department of Defense looks to updating its 2004 cyberspace “rules of engagement” to clarify its role, responsibilities, and capabilities in defending its own networks and civilian resources, plans are in the works on initiatives such as network resiliency efforts, cryptographic systems, and department-wide cybersecurity standards and technologies, as well as consolidation of IT infrastructure, networks, and more to improve effectiveness and efficiency. Over at least the next year in this new environment, questions will abound: When and how can the military engage? What proactive responses to threats will fit these new rules? Leaders in the field discuss how these and other complications might be addressed.
Keith Rhodes, chief technology officer, services and solutions QinetiQ North America
The root challenge to the U.S. government is that “proaction” assumes that one knows and understands what one sees. During the Cold War, we watched actions near strategic borders, and we had action plans that escalated based on what opponents were doing.
In the borderless virtual world, movements are short-lived and neither easily attributable nor understandable. For example, the impact of “proactive defense” is not clearly definable in terms of collateral damage, because virtual defense and offense may be the exact same action, and what looks like an opponent may be just a “downstream victim.”
Only government can determine what is acceptable in terms of a cyber attack. The industry’s role is to explain and show the difference between the possible and the probable, so that the government can understand the impacts of certain actions better. From this understanding, the government can define a successful engagement. It can determine the rules for a proactive defense and make it easier for the industry to understand what they can legally do to help the government meet its mission.
Bill Varner, president and chief operating officer, mission, cyber, and intelligence solutions group
There is little doubt that the traditional battlefield of air, land, and sea has evolved to air, land, sea, and cyberspace. It is a vector of attack against our nation on a daily basis. As with the more traditional battlefield, weaponry must be developed and updated to combat the threats. The ability to strike back will need to be an option that, as a last resort, can be exercised if necessary.
Obviously, there are challenges associated with this option. The first is that, in order to develop sustainable alternatives, the legal framework would need to be examined. That’s not to say it cannot be worked within the existing framework, but it might make the options less useful. No matter how the legislative process shakes out, these options will have to be carefully thought out and require checks and balances.
More importantly, and far more complex, is determining who else is sharing the infrastructure that might be targeted. Much of the world’s cyber infrastructure is operated by the private sector. A key consideration has to be who else could be impacted. Given the global nature of our communications, it is likely that U.S. companies and interests are also sharing those same communications channels. It is paramount that the weaponry of the future be very targeted and create as little collateral damage as possible to critical infrastructure or even economic impacts.
The federal government has always relied heavily on outside expertise to create cyber capabilities. The demand in the private sector for the talent needed to develop techniques and capabilities in this space makes it challenging for the federal workforce to retain that talent internally.
Wendy Martin, vice president, advanced information solutions Harris Corporation
The primary challenges to offensive cyber protection are technical difficulty and policy ambiguity. Industry is working with the government to determine strategies and identify technologies that, when coupled with an approved concept of operations, will work effectively to deal with cyber attacks proactively. We feel that, in order to be successful, the proactive cyber actions must occur with both accurate attribution and positive control of action.
Growing, training, and retaining cyber talent is also critical, and industry is helping by partnering with education institutions to promote science, technology, engineering, and math (STEM). Focusing on these hard sciences today helps grow tomorrow’s cybersecurity innovators. We find that today’s cyber experts gravitate towards positions that offer the most challenging and intrinsically rewarding experiences, while allowing them to make a real difference to our country’s public safety and national security.
Charlie Croom, vice president of cybersecurity solutions Lockheed Martin Information Systems & Global Solutions
In many cases, network enterprises were formed through multiple network consolidations over time. The security solutions for these networks were not implemented as a cohesive whole, but acquired in piecemeal fashion. Our focus is on providing solutions built on three pillars: integrated solutions, proactive services, and resilient systems.
Integrated solutions meld commercial-point solutions into a seamless security fabric stretched across the software and hardware enterprise. If seams remain, we close them with added layers of protection. Fortunately, today’s automated technologies are providing integrated solutions, delivering greater security, and doing it more efficiently. But integrated solutions are not sufficient to defend against the most sophisticated threats.
Proactive services must be employed to get ahead of sophisticated attacks. Our capability is built on an intelligence-driven defense called the “cyber kill chain,” where understanding the adversary and taking advantage of their persistence and repetitive behavior allows us to anticipate and block their next steps.
Despite best efforts, intrusions will continue to occur and systems must be designed to be resilient and operate through an attack. Approaches include redundancy and self-healing where a system either repairs itself or returns to a trusted state.
About 10 years after the original Federal Information Security Management Act (FISMA) guidelines were determined, FISMA 2012 standards currently going through Congress respond to the need to strengthen the federal government’s security framework against increasingly sophisticated and dangerous cyber attacks. Predominant among these standards is the requirement for near-continuous monitoring for threats before they occur. Even as 64 percent of agencies polled in a survey said that they believe continuous monitoring will improve their IT security status, fewer than half said they expected to be ready by this month. We asked several leaders in the field about the challenges and opportunities they see posed by FISMA 2012.
JR Reagan, federal chief innovation officer, Deloitte & Touche LLP
The proposed amendment to FISMA provides a comprehensive framework for protecting our nation from current and future cyber threats. It gives the government the fundamental building blocks for advanced security capabilities to mitigate increasingly sophisticated threats.
Agencies need to prepare for automated continuous monitoring through uninterrupted, ongoing, real-time or near real-time processes to protect their assets. Implemented correctly, this affords agencies an opportunity to capitalize on existing IT security investments and tool sets and to leverage industry-leading practices and methodologies for data management and integration. Taking a risk-based approach presents agencies with the opportunity to balance IT spending against cybersecurity risk and to have the ability to evaluate and monitor the areas that present the greatest threats to their missions.
Clients may also want to consider outsourcing cyber operations to augment their own internal appraisals. It can be very valuable to have a view provided from outside the network perimeter.
Sam Chun, director, cybersecurity practice, U.S. public sector HP Enterprise Services
The bipartisan FISMA Act of 2012 recently introduced in the House is less of an overhaul of FISMA than one that codifies and expands (through parallel frameworks for nation security systems) existing continuous monitoring requirements already being driven through by the Office of Management and Budget.
The transition away from the labor-intensive annual checklist process of FISMA 2002 to the real-time continuous monitoring and automated submission through Cyberscope—though challenging to implement—is expected to improve visibility and insight, thereby improving federal security posture while simultaneously reducing the efforts. Moving away from a monumental, episodic paperwork process to one that focuses on operating effectiveness through automation is definitely a positive regardless of the task (cybersecurity included).
The big challenges that agencies will face in implementing the FISMA update (assuming it is signed into law) will be similar to the implementation of continuous monitoring. Without robust support from OMB and Congress in the form of additional budget authorizations, agency leaders will be compelled to make hard choices to comply with the new law, potentially sacrificing security. Specifically, the Congressional Budget Office estimates that over the next five years, the proposed FISMA updates would cost agencies an additional $710M to implement. With the mantra of the current federal CIO, “do more with less,” the benefits of the act may have to take a back seat to federal budget realities.
Tiffany Jones, director, public sector programs & strategic initiatives Symantec Corporation
As the Federal CIO report on FISMA metrics accurately states, the Federal cybersecurity defensive posture “is constantly shifting because of the relentless dynamic threat environment, emerging technologies, and new vulnerabilities.” Under the FISMA 2012 metrics, Federal Agencies will be required to implement “automated and continuous monitoring” of IT systems, and specific performance metrics/targets are defined.
Meeting these targets means less dependence on human involvement and more reliance on best practices and the latest technologies to contribute to “near real-time” monitoring and action. This change will increase the overall security posture for agencies against more sophisticated threats, while also providing opportunities for companies to contribute capabilities to the agency enterprise environments. The work being done by Department of Homeland Security’s Federal Network Security team, National Institute of Standards and Technology, OMB, and others is moving us closer to a state of continuous monitoring and situational awareness—beyond just C&A to a state of security automation and risk management.
A major near-term challenge is the looming federal budget stalemate that could bring automatic program cuts under sequestration. This could have significant impacts on agency spending and their overall cybersecurity posture. Another challenge is the ability of agencies to rapidly make the required changes to meet the FISMA metrics/targets, improve their security posture, and ultimately better protect government and citizen data. Last, visibility into accountability for how agencies are meeting performance targets is important to ensure cybersecurity requirements are being met.
Cloud First Check-in
Cloud First…or second, or third? What’s the take from those who are working to support cloud readiness? When Cloud First was introduced, in 2010, as part of the Office of Management and Budget’s 25-point plan to reform federal IT, it generated excitement for potential transparency, efficiencies, and cost-savings. However, a recent study by Deltek revealed varying levels of “cloud readiness” across federal agencies. With the big changes in thinking about the government’s role, establishing standards, and working out security requirements—and, sometimes, additional hardware and software expenses—the leap to the cloud may be more like a climb for some. Here, leaders give a balanced perspective on the rate of progress—and share ideas on how to keep the steady momentum going.
Chris Smith, U.S. federal chief technology and innovation officer Accenture Federal Services
A current view of the market shows that commercial entities are adopting cloud-based solutions at a more rapid rate than are governments around the world. While perhaps conservative by industry standards, the U.S. federal government is making solid and steady progress in implementing cloud solutions. Multiple agencies are running internal private cloud services, and a number of agencies already have or now are adopting public cloud services in the messaging, unified communications, and customer relationship management arenas.
One of the most important steps an agency can take to increase the rate of cloud adoption and more rapidly achieve increased productivity, agility, and cost savings is to develop a robust cloud strategy. This strategy aligns private internal cloud development with enterprise data center consolidation as well as public cloud offerings.
A well-articulated strategy, architecture, and operational model will allow agencies to take advantage of disparate public and private cloud environments and better meet mission requirements. The common perceived barriers to entry such as security, maturity of cloud offerings, and data protection can be clearly identified, quantified, and mitigated with thoughtful planning and well-managed execution.
Yogesh Khanna, vice president, chief technology officer CSC North American Public Sector
There are several factors that U.S. federal agencies continue to grapple with as they strive to adhere to the Cloud First policy.
Chief among the concerns is security. Most federal CIOs are still uncomfortable with their data commingled with other clients. There is a general lack of trust in multi-tenant clouds. The FISMA controls help, but the process to certify and accredit cloud services takes too long and is expensive. FedRAMP is designed to address this issue, but that program is just getting off the ground, and it will take some time before agencies start harnessing its benefits.
The CIO’s operations staff at most agencies fear losing control of IT infrastructure, and in some cases their applications and data. The desire to maintain control and visibility into the day-to-day operations often leads to requirements that are not aligned with cloud computing business models.
There are some in the federal government who are quietly questioning the economic benefits of cloud services. Cloud computing represents yet another service delivery vector for most CIOs. No agency is about to put every workload in its environment in a cloud. Quite a bit of the legacy infrastructure and applications will likely remain outside the cloud and still have to be supported and maintained.
While there are economic benefits to moving a specific workload to a cloud, the expense to oversee or manage the cloud service is additive to the cost of servicing the non-cloud environment. So, for some in the federal government, if the economic benefits of migrating to the cloud do not help reduce the overall cost of delivering IT services, status quo takes hold, and the pace to change and adopt cloud services slows down.
John Lee, vice president, cloud solutions Carahsoft Technology Corp.
It’s been our experience that agencies are in fact moving to the cloud in great numbers and are already achieving significant benefits from doing so. We’re seeing significant progress, for example, across organizations within the Department of Energy, the U.S. Army, and the U.S. Food and Drug Administration. They’ve reduced IT complexity and cost and are now building out secure, flexible private, public, and hybrid clouds using VMware and virtualization as the underlying cloud enabler.
What factors make these agencies and others like them agile and successful in their adoption of cloud technologies?
First, they have adopted a virtualization strategy that gets them from internal flexible private clouds and uses that platform to continue the journey to the cloud. Second, they’ve clearly defined their business processes and workflows and are laser-focused on the business problems cloud technology can help them solve. Third, they have developed a phased approach to implementation that produces high-impact results and demonstrates immediate productivity and agility enhancements, all with fast returns on their investments. GCE