IWS - The Information Warfare Site
News Watch Make a  donation to IWS - The Information Warfare Site Use it for navigation in case java scripts are disabled

SUBCOMMITTEE ON GOVERNMENT EFFICIENCY, FINANCIAL MANAGEMENT AND INTERGOVERNMENTAL RELATIONS

Congressman Stephen Horn, R-CA Chairman


Oversight  hearing on

"What Can be Done to Reduce the Threats Posed by Computer Viruses and Worms to the Workings of Government?"

August 29, 2001

Testimony of 

Peter G. Neumann
Principal Scientist, Computer Science Laboratory
SRI International

before the 


Subcommittee on Government Efficiency, 
Financial Management 
and Intergovernmental Relations 

 

Summary

This is the fourth time I have provided testimony for a U.S. House of Representatives committee relating to computer-communication security, the previous three having been in Washington D.C. [1,2,3] in 1997, 1999, and 2000. The situation has not been noticeably improving; indeed, we seem to be falling further behind.

Although there have been advances in the research community on information security, trustworthiness, and dependability, the overall situation in practice appears to continually be getting worse, relative to the increasing threats and risks -- for a variety of reasons. The information infrastructure is still fundamentally riddled with security vulnerabilities, affecting end-user systems, routers, servers, and communications; new software is typically flawed, and many old flaws still persist; worse yet, patches for residual flaws often introduce new vulnerabilities. There is much greater dependence on the Internet, for Governmental use as well as private and corporate use. Many more systems are being attached to the Internet all over the world, with ever increasing numbers of users -- some of whom have decidedly ulterior motives. Because so many systems are so easily interconnectable, the opportunities for exploiting vulnerabilities and the ubiquity of the sources of threats are also increased. Furthermore, even supposedly stand-alone systems are often vulnerable. Consequently, the risks are increasing faster than the amelioration of those risks.

Discussion

There are quite a few realistic but sometimes dirty truths that remain largely unspoken and under-appreciated.

  • Secure information systems and networks are extremely difficult to design, develop, operate, and maintain. Although perfect security is inherently impossible (especially when insider threats are considered), what we have today is a far cry from what is straightforwardly possible. System developers, and particularly mass-market software developers, are not adequately addressing the underlying security needs of computer-communication technologies.
  • Computer-communication systems and their development processes are becoming increasingly complex, which runs counter to security. Ideally, it should be possible to configure less complex systems specifically tailored to their given requirements, perhaps as stark subsets of generic secure systems, rather than continually adding more functionality without security.
  • Our critical national infrastructures -- including our information infrastructures -- are not only vulnerable, but highly at risk, as was noted by the President's Commission on Critical Infrastructure Protection (PCCIP) [4] in the previous Administration. The risks pointed out then are essentially all still present today, and have not substantially diminished. In some senses, the risks may be greater because of increased opportunities for exploitation of the vulnerabilities.
  • The Internet is an enormous distributed system. It is international in nature. U.S. laws intended to outlaw bad behavior here seem to have relatively little effect in thwarting malicious activities from off-shore. Because of generally weak information security, threats arising from anywhere in the world are often very difficult to trace accurately. Improving the dependability and security of our computer and communication systems would be a good place to start, with sensible uses of cryptography, less easily bypassed user authentication, and meaningful accountability (for example). Laws and law enforcement do have roles, but cannot be the primary means of discouraging misuse.

Internet-connected systems are especially vulnerable to viruses, worms, Trojan horses, e-mail letter bombs, calendar-time bombs, and other malfeasant attacks, and remain so despite nominal improvements. The long history of relatively simple-minded mail bombs (Melissa, ILoveYou, SirCam) and other attacks such as the recent Code Red variants suggest that much more destructive attacks can easily be conceived and perpetrated. Denials of service and especially widely distributed denial-of-service attacks are easy to mount, and can be quite debilitating. However, much more serious system subversions are also easy to perpetrate.

Education relating to computer systems and computer security is woefully inadequate. The technical field has developed very rapidly, and education is always hard-pressed to keep up. But the problems are particularly vital with respect to systems with critical requirements. For example, developers of secure systems, ultra-reliable systems, life-critical systems, and other systems with stringent requirements need to be more than merely competent; extensive backgrounds in dependable software engineering are required. In some cases, an understanding of mathematics far beyond what the average college student receives is necessary. System administrators are generally unprepared for the sophistication required to deal with the flawed system security and weak configurations; the steady flow of security patches attempting to fix earlier flaws often remain uninstalled. Managers often do not have a clue. Legislators need to have a much better understanding of the social and technical implications. Some people have advocated certification of developers and programmers; however, this is a very contentious matter, which if adopted badly could easily create a sense of false security. Overall, much greater emphasis on education is needed, for training would-be experts and illuminating less technical folks as well.

Outsourcing of critical functionality to people who must be trusted even if they are not trustworthy is a riskful strategy, although it is being increasingly used in various branches of government. Dependence on questionable outsiders for software development, operations, maintenance, and administration presents many additional risks. DoD outsourcing of critical system administration functionality and the recent use of foreign nationals for the Year-2000 remediation of air-traffic control software (apparently unbeknownst to the technical people at the Federal Aviation Administration) are recent examples of potential risks.

In general, seemingly simple solutions are often not effective. They are misleading, and tend to offer a false sense of security. Several examples are given here:

--The existing Federal Digital Millennium Copyright Act (DMCA) and the emerging Uniform Computer Information Transactions Act (UCITA, either passed or under consideration in various states) both seem to be having a chilling effect by seriously impeding the research community from helping to improve security, and by allowing system developers and vendors to hide behind inferior security. Also, genuinely well-intentioned whistleblowers are increasingly finding themselves threatened with prosecution.

--Past government efforts to prevent or impede the use of strong cryptography have seriously retarded progress in security. Cryptography and strong security should have been routinely embedded into our standard protocols and products, but unfortunately this has not happened. Security is extremely difficult to retrofit into systems that are fundamentally flawed. It should not be surprising to anyone that many cryptographically enhanced systems are so easily broken.

At the moment, there is a mad rush to try to replace punched-card ballots and their vote-counting systems with all-electronic voting systems. However, today's fully electronic voting systems (such as Direct Recording Equipment, DREs) and especially Internet voting software all have a fundamental lack of meaningful accountability. Because of the absence of user-verified independent audit trails, there is typically no assurance whatever that a vote as cast is identical to the vote as counted. Although some people have hope that this serious deficiency could be overcome in the future, it may be possible only at the sacrifice of voter privacy. In addition, Internet voting adds opportunities for election fraud from anywhere in the world, not just locally within a given precinct. Proprietary electronic voting and Internet voting systems are both highly susceptible to insider fraud that can seriously alter the results of elections; in addition, Internet voting is especially susceptible to bogus polling places and fraudulent voting software, plus hacker attacks, viruses, worms, calendar-time bombs, and external denial-of-service attacks (to mention just a few security risks). The proprietary nature of the election software results in voters having to trust software that is seldom subjected to external scrutiny. However, even open examination of the software would not be enough to prevent election fraud. I have grave doubts that fully electronic voting systems will ever be adequately fraud resistant. Interestingly, the problem of attaining high-integrity election systems is a paradigmatic example of the general system security problems, opening up many of the usual problems -- inadequate requirements, lack of adequate standards, unverified proprietary software, and many unchecked operational problems.

Attempts to hinder Internet spamming attacks (with potentially huge amounts of unsolicited and often offensive e-mail) by legislation requiring filtering are always going to be of limited effectiveness. Simplistic spam filters are usually counterproductive, as they have often filtered out such content as the Bible, encyclopaedias, valuable Web sites and people's names because they contained some particular character string (Sussex and Essex are common examples), and other generally desirable materials.

 

Conclusions

One conclusion from the above discussion is very simple: we are not progressing sufficiently in our attempts to achieve acceptable information security. Essentially everything I wrote in my 1995 book [5] about computer-related risks -- and particularly security risks -- still seems to apply today.

A broadly coordinated effort is needed, not just palliative measures. In principle, technological problems need technological solutions, not legal solutions. Legal problems need laws and enforcement, not technological solutions. In general, technologists are better at understanding the technical problems, and similarly for the legal communities. Mismatched solutions tend not to be effective. However, many of our emerging problems require a careful combination of approaches cognizant of the full spectrum of social, economic, technological, legal, and other needs. Nevertheless, at the very minimum, we need vastly improved security, reliability, dependability, and survivability in the face of adversity, in the computer and communication systems on which we critically depend for so many things.

It is unfortunate that many important research advances are not finding their way into practice. In the research community, we have known how to do much better for a long time. For example, many approaches for developing and operating vastly more secure systems and networks can be found in a recent report [6], including system and network architectures that sharply reduce the necessity for trusting potentially untrustworthy components and individuals, while also realizing extensive interoperability and ability to evolve over time while still fulfilling the desired requirements. However, many factors have contributed to our having less information security than we deserve, including (for example) U.S. Government's past restrictions on cryptography policy, the House's predominant concern with the immediate future rather than looking farther ahead, corporations often determined to deliver functionality without regard to security, customers lacking awareness of the risks, and a general lack of commitment to progress.

What Might Congress Do?

To begin with, Congress should avoid repressive legislation that disincentivizes better security, as has been the case for example with past constraints on the use of cryptography and the implicit sanctioning of weak systems. Unfortunately, on the other hand, leaving progress solely to the marketplace evidently does not work, because there are very few financial incentives to significantly improve security in the absence of serious government and customer demands. The DMCA legislation is already causing enormous grief in dumbing down progress and hampering the research community's ability to inspire improved security; that needs to be revised.

There are various roles that the National Institute of Standards and Technology (NIST) could play, particularly in the development of relevant interoperable vendor-nonspecific security standards. Although the Common Criteria are emerging as a potential framework for security, there is still much to be done to make that process realistic. For example, NIST (when it was the National Bureau of Standards) was actively involved in election standards; a serious application of the Common Criteria to voting systems would be a major step forward. H.R. 1165 could be a possible step in that direction for security standards of general applicability.

Another direction to consider would be liability legislation. Emerging one state at a time in state legislatures, UCITA among other things allows information-system developers and vendors to disclaim essentially all liability for failures of their products. Perhaps Federal legislation that imposes strict liabilities and consequential damages for grossly negligent system development and flagrant corporate misbehavior would go a long way toward ratcheting up the dependability, reliability, and security of our information infrastructures.

Relevant research and development efforts are still needed to provide the basis for dramatically increasing the security and reliability of our computer systems and networks. However, that research also needs to find its way into systems that are procured by the U.S. Government, setting a good example for others.

Improved computer-related education is an area strongly in need of support, to attempt to overcome many of the problems noted above.

Overall, there are few incentives today for the development, operation, and maintenance of robust, secure, reliable computer-communication systems that are so badly needed as a basis for our future. That needs to be corrected.


References

(Hot links to the references are included in the Web version of this document: http://www.csl.sri.com/neumann/house01.html).

1. Peter G. Neumann, Computer-Related Risks and the National Infrastructures. U.S. House Science Committee Subcommittee on Technology, 6 November 1997. In The Role of Computer Security in Protecting U.S. Infrastructures, Hearing, 105th Congress, 1st session, No. 33, 1998, pages 64--99, ISBN 0-16-056151-5, 1997, preceded by the oral presentation on pages 61--63. Oral responses to oral questions are on pages 101--118, and written responses to subsequent written questions are on pages 148--161. ( Written testimony at http://www.csl.sri.com/neumann/house97.html and written responses to written questions at http://www.csl.sri.com/neumann/house97.ans )

2. Peter G. Neumann, Melissa is Just the Tip of a Titanic Iceberg. Written testimony, for the U.S. House Science Committee Subcommittee on Technology, hearing on 15 April 1999. ( Written testimony at http://www.csl.sri.com/neumann/house99.html)

3. Peter G. Neumann, Risks in Our Information Infrastructures: The Tip of a Titanic Iceberg Is Still All That Is Visible. Written testimony, for the U.S. House Science Committee Subcommittee on Technology, hearing on 10 May 2000, introduced into the record by Keith Rhodes of the General Accounting Office on my behalf. ( Written testimony at http://www.csl.sri.com/neumann/house00.html) 

4. Tom Marsh (ed), Critical Foundations: Protecting America's Infrastructures, President's Commission on Critical Infrastructure Protection, October 1997. (CIAO Web site at http://www.ciao.org and PCCIP report information at http://www.ciao.org/PCCIP/index.htm)

5. Peter G. Neumann, Computer-Related Risks, Addison-Wesley, 1995. 6. Peter G. Neumann, Practical Architectures for Survivable Systems and Networks, SRI report for the U.S. Army Research Laboratory, 30 June 2000. ( html, PostScript, and pdf versions available at http://www.csl.sri.com/neumann)