IWS - The Information Warfare Site
News Watch Make a  donation to IWS - The Information Warfare Site Use it for navigation in case java scripts are disabled

Cybersecurity & Consumer Data: What's at Risk for the Consumer?

Subcommittee on Commerce, Trade, and Consumer Protection
November 19, 2003
10:00 AM
2123 Rayburn House Office Building 

Ms. Mary Ann Davidson
Chief Security Officer
Oracle Corporation
500 Oracle Parkway
1OP5
Redwood City, CA, 94065

Mr. Chairman, Ranking Member Schakowsky, and members of the Subcommittee, my name is Mary Ann Davidson, Chief Security Officer of Oracle Corporation. Thank you for inviting me here again to talk about cybersecurity, and specifically, the efforts all of us can take - as information technology consumers, producers, caretakers and policymakers - to advance information assurance.

As you know, I appeared before this subcommittee just a few months after the ghastly events of September 11th. In the shadow of one of the most tragic terrorist attacks in history, all of us contemplated the potential catastrophe caused by cyberterror on a massive scale, and the need for all of us to take far greater responsibility toward better information assurance.

While we have yet to witness a point-and-click terrorist attack, we have experienced, through CodeRed, Blaster and Sobig.F, its forebears, with billions of dollars in damage and lost productivity. These attacks are a grim reminder of what I warned this subcommittee two years ago: Far too much commercial software is built without attention to information assurance principles, leaving many of our national cyberassets - most in private hands - vulnerable to attack.

This vulnerability increases every day. Bounty money may result in the arrest of one or two of those responsible for cyberplagues, but it won't slow the development of advanced hacking tools, or change our increasing dependence on Internet-based platforms to administer public and private enterprises - two trends that are at the heart of our growing vulnerability. We are in our own version of an arms race, and the bad guys are winning.

For the information technology industry, our contribution to cybersecurity is straightforward: to achieve a marketplace and an industry culture where all commercial software is designed, delivered and deployed securely. There are no 'silver bullets' to get there. A culture of security will require years to achieve and decades to maintain. Good intentions are not good enough and frankly, can do more harm than good. We already have seen one instance, in California, where a cyber-related event triggered a rush by the legislature to impose reporting requirements on security breaches. This law was passed without a fundamental understanding of the limits of current technology, and arguably could make consumer data more vulnerable to unauthorized access. It's not good intentions, but sound ideas that we need from government, and fortunately, there are a number of constructive steps the federal government can take, as both a software buyer and policy-maker to move us toward a culture of secure software.

Let the buyers be wary. Try as you might, Congress can't legislate good software. Those in a position to make a difference for the better are software consumers, from small business enterprises to big government agencies. All they have to do is make security a purchasing criterion. We at Oracle made the investments to integrate security throughout our development process because our customers asked for it. Our first customers, the intelligence community, who I affectionately call the 'professional paranoids,' are some of the most security-conscious people on the planet.

After ten years of an on-again, off-again merry-go-round by the federal government to become a more responsible software buyer, we are seeing constructive action being taken by the Defense Department to enforce a pro-security approach to software procurement known as NSTISSP #11. Simply put, for national security systems, an agency can only purchase commercial software that has been independently evaluated under the international Common Criteria (ISO 15408) or the Federal Information Processing Standards (FIPS) Cryptomodule Validation Program (CMVP).

Since NSTISSP #11 went into effect 14 months ago, we've seen several positive developments. First, a number of firms, including several of our competitors, are getting their products evaluated under FIPS or the Common Criteria for the first time. Second, we're seeing firms, including Oracle, financing evaluations of open source products. The security of open source versus proprietary software must not be a religious argument, as it so often is, but a business one. Open source, like proprietary software, is here to stay. We must all work to make it as secure as possible. Third, several industry organizations, such as the financial services industry, are coming together to make security a purchasing criterion industry-wide and are using NSTISSP #11 as a model.

We're seeing all of this because the initial impression from an industry perspective is that the federal government - the largest single buyer of commercial software -- means business this time. As a result, security is now more in the software development consciousness than it was two years ago, and all of us as information technology consumers stand to benefit. That, in and of itself, is a major victory, and credit goes to the people within the Defense Department and intelligence agencies, as well as Congress, who are making a concerted effort to make this process work.

Secure 'out of the box.' NSTISSP #11 is a strong lesson that the federal government, acting as a security conscious software buyer, can change the entire commercial software landscape for the better. That said, are there ways, other than NSTISSP #11, that can accomplish the same purpose? We believe one measure worth considering is for the federal government to insist that the commercial software it buys is either defaulted to a secure setting right out of the box, or made easy for the customer to change security settings, for example, through automated tools that enable customers to become, and remain, secure. For example, the Office of Management and Budget, working in conjunction with the federal agencies, the National Institute of Standards and Technology (NIST) and private industry, could specify what is the appropriate default security setting for the software it buys, or require appropriate and easy-to-use tools needed to change these settings.

Software Underwriters Lab. Government can be a useful vehicle to promote voluntary cooperation in the name of better security. For example, the Federal Trade Commission could work with the software industry to establish the software equivalent of the Underwriters Laboratories (UL). Security evaluations under the Common Criteria, which can cost half a million dollars per evaluation, are not for everyone, especially for many forms of consumer software. A software version of the UL is a cost-effective vehicle to capture less complex, more consumer-oriented forms of software. Again, the fundamental goal is to make all commercial software secure by design, delivery and deployment. To get there, the federal government should work with private industry to establish a consumer software equivalent of the UL. Thanks to the UL, most consumer products are generally difficult to operate in an insecure fashion. For example, Cuisinarts are designed so that you can't lose a finger while the blades are whirling. We don't expect the consumer to do anything special to operate Cuisinarts securely; they just are secure. Similarly, consumers should not be expected to be rocket scientists or security experts. Industry needs to make it easy to be secure.

Better Information for Buyers. There are already several good web sites to help private and public customers understand Common Criteria, FIPS and NSTISSP #11. However, particularly as more and more private customers see Common Criteria as a potential security benchmark, we are finding that what many of our customers need is a one stop, 'go to' site in order to validate vendor security claims and compare them to the evaluation results themselves. It would be useful for a government procurement officer, or a private sector buyer, to be able to see all evaluations of any type, for a single vendor, at a single glance, from a single location, whether FIPS-140 or Common Criteria, whether evaluated here or abroad. This empowers them to make apples to apples comparisons. For example, two database vendors can both receive an EAL4 certification, even though one database vendor made two functionality claims in a security target, while the other database vendor made forty security claims. A clearinghouse would enable buyers to perform security target 'scorecarding' and facilitate this and other types of comparisons.

Academic Research and Professional Development. As in many disciplines, the market alone cannot produce every security solution. A culture of security, like any professional culture, has to have an academic component for professional development, and to advance the field in areas not addressed in the commercial marketplace. For example, even with a good development process, "to err is human." A developer can check 20 of 21 conditions, and if failure to check the 21st causes a buffer overflow, the system is still potentially vulnerable. Keep in mind, hackers only need to find one error, while developers have to anticipate and close every one. It's an uneven battle. Federal government resources directed toward academic talent can work with industry and level the playing field.

One area that deserves attention, especially as more and more US firms partner with foreign countries on software development, is research on effective tools that can scan software and pinpoint irregularities or backdoors in the code. Unfortunately, this type of product research and development is not seen as an attractive option among venture capitalists, who generally channel their funds toward products that are nothing more than techno-band-aids for security faults. In other words, the market mentality toward information assurance is focused on developing a better Band-Aid, rather than an effective vaccine.

Congress last year took an important step in filling this void when it passed the Cyber Security Research and Development Act, which authorizes nearly a billion dollars over five years to invest in projects like code-scanning tools. We are about to enter the second year of this five-year program, and Congress is providing very limited assistance to pursue the goals of this legislation. We hope Congress will increase its investment.

If the medical community could eradicate smallpox with a strong investment in research, we should be able to eradicate buffer overflows. It's just code, after all.

A portion of the proposed investments under the Cyber Security R&D Act is authorized to create or improve academic programs and research centers on computer security in order to increase the number of graduates with this specialty. These kinds of investments are needed. The National Science Foundation reported earlier this year that only seven PhD's in cybersecurity are awarded each year. Research conducted more than two years ago found that while there were twenty-three schools identified as 'centers of excellence' in information assurance, not one four-year university offered a bachelor's program in cybersecurity. Only one associate degree program was offered at two-year institutions. We've seen some progress on this front, but much more can be done if the federal government invested more resources in this effort. The private sector can be a critical support component as well, especially given the current and growing demand for information security professionals among publicly held corporations.

In the IT industry, no one should be able to work on software that becomes part of critical infrastructure without proving that they understand and can demonstrate sound software design, coding and engineering principles. We do not allow engineers to design buildings merely because they use "the coolest materials." They must be licensed professional engineers. Why do we hire programmers to design critical IT infrastructure merely because they know the coolest programming languages? Ignorance and hubris are the enemies of reliable cyber infrastructure. Industry lacks for neither of these, unfortunately, so long as we hire based on what programming languages someone knows, and not whether they speak the language of cybersecurity. We are at war, and all our footsoldiers must be armed with the knowledge of what the enemy can and will do to the unprepared or careless.

A strong academic component in our culture of security also fosters a competitive and diverse culture. Strong competition and diversity will prevent the IT equivalent of the Irish potato famine, where reliance on one strain of potatoes brought on mass starvation and emigration. Similarly, lack of 'biological' diversity in many IT infrastructures renders them immensely susceptible to cyberplagues. I dare say that far more than one quarter of our population would be affected should the next cyberplague be more destructive than its predecessors. Biological diversity breeds resistance. Lack of it is deadly.

As today's hackers and virus spreaders demonstrate every day, cybersecurity is an evolving discipline, one that combines art and science, and determination and passion. One cannot simply take a snapshot of a company's IT systems today and compare it to some preconceived list and say 'yes, you are secure,' or 'yes, you are doing the right things toward better security.' The state of the art is in a perpetual state of revolution.

Ultimately, any culture is as good as the institutions that serve as the foundation of that culture. So, if there is an overarching recommendation for you and your congressional colleagues, it is to work with us in industry and in academia to facilitate the development of the institutions, practices and mores necessary to build a strong, vibrant and diverse culture of security. I believe we have turned a corner, and are making progress toward getting more and more of our customers to think about security. Further steps are needed, such as the ones outlined here. Again, these recommendations are no silver bullets, but what we at Oracle believe are the next appropriate steps up this ladder of better security. We are very pleased to be a part of next month's Cybersecurity Summit being planned by the Department of Homeland Security, and some of our leading trade associations. Establishing that kind of regular, continuing dialogue is yet another link toward making sure we have truly turned a corner for the better, rather than yet another trip on the merry-go-round of information assurance.

Thank you again, Mr. Chairman, for the opportunity to appear before you today.