ACM Washington Update, Vol. 12.2 (February 5, 2008)

By David Bruggeman
February 5, 2008


[1] Newsletter Highlights
[2] ACM Groups Urge Actions to Broaden Web Accessibility
[3] ACM Announces 2007 Turing Award Winners
[4] ACM Public Policy Office Assesses Technology Policy in 2008
[5] USACM Criticizes the Final REAL ID Regulations
[6] Voting Study Addresses Usability Concerns
[7] FY 2008 Appropriations Significantly Affects Physical Sciences
[8] MPAA Acknowledges Flawed Study on University Downloading
[9] About USACM

[An archive of all previous editions of Washington Update is available at]


It’s already been a busy year with technology policy developments on a number of issues. Congress just started its new session and we take a look at what is on its agenda for this year. There is more detail on each item below, as well as on our weblog at

* USACM, along with three of ACM’s Special Interest Groups and the Computer Science Teachers Association, urged policymakers to take more steps to broaden web accessibility.

* ACM announced that Edmund M. Clarke, E. Allen Emerson, and Joseph Siafkis won the 2007 Turing Award.

* The ACM Public Policy Office is releasing a weekly series of posts analyzing key issues in technology policy for 2008. In this newsletter we cover e-voting, REAL-ID, the “Innovation Agenda”, and identity theft and data security. Be sure to read up on each topic!

* The Department of Homeland Security released final regulations for the REAL ID program, addressing few of the privacy and security concerns outlined by USACM and others in comments submitted last year.

* A new study takes a comprehensive look at usability issues related to electronic voting machines.

* The FY 2008 Appropriations will have significant impact on various government agencies that support physical science research.

* The Motion Picture Association of America acknowledged that their study on university downloading of movies was flawed, highly overreporting the frequency of illegal downloads on university campuses.


In a Joint Statement with the ACM Special Interest Groups on Accessibility (SIGACCESS), Hypertext, Hypermedia and the Web (SIGWEB), and Computer-Human Interaction (SIGCHI); and the Computer Science Teachers Association (CSTA), USACM released a statement outlining principles computer scientists should follow to ensure that web pages are accessible to those with disabilities. You can see the statement at:

The statement emphasizes the following principles:

Increase Awareness. The Federal government should undertake a range of activities, including public-private partnerships, to increase awareness of the value of building accessibility into systems. These efforts should include building awareness of these issues internationally as well as domestically.

Develop Tools. Resources (such as software tools and guidelines) already exist to help make commercial websites more accessible. The information technology community should continue to develop additional low-cost web-development tools and promote their adoption. Further, the Federal government should continue to promote and fund research and development of more accessible information technology systems.

Extend Accessibility Standards While Minimizing Regulatory Burden. Free participation and innovation by individuals and organizations contributing to the Internet and to its growth are important. Minimal regulation has helped foster the development and spread of Internet technologies. We recommend that requirements be crafted that balance the values of accessible participation and innovation, and that those requirements are extended to public, commercial websites. In addition, we recommend that well-known and well-vetted standards be used as the starting point for enhancing and extending accessibility requirements for all people to public, commercial websites.

There are resources online to assist web page designers with making their sites more accessible. Links to those resources are available with the statement online.


ACM has named Edmund M. Clarke, E. Allen Emerson, and Joseph Sifakis the winners of the 2007 A.M. Turing Award, widely considered the most prestigious award in computing, for their original and continuing research in a quality assurance process known as Model Checking. Their innovations transformed this approach from a theoretical technique to a highly effective verification technology that enables computer hardware and software engineers to find errors efficiently in complex system designs. This transformation has resulted in increased assurance that the systems perform as intended by the designers.

Clarke of Carnegie Mellon University, and Emerson of the University of Texas at Austin, working together, and Sifakis, working independently for the Centre National de la Recherche Scientifique at the University of Grenoble in France, developed this fully automated approach that is now the most widely used verification method in the hardware and software industries.

Among the beneficiaries of Model Checking are personal computer users, medical device makers, and nuclear power plant operators. As computerized systems pervade daily life, consumers rely on digital controllers to supervise critical functions of cars, airplanes, and industrial plants. Digital switching technology has replaced analog components in the telecommunications industry, and security protocols enable e-commerce applications and privacy. Wherever significant investments or human lives are at risk, quality assurance for the underlying hardware and software components becomes paramount.


The ACM Policy Office is releasing a series of posts this year assessing key issues in technology policy and likely activity in those areas during 2008. These posts are released approximately once a week, and are available at the Policy Office Blog:

So far this year, posts have been released on electronic voting, REAL ID, the ‘Innovation Agenda’ and identity theft/data security. Other topics scheduled in the series include health IT, patents and copyrights, and internet filtering.

The E-Voting post covered legislative developments in electronic voting along with the new proposed standards for electronic voting systems, and the continued contest of a House race involving electronic voting machines.

The REAL ID post addressed the final regulations (see article below), states’ actions against REAL ID, and the possibility of expanding the use of the program.

For the Innovation Agenda, we assessed the legislative and budgetary front on the four main goals of the agenda: STEM education, and improving funding for physical science research, the R&D tax credit, and immigration reform.

The identity theft and data security post covered the continuing trend of data breaches and the likelihood of Congressional action on both data security and privacy legislation.

Keep checking the USACM Weblog for additional posts as 2008 continues.


In January the Department of Homeland Security released the final rule on REAL ID. This is the program that requires new drivers licenses or identification cards for people to gain access to federal facilities or other federally controlled activities that accept licenses for identification. In other words, you will need a REAL ID if you want to present your license at the airport. Criticized as a de facto national ID, REAL ID was passed as part of a budget bill in 2005 with little debate. It would enshrine the notion of one driver, one ID in a card that is supposed to be more secure than current ID documents.

The following press release summarizes some of the proposed changes in the final rule compared to the preliminary rules released last March. Most of the changes were to the implementation schedule, meaning that REAL ID will roll out in phases over a few more years than originally intended. States can apply for extensions, but they must demonstrate that they are attempting to comply with the law. Seventeen states have passed legislation rejecting or lodging objections to the law.

The Department released preliminary rules on REAL ID in March of 2007, and in May USACM submitted lengthy comments objecting to various provisions of the proposed rules, as did 21,000 other parties. You can see a summary of our comments, and the comments in full, online:

What changes were made from those preliminary rules are insufficient to counter the significant privacy and security risks embedded within REAL ID. We indicated as much in our press release on the subject:

The extension of implementation deadlines will simply spread the significant costs of this program over more years. The states will still shoulder the burden for digitizing and storing large amounts of information, which will prove a greater target for identity theft, as the licenses issued will become more valuable due to the increased trust placed in their validity. The databases required by this program lack specific privacy and security policy guidelines, including access controls, that would help minimize intrusions into this data that can affect peoples privacy and weaken the security of these identification documents. Additionally, the scaling up of several databases as envisioned with these rules will trigger additional problems not foreseen in these systems at their current, smaller scale.


In the debates about electronic voting machines the claim is often made that direct electronic recorder (DRE) voting machines are much easier to use and much more accurate in capturing voter intent than other voting systems. A new comprehensive usability study of five commercial e-voting machines (published by the Brookings Institution) finds that we still have a long way to go in improving machine usability and that ballot design remains a challenging issue. We haven’t read the full study (Available from Brookings Press for $19.95), but you can order it here:

While the study found voter confidence and satisfaction with the systems was good, error rate increased as voter tasks became more complex. MIT Technology Review has an article on the study. From the magazine, ” … Bederson says that even for the simplest task voting in one presidential race on a single screen participants had an error rate of around 3 percent. When the task became more complicated, such as when voters were asked to change their selection from one candidate to another, the error rate increased to between 7 and 15 percent, depending on the system. Bederson notes that, although the error rate that occurred in the study may not necessarily mean that there is the same error rate in terms of actual votes on actual machines, the study does raise concern, considering how close some recent elections have been.”

The full article in MIT Technology Review is available at:


Last month, we discussed the budget meltdown that crippled the American Competitiveness Initiative’s (ACI) efforts to double physical science research funding. Now, thanks in part to analysis by Peter Harsha at the Computing Research Association, we have some of the specific details about how this budget will seriously compromise federal physical science research.

Peter’s analysis focuses on the National Science Foundation and DOE Office of Science parts of the ACI. You can read it online at:

Several programs in both institutions will see grant and award sizes reduced, scientists let go, and other programs delayed, reduced in budget, or flatlined. The picture for the National Institute of Standards and Technology is no better. Many of the affected programs listed below involve computing in some capacity.

The President had requested about a $66 million increase for NIST’s labs this fiscal year, but it only received $6 million. Here is how the agency is going to allocate the $6 million for labs:

* $893,000 to fund the earmark for the New York Nano Measurement Facility
* $3.2 million for the Innovations in Measurement Science program
* $1 million for the US Measurement system
* $700,000 for nuclear magnetic resonance equipment at the Hollings Marine Lab
* $300,000 for Baldridge National Quality Program

What is more instructive is the laundry list of new/expanded programs that were proposed but will not be funded:

* $11 million Enabling Nanotechnology from Discovery to Manufacture
* $5 million Measurements and Standards for the Climate Change Science Program
* $7 million Enabling Innovation Through Quantum Science (including Quantum Computing)
* $4 million Disaster Resilient Structures and Communities
* $3.25 million National Earthquake Hazards Reduction Program
* $4 million Enabling the Hydrogen Economy
* $1 million Manufacturing Innovation through Supply Chain Integration
* $1.5 million Synchrotron Measurement Science and Technology: Enabling Next Generation Materials Innovation
* $1 million International Standards and Innovation: Opening Markets for American Workers and Exporters
* $1 million Bioimaging: A 21st-Century Toolbox for Medical Technology
* $600,000 Cyber Security: Innovative Technologies for National Security
* $2 million Biometrics: Identifying Friend or Foe

This serious cut in funding makes the efforts for the FY 2009 budget – starting now – even more important. We will follow this issue throughout the budget process.


The Associated Press recently reported that a high-profile study the Motion Picture Association of America issued in 2005 is significantly flawed. Specifically, the study said that 44 percent of the industry’s domestic losses came from students’ illegal downloading at universities. The MPAA says that due to “human error” that figure is more like 15 percent. Some are even arguing that it is more like three percent because of further flaws with the study. The Associated Press Report is available here:;_ylt=AqLQa0xh7T6cuUTSbOae4DMjtBAF

That’s quite a difference and calls into question the credibility of the entire report. What other data is flawed? The $6.1 billion in losses also reported?

Errors like this are a big deal because they misinform the critical policy debates that often surround studies like this. In fact, this report helped drive recent Congressional proposals to either require universities to install technology filters or strong-arm them to do so.

In 2006 the House Judiciary Committee held a hearing titled An Update: Piracy on University Networks. The letter announcing the hearing leads with the 44 percent statistic, which is used to frame the hearing. That hearing was used as a platform for a legislative proposal Congress is likely to consider in February dealing with campus-based filtering. The point here is that advocates, think tanks and policymakers have used the 44 percent figure to justify legislative action in this area. So the question is will Congress change course now that it is clear the data is flawed?

Regardless of the outcome of the proposed legislation, it is hard to have a rational debate about controversial issues when advocates are throwing around deeply flawed data.


USACM is the U.S. Public Policy Committee of the Association for Computing Machinery (ACM). ACM is an educational and scientific society uniting the world’s computing educators, researchers and professionals to inspire dialogue, share resources and address the field’s challenges. ACM strengthens the profession’s collective voice through strong leadership, promotion of the highest standards, and recognition of technical excellence. ACM supports the professional growth of its members by providing opportunities for life-long learning, career development, and professional networking.

For more information about USACM and ACM, see:


For earlier editions of the ACM Washington Update, see:


To subscribe to ACM’s Washington Update newsletter, send an e-mail to with “subscribe WASHINGTON-UPDATE “First Name” “Last Name”
(no quotes) in the body of the message.

To unsubscribe, simply include the “SIGNOFF WASHINGTON-UPDATE” command in an
email to