HFP Subcommittee Teleconference
Friday, November 3, 2006 2 PM ET

Agenda:

0. TGDC and EAC updates from Allan, John Wack
1. Discussion of updated Usability and Accessibility Sections--Sharon and John
2. Update on benchmark experiments and ballot instruction experiments--Sharon
3. Discussion of next steps to talk about at the TGDC meeting--Sharon
4. Other issues or items

Next telecon is Thursday, November 9, 11:30AM ET.

Participants: Allan Eustis, David Baquis, John Cugini, John Wack, Nelson Hastings, Sharon Laskowski, Tricia Mason, Wendy Havens

Administrative Update:

  • Allan: With appropriate public posting, David is now an invited government expert from the U.S. Access Board and can participate in HFP teleconferences between now and the December plenary meeting.
  • Sharon: the majority of today's meeting will be John Cugini going over the updated usability and accessibility section. We would like to get a draft for formatting out in the next week.

Discussion of updated Usability and Accessibility Sections

John C. has marked the items where he feels need to be discussed with question marks. Anyone who has other issues is welcome to introduce them.

3.1.3. New section called "the Relationship between HAVA and VVSG". There's confusion on the interaction. This material was added to clarify the distinction between HAVA legal requirements and "requirements" within the VVSG. Maybe this belongs in a general VVSG section, not just an HFP section. [Allan agrees it should be in the overview.] Will stay as is unless there are objections. [David: Questions about enforcement. Under VVSG, do we really want the test labs to provide enforcement or should it be EAC? JC and SL agreed to change from test labs to EAC (they are rule making body).]

3.2.2.1 B. Overall Efficiency. Clarification. The overall requirements apply to all voting stations. The accessibility stuff is extra. The usability requirements need to be applied in a different way at the accessible stations, in particular with the differences between video and audio. We do not have a benchmark for this yet. We want a requirement for efficiency - how long it takes voters to vote; note different benchmarks for audio for visual. No objections.

3.2.2.2. Performance Requirements for Specific Tasks. Thinking may be going against this because it's too detailed and too hard to measure, so this may be going away. We don't want testing by labs to be too detailed because of costs.

3.2.3.2.C. Handling of Marginal Marks. Handed off from CRT. This is for paper ballots. Will stay as is unless there are objections. [David: Clarification. Will they be filled out by hand and inserted into scanning machine? Yes, this applies to opscan systems.]

3.2.4.G. Icons and Language. Also from CRT. This is because we can't rely on color alone. Must require icon and text. No objections.

3.2.5.G. Visual Access to VVPAT. Most accessibility issues for VVPAT covered by general requirements. Deals with comparing paper to screen. If you want comparable formats between two, you need to pull that out specifically. [What does same posture mean? Paper visible at same height as machine.] "Posture" the best way to write this? Sharon/John to re-word. [Whatis meant by by same format? Summary capabilities must be the same.] Default position: doing nothing for now on this requirement, John W to come up with some language. [David to take posture issues to ADA specialist for recommendations.]

Can TGDC recommend areas to EAC that they may want to consider research in? Hopefully so. STS wants to do this as well.Note: May want to create list of things to consider.

3.2.6.1 Timing Issues. New section, we should look at carefully. You have to do different things for audio versus video. The requirements have changed. [David: Clarification on section C, is this about initializing? This is any kind of response from machine. No good way to put an end time on audio since it is content dependent. We should have input from disability community.]

3.2.9. Usability for Poll Workers. This is one we gave away to the Core requirements sub committee . Originally this was a section about usability of documentation. It seemed more appropriate to give to CRT since they were responsible for other documentation. It should be practical and usable for average poll workers. There should be style guides on how to put documentation together. [David to provide documents on accessibility of documentations.] Sharon not sure how far to go into accessibility for this topic.

3.3.2. (C) and (D) High Contrasts for Displays and Adjustable Saturation for Color Displays. Thinks the purpose is that people with vision problems have choice of high & low contrast and high & low saturation, not actual color change. John reformulated and simplified this section. We should put out guidance about universal colors, put out best practices. [David: Good place to harmonize between this and ADA 508 standards.]

3.3.3.D Ballot Activation. Clause about "normal procedure" a little funny. Who knows what the normal procedure is. John would like to reword. Must be based on equipment, not procedures in the polling place.

3.3.5.B Allowance for Assistance. New requirement. Suggested in comment period. No objections. [David will pass to ADA specialist - "adequate room" is arbitrary and not measurable.]

John C. intends to have this to the formatting contractor by next weekend. Any comments should get to John a.s.a.p.

Adjustable Controls: A lot of these things are adjustable by the voters, except contrast, which should be adjustable either by the voter or the poll worker. Not sure why contrast is singled out. John proposes making all visual aspects adjustable by the voter. Will discuss further with Whitney.

Update on benchmark experiments and ballot instruction experiments - Sharon

We're trying to get benchmarks from the usability testing. Don't have formal report, still looking at preliminary statistics to figure out how to write benchmarks. Initial "rough" Results (which collected paper optical scan and DRE with VVPAT) show that timing wasn't much different between the two. All voters were confident and had average satisfaction with voting experience. This is the first batch. Out of 23 DRE users, 15 made mistakes. With the paper optic scans, 7 people made mistakes. Outcome: We are able to measure error rates.

Discussion of next steps to talk about at the TGDC meeting - Sharon

Any pressing issues to discuss with Bill Jeffrey?

  • STS plans and how it affects usability. Is this controversial? Ron Rivest and David Wagner want to put forward that software independent systems are only ones available. People might say paper is not usable or accessible. Preliminary evidence is that people make less errors using paper ballots than DREs.
  • Test and certification procedures. EAC accreditation. [See CRTs website, John W to forward to Sharon]

Next telecon is Thursday, November 9, 11:30AM ET.



****************

Link to NIST HAVA Page

Last updated: July 25, 2007
Point of Contact

Privacy policy / security notice / accessibility statement
Disclaimer / FOIA
NIST is an agency of the U.S. Commerce Department