La lecture en ligne est gratuite
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
Télécharger Lire

ACCURATE Public Comment on the Voluntary Voting System Guidelines (VVSG), Version 1.1

De
29 pages
PUBLIC COMMENT ONTHE VOLUNTARY VOTING SYSTEM GUIDELINES,∗VERSION 1.1SubmittedtoTheUnitedStatesElectionAssistanceCommissionSeptember28,2009∗This material is based upon work supported by the National Science Foundation under A Center for Correct, Usable,Reliable, Auditable and Transparent Elections (ACCURATE), Grant Number CNS 0524745. Any opinions, findings, andconclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the viewsoftheNationalScienceFoundation. ThispubliccommentnarrativewaspreparedbyAaronBursteinandJosephLorenzoHallinconsultationwiththeACCURATEPrincipalInvestigatorsandAdvisoryBoardMembersLillieConey,DavidJefferson,andWhitneyQuesenbery. ThesecommentsbenefitedfromcontributionsbyAndrewAppel,MattBishop,PacoHope,SeanPeisert,EricRescorla,GreggVanderheiden,andKa PingYee.ACCURATEPrincipalInvestigatorsAvielD.Rubin DanS.WallachACCURATEDirector ACCURATEAssociateDirectorDepartmentofComputerScience DepartmentofComputerScienceJohnsHopkinsUniversity RiceUniversityrubin@cs.jhu.edu dwallach@cs.rice.eduhttp://www.cs.jhu.edu/~rubin/ http://www.cs.rice.edu/~dwallach/DanBoneh MichaelD.ByrneDepartmentofComputerScience DepartmentofPsychologyStanfordUniversity RiceUniversitydabo@cs.stanford.edu byrne@rice.eduhttp://crypto.stanford.edu/~dabo/ http://chil.rice.edu/byrne/DrewDean DavidL.DillComputerScienceLaboratory DepartmentofComputerScienceSRIInternational StanfordUniversityddean@csl.sri.com dill@cs ...
Voir plus Voir moins

PUBLIC COMMENT ON
THE VOLUNTARY VOTING SYSTEM GUIDELINES,
∗VERSION 1.1
Submittedto
TheUnitedStatesElectionAssistanceCommission
September28,2009
∗This material is based upon work supported by the National Science Foundation under A Center for Correct, Usable,
Reliable, Auditable and Transparent Elections (ACCURATE), Grant Number CNS 0524745. Any opinions, findings, and
conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views
oftheNationalScienceFoundation. ThispubliccommentnarrativewaspreparedbyAaronBursteinandJosephLorenzoHall
inconsultationwiththeACCURATEPrincipalInvestigatorsandAdvisoryBoardMembersLillieConey,DavidJefferson,and
WhitneyQuesenbery. ThesecommentsbenefitedfromcontributionsbyAndrewAppel,MattBishop,PacoHope,SeanPeisert,
EricRescorla,GreggVanderheiden,andKa PingYee.ACCURATEPrincipalInvestigators
AvielD.Rubin DanS.Wallach
ACCURATEDirector ACCURATEAssociateDirector
DepartmentofComputerScience DepartmentofComputerScience
JohnsHopkinsUniversity RiceUniversity
rubin@cs.jhu.edu dwallach@cs.rice.edu
http://www.cs.jhu.edu/~rubin/ http://www.cs.rice.edu/~dwallach/
DanBoneh MichaelD.Byrne
DepartmentofComputerScience DepartmentofPsychology
StanfordUniversity RiceUniversity
dabo@cs.stanford.edu byrne@rice.edu
http://crypto.stanford.edu/~dabo/ http://chil.rice.edu/byrne/
DrewDean DavidL.Dill
ComputerScienceLaboratory DepartmentofComputerScience
SRIInternational StanfordUniversity
ddean@csl.sri.com dill@cs.stanford.edu
http://www.csl.sri.com/users/ddean/ http://verify.stanford.edu/dill/
JeremyEpstein DouglasW.Jones
ComputerScienceLaboratory DepartmentofComputerScience
SRIInternational UniversityofIowa
jepstein@csl.sri.com jones@cs.uiowa.edu
http://www.csl.sri.com/people/epstein/ http://www.cs.uiowa.edu/~jones/
DeirdreK.Mulligan
PeterG.Neumann
SchoolofInformation
ComputerScienceLaboratory
UniversityofCalifornia,Berkeley
SRIInternational
dkm@ischool.berkeley.edu
neumann@csl.sri.com
http://www.ischool.berkeley.edu/
http://www.csl.sri.com/users/neumann/
people/faculty/deirdremulligan
DavidA.Wagner
DepartmentofComputerScience
UniversityofCalifornia,Berkeley
daw@cs.berkeley.edu
http://www.cs.berkeley.edu/~daw/Preface
1A Center for Correct, Usable, Reliable, Auditable and Transparent Elections (ACCURATE), a multi
institution, interdisciplinary, academic research center funded by the National Science Foundation’s
2(NSF) “CyberTrust Program,” is pleased to provide these comments on the Voluntary Voting System
GuidelinesVersion1.1(VVSGv1.1)totheElectionAssistanceCommission(EAC).
ACCURATE was established in 2005 to conduct fundamental research into methods for improv
ing voting technology. ACCURATE’s Principal Investigators direct investigating software ar-
chitecture, tamper resistant hardware, cryptographic protocols and verification systems as applied to
electronicvotingsystems. Additionally,ACCURATEevaluatesvotingsystemusabilityandhowpublic
policy,incombinationwithtechnology,canbettersupportelections.
Since receiving NSF funding in 2005, ACCURATE has made many important contributions to the
3science and policy of electronic voting. The ACCURATE Center has published groundbreaking re
sults in security, cryptography, usability, and verification of voting systems. ACCURATE has also
activelycontributedtothepolicydiscussionthroughregulatoryfilings,throughtestimonyandadvising
4decisionmakers as well as conducting policy research. ACCURATE researchers have participated in
runningelectionsandassistingelectionofficialsinactivitiessuchasunprecedentedtechnicalevaluation
5ofvotingsystemsandredesigningelectionprocedures. Finally,theeducationandoutreachmissionof
ACCURATE has flourished through the development of numerous undergraduate and graduate classes
6andthecreationofthepremiervenueforvotingtechnologyresearch.
Withexpertsincomputerscience,systems,security,usability,andtechnologypolicy,andknowledge
of election technology, procedure, law and practice, ACCURATE is uniquely positioned to provide
helpfulguidancetotheEACasitattemptstostrengthenthespecificationsandrequirementsthatensure
thefunctionality,accessibility,security,privacyandtrustworthinessofourvotingtechnology.
WewelcomethisopportunitytofurtherassisttheEACandhopethisprocesscontinuesthecollabo
rationbetweentheEACandindependent,academicexpertsinordertosustainimprovementsinelection
systemsandprocedures.
1
See: http://www.accurate voting.org/ .
2National Science Foundation Directorate for Computer & Information Science & Engineering, CyberTrust, see: http:
//www.nsf.gov/funding/pgm_summ.jsp?pims_id=13451&org=CISE.
3A Center for Correct, Usable, Reliable, Auditable and Transparent Elections. 2006 Annual Report. Jan. 2007.
URL: http://accurate voting.org/wp content/uploads/2007/02/AR.2007.pdf ; A Center for Correct, Usable, Re
liable, Auditable and Transparent Elections. 2007 Annual Report. Jan. 2008. URL: http://accurate voting.org/
wp content/uploads/2008/01/2007.annual.report.pdf ; A Center for Correct, Usable, Reliable, Auditable and Trans
parent Elections. 2008 Annual Report. Jan.2009. URL: http://accurate voting.org/wp content/uploads/2008/12/
2008annualreport.pdf
4List of ACCURATE Testimony. ACCURATE Website. URL: http://accurate voting.org/pubs/testimony/ ; A
Center for Correct, Usable, Reliable, Auditable and Transparent Elections. Public Comment on the 2005 Voluntary Vot
ing System Guidelines. Sept. 2005. URL: http://accurate voting.org/accurate/docs/2005_vvsg_comment.pdf ; A
Center for Correct, Usable, Reliable, Auditable and Transparent Elections. Public Comment on the Voluntary Voting System
Guidelines, Version II (First Round). May 2008. URL: http://accurate voting.org/wp content/uploads/2008/05/
accurate_vvsg2_comment_final.pdf.
5ACCURATE researchers have participated in voting system evaluations sponsored by the States of California, Florida,
Kentucky,OhioandtheDistrictofColumbia.
6For more on our educational output, please see those sections of our Annual Reports (see note 3). The Electronic Voting
Technologyworkshop(EVT),collocatedwiththeUSENIXSecuritySymposium,wasstartedin2006andcontinuestoattract
thehighestcalibervotingtechnologyresearch. See: http://www.usenix.org/event/evtwote09/.
iiiContents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii
1 IntroductionandBackground 1
2 TransitioningFromVVSG2005toVVSGIIviaVVSGv1.1 3
2.1 EffectsoftheTransitionontheMarketforVotingSystems . . . . . . . . . . . . . . . 3
2.2 EffectiveDateofVVSGv1.1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
3 TheImportanceOfAuditabilityandStructuredData 4
3.1 ProgressinSupportingAuditabilityandStructuredDataFormats . . . . . . . . . . . . 4
3.2 TheImportanceofStandardizedStructuredData . . . . . . . . . . . . . . . . . . . . 4
3.3 MandatingVotingSystemsSupportEML . . . . . . . . . . . . . . . . . . . . . . . . 6
4 SignificantButLimitedImprovementsinCryptography 7
4.1 FIPS140 2: aSolidFoundationforImplementingCryptographyinVotingSystems . . 7
4.2 ChangesinCryptographySpecificationsdonotAddressSystemicIssues . . . . . . . . 8
5 ChangestoSoftwareSecuritydonotObviateSoftwareIndependence 9
5.1 SoftwareDevelopmentandWorkmanshipRequirements . . . . . . . . . . . . . . . . 10
5.2 SoftwareValidationRequirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
6 NewRequirementsforAccuracyandReliabilityTesting 18
7 TheNeedforPerformance basedUsabilityBenchmarksandTesting 20
8 SystemDocumentationandTechnicalDataPackageRequirements 21
8.1 StandardizedConfigurationChecklistsforAssessingAuditingFunctionality . . . . . . 21
8.2 TheBenefitsofMoreCompleteSecuritySpecificationRequirements . . . . . . . . . . 22
8.3 ExpandedRequirementsforTechnicalDataPackageContentsAreWarranted . . . . . 23
8.4 RequirementsforIdentifyingProtectedandConfidentialInformation . . . . . . . . . . 24
9 Conclusion 24
iv1 IntroductionandBackground
The EAC’s proposed revision of VVSG 2005 will require extensive changes to current voting systems,
yetyieldmodestbenefitsonkeyissuesofaccessibility,usability,reliability,accuracy,andsecurity. The
commission claims that its proposal aims to revise technical requirements that do not require changes
1incurrentvotingsystemhardwareor“complexsoftwarechanges,” andtoincludeVVSGIIprovisions
that clarify existing requirements or improve testing; the Commission notes that VVSG II commenters
2were“near[ly]unanimous”inpraisingtheseprovisions. TheEACpositionstheproposedrevisionasa
3setofrelativelyminortechnicalchangesandnon controversialchangesintestingrequirements.
Whiletheinclinationtoproceedincrementallyisunderstandable(althoughwebelieveill advisedas
apolicyandmarketmatter),thechangesproposedinthedraftVVSGv1.1insomeinstanceswillrequire
complex software changes and omit some crucial improvements in testing recommended in the draft
VVSG II. Most importantly, we believe the proposed revisions are not well targeted. In our last public
comment on the draft VVSG II we highlighted four essential improvements—software independence,
adversarial vulnerability testing (OEVT), usability benchmark testing and volume testing—necessary
4to make substantial progress. However, the EAC’S approach is not currently moored in a substantive
prioritization of risks that need be addressed. The impact of this gap is most profound in proposed
security revisions that lack the core requirement for software independence recommended by NIST
in the draft VVSG II. There are also practical issues surrounding the transition from VVSG 2005 to
VVSG v1.1 that amplify our concern with the substantive proposals. Specifically, it is unclear how the
revision will affect voting systems that are currently being tested under VVSG 2005 or that will have
obtained certification by the time the EAC adopts VVSG v1.1. Additionally, under the current draft,
VVSGv1.1willgointoeffectimmediatelyuponadoptionbytheEAC,withoutthetransitionperiodof
pastVVSG,thereforefurthercomplicatingthecertificationenvironmentinanalreadytroubledmarket.
Viewedinisolation,manyoftheproposedtechnicalrequirementswouldimproveuponVVSG2005.
Howevertheproposedchangesmustbeassessedfortheireffectsinpromotingtrustworthinessinentire
voting systems. While the proposed changes will yield improvements, they neither target the most
substantialproblemsnoraddresssecondandthirdtierproblemsinaneconomicalandefficientfashion.
For this reason we find them misguided and inefficient because in the long run they are likely to result
in greater costs for less benefit than a more comprehensive or risk oriented approach. It is important
to note that the VVSG II recommendations that provide the basis for the new software requirements
in VVSG v1.1 were formulated against the backdrop of the principle of software independence, which
holds that “an undetected error or fault in the voting system’s software is not capable of causing an
5undetectable change in election results.” In the absence of this core principled change to the structure
1U.S. Election Assistance Commission. Proposed Guidance on Voluntary Voting System Guidelines. 74 Federal Register
26665.June2009,26666.
2Ibid.,26666.
3The EAC states two broad goals in issuing the draft VVSG v1.1: improving testing under the VVSG and re-
vising certain requirements to “reflect advancements in voting technology.” (Ibid., 26666) The EAC did not is
sue a final version of the “next iteration” of the VVSG, commonly known as “VVSG II” or “VVSG 2007,” to
achieve these goals. Instead, it decided to replace portions of VVSG 2005 with portions of VVSG II. See:
(U.S. Election Assistance Commission, Technical Guidelines Development Committee. Voluntary Voting System Guide
lines Recommendations to the Election Assistance Commission. Aug. 2007. URL: http://www.eac.gov/files/vvsg/
Final TGDC VVSG 08312007.pdf ; U.S. Election Assistance Commission. Draft Voluntary Voting System Guidelines Ver-
sion 1.1. May 2009. URL: http://www.eac.gov/program areas/voting systems/voting system certification/
2005 vvsg/draft revisions to the 2005 voluntary voting system guidelines vvsg v 1 1 ,Introduction).
4A Center for Correct, Usable, Reliable, Auditable and Transparent Elections. Public Comment on the Voluntary Vot
ingSystemGuidelines,VersionII(FirstRound).May2008.URL:http://accurate voting.org/wp content/uploads/
2008/05/accurate_vvsg2_comment_final.pdf.
5U.S.ElectionAssistanceCommission,TechnicalGuidelinesDevelopmentCommittee,VVSGII,seen.3,Vol.1,§2.4.
1of voting systems the proposed revisions lose their importance. Picking and choosing elements from
VVSG II guided by a desire to avoid complex and/or controversial revisions will produce costly yet
substantivelyunsatisfactoryresults.
The EAC has yet to decide what to do with the TGDC’s software independence recommendation,
6andtheproposedVVSGv1.1mustbeunderstoodinlightofthisuncertainty. Thesoftwarechangespro
posedinVVSGv1.1cannotmakeupforitsomission,andthesechangesdolittletolaythegroundwork
7for software independence in the future. Some changes will not be fully realized without requiring
software independence; for example, requiring voting systems to generate electronic data in auditable,
structured formats is an extremely valuable improvement (as discussed in Section 3) but not nearly as
powerfulasitwouldbewhenembeddedinaframeworkforsoftwareindependentvotingsystems. Simi
larly,thoughVVSGv1.1’scryptographyrequirementswouldprobablyremedysomeofthepastmisuses
in voting systems’ use ofy, it is unrealistic to expect better cryptography to produce a leap
forward in the integrity of election results without software independence. As discussed in Section 4,
other components of voting systems will remain vulnerable to attacks that do not require subverting
theircryptographicmodules. Finally,someofthenewsoftwaresecurityrequirementspush(orexceed)
the state of the art but bring a questionable benefit to the security of the voting system as a whole.
In Section 5 we point out that these requirements either focus on workmanship improvements that are
likelytoyieldmarginalimprovementsintestabilityandsecurityorrequiresoftwarevalidationmethods
thatlackfoolproofreal worldimplementations.
Thechangesinvotingsystemtestinganddocumentationholdgreaterpromiseforimprovingvoting
system testing and increasing voting system trustworthiness in the near term. However, the testing and
documentation changes that are included in VVSG v1.1 would benefit from further refinement. Yet,
here too we are disappointed to see that what we believe were core recommendations in VVSG II with
8respect to testing—adversarial vulnerability testing (OEVT) and volume testing—were not included.
Whilethenewmorestringentandpreciserequirementsforaccuracyandreliabilitytestingarewelcome,
the absence of adversarial testing and volume testing misses an opportunity to centralize at the federal
levelaexpensivetypesofsecurityandreliabilitytestingthathasprovedvaluable. Therequirementsfor
performance based usability benchmarks and testing mark major advances in the assessment of voting
systems. Yet, as we note in Section 7, test protocols for usability benchmarking being developed by
NIST—which are not included in VVSG v1.1—are critically important. We recommend that they be
addedtoVVSGv1.1iftheyareready;ifnot,theEACshouldencouragetheircompletionandaddthem
tothestandardsassoonaspossible. ThedocumentationrequirementsinVVSGv1.1holdconsiderable
promise for encouraging voting system vendors to adopt sound security engineering practices and for
settingahighstandardforsecuritytestingandevaluationduringthecertificationprocess. Ifthesewere
coupledwithcoresubstantiverevisions,includingsoftwareindependence,adversarialvulnerabilitytest
6EAC had planned to revise the VVSG II and submit the revision for another round of public comment before adop
tion. However, at the beginning of this year, this plan was changed to include an interim VVSG (VVSG v1.1) before tak
ing substantially more time to consider the draft VVSG II and the associated public comments. See: U.S. Election Assis
tance Commission. Implementation Plan for 2005 Revision / Next Iteration. Jan. 2009. URL: http://www.eac.gov/
program areas/voting systems/docs/vvsg timeline update 2005 31march09/attachment_download/file .
7Norwouldtheypushvotingtechnologyinadirectionthatwouldserveanalternativetosoftwareindependence. Forexam
ple, in response to direction from the EAC to “[d]evelop possible alternatives to the requirement of Software Independence,”
NIST has offered “auditability”—ensuring that “any error in [a voting system’s] recording of votes or vote totals, whether
randomly occurring or maliciously induced, is detectable.” National Institute of Standards and Technology. EAC Reasearch
AreasfortheTGDCVVSGRecommendations.Jan.2009.URL:http://www.eac.gov/program areas/voting systems/
docs/nist response.pdf/attachment_download/file ,1,3
8As explained in U.S. Election Assistance Commission, Technical Guidelines Development Committee, VVSG II, see n.
3,3:3.4,open endedvulnerabilitytestinginvolvesattemptsbyskilledtesterstofalsifytheassertionthatasystemissecureby
demonstratingthatvulnerabilitiesinasystemcanbeexploited. Volumetesting,discussedinSection6,involvestestingalarge
amountofexemplarvotingsystemsunderconditionsmeanttosimulatetheelectionenvironment.
2ing, usability benchmarking and volume testing, progress would be certain. We discuss these changes,
andsuggestsomefurtherrevisions,inSection8.
2 IssuesArisingfromtheTransitionfromVVSG2005toVVSGv1.1
ThissectiondiscussessomeoftheissuesinvolvedinthetransitionfromVVSG2005toVVSGv1.1.
2.1 EffectsoftheTransitionontheMarketforVotingSystems
Given that the Commission expects the VVSG 1.1 to be in effect for two years before issuing the next
9iteration of the VVSG, how the proposed revision will affect manufacturers and jurisdictions is an
issue worth considering. First, it is unclear how the revision will affect systems that are currently un
10der testing. Our understanding of the EAC’s certification program is that any system certified under
VVSG2005willbeabletomaintainitscertification,providedthatnoneoftheconditionsthatmighttrig
11gerdecertification arise. Therevision’seffectsonsystemsthatareundergoingtestingislessclear. Will
manufacturersberequiredtomakechangestotheirtoconformtonewrequirementsgoverning
cryptography, software development and workmanship, and other substantive technical revisions? Or
will manufacturers and VSTLs have the option of continuing testing under the VVSG 2005? Current
EACguidanceonthissubject—primarilytheTestingandCertificationManual—doesnotanswerthese
questions.
Second,theVVSG1.1couldcreatesomeuncertaintyforjurisdictionsthatareconsideringpurchas
ing new voting systems. A division between systems certified to VVSG 2005 and VVSG 1.1 might
12encourage jurisdictions to wait until the latter are available. This would prolong states’ dependence
13onvotingsystemsthatwerecertifiedtothe2002VSS(orearlierstandards).
2.2 EffectiveDateofVVSGv1.1
Further complicating matters is the fact that the Commission proposes to make VVSG 1.1 effect im
14mediately upon adoption. This is in contrast to the 24 month waiting period between the adoption of
15VVSG 2005 and their going into effect. This choice is understandable from the standpoint that the
EACintendstomakeimmediateimprovementsintestingandtechnicalrequirementsuntilitadoptsthe
VVSG II in final form. The Commission is also assuming that the changes in requirements are minor
9U.S.ElectionAssistanceCommission,ImplementationPlanfor2005VVSGRevision/NextIteration,seen.6.
10U.S.Commission.VotingSystemTestingandCertificationProgramManual.Dec.2006.URL:http:
//www.eac.gov/voting%20systems/docs/testingandcertmanual.pdf/attachment_download/file,Chs.3&4does
notdiscusshowtohandlechangesinstandardsduringthecourseoftestingavotingsystem.
11Ibid.,§7.
12It seems that few, if any, state statutes that require use of NASED certified or EAC certified voting systems require that
those systems be certified to the latest national standards or guidelines. Therefore, most election jurisdictions that face such
requirements would seem to have the option of purchasing older systems. See: U.S. Election Assistance Commission. State
RequirementsandtheFederalVotingSystemTestingandCertificationProgram.2009
13JurisdictionsmightbeabletomanagethisuncertaintybypurchasingsystemscertifiedtotheVVSG2005andspecifying
inthepurchasecontractthatthemanufacturerisobligatedtoupdatethesystemtoconformtothe1.1. Ourexperience,
based on a review of hundreds of manufacturer jurisdiction contracts, is that such contract terms are relatively rare; it is far
more common for contracts to specify that jurisdictions must pay to have their systems brought into conformance with new
standards.
14U.S.ElectionAssistanceCommission,DraftVVSGv1.1,seen.3,vi.
15Onarelatedhistoricalnote,theEACproposestostrikemuchofthedescriptionofthehistoryofthevotingsystemtesting
and certification process. See ibid., §§ 1.1 1.2. This historical content is quite helpful for understanding the current federalsystem,andwesuggestleavingthismaterialinVVSG1.1.
3and thus make the immediate effect of VVSG 1.1 realistic. As later sections of our comment suggest,
this assumption might not be correct. Changesin the requirements for voting system cryptography and
software workmanship, for example, could require significant changes to current voting system design
16andarchitecture.
3 TheImportanceofAuditabilityandStructuredData
Most voting systems fielded today have not been designed for auditability and evidentiary needs—i.e.,
robustsupportformaintainingtheintegrityofeventlogs,votedataandballotconfigurationdata. Inthis
section, we argue that it is important for voting systems to keep data that support of various notions of
proceduralandforensicauditing,andtodosoinstandardizedformatsandmethods.
3.1 ProgressinSupportingAuditabilityandStructuredDataFormats
TheVVSGv1.1makesanumberofsignificantchangestosupportmorerobustauditing. TheVVSGv1.1
17nowrecognizes thatelectronicrecordsmustsupportthe22 montharchivalstoragerequirementsfrom
18 19federal law, recently applied to electronic records in litigation in California. Changes to the stan
20dardshavealsobeenmadetoincreasethefidelityandintegrityofauditlogsandeventlogs.
In terms of electronic reports of ballot records or aggregated ballot data, the new § 2.4.4 requires
votingsystemstohavethecapabilitytoexporttheserecordsina:
non restrictive, publicly available format. Manufacturers shall provide a specification de
scribinghowtheyhaveimplementedtheformatwithrespecttothemanufacturer’sspecific
voting devices and data, including such items as descriptions of elements, attributes, con
21straints,extensions,definitionsfordatafieldsandschemas.
Subsequentsectionsgoontodetailwhatthesereportsmustcontain,thequalityofrandomizationofbal
22lot records and other polling place related metadata. For systems that include VVPAT, the guidelines
nowincludearequirementthatballotimagerecordsbeexportableina“apubliclydocumentedformat,
23suchasXML”.
3.2 WhyStandardizedStructuredDataIsImportant
Standardized, documented, structured, open data formats could significantly increase voting system
competitionandsupportforauditability,forensicsandresultsreporting. Eachofthesefourelementsare
important:
• Structured: Data formats need to be structured through the use of technologies like XML, in
which the data is accompanied by a machine readable document definition that can be used to
validate data. This allows enforcement of complex type systems in an unambiguous manner. A
data user with XML data and the document definition can “transform” the data into any format
theywish.
16
See§§4and5,below.
17U.S. Election Assistance Commission, Draft VVSG v1.1, see n. 3, §§ 2.1.10, 4.1.3.2, 4.1.6.1.b, 4.1.6.2.c, 4.1.7.1 and 5.3.
(VolumeIIrequiresdocumentationofsupportfortheserequirements. (Ibid.,VolumeII§2.8.5.i.))
1842U.S.C.§1974.
19AmericansforSafeAccess,etal. vs. CountyofAlameda,etal.,174Cal. App. 4th1287(May22,2009).
20U.S.ElectionAssistanceCommission,DraftVVSGv1.1,seen.3,§2.4.4.
21Ibid.,§2.4.4.1.
22Ibid.,§§2.4.4.2,2.4.4.3.
23Ibid.,§7.9.3.b.
4• Documented: Publicdocumentationofallelementsofthedataformatarecrucialsothatdatacre
atorsanduserscanshareacommonunderstandingofthesemanticandtechnicalmeaningofeach
dataelement. DataformatslikePDFandDOCdonotmeetthiselementofthisdefinitionbecause
they are either not formally documented or contain proprietary elements that are undocumented
andunavailabletomostusers.
• Open: Data formats must be open; they must publicly available for royalty free uses unencum
beredfromcopyrightorpatentclaimsbycontributorstothecreationoftheformat.
• Standardized: Finally, allthiscanbeachievedwithoutadeliberative, peerreviewprocess. How
ever, standards created in openly deliberative forums allow all potential stakeholders input into
thedirectionofthestandard,andshortcircuiteffortsbyspecialintereststocontroltheirevolution.
The only standard for election data that meets these elements is OASIS’ Election Markup Language
24(EML). Otherelection relateddatainterchangestandardsareeitherproprietary— i.e.,HartInterCivic’s
25 26EDX —orhavelongsincestalledandneverreachedapprovedstandardstatus—i.e.,IEEE1622.
TheVVSGv1.1couldgofurthertosupportcompetitionbetweenvotingsystemvendors,increased
auditability of voting systems, evidentiary requirements and results reporting to the public and the me
dia. Intermsofcompetition,requiringthatballotdefinitionfilesbestandardizedintoaparticularformat
would allow, for example, makers of voter registration products to export ballot definitions and other
election specificconfigurationdatasuchthatajurisdictionalcustomercoulduseanimprovedvoterreg
istrationproduct,orsimplythevoterregistrationproductthattheirstate levelelectionentityhaschosen
27or developed to implement HAVA’s requirements for statewide voter registration databases. Recently
patents have been awarded to manufacturers on crucial election related technologies, such as ballot
marking devices (BMDs), that, when combined with obscure data interchange formats, ensures that
customers that wish to use only the BMD product have to purchase the vendor’s entire election suite.
A well documented data interchange format could allow a voting services company to design an EMS
productthatservicedavarietyofvendor’svotingsystemcomponents.
Recently, interest has increased remarkably in the topic of voting system auditability. Post election
machine and manual tally auditing, particularly, has received a great deal of attention. In terms of
auditability, requiring that manufacturers support standardized, structured, documented, and open data
formats shifts the focus of these records from proprietary uses contemplated by the manufacturer, to a
customer centered focus. Such data formats will enable election officials, not just the vendor, to per-
form audits. For example, in a recent study implementing state of the art “risk limiting” post election
audit models in California, the most substantial hurdle in performing the work was not complicated
mathematics and statistics, but simply getting election results in a machine readable format to perform
28the necessary calculations. When these researchers attempted to use electronic results reports from
the voting system EMSs from three major vendors, the default format was PDF, a format designed for
presentation of data, not calculation. EMSs that did allow export in formats suitable for computation,
24ElectionandVoterServicesTechnicalCommittee.ElectionMarkupLanguage(EML).OrganizationfortheAdvancement
ofStructuredInformationStandards.URL:http://xml.coverpages.org/eml.html.
25Hart InterCivic, Inc. EDX Specification 2.27.1. 2005. URL: http://www.hartintercivic.com/files/edx/edxdoc/
index.htm.
26IEEE. Voting Systems Electronic Data Interchange—Project 1622. 2005. URL: http://grouper.ieee.org/groups/
scc38/1622/index.htm.
2742U.S.C.§15483etseq.
28JosephLorenzoHall,LukeW.Miratrix,PhilipB.Stark,MelvinBriones,ElaineGinnold,FreddieOakley,MartinPeaden,
GailPellerin,TomStanionis,andTriciaWebber.“ImplementingRisk LimitingPost ElectionAuditsinCalifornia”. Electronic
Voting Techology Workshop/Workshop on Trustworthy Elections 2009 (EVT/WOTE 2009) (Aug. 2009). URL: http://www.
usenix.org/events/evtwote09/tech/full_papers/hall.pdf.
5suchascomma separatedvalue(CSV)andMicrosoftExcel(XLS)formats,didnotdescribetheformats
29and often contained significant problems. While the systems used in this study were likely certified
undertothe2002VSS,wewouldexpectthattherequirementsintheVVSGv1.1shouldmakesubstan
tial progress towards fixing problems like these. There a number of enhancements to the VVSG v1.1
30electronicresultsreportrequirementsthatcouldassistcontest specificauditingefforts.
Increasingly, types of forensic election data are being consulted as evidence in investigations and
election contests. Researchers have pointed out that voting systems do not record sufficient forensic
31data. They have determined what additional data should be recorded as well as how these data can
be better captured and stored to assist investigators in determining which theoretical causes and effects
are the actual causes and effects of anomalies of interest. A standard model of what data needs to
be collected for forensics uses, as well as an open standardized grammar for recording it would mean
thatauditorsandinvestigatorscouldconcentrateontheproblemsathand,insteadofidiosyncrasiesand
deficienciesofthevotingsystem’sinfrastructure.
We have seen recent developments in the use of structured results reporting to better serve the in
formationconsumptionneedsofthemediaandthepublic. CaliforniausestheOASISstandardElection
32Markup Language (EML) to report results in real time on election day and immediately afterward.
Thishasanumberofbenefits,inthatmediaoutletscandesigntheirowntoolstotransformoradaptthis
data into formats and analyses suitable for engaging the public in election news. Unfortunately, Cali
fornia spends a good deal of time hand coding results from counties that they receive as PDF reports
and even faxes from counties to get this data into the EML format. Requiring voting systems to export
results in publicly documented structured data formats, as the VVSG v1.1 now does, will allow states
like California to simplify this process, at most having to write a transform for each vendor or county
that will cast the results reported from the EMS into the format that they need for aggregation at the
statelevelandreporting.
3.3 TheVVSGv1.1ShouldRequireVotingSystemsSupportEML
All this being said, it seems appropriate that the VVSG v1.1 should require voting systems to support
specific standardized data interchange elements. The VVSG v1.1 should mandate support of election input, forensicrecordsandoutput fromvotingsystemsinaspecificstandardized,documented,
structuredopenformat.
What is needed here is not simply a publicly available and specified format that the VVSG v1.1
33callsfor,butaspecificstandardizedgrammarthatvotingsystemsmust“speak”. Atthispointintime,
thiswouldlikelymeanthatvotingsystemsmustsupportthefollowinginEMLv5orlater:
• election specificvotingsysteminput intheformofballotdefinitions;
29Forexample,wherecolumnsorrowsappeartohavebeenswappedandaggregatenumbersdidnotmatchvaluessummed
fromindividualvalues.
30Whileanextensivelistisbeyondthescopeofthisdocument,wewilllistafew: resultsneedtobereportedatthecontest
level for each unit subject to audit (which can be machine, precinct, polling place, etc. depending on state law). Also, for
absenteeandvote by mailballoting,“batches”ofballotsarescannedinandthis“batch”isthenaturalunitforauditing;results
reportsshouldreportallcontestspresentinabatchwiththeirassociatedaggregateresults. Finally,provisionalballotsneedto
bereportedseparatelyinthesereportssotheycanbedisaggregatedfromtheother
31Matt Bishop, Sean Peisert, Candice Hoke, Mark Graff, and David Jefferson. “E Voting and Forensics: Prying Open the
Black Box”. USENIX/ACCURATE/IAVoSS Electronic Voting Technology Workshop/Workshop on Trustworthy Elections 2009
(EVT/WOTE’09)(Aug.2009).URL:http://www.usenix.org/events/evtwote09/tech/full_papers/bishop.pdf,5.
32CaliforniaSecretaryofState.ElectionNightData FeedInformation .URL:http://www.sos.ca.gov/media/.
33Bishop makes this case for audit logs, although his argument translates easily to election specific input and output. See:
MattBishop.“AStandardAuditTrailFormat”.ProceedingsoftheEighteenthNationalInformationSystemsSecurityConfer-
ence,136–145(Oct.1995).URL:http://nob.cs.ucdavis.edu/~bishop/papers/1995 nissc/stdaudfmt.pdf .
6

Un pour Un
Permettre à tous d'accéder à la lecture
Pour chaque accès à la bibliothèque, YouScribe donne un accès à une personne dans le besoin