Coalition for Voting Integrity, the home of the Voice of the Voters

Rebuttal to Bucks County Voting Machine Assessment

Home
SaveOurVote.com
Voice of the Voters! Internet/Radio
Voting News
Banfield v. Cortés
2009 Holt Bill
Editorials
Letters
Videos
Voting Machine Allocation
Reports
*GAO Reports*
Take Action!
Legislative Efforts
Voting Principles
Vision and Principles
Pollwatching Kit
Facts & FAQs
Rebuttal re Danaher
Danaher Reexamination Request
Redistricting
Blogs, Groups
Cost Comparisons
2008 Municipal Resolutions
2005 Municipal Resolutions
Lou Dobbs
Slideshow
Lehigh and Northampton Counties
Facts about HAVA
Vote-PAD
New York Times
Join Us!
Contact Us
Contact Your PA Legislators
Donate
Links
Supportive Candidates
Songs
Voting Forum October 2005
Voting Integrity Forum, June 2005


Bucks County Voting Machine Assessment Rebuttal by CVI Volunteer Janis Hobbs-Pellechio et al.

 

March 29, 2006

Bucks County Voting Machines

Results of Work Group Assessment

 

 

 

I.       Background

 

An interdepartmental, interdisciplinary work group was formed by the Board of Elections for the purpose of examining in depth the proposals from three voting machine companies.

 

Over a period of the past several weeks, the work group undertook the following activities:

 

v      Interviewed three vendors

v      Analyzed cost proposals

v      Compared cost proposals with the CoStars-10 (state) purchasing contract

v      Secured and reviewed the contracts from other jurisdictions for voting machine acquisition

v      Projected the 10-year operating costs for each of the three systems

v      Assessed cost, reliability, and compliance with the Help America Vote Act (HAVA)

v      Discussed deliverables and schedules with each vendor

v      Prepared a comparison of the three systems, noting pros and cons

v      Considered the input from the public at the Voters’ Forums held in January, 2006

v      Formulated observations and conclusions to pass on to County administration

       Why didn't they examine research material provided by the Coalition for Voting Integrity?

The work group contained representatives from the following departments:

v      County solicitor

v      Information Services

v      Purchasing

v      Finance

v      Voter Registration/Board of Elections

v      Voting Machines

v      Community Services Division/Planning Commission

Who exactly were the persons who comprised this group?

Why aren't names provided?  Why aren't we told how often they met?

     

The three vendors under consideration are:

 

1.   Advanced Voting Solutions, Inc.

2.   Election Systems and Software (ES+S)

3.   Danaher Industrial Controls

 

These three vendors demonstrated their machines at the public Voter Forums sponsored by the County Commissioners on January 18 and 19, 2006.

 


II.     Factors Considered

The factors considered by the group were the following:

 

RELIABILITY OF VOTE – Information Services Evaluation

Reliability of vote

How in the world could they figure out the reliability without an independent, hard copy    backup of the data?

Election Night Reporting -  Integration with our election reporting system

Software and technology used

Long-term record and stability of company

Did they even check the backgrounds of these voting machine companies?  If they did, they might be less comfortable with the choices!

 

HAVA REQUIREMENTS and COMMONWEALTH, FEDERAL CERTIFICATION

At the time they were checking these systems, the Danaher system was NOT up to 2002 standards.  According to a phone conversation with Chet Harhut, HAVA administrator for the State, the State and vendor decided by internal agreement to drop the State requirement that a vendor receive a NASED 2002 qualification # before counties could purchase that machine,

 

    On NASED website, Danaher is qualified to 1990 standards.  Repeated phone calls to EAC ITA secretary have not been returned so it is not possible to know if the NASED technical review committee had its questions about lab tests answered and has issued a NASED 2002 # to Danaher.

 

         

 

VOTER-VERIFIED PAPER AUDIT TRAIL

If this was a consideration, then the optical scan system was the ONLY system available that offers this.  The Danaher system has never had a VVPAT, and certainly has never been approved in Pennsylvania, as this very report mentions.

 

USE OF EASE, FAMILIARITY

Full face or partial ballot

Stability of machine

Care of machine

 

COUNTY STAFF RESPONSIBILITIES

County staff responsibilities for set-up and delivery of initial system

 

Training and support provided by vendor

County staff

Poll workers

Public

 

WAREHOUSE AND TRANSPORT

Warehouse needs: storage area, electrical, climate control

Transport requirements

              Each Danaher machine needs to be plugged in year-round

DELIVERABILITY

Earliest Date

 

AFFORDABILITY

Initial Investment

Cost over 10-year period

Number of machines needed

Available to purchase from State contract

Rent or lease option

Life expectancy of machines

Options for high-turnout presidential elections

 

 

These factors are covered in the comparison matrix. Information on how assessments were made in discussed in the text.


Information Systems Assessment

Two members of the IS staff were assigned to the work group and assessed the software/hardware/IS needs of all three system. Advanced uses a Windows-based software, which was considered to have pros and cons. The other two vendors use proprietary firmware. Advanced uses Smart Cards which were confusing to poll workers.

Did they have actual poll workers trying these systems?  If not, where did they get this claim about Smart Cards being confusing for poll workers?  This report either relies solely on the “work group” members, or it also uses outside sources.  If that is the case, we can introduce reams of outside sources to contradict nearly everything in this report.  Also, any IS professional is more aware than the average citizen how computers can be error-prone, can crash, be hacked, etc. They would know how important having an independent backup is essential to preserve data.

 

HAVA Compliance

All three systems meet the system requirements of HAVA and have been certified by the federal and state governments.

 

Voter-Verified Paper Audit Trail (VVPAT)

ES+S uses a paper ballot which is scanned. The other two systems can be adapted to include a paper ballot, although these add-on components systems have not been approved in Pennsylvania. The Advanced VVPAT system is not available.

All VVPATs are not equal in reliability, security, and ease of use.  As stated above, the optical scan is the only system that inherently provides a voter-verified paper trail. The paper ballot is filled out by the voter by filling in circles with a pencil or pen next to the candidates, fed into and read by scanner, then retained in the scanner for use in recounts or audits. In one very easy to do operation, the ballot is voter-verified, filled out, cast and counted: it’s very simple, easy, reliable. 

 

The so-called VVPAT’s in touchscreen systems are provided by hooking up a separate printer (more about that later) to each machine, which allegedly prints out the voter’s choices, and requires the extra step of the voter rechecking to verify it is correct.  Problems with this system include the documented extra time this takes and the difficulty of reading and verifying the ballot print-out, since the print is small, hard to read, and not in the same format as the original computer screen ballot. There have been studies done that confirm that nearly 80% of voters don’t take the extra step to make sure the print-out verifies their actual choices, due to confusion with computers, inability to see print-outs, and nervousness about taking too much time to vote.  When we pointed out this difficulty to the commissioners, their response was that it would be the voter’s fault if they don’t choose to make the extra effort to verify their choices!  Wouldn’t the best solution be to make the voting process as easy and secure as possible for the voters instead?

 

Ease of Use and Familiarity

Consideration was given to the ease of use of the machines. The survey results from the voter forums are included. The stability and care of the machines was discussed.

 

County Staff Responsibilities

County staff will have responsibilities in any system, in terms of setting up and printing ballots; transporting machines; storing machines; answering questions; working with the software provided; collecting and tabulating election night results (unofficial count); placing election night results on the website. The county will also have to dedicate computer hardware to the election reporting system. The systems for communication with the courthouse and means by which election results are collected and tabulated were discussed.

     In the certification report for Danaher, the State Examiner, Dr Shamos, questions the use of a dedicated phone line to transmit election results to the County courthouse.  Dr Shamos says that it is a large expense and that the results transmitted electronically cannot be used as official results because electronic transmission of the official results are not authorized by the Election Code.

Training

All vendor proposals include training for county staff, election technicians, and poll workers. Public training is an optional add-on and can include demonstrations, videos, and public service announcements. This is an important component for each system.

There is much documented evidence that expecting non-technical and possibly computer-phobic poll workers and voters to work with the more-complex touchscreen systems leads to more error and security lapses than should be acceptable in an election system. The commissioners pointed out that most of the problems they read about concerning the Danaher system were all “human errors.”  Aside from the fact that there are hundreds of documented Danaher machine failures in Philadelphia alone, one should also keep in mind that a system that results in many human errors is not designed well and that should be a major concern.

 

Warehouse, Transport, and Storage and Needs

Each voting system would require storage in the warehouse, including electrical drops. None of the systems require climate control.

Danaher machines need to be plugged in year-round.

Deliverability

None of the system vendors can deploy machines in time for the May 16, 2006 primary election. No partial deliveries can be made, and no used or reconditioned machines are available. Danaher and ES+S predict a delivery date of June or July, with set up and training occurring for the November, 2006 election. Advanced stated that it would be “impossible to guarantee delivery by May 16.” A delivery date in late April is a possibility, but there would be no possibility to set up and provide training for the primary election.

 

Affordability

Cost proposals were received from the three vendors. The work group reviewed the costs prior to interviewing vendors and then discussed each line item with the vendor representatives. The proposals contained the costs for voting machines, consumable products, set-up costs, training, technical support, information services requirements, and computer hardware and software. The number of machines per precinct varies, depending on the type of system, but is never fewer than two per precinct.

 

Through this discussion and by comparing the cost proposal with the CoStars-10 contract (state purchasing agreement) and with the executed contracts from other jurisdictions, the work group was able to determine which elements of individual proposals could be modified or reduced. In some cases, the county’s existing departments (Information Services, Voter Registration, Voting Machines, and Public Works) are able to perform tasks that the vendor also offers. Where existing personnel can provide the service, the vendor proposal was modified.

 

Assessment of 10-year costs

A projection of 10-year costs was made, based on the information provided by the vendor and assuming two elections per year. Included in this projection are licenses fees for software, maintenance fees, consumable products, such as paper, printing, batteries, electronic media, seals, and so forth. Every potential cost item was examined to prepare the 10-year projected costs. In year one, the initial purchase covers some of the costs, so the costs are calculated for a nine-year period in this case. These are all itemized on the attached cost estimate sheets.

We have major questions on how some of the costs were arrived at.  We do not see an itemized cost estimate sheet, which would allow us to be sure all costs were compared fairly and accurately. 

 


III.    Profile of Three Systems

 

Advanced Voting Solutions

The Advanced system is a touch-screen machine. The entire ballot does not appear at the same time to the voter. The program scrolls through the ballot listings. The number of machines needed is based on the number of registered voters. For the purpose of our estimate, it was assumed that the state-recommended number of 1 machine per 300 voters would be needed.

 

The vote tallies are registered on the voting machine (which has a Smart Card specific to the precinct in which the machine is located), on a paper tape, and on a USB device. The USB device is used to transfer the vote tallies to election central. At the precinct level, the tallies from the USBs in use are totaled on one USB, which is delivered to election central, either in Doylestown or in Levittown.

 

Handicapped voters are accommodated with headset devices which can be fitted on the voting machines.

 

There is a VVPAT under development for the Advanced system; however, it has not been certified by Pennsylvania.

 

Voting machines are light weight and are stored in a precinct cart, which holds up to 10 machines. The machines can be plugged into the cart and the cart connected to a power supply.

 

ES+S

The ES+S system is an optical scan card reader system. Voters receive a paper ballot, which is 8 ˝ inches by 14 inches (or larger if needed). The voter marks the ballot and inserts it into an optical scan machine. If a ballot contains overvotes, a display screen alerts the voter to a ballot problem. A ballot can be corrected before it is counted and put into the ballot box. After scanning it goes into a ballot box.

 

ES+S requires a separate type of system to accommodate handicapped voters. There are two options. One, called the Automark, uses a keypad and headphones and the machine marks the paper ballot. The voter can insert it into the optical scanner. The second option is a touch screen machine called the iVotronic, which also uses headphones. In either case, a separate unit would be required in every polling place for handicapped voters.

 

The optical scanner uses proprietary software and PCMCIA cards in each scanner to record votes. At the end of voting, the cards and a results tape are carried to election central. Election results are tabulated from the cards.

 

Paper ballots are required to be printed for every voter, at a cost of $0.31 per ballot.

This is a misleading statement, its intention seemingly to imply that ballot costs are an expensive issue. Also, especially for primaries, it is only necessary to print enough for a percentage of registered voters.  There is no mention that paper ballots (absentee and provisional) would also be required should a touchscreen system be purchased.  In the very likely event of machine failures, paper ballots would need to be on hand for voters.  Because of the volume discounts possible when printing for an entire county, the paper ballot costs would be a known, fixed cost with an optical scan system.  The costs of ballots needed with a touchscreen system would be considerably higher per ballot because there would be the same printer set-up costs, and no volume discount.  We would also need at least one scanner to read the absentee and provisional ballots, unless we opt to count those by hand.

The vendor said that optical scan results can be inaccurate, due to marginal marks on a ballot, folds in the paper, or smudges on the paper. He expressed surprise that the county was considering optical scan because the touch screens are more accurate. If a ballot cannot be read by a scanner, it can be hand-copied onto a new ballot and then scanned. High-speed scanners are available for central locations, but their accuracy is less reliable than the scanners proposed for each voting precinct.

Here, the report seems to imply that because the vendor himself seems to be reluctant to tout this product, it must be inferior, especially since a fault was intentionally pointed out.  First of all, all the voting machine vendors want to push their touchscreen systems because their profit margins are so much larger than what they get for the optical scan systems.  Many more touchscreens are required (3-6 times as many), they make vast amounts in escalating and hidden “maintenance fees,” and other possible reasons. 

 

Also, how a vendor can claim that touchscreens are more accurate when there is absolutely no way to verify their results is outrageous.  One will notice there is no mention of the hundreds or thousands of touchscreen malfunctions, breakdowns, lost votes, elections thrown into chaos, and no way to recapture the voters’ intent….only a “surprise” that optical scans are being considered with errors occurring because of folds or smudges on paper.  Absolutely incredible.   

 

Danaher

The Danaher machine is a full-screen touch screen system. It replaces the lever machines, one for one, and has the same appearance as the machines with which voters are familiar. The entire ballot is displayed on the machine face.

 

Each unit comes with an audio keypad so it can be used by handicapped voters. Results are tallied on memory cartridges, which are carried to election central. They use proprietary software. There are redundant reporting mechanisms, with aggregate result totals being stored in six locations that are cross-checked against each other.

Many of these security problems (data stored on small chips that can be stolen, lost, etc., and private and secret software used to run the machines) are shared with the optical scan machines.  However, the optical scans have a secure, redundant, voter-verified hard copy of the ballots to use for recounts and audits of an election should problems or disputes arise.  These alleged redundant reporting mechanisms stored in numerous locations and cross-checked with each other are useless because they are all generated by the computer!  If a vote is incorrectly cast, no matter how many times you recount it or print it out, it will still be wrong, and you will have no way to check.  Should the entire machine crash, you have no way to recover the votes.   

 

In addition to the memory cartridge, Danaher has developed a VVPAT module that can be added, but it is not certified and cannot be used in Pennsylvania.

Danaher has never used a VVPAT module in any election…this is a vague promise of something that may never come about, and has no guarantee of ever being available or certifiable in Pennsylvania.  Should Pennsylvania ever join 38 other states in the USA that currently or in the very near future will require a VVPAT of some kind, we will either be at the mercy of whatever printer system Danaher comes up with (and pay whatever they demand), or toss out the whole system and start over (most likely going to an optical scan system, like entire states are currently doing!). Any scenario will prove very expensive to Bucks County taxpayers.

 


IV.    Preferences of Voters and Poll Workers at Voter Forums

Voters and poll workers who used the machines expressed preferences. The results, which are attached, indicate that the Danaher machine received the highest marks in all questions asked.

When the first voter forum was held in the county courthouse, last-minute and unannounced in December of 2005, members of the Coalition for Voting Integrity (CVI) were barred from providing a counter-balanced view to what the vendors were telling officials and citizens.  There was no optical scan or Danaher machine at that first forum. The results from the next two forums are listed below, and were also set up so that there could be no cross-examination or probing questions asked of the vendors so that the citizens in attendance could hear all sides. These were basically sales pitches by the vendors, who gave misleading and sometimes outright wrong information to all.

 

We also objected to the alleged reasoning given for these “forums”, which should have been debates and not so completely one-sided.  They were set up to sell the voter on their ease of use, which should NOT be the main reason to pick a voting system.  We feel security and reliability of a voting system to protect the integrity of each person’s vote is the MOST important consideration!  Instead, people who may have had no experience or knowledge of the pros and cons of the systems were to be sold on the bells and whistles.

 

Also, many of the attendees were already wedded to one system or another.

 

You will notice that every single question on the survey below dwells on how easy a machine was to use, and absolutely no mention is made of how secure a machine might be, or whether it would actually record a voter’s true intent. In our opinion, this entire survey is irrelevant and should have carried no weight in the decision.    

 

Bucks County Voting Machine Survey

Grand Total

 

                                                                               Strongly Agree-1          Strongly Disagree-5

 

 

1

2

3

4

5

No Answer

 

Was this machine easy to read?

 

 

 

 

 

 

Advanced Voting

58

33

20

4

6

2

 

Danaher

66

38

28

9

2

1

 

ES&S iVotronic

32

34

27

8

4

4

 

ES&S Optical Scan

49

23

23

12

4

8

 

Was this machine easy to use?

 

 

 

 

 

 

 

Advanced Voting

52

35

20

10

3

3

 

Danaher

65

31

30

13

4

1

 

ES&S iVotronic

18

30

31

19

6

5

 

ES&S Optical Scan

50

26

20

14

3

6

 

Did the voting machine offer privacy?

 

 

 

 

 

 

Advanced Voting

24

37

36

15

6

3

 

Danaher

86

24

22

5

5

2

 

ES&S iVotronic

16

20

37

20

11

5

 

ES&S Optical Scan

44

23

28

12

4

6

 

Were the instructions sufficient?

 

 

 

 

 

 

Advanced Voting

47

32

34

4

4

2

 

Danaher

51

38

31

16

2

6

 

ES&S iVotronic

17

28

33

19

7

5

 

ES&S Optical Scan

47

26

22

12

3

9

 

Could you complete the voting process within 3 minutes?

 

 

 

 

Advanced Voting

49

27

23

8

6

8

 

Danaher

52

49

19

11

5

8

 

ES&S iVotronic

24

27

27

12

11

8

 

ES&S Optical Scan

53

17

25

10

5

7

 

Did you find the write-in process easy to use?

 

 

 

 

 

Advanced Voting

34

30

19

13

9

18

 

Danaher

44

33

22

26

10

9

 

ES&S iVotronic

20

32

24

9

5

18

 

ES&S Optical Scan

38

20

19

14

6

20

 


V.      Work Group Observations

 

1.      The side-by-side comparison of voting systems is contained in a chart which is attached.

 

1.      Cost differences exist among the three systems, but are not dramatic, especially over a 10-year period that reflects initial purchase and operating costs.

 

The cost differences shown in this comparison do not reflect anything remotely like cost comparisons between touchscreen and optical scan systems done all over the USA.  We would like to examine itemized comparisons to determine how these figures were arrived at.

     Many jurisdictions are also finding out there are many hidden costs and fees that vendors spring on them after the systems are purchased, which in some cases exceeded 78% of the original cost estimates.  

 

3.   None of the vendors can fully deploy machines and complete set-up and training by the May 16, 2006 primary election.

 

4.   All machines are HAVA-compliant.

 

5.      ES+S optical scan is not recommended due to questions about its reliability with the scanning operation.

This statement seems to shout out the bias the writers of this report had going into this whole “comparison.” If they can eliminate the optical scan because of its alleged unreliability, but ignore the thousands of documented touchscreen machine failures (even statements in the federal Government Accountability Office report attesting to their inherent unreliability!), then this whole “assessment” and “exhaustive study” is truly a farce.   

 

6.   The work group evaluated the pros and cons of the remaining two systems and felt that there are advantages to each system:

 

Advanced:

v      The Advanced Voting Systems technology is more advanced. This means that it is more complicated to use, but more appealing to the IS department.

v      The Advanced system is the least expensive option, both in initial purchase and 10-year operating costs.

v      The Advanced machines are lightweight and easy to store and transport.

 

 

Danaher:

v      The Danaher full-screen ballot and similarity to the lever machines is an advantage.

v      The Danaher system uses proven technology. The company has a track record with voting machines in our area and with its technology.

What the heck is “proven technology?”  If we are going to go by “proven technology and track records,” the optical scan has been used for many decades in voting, lottery machines, SAT testing, and other everyday applications. The Danaher voting system has a proven track record of nearly 400 reported machine failures in the Philadelphia area alone; this did not include “voter errors,” which should be included because of design flaws causing confusion in casting the votes correctly. 

v      The Danaher system was preferred by voters at the Voters’ Forums.

Yes, preferred by voters misled by vendor misstatements and who did not know what questions to ask. Informed voters such as members of CVI were not allowed to challenge any claims.  The Voter Forums could have been an excellent opportunity for election education but were instead used as leverage to choose an unreliable (but profitable for someone) system.

v      Poll workers at the Voters’ Forums preferred the Danaher system.

Did the poll workers understand what they will be required to do in the event of touchscreen errors, malfunctions, maintenance and voter questions?  Will they feel comfortable with a system that has no paper back-ups and so cannot guarantee the votes were accurately counted?  

 


VI.    Voting Machine Comparisons

 

 

ADVANCED

ES+S

DANAHER

Description

Touch Screen

Optical Scan

Touch Screen

 

 

 

 

Initial Purchase Costs

$4,565,560

$4,215,518

$5,018,315

Operating costs over 10-year period

$1,569,905

$2,371,715

$1,801,224

TOTAL COST – 10 years

$6,135,465

$6,545,853

$6,819,539

 

 

 

 

Number of machines needed

1,435

646

744

Available to purchase from State contract

Yes

Yes

Yes

 

Rent or lease option

Unclear

Unclear

Rent machines for $500 during high-turnout events (through end of 2007)

Life expectancy of machines

15-20 years

20 years

20 years

 

 

 

 

DELIVERABILITY

Earliest Date

Not in time for May 16 primary

June – July, 2006

June – July, 2006

 

 

 

 

INFORMATION SERVICES EVALUATION

Long-term record and stability of vending company

Unknown track record; no Pennsylvania installations

Closest machines are in Ohio and West Virginia; most ES+S clients opting for Touch Screen systems

 

Company has track record; proximity of supplier is helpful. Used in Philadelphia, Berks, Dauphin, Delaware Counties

Reliability of vote

Votes recorded in several places; carrying tabulated results on USB drives may be an issue

Paper can result in inaccuracies due to folds, moisture, scanning errors; paper ballots can be recounted.

Votes recorded in several places, including paper tape, for cross-checking

Election Night Reporting -  Integration with our election reporting system

System for getting tabulations from remote drop-off to election central undetermined

System for getting tabulations from remote drop-off to election central undetermined

System for getting tabulations from remote drop-off to election central undetermined

Software and technology used

Windows-based system; most technologically advanced system

Proprietary firmware

Proprietary firmware

 

 

 

 

WAREHOUSE AND TRANSPORT

Warehouse needs: storage area, electrical, climate control

No climate control needed; all systems will require some changes to the warehouse for storage and electrical connections

Transport requirements

Lightest machine – 22 lbs.; stored in machine carts carrying 10 units

 

195 lbs. (lighter than current lever machines)

 

COUNTY STAFF RESPONSIBILITIES

County staff responsibilities for set-up and delivery of initial system

All systems require the involvement of Information Services, Voter Registration, Voting Machines

 

 

ADVANCED

ES+S

DANAHER

Training and support provided by vendor

County staff

Yes

Yes

Yes

Poll workers

Yes

Yes

Yes

Public

No

No

No

Election-Day Support

Yes

Yes

Yes

HAVA REQUIREMENTS and STATE, FEDERAL CERTIFICATION

Yes

Yes

Yes

VOTER-VERIFIED PAPER AUDIT TRAIL

Add-on not fully developed; not available; future cost estimated at $1000 per machine

System includes paper ballot

Add-on available but not approved for PA use; cost is $2495 per machine

 

USE OF EASE, FAMILIARITY

Ballot Design

Partial ballot – scroll through; can zoom in for larger print

Full paper ballot

Full face ballot, similar to lever machines

Stability of machine

Machine legs seemed shaky and unstable; screen cleaning difficult

 

Stable; screen can be wiped clean

Poll Worker Requirements

Smart Cards judged to be difficult and confusing

Voter inserts paper ballot; difficulty would arise with rejected ballots or if scanner is not working. If scanner fails, ballots are put in a special place for later scanning.

Operation is similar to lever machine

 

Results from Voters’ Forum

Ease of Use

2nd of 4 for ease of use

3rd of 4 for ease of use

1st of 4 for ease of use

Easy to read

2nd of 4 for being easy to read

3rd of 4 for easy to read

1st of 4 for easy to read

Privacy

3rd of 4 for privacy

2nd of 4 for privacy

1st of 4 for privacy

 

REMARKS ABOUT TABLE

 

COSTS

 

  The cost comparisons between the Danaher touchscreen system and the ES&S optical scan system are so strange to our eyes, we insist on seeing how they were arrived at.  We also do not understand how they came up with the number of machines that they did for each system. With 301 precincts, they seem to be averaging only 2˝ touchscreen machines per precinct (so that 744 number seems too low).  All the data we have suggests unequivocally that more touchscreens are needed to replace levers, especially factoring in the extra time required when a machine is being used as handicapped-accessible.  Also, machine malfunctions are a very real possibility, and without enough machines, a huge backup can result while machines are being repaired or taken out of service.  Only one optical scan machine is needed per precinct, plus a handicapped-accessible ballot-marker (Automark)—so where did that 646 number come from?  Should a malfunction occur, voting can continue on the paper ballots and be counted in a scanner once it is fixed.  The whole election process needn’t be stopped.

 

REMARKS BY INFORMATION SERVICES

 

They keep mentioning Danaher’s track record; it is obvious they ignored a great deal of information available from us and other resources, including our own government.  They also place stock in the fact that other counties (according to ES&S) are choosing the iVotronic (their brand of touchscreen machine) over the optical scan.  We gave reasons ES&S is pushing the iVotronic over the optical scan.  Also, even a cursory study of what is happening around the country should give anyone pause: as the myriad overwhelming problems of touchscreens are coming to light in places that have used them for years, they are being dumped by counties and entire states (Michigan, New Mexico, Maryland) in favor of optical scans. The vendors are panicking, and trying by any means (low fire-sale prices, misleading “facts” about their products, outright deceptions) to sell the more profitable systems while they can.

 

Under “reliability of vote,” it is obvious that the writers of this report relied solely on vendor propaganda and not on any common sense or our reams of information to the contrary.  Because of the inherent nature of touchscreens, especially paperless ones, there is no way to ensure that they are accurate.  There is no hard copy or verification of the vote generated outside the machine itself.  It is impossible to claim that they are reliable and accurate.  Alleged “testing” of the machines relies on test labs paid by the vendors.  Accuracy is determined by running a very small number of votes (as low as 12) through a single machine, in test mode.  It has also been proven that a machine run in “test” mode may come up with accurate results, but completely different outcomes can happen in real “election” mode!  The alleged cross-checking inside a touchscreen is bogus, because it is simply rechecking its original record, which may have been incorrect to begin with.  In any case, how does one know all this cross-checking is going on?

 

WAREHOUSE AND TRANSPORT

 

Does anyone notice that there is no mention of the weight and ease of transport of the optical scan systems?  The fact that they are lightweight and easy to transport is entirely omitted, and instead we find the much heavier Danaher machines compared to the lever machines!

 

TRAINING AND SUPPORT PROVIDED BY VENDOR

 

Take a very good, hard look at the cost figure for an alleged future add-on printer that would be required when Pennsylvania inevitably (like most of the other states) requires a voter-verified paper trail of some kind;  $2495 per machine.  If that is not a huge typo error here, that would cost the taxpayers an additional $1,856,280 just for the initial printers, to say nothing of the added expense of the paper and huge headaches of printers jamming, etc.  We would be at the mercy of whatever the vendor wanted to charge us.

 

Note again that with an optical scan system, we have the voter-verifiable factor already inherent in the system.

 

EASE OF USE, FAMILIARITY

 

Why the commissioners place so much emphasis on this, and ignore the primary importance of vote integrity and recount ability is beyond us.  One notes the empty column about stability of the optical scan; could that be because it is the most stable of them all?  It’s just a square unit with a hole in it for you to put your ballot in.

 

Then we read something very enlightening.  The report authors feel compelled to point out the problems one may encounter with an optical scanner rejecting a ballot.  It is supposed to do that if the voter has an under- or over-vote (didn’t vote in a category or voted for too many); this alerts the voter to correct the ballot before recasting it.  It is a failsafe, not a failure of the system!  They also make a big deal about the scanner not working at times, but at least the election can continue. 

 

But look---there are NO PROBLEMS pointed out for the touchscreen system!  In fact, it is compared to the good, old reliable lever machine again!  Not one word about the many security, reliability, ease-of-use disasters that have occurred for years and all over the country with touchscreens that crash, blank out, change votes right before voter’s eyes, end up with tallies of zero, or with 100,000 more votes than voters in a precinct.  Not one little hint or acknowledgement of election disasters that have occurred because of touchscreen systems. 

 

CONCLUSIONS

 

This report seems to have been generated with one purpose in mind: to try to give credence to the commissioners’ selection of the Danaher touchscreen voting system.  If one reads over the way this is presented, the categories that are given precedence, how comments are made in all categories, one cannot escape the conclusion that the commissioners and their higher-ups had already chosen the system they wanted.  Everything is then slanted to downplay or totally ignore any faults of the Danaher system, while trying to cast as much positive light on it as possible, even to the point of relying solely on vendor talking points.  There is precious little evidence that any of the tons of information we gave them concerning the pitfalls of touchscreens is even considered. 

 

Alternatively, the optical scan system’s faults are either pointed out or grossly exaggerated at every opportunity (even ones shared with but not attributed to touchscreens), are completely fabricated or presented as faults when they are actually assets.  Favorable characteristics are omitted entirely or downplayed. Its greatest asset (and what should be the primary focus of choosing a system), that the vote can be verified, recounts are possible, and election results audited, is not given any weight (and so, of course, the fact that none of this can be accomplished with a touchscreen system need not be mentioned either).

 

It is obvious that county officials had no desire to listen to informed citizens, computer scientists, voting system experts, other jurisdictions with negative experiences, and non-partisan government agencies, who all have overwhelming evidence of the disadvantages of touchscreen systems.  They chose to ignore the advice of seven former county commissioners,  resolutions from Bucks county municipalities, hundreds or thousands of phone calls and emails and personal pleas from citizens to pick the system that is the most secure, easy to use, and cost effective. In other words, the desires of the voting electorate were not an important consideration, and could be ignored.

 

If they think that this “report” (which no one could put their name to) can validate their choice of system, that they extensively studied the issue and came to the best conclusion, then they are sorely mistaken.  This proves instead to be an indictment of their true agenda, applying positive and negative spin as suits their purpose, and omitting verifiable data when convenient.  We challenge their data, request breakdowns of their cost analysis and 10-year projections, and feel we are entitled to know exactly who worked on this project and report.  If they are going to choose a non-transparent election system for the citizens, then at the very least their data sources should be available and transparent.