Catch up on stories from the past week (and beyond) at the Slashdot story archive


Forgot your password?
Books Book Reviews

Book Review: Burdens of Proof 70

benrothke writes "When the IBM PC first came out 31 years ago, it supported a maximum of 256KB RAM. You can buy an equivalent computer today with substantially more CPU power at a fraction of the price. But in those 31 years, the information security functionality in which the PC operates has not progressed accordingly. In Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of Electronic Documents, author Jean-François Blanchette observes that the move to a paperless society means that paper-based evidence needs to be recreated in the digital world. It also requires an underlying security functionality to flow seamlessly across organizations, government agencies and the like. While the computing power is there, the ability to create a seamless cryptographic culture is much slower in coming." Keep reading for the rest of Ben's review.
Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of Electronic Documents
author Jean-François Blanchette
pages 288
publisher MIT Press
rating 9/10
reviewer Ben Rothke
ISBN 978-0262017510
summary Excellent overview and history of using cryptography to build a trust framework
The so called Year of the PKI has been waiting for over a decade, and after reading Burdens of Proof, it is evident why a large-scale PKI will be a long time in coming. More than that, getting the infrastructure in place in a complex environment that exists in the USA with myriad jurisdictions and technologies may prove ultimately to be impossibility.

The irony is that an effective mechanism for digital authentication would seem to be an indispensable part of the digital age. The lack of such an authentication infrastructure may be the very reason that fraud, malware, identity theft and much more, are so pervasive on the Internet.

The premise of this fascinating book is that the slow decline from the use of paper from a legal and evidentiary perspective has significant consequences. For the last few hundred years, paper has been ubiquitous in modern life; from legal and health records, school, employment and everything in between.

The book details the many challenges that businesses and governments face in moving from a paper-based record society and the underlying trust mechanisms that go along with it, to a new digital-based record system, and how a new framework is needed for such a method. The book details part of that new framework.

The book opens with an observation on the authenticity of President Obama's birth certificate. While Blanchette is not a birther, he does note that if the moral authority of paper records has diminished, then the electronic documents replacing them, which are what the Obama administration provided, appear to be even more malleable. And that is precisely the issue that he addresses.

Blanchette details a compelling story and writes it as an insider. He was a member of a task force appointed in 1999 by the French Ministry of Justice to provide guidance on the reform of the rules governing the admissibility of written evidence in French courts, into a digital format.

The first few chapters provide an excellent overview of the history of cryptography. Chapter 3 – On the Brink of a Revolution– gives an excellent summary of cryptography from 1976 on, starting with seminal research that was done by Diffie and Hellman, and Rivest, Shamir and Adleman (RSA).

In chapter 5, Blanchette details his narrative about how France embraced and moved to a more digital governmental framework. He notes that the challenge was that France was the country that gave bureaucracy its name, and is a place where citizens must carry at all times their papers d'identite and is a society enmeshed in paper. Blanchette writes of the many French bureaucracies that had to let go of their protectionist stances as they moved down the path to letting electronic documents have legal validity.

Blanchette writes that in France, one of the biggest impediments to moving to a digital framework were the French civil-law notaries or notaire. French notaries are much more powerful than a notary public in the US, and are closer to being what a paralegal does in the US.

The French notaire are a wealthy and powerful monopoly when it comes to issues of purchases, sales, exchanges, co-ownerships, land plots, leases, mortgages and the like. A notaire can form a corporation prepare commercial business leases and much more. The entire French notary profession had been dependent on its monopoly to grant authenticity, and no definition of electronic authenticity could emerge and succeed if it did not meet its criteria.

While paper trust may be intuitive now, Blanchette writes that it wasn't always the case. When documents were first created (whenever that may have been), they did not immediately inspire trust. As with other innovations, there was a long and complex period of evolution needed to gain accepted levels of trust.

In chapter 6, the books notes that many people assumed cryptography would be the mechanism that would inspire trust in the digital world. Blanchette writes that the mistake cryptographers made and sometimes continue to make; is that they often assumed that the properties of cryptographic objects will translate transparently into the complex social and institutional setting in which they are deployed in.

This was incisively noted in Why Johnny Can't Encrypt, which was a usability evaluation of PGP by Whitten and Tygar. The author's observed that user errors cause or contribute to most computer security failures, yet user interfaces for security still tend to be clumsy, confusing, or near-nonexistent. While the paper was written in 1999, most of its findings are still relevant.

Chapter 6 provides 3 fascinating case studies that show have different approach to security technology and cryptographic deployments are imperative in ensuring that they work.

In just under 200 pages, the books 7 chapters provide both a fascinating overview of the history of cryptography, in addition to showing how cryptography can be effectively used to authenticate digital documents. The book also has a high-level framework (a comprehensive framework would require at least 5 times as many pages) for an effective cryptographic framework for digital trust.

As Blanchette notes many times in the book, the challenge with getting digital signatures to work is not with the technology; rather it is with the underlying societal infrastructure in which to make it work. France was brought kicking and screaming into the age of electronic authentication, and is one of the few countries that have had such widespread success.

The book is a fascinating read that details how frustrating difficult it has been to create a comprehensive mechanism for digital authentication. The book raises many beguiling questions, and Blanchette is smart enough to notes that there are no simply answers to these multifaceted problems.

Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of Electronic Documents is both a fascinating overview of the history of paper and electronic authentication, in addition to providing a synopsis of what it will take to make create a cryptographic culture, where digital evidence will be as accepted in the courtroom, as its antique paper cousin.

Ben Rothke is the author of Computer Security: 20 Things Every Employee Should Know.

You can purchase Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of Electronic Documents from Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Book Review: Burdens of Proof

Comments Filter:
  • by stanlyb ( 1839382 ) on Monday December 24, 2012 @04:21PM (#42383469)
    The security, the protocol, the encryption, BUT, as you already guessed, our sweat government does not want us to have secure and secret documents, without the ability to spy on them, whenever they want to, and for whatever funny reason.
  • by Anonymous Coward on Monday December 24, 2012 @04:23PM (#42383485)

    as I am the first one to make a comment!

  • by M0j0_j0j0 ( 1250800 ) on Monday December 24, 2012 @04:37PM (#42383571)

    i can assume this was written by a moderate person by is qualification of "substantially".

  • by Anonymous Coward on Monday December 24, 2012 @04:41PM (#42383601)

    ... the problem is we've always had problems 'authenticating' what is true or false because it relies on human beings not being compromised and/or beholden to power.

    What I'm most worried about is how digital technology gives power to private actors outside the law to scoop up and profile everyone on the net. Everyone has something the rich and powerful who own the government can stick them with.

    I think many future political activists who were very 'open' on the net when young and stupid will end up paying for it hugely down the line when they mature and want to change the world for the better and then find out your political enemies goons know about things that could discredit you in the public eye.

    While encryption and pseudo-anonymity barriers (like tor) that make it harder to identify you is coming online it doesn't take long before a government wants to make it against the law or will have the resources to compromise enough servers/routers.

    The real problem for the end users is ease of use, cost, decentralization and the rather low amount of people who are intelligent and aware enough to be careful of what they post online as it is all being sucked into profiles about you.

    The internet + your cell phone + every company you interact with. Is just one giant total information awareness grid, with the likes of facebook/Google and telco's being linked up to secretive government agencies. The fact that governments now have the tools to chew through enormous amounts of data and pre-preemptively attack the little people is disturbing given that technology is a real game changer when it comes to social change. Being able to profile, predict and pre-emptively diffuse public anger and outcry from becoming a political threat is something we all should be concerned about.

    We can see this in the US elections when the banks were busy robbing the treasury and no one is going to jail media outlets successfully steered the publics outrage into traditional political venues and ideologies and made the ignorant public of america direct their anger towards each other instead of the whole corrupt corporate establishment. Government is so fucked because of rich corporations and their lobbying. If americans were smart they'd threaten the powers that be with socialism (not that they have to believe it, they just need to make the powers believe it), since Obama and Romney are just two sides of the same fascist coin, they are both hard right conservative republicans. The democratic party in america hasn't been liberal for decades and most liberals in america are just as myopic as their conservative brethren in that they believe the system can be reformed through traditional means.

  • by gelfling ( 6534 ) on Monday December 24, 2012 @06:08PM (#42384075) Homepage Journal

    EVERYTHING related to PC's is still, after 30 years, a clumsy bolt-on. Hell, networking and printing still have to be added, tweaked and configured and VPN is still a mess. As long as we tolerate companies like MS shoveling Windows 8 at us while the guts under the covers are garbage, this is what we'll get. I mean with a multicore processor there's no way to make one of those cores a security specific ASIC that does all the heavy lifting for security across the board in hardware. But we'll never get that because it's more important to have live tiles and 12 different apps that all do photo filters the same way. Hoo ray.

  • by Corwn of Amber ( 802933 ) <corwinofamber@ s k> on Monday December 24, 2012 @06:22PM (#42384165) Journal

    I should begin crowdsourcing a slew of form documents, in the style of "here is why your spam solution won't work".

    Beginning with "So you wrote up a cyber law. It won't work. Here's why it won't work."

    NOTHING can be saved if it can't be freely copied by anyone from anywhere. Most documents won't survive anyway, lacking interest in making copies for all of the time they're available.

    The one and only way to keep a document is to have it freely copyable by everyone everywhere forever, end of story. Everything else is on reprieve.

  • by Anonymous Coward on Monday December 24, 2012 @06:25PM (#42384191)

    AKA some stupid little french cunt. Fuck off, frenchie.

  • by sugarmotor ( 621907 ) on Monday December 24, 2012 @06:30PM (#42384221) Homepage

    Amazon "search inside this book" has no results for "NP" as in P vs. NP. How can that be? The book doesn't draw the connection to this major relevant open question on one hand, but has "burden of proof" in the title on the other hand?

  • by TheSHAD0W ( 258774 ) on Monday December 24, 2012 @08:39PM (#42384839) Homepage

    And 64KB on the motherboard. I know, I had one.

    • by Anonymous Coward on Monday December 24, 2012 @09:14PM (#42384999)

      like that fact really matters.....

    • by WhirledOne ( 213095 ) on Monday December 24, 2012 @09:40PM (#42385087)

      Y'know, I wondered if anyone was going to point out something along those lines. Actually, IIRC, the original maximum "official" memory capacity of the early 64K PC1 was in fact 256K if you only used official IBM memory expansion cards, but the memory map officially allowed up to 512K of RAM (and was supported by some 3rd party expansion cards). A few years later, IBM apparently realized that there wasn't really a need to reserve the entire remaining 512K of addressing space for ROM and device-specific RAM (such as video RAM), so they "unreserved" a block of 128K, thus bringing the official maximum to 640K.
      Even then, it was still possible to get beyond 640K of base ram by adding RAM in the "holes" unused by ROM on your particular PC, and using an appropriate driver in MS-DOS so that DOS would know about it. Examples of such "holes" in the memory map would be the space reserved for PCjr cartridge ROM, or the MDA video RAM space if you didn't have an MDA (or the CGA space if you had an MDA instead of a CGA). When VGA became commonplace, there was a shareware driver out there that would map the 64K VGA "window" to MS-DOS use, and switch the card to CGA compatibility mode. This gave you 704K of usable base RAM in DOS without any additional hardware, and was great for text-mode or CGA-mode only software where the VGA modes wouldn't be needed anyway.

      • by old_fortran ( 80585 ) on Tuesday December 25, 2012 @02:34PM (#42388977)

        The original 64KB 5150 motherboard (4 banks of 16KB each) supported 512KB AST and other 3rd-party option cards, but carried ROMs that had a total system limit less than 640KB. The second gen motherboard supported 4x 64KB, of 256 KB on the motherboard, and 640KB of main memory overall. My recollection is of some number like 512KB + 32KB, for a total of 544KB, but it could have been 512KB+64KB, or 576KB; STILL not 640KB. I remember this because I once had to replace ROMs from gen 1 motherboards so I could get some machines up to the full 640KB memory available.

        What I remember more, however, was how fast IBM's original expectations for the PC were surpassed by people using its relatively open architecture to do far more with it than IBM had planned (or anticipated). In 1980-81, few at IBM (or anywhere apparently) could conceive why one would want a PC with more than 128KB (64+64). By being open to change, the PC went quickly from that early 8-bit kind of view to one that would lead to a revolution in business and home computing. You can say what you want about PCs and Windows, but this is being typed on a garden variety home built PC vastly more powerful in every way that that distant ancestor. It has also had RAM, storage, and P/S upgraded over its lifetime (5 1/2 years thus far). Still useful, and more importantly, still usable with new OS versions (started on XP, moved to Win 7 when stable); and while I like having long HW and SW lifecycles, the point is I am not stuck with it as it was - like I am with my DVR (which is just a specialized Linux appliance really).

        Thus I do wonder what the last 30+ years would have been like if the "Apple appliance computing" model had been adopted by IBM in 1980-85 instead of the more open one used for the PC / XT / AT? Even though it came out in 1984, the first gen Mac was ridiculous - a completely closed Moto 68K-based mini-workstation with an "8-bit machine" memory limit. It **could** have been built with a removable bottom plate and enough memory sockets for 4x 64KB - but only 1/2 populated (an expansion capability similar to what is now available for its distant descendant, the Mac Mini). But it wasn't - you had to physically upgrade your 1st gen MAc to get decent memory: to 512KB, aka the "Fat Mac", and then upgrade again to get a hard drive in the Mac Plus. Or you had to resort to strategies that would void your warranty (e.g., the hardware equivalent of a "jailbreak"). This Jobsian approach to evolution - via sales of more hardware - should sound familiar to Apple fanbois everywhere at this point (and why I opted not to buy this year's version of the iPad Mini, but wait for - GASP - the one with the proper CPU, camera, RAM, and screen).

        Ancient history? Not really. At least two current trends (1) "wirecutters' and (2) cloud computing are going to see this "open architecture versus closed appliance/service" competition played out yet again. (A third may be iOS versus Android smartphones.) Overall I am still optimistic that on balance openness will lead to innovation that will be beneficial and also not necessarily anticipated by those who want everything tightly controlled for their own profit. This doesn't mean however that appliance advocates won't put up a good fight.

  • by DamnStupidElf ( 649844 ) <> on Tuesday December 25, 2012 @05:16AM (#42386193)

    In the paper world you have to invest significant resources to forge each paper document. In the digital world if you can forge one document with a free tool you can forge as many as you want. To raise the cost of being able to forge a digital document beyond what an attacker is willing to pay the cost of legitimate use becomes greater than the benefit.

    One possible solution is a hierarchy of security where the higher layers increase both the cost of forgery and the cost of legitimate use and let the market decide how much risk to bear. The SSL world tried to do that with extended validation certificates (the green address bar) but I'm not convinced it actually improved overall security since the problem is almost always at the user level. Maybe if they started selling extended validation hardware clients whose components were fixed in epoxy and ran highly secure firmware and software it would actually work. Trusted Computing is the obvious parallel in the PC world but it fails because the cost of developing software aimed at general purpose computers to rigorous security standards is too high. It's possible that as Moore's Law shoves hardware prices through the floor banks will just send relatively cheap secure hardware home with their new customers.

    • by stevetruelord ( 2801571 ) on Tuesday December 25, 2012 @09:33AM (#42386797)
      Good point...Bruce Schneier has written tons on this topics.
    • by starfishsystems ( 834319 ) on Tuesday December 25, 2012 @05:48PM (#42390243) Homepage

      We need to be clear about what EV is. It's not about SSL, it's about X.509. It doesn't solve a technical problem because EV identifies no technical problem with X.509 certificates. EV promises a procedural solution to a procedural problem, namely the failure by Certificate Authorities to take reasonable care to check the real-world credentials of certificate requestors in order to determine that they are who they claim to be. In effect, the CAs are saying, "Yeah, well, we were a bit negligent the last time around, but we promise to do a better job next time if you just pay us more money."

      So I share your misgivings about whether EV has improved security, but for rather different reasons. And there's nothing saying that we both can't be right.
    • To raise the cost of being able to forge a digital document beyond what an attacker is willing to pay the cost of legitimate use becomes greater than the benefit.

      Exactly why we have spam. Make sending forged address and spammy e-mail require prepaid costs such that it was as expensive as snail mail and spam would drop from 90% of mail to less than 0.1%. It's only because it costs nearly nothing either directly for small quantities (like phishing scams) or via botnets to send huge quantities of spam (for bulk ads for fraudulent products including erectile disfunction drugs) that so much can be generated. We have some partial solutions but the entire solution requires better tools to manage the movement and transfer of email.

      As for security of e-documents and other electronic transmissions, we need good tools to seamlessly handle authentication (proof the sender is who they claim they are) and non-repudiation (proof that you signed [or authorized] the document and that you are unable to claim you didn't - or the recipient has proof you did - authorize the document)

      Repudiation is extremely critical to claim fraud or to stop a transaction, like when you claim a credit card charge is not authorized or the merchant stiffed you and you want a chargeback, or when you want to stop payment on a check. Non-repudiation is critical if you verified the recipient, provided the goods and services, and want to guarantee you get paid and the recipient can't falsely file a chargeback when they legitimately did receive the merch.

      For ordinary transactions using paper documents we have notary publics who provide both and the notary affixes their seal and signature, essentially countersigning the signature for original signed documents or verifying certified copies of documents. Some organizations like stock certificate issuers require more, and so there is Medallion Signature Guarantee, where the authenticating official certifies the identity of the party to the document and agrees to cover any financial loss to the recipient of the document if the person who was guaranteed is not the correct individual (or agent if they're signing as attorney in fact or signing on behalf of an entity like a corporation or LLC.)

      I am a notary public, licensed in two states. (A notary's license, or commission is, with minor exceptions only valid in the state of issue.) The jobs of a notary. in general, include the certification of documents, authentication of signatures, and taking oaths and affirmations (and some states don't include all of these or add special extras; notaries in Maryland can't certify copies of documents like contracts, while in Ohio, notaries have the power to issue subpoenas.). All done on paper and the job of a notary hasn't changed from essentially the way it was done 200 to 500 years ago. There are provisions for notaries to authenticate electronic documents but they're not uniform from state-to-state (some states don't even allow it or have the standards on how it is to be done) and the infrastructure to support it (including proper software and/or hardware for document certification and authentication) aren't there yet.

      Until we have well-defined electronic equivalents like notaries public and guarantees like Medallion, we're going to continue to have a problem with those very issues of authentication and/or non-repudiation (plus security when the document must be kept private like certain contracts or secret like classified material.)

  • by AmiMoJo ( 196126 ) * <mojo AT world3 DOT net> on Tuesday December 25, 2012 @09:13AM (#42386747) Homepage Journal

    Jean-Fran&amp;amp;amp;amp;#231;ois Blanchette

    When TFA text is managed by Slashdot's encoding you know something is wrong. I know Unicode has its problems but it really should be a priority at this point.

  • by Anonymous Coward on Tuesday December 25, 2012 @05:17PM (#42390043)

    We sure have a lot to worry if a tech site can't even get this right...

    Ditto for the RSS feed, which too often is treated like it's HTML while it isn't. It's an XML implementation...

  • by Anonymous Coward on Monday January 07, 2013 @02:20PM (#42507347)

    Just about none of the comments here deal with the claims that cryptography can improve legal evidence. It can bolster assertions of integrity, in that good use of a hash function can show that a document has not been altered between time A and time B. But its use in authentication is subject to a lot of problems. Those problems do not arise because of any problem with the mathematics, and debates about what level of encryption can be broken by what level of attack or attacker are not really helpful for the usual processes in civil or criminal courts.

    The problem with encryption systems is the interface between the user of the computer and the information created by it. How does the computer user know for sure that what he or she intends to create/encrypt/sign is actually what the computer generates as a result? And on the other end (authentication in a court process i.e. submission of evidence), how does the relying party prove that what was received was created/signed by the other party, rather than by someone else at the creator's computer or by intervening malware or a man-in-the-middle?

    Generally, access to the encryption tools is by way of a user ID and password, which are very vulnerable to attack. Even biometric controls are not foolproof, and most people don't have them. I doubt that even French notaries access their document-creating computers through a biometric screen.

    Some elements of the discussion are here: .

Never worry about theory as long as the machinery does what it's supposed to do. -- R. A. Heinlein