Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Books Book Reviews

Book Review: Burdens of Proof 70

benrothke writes "When the IBM PC first came out 31 years ago, it supported a maximum of 256KB RAM. You can buy an equivalent computer today with substantially more CPU power at a fraction of the price. But in those 31 years, the information security functionality in which the PC operates has not progressed accordingly. In Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of Electronic Documents, author Jean-François Blanchette observes that the move to a paperless society means that paper-based evidence needs to be recreated in the digital world. It also requires an underlying security functionality to flow seamlessly across organizations, government agencies and the like. While the computing power is there, the ability to create a seamless cryptographic culture is much slower in coming." Keep reading for the rest of Ben's review.
Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of Electronic Documents
author Jean-François Blanchette
pages 288
publisher MIT Press
rating 9/10
reviewer Ben Rothke
ISBN 978-0262017510
summary Excellent overview and history of using cryptography to build a trust framework
The so called Year of the PKI has been waiting for over a decade, and after reading Burdens of Proof, it is evident why a large-scale PKI will be a long time in coming. More than that, getting the infrastructure in place in a complex environment that exists in the USA with myriad jurisdictions and technologies may prove ultimately to be impossibility.

The irony is that an effective mechanism for digital authentication would seem to be an indispensable part of the digital age. The lack of such an authentication infrastructure may be the very reason that fraud, malware, identity theft and much more, are so pervasive on the Internet.

The premise of this fascinating book is that the slow decline from the use of paper from a legal and evidentiary perspective has significant consequences. For the last few hundred years, paper has been ubiquitous in modern life; from legal and health records, school, employment and everything in between.

The book details the many challenges that businesses and governments face in moving from a paper-based record society and the underlying trust mechanisms that go along with it, to a new digital-based record system, and how a new framework is needed for such a method. The book details part of that new framework.

The book opens with an observation on the authenticity of President Obama's birth certificate. While Blanchette is not a birther, he does note that if the moral authority of paper records has diminished, then the electronic documents replacing them, which are what the Obama administration provided, appear to be even more malleable. And that is precisely the issue that he addresses.

Blanchette details a compelling story and writes it as an insider. He was a member of a task force appointed in 1999 by the French Ministry of Justice to provide guidance on the reform of the rules governing the admissibility of written evidence in French courts, into a digital format.

The first few chapters provide an excellent overview of the history of cryptography. Chapter 3 – On the Brink of a Revolution– gives an excellent summary of cryptography from 1976 on, starting with seminal research that was done by Diffie and Hellman, and Rivest, Shamir and Adleman (RSA).

In chapter 5, Blanchette details his narrative about how France embraced and moved to a more digital governmental framework. He notes that the challenge was that France was the country that gave bureaucracy its name, and is a place where citizens must carry at all times their papers d'identite and is a society enmeshed in paper. Blanchette writes of the many French bureaucracies that had to let go of their protectionist stances as they moved down the path to letting electronic documents have legal validity.

Blanchette writes that in France, one of the biggest impediments to moving to a digital framework were the French civil-law notaries or notaire. French notaries are much more powerful than a notary public in the US, and are closer to being what a paralegal does in the US.

The French notaire are a wealthy and powerful monopoly when it comes to issues of purchases, sales, exchanges, co-ownerships, land plots, leases, mortgages and the like. A notaire can form a corporation prepare commercial business leases and much more. The entire French notary profession had been dependent on its monopoly to grant authenticity, and no definition of electronic authenticity could emerge and succeed if it did not meet its criteria.

While paper trust may be intuitive now, Blanchette writes that it wasn't always the case. When documents were first created (whenever that may have been), they did not immediately inspire trust. As with other innovations, there was a long and complex period of evolution needed to gain accepted levels of trust.

In chapter 6, the books notes that many people assumed cryptography would be the mechanism that would inspire trust in the digital world. Blanchette writes that the mistake cryptographers made and sometimes continue to make; is that they often assumed that the properties of cryptographic objects will translate transparently into the complex social and institutional setting in which they are deployed in.

This was incisively noted in Why Johnny Can't Encrypt, which was a usability evaluation of PGP by Whitten and Tygar. The author's observed that user errors cause or contribute to most computer security failures, yet user interfaces for security still tend to be clumsy, confusing, or near-nonexistent. While the paper was written in 1999, most of its findings are still relevant.

Chapter 6 provides 3 fascinating case studies that show have different approach to security technology and cryptographic deployments are imperative in ensuring that they work.

In just under 200 pages, the books 7 chapters provide both a fascinating overview of the history of cryptography, in addition to showing how cryptography can be effectively used to authenticate digital documents. The book also has a high-level framework (a comprehensive framework would require at least 5 times as many pages) for an effective cryptographic framework for digital trust.

As Blanchette notes many times in the book, the challenge with getting digital signatures to work is not with the technology; rather it is with the underlying societal infrastructure in which to make it work. France was brought kicking and screaming into the age of electronic authentication, and is one of the few countries that have had such widespread success.

The book is a fascinating read that details how frustrating difficult it has been to create a comprehensive mechanism for digital authentication. The book raises many beguiling questions, and Blanchette is smart enough to notes that there are no simply answers to these multifaceted problems.

Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of Electronic Documents is both a fascinating overview of the history of paper and electronic authentication, in addition to providing a synopsis of what it will take to make create a cryptographic culture, where digital evidence will be as accepted in the courtroom, as its antique paper cousin.

Ben Rothke is the author of Computer Security: 20 Things Every Employee Should Know.

You can purchase Burdens of Proof: Cryptographic Culture and Evidence Law in the Age of Electronic Documents from amazon.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Book Review: Burdens of Proof

Comments Filter:
  • by stanlyb ( 1839382 ) on Monday December 24, 2012 @04:21PM (#42383469)
    The security, the protocol, the encryption, BUT, as you already guessed, our sweat government does not want us to have secure and secret documents, without the ability to spy on them, whenever they want to, and for whatever funny reason.
    • by Anonymous Coward on Monday December 24, 2012 @04:37PM (#42383567)

      *puts on tinfoil hat so he can reason with you*

      Any government that wants to control its citizens wants authenticated computing. It provides evidence and a cheap and easy method for spying, since it will undoubtedly be a black box as far as the owner of the computer is concerned.

      • by stanlyb ( 1839382 ) on Monday December 24, 2012 @05:21PM (#42383835)
        You do realize that the full blown encryption between person A and person B does not involve anything but the public keys from A and B for encryption, and the private keys from A and B for decryption? You do realize that there is no need of third party (government) to assist you of securing an encrypted tunnel between A and B? Again, do you really know what secure connection is?
        • by SuricouRaven ( 1897204 ) on Monday December 24, 2012 @06:08PM (#42384077)

          Almost. Such an encryption protects entirely against passive interception, but has a serious weakness: MITM attacks. There are only two ways to solve this problem. One is to pre-exchange keys over a secure channel. That's fine for connecting to the company VPN and such situations when someone has to physically set up the endpoints, but it's not a lot of use on the internet. The other is to have a trusted third party provide confirmation of identity, and in turn authenticate this third party by keys exchanged over a secure channel. It's a really ugly method (Can you *really* trust any of those CAs? Of course not!) but as of now, it's the only option there is. Some protocols rely on a web-of-trust system, but again it isn't suitable for all situations, particually those in which nodes are many and connections infrequent and transient.

          • by kwerle ( 39371 ) <kurt@CircleW.org> on Monday December 24, 2012 @09:02PM (#42384929) Homepage Journal

            Almost. Such an encryption protects entirely against passive interception, but has a serious weakness: MITM attacks...

            It seems like you are conflating security and encryption. Perfect encryption exists and is trivial for any two parties to use. There is no MITM problem. Security is only possible insofar as you trust someone or something - whether it be the person you are handing/receiving your public key to, or the web of trust, or whatever else.

            And, of course, once you have exchanged public keys and can start an *encrypted* conversation, verification of identity can be established by external (what was our previously agreed upon exchange, or let me call you and make sure you are who I am talking to) or even internal (let's see you on encrypted video) protocol.

            Really, the GP author is somewhat to blame - implying that an encrypted connection is the same as secure. It IS, as long as the endpoints are trusted. That's the only problem, and in practice it is not so large in many circumstances.

            • by stanlyb ( 1839382 ) on Tuesday December 25, 2012 @12:09AM (#42385561)
              Actually, what i am trying to imply is that if I, and my Friend decide to establish secure tunnel between us, and having before that exchanged the private keys, then it will become extremely difficult for anyone else to decode our conversation, or to pretend to be one of the parties (yes, that's true, the man in the middle has to have one of the private keys too if he wants to succeed). In most cases, this is enough, as the only way for the Evil guy to take your keys is to do it in person, physically. And if he has to do it for 300 million people...you make the math.
              • by kdemetter ( 965669 ) on Tuesday December 25, 2012 @02:11AM (#42385853)

                Unless he manages to convince each party that the public key has changed, without the secure connection.

                For example, you have JohnA@gmail.com and JohnB@gmail.com , and they have a secure connection through assymetric encryption.
                I create my own private keys JohnA@gmail.com and JohnB@gmail.com , and corresponding public keys.

                I send a mail from JohanB@gmail.Com to JohnA@gmail.com , stating that I've changed my key, and this is the new public key.
                I do the same from JohanA@gmail.com to JohnB@gmail.com.

                Now, I can intercept the connection which will be attempted with the new keys, decrypt them with my new private keys, and resend them using the old public keys.
                Each party will just receive it and decrypt it using their old keys, while thinking the other person has a new key.

                Offcourse, this could be avoided by having a clearly defined system of exchanging new keys preferably with new keys signed and encrypted by the old ones.

              • by kwerle ( 39371 ) <kurt@CircleW.org> on Tuesday December 25, 2012 @04:34PM (#42389809) Homepage Journal

                Actually, what i am trying to imply is that if I, and my Friend decide to establish secure tunnel between us, and having before that exchanged the private keys,

                Public keys.

                then it will become extremely difficult for anyone else to decode our conversation, or to pretend to be one of the parties (yes, that's true, the man in the middle has to have one of the private keys too if he wants to succeed).

                It is *impossible* unless they have your private key *and the ability to use it* (your passphrase).

                In most cases, this is enough, as the only way for the Evil guy to take your keys is to do it in person, physically. And if he has to do it for 300 million people...you make the math.

                Right. Or compromise your system in such a way that they can view what you do when you decode the conversation.

            • by SuricouRaven ( 1897204 ) on Tuesday December 25, 2012 @04:53AM (#42386147)

              You skip over the problem - exchanging public keys. If the channel isn't secure then what is there to stop an attacker from intercepting the keys in transit and replacing them with his own? You need to have either a secure channel for key exchange or a pre-shared secret. Neither of which is an option when you just want to view a website you've never visited before over SSL.

        • by TwezerFace ( 2788771 ) on Wednesday December 26, 2012 @04:09PM (#42397455)
          Sorry...who u directing this query to?
      • by TwezerFace ( 2788771 ) on Wednesday December 26, 2012 @04:08PM (#42397451)
        I guess you never saw how much staff the NSA has....
    • by stevetruelord ( 2801571 ) on Monday December 24, 2012 @04:38PM (#42383583)
      Nice conspiracy theory...u have the slightest bit of evidence to back up your claim?
    • by Anonymous Coward on Monday December 24, 2012 @05:25PM (#42383867)

      I have no idea what a "sweat government" is, but governments and corporations alike would love to have the sort of crypto tech they're really talking about here: automatic noncircumventable tracking of the source of every bit of data.
      The remaining shreds of our civil liberties traditions are still standing in the way of such progress.

    • by Zero__Kelvin ( 151819 ) on Tuesday December 25, 2012 @11:12AM (#42387189) Homepage
      Well, this guy [slashdot.org] is dead now, so there might be less sweat [thefreedictionary.com] government than you think!
    • by stevetruelord ( 2801571 ) on Tuesday December 25, 2012 @08:40PM (#42391197)
      As to our govt., you do realize the the mayan apocolypse really did happen...the govt. just covered it up.
    • by TwezerFace ( 2788771 ) on Wednesday December 26, 2012 @10:15AM (#42394231)
      I think there is a lot of hype around the NSA and their capability to encrypt. We have never seen a single instance where thy have gone to the courts and shown they decrypted strong encryption without some external vulnerability.
  • by Anonymous Coward on Monday December 24, 2012 @04:23PM (#42383485)

    as I am the first one to make a comment!

  • by M0j0_j0j0 ( 1250800 ) on Monday December 24, 2012 @04:37PM (#42383571)

    i can assume this was written by a moderate person by is qualification of "substantially".

  • by Anonymous Coward on Monday December 24, 2012 @04:41PM (#42383601)

    ... the problem is we've always had problems 'authenticating' what is true or false because it relies on human beings not being compromised and/or beholden to power.

    What I'm most worried about is how digital technology gives power to private actors outside the law to scoop up and profile everyone on the net. Everyone has something the rich and powerful who own the government can stick them with.

    I think many future political activists who were very 'open' on the net when young and stupid will end up paying for it hugely down the line when they mature and want to change the world for the better and then find out your political enemies goons know about things that could discredit you in the public eye.

    While encryption and pseudo-anonymity barriers (like tor) that make it harder to identify you is coming online it doesn't take long before a government wants to make it against the law or will have the resources to compromise enough servers/routers.

    The real problem for the end users is ease of use, cost, decentralization and the rather low amount of people who are intelligent and aware enough to be careful of what they post online as it is all being sucked into profiles about you.

    The internet + your cell phone + every company you interact with. Is just one giant total information awareness grid, with the likes of facebook/Google and telco's being linked up to secretive government agencies. The fact that governments now have the tools to chew through enormous amounts of data and pre-preemptively attack the little people is disturbing given that technology is a real game changer when it comes to social change. Being able to profile, predict and pre-emptively diffuse public anger and outcry from becoming a political threat is something we all should be concerned about.

    We can see this in the US elections when the banks were busy robbing the treasury and no one is going to jail media outlets successfully steered the publics outrage into traditional political venues and ideologies and made the ignorant public of america direct their anger towards each other instead of the whole corrupt corporate establishment. Government is so fucked because of rich corporations and their lobbying. If americans were smart they'd threaten the powers that be with socialism (not that they have to believe it, they just need to make the powers believe it), since Obama and Romney are just two sides of the same fascist coin, they are both hard right conservative republicans. The democratic party in america hasn't been liberal for decades and most liberals in america are just as myopic as their conservative brethren in that they believe the system can be reformed through traditional means.

  • by gelfling ( 6534 ) on Monday December 24, 2012 @06:08PM (#42384075) Homepage Journal

    EVERYTHING related to PC's is still, after 30 years, a clumsy bolt-on. Hell, networking and printing still have to be added, tweaked and configured and VPN is still a mess. As long as we tolerate companies like MS shoveling Windows 8 at us while the guts under the covers are garbage, this is what we'll get. I mean with a multicore processor there's no way to make one of those cores a security specific ASIC that does all the heavy lifting for security across the board in hardware. But we'll never get that because it's more important to have live tiles and 12 different apps that all do photo filters the same way. Hoo ray.

  • by Corwn of Amber ( 802933 ) <corwinofamber@@@skynet...be> on Monday December 24, 2012 @06:22PM (#42384165) Journal

    I should begin crowdsourcing a slew of form documents, in the style of "here is why your spam solution won't work".

    Beginning with "So you wrote up a cyber law. It won't work. Here's why it won't work."

    NOTHING can be saved if it can't be freely copied by anyone from anywhere. Most documents won't survive anyway, lacking interest in making copies for all of the time they're available.

    The one and only way to keep a document is to have it freely copyable by everyone everywhere forever, end of story. Everything else is on reprieve.

  • by Anonymous Coward on Monday December 24, 2012 @06:25PM (#42384191)

    AKA some stupid little french cunt. Fuck off, frenchie.

  • by sugarmotor ( 621907 ) on Monday December 24, 2012 @06:30PM (#42384221) Homepage

    Amazon "search inside this book" has no results for "NP" as in P vs. NP. How can that be? The book doesn't draw the connection to this major relevant open question on one hand, but has "burden of proof" in the title on the other hand?

  • by TheSHAD0W ( 258774 ) on Monday December 24, 2012 @08:39PM (#42384839) Homepage

    And 64KB on the motherboard. I know, I had one.

    • by Anonymous Coward on Monday December 24, 2012 @09:14PM (#42384999)

      like that fact really matters.....

    • by WhirledOne ( 213095 ) on Monday December 24, 2012 @09:40PM (#42385087)

      Y'know, I wondered if anyone was going to point out something along those lines. Actually, IIRC, the original maximum "official" memory capacity of the early 64K PC1 was in fact 256K if you only used official IBM memory expansion cards, but the memory map officially allowed up to 512K of RAM (and was supported by some 3rd party expansion cards). A few years later, IBM apparently realized that there wasn't really a need to reserve the entire remaining 512K of addressing space for ROM and device-specific RAM (such as video RAM), so they "unreserved" a block of 128K, thus bringing the official maximum to 640K.
      Even then, it was still possible to get beyond 640K of base ram by adding RAM in the "holes" unused by ROM on your particular PC, and using an appropriate driver in MS-DOS so that DOS would know about it. Examples of such "holes" in the memory map would be the space reserved for PCjr cartridge ROM, or the MDA video RAM space if you didn't have an MDA (or the CGA space if you had an MDA instead of a CGA). When VGA became commonplace, there was a shareware driver out there that would map the 64K VGA "window" to MS-DOS use, and switch the card to CGA compatibility mode. This gave you 704K of usable base RAM in DOS without any additional hardware, and was great for text-mode or CGA-mode only software where the VGA modes wouldn't be needed anyway.

      • by old_fortran ( 80585 ) on Tuesday December 25, 2012 @02:34PM (#42388977)

        The original 64KB 5150 motherboard (4 banks of 16KB each) supported 512KB AST and other 3rd-party option cards, but carried ROMs that had a total system limit less than 640KB. The second gen motherboard supported 4x 64KB, of 256 KB on the motherboard, and 640KB of main memory overall. My recollection is of some number like 512KB + 32KB, for a total of 544KB, but it could have been 512KB+64KB, or 576KB; STILL not 640KB. I remember this because I once had to replace ROMs from gen 1 motherboards so I could get some machines up to the full 640KB memory available.

        What I remember more, however, was how fast IBM's original expectations for the PC were surpassed by people using its relatively open architecture to do far more with it than IBM had planned (or anticipated). In 1980-81, few at IBM (or anywhere apparently) could conceive why one would want a PC with more than 128KB (64+64). By being open to change, the PC went quickly from that early 8-bit kind of view to one that would lead to a revolution in business and home computing. You can say what you want about PCs and Windows, but this is being typed on a garden variety home built PC vastly more powerful in every way that that distant ancestor. It has also had RAM, storage, and P/S upgraded over its lifetime (5 1/2 years thus far). Still useful, and more importantly, still usable with new OS versions (started on XP, moved to Win 7 when stable); and while I like having long HW and SW lifecycles, the point is I am not stuck with it as it was - like I am with my DVR (which is just a specialized Linux appliance really).

        Thus I do wonder what the last 30+ years would have been like if the "Apple appliance computing" model had been adopted by IBM in 1980-85 instead of the more open one used for the PC / XT / AT? Even though it came out in 1984, the first gen Mac was ridiculous - a completely closed Moto 68K-based mini-workstation with an "8-bit machine" memory limit. It **could** have been built with a removable bottom plate and enough memory sockets for 4x 64KB - but only 1/2 populated (an expansion capability similar to what is now available for its distant descendant, the Mac Mini). But it wasn't - you had to physically upgrade your 1st gen MAc to get decent memory: to 512KB, aka the "Fat Mac", and then upgrade again to get a hard drive in the Mac Plus. Or you had to resort to strategies that would void your warranty (e.g., the hardware equivalent of a "jailbreak"). This Jobsian approach to evolution - via sales of more hardware - should sound familiar to Apple fanbois everywhere at this point (and why I opted not to buy this year's version of the iPad Mini, but wait for - GASP - the one with the proper CPU, camera, RAM, and screen).

        Ancient history? Not really. At least two current trends (1) "wirecutters' and (2) cloud computing are going to see this "open architecture versus closed appliance/service" competition played out yet again. (A third may be iOS versus Android smartphones.) Overall I am still optimistic that on balance openness will lead to innovation that will be beneficial and also not necessarily anticipated by those who want everything tightly controlled for their own profit. This doesn't mean however that appliance advocates won't put up a good fight.

  • by DamnStupidElf ( 649844 ) <Fingolfin@linuxmail.org> on Tuesday December 25, 2012 @05:16AM (#42386193)

    In the paper world you have to invest significant resources to forge each paper document. In the digital world if you can forge one document with a free tool you can forge as many as you want. To raise the cost of being able to forge a digital document beyond what an attacker is willing to pay the cost of legitimate use becomes greater than the benefit.

    One possible solution is a hierarchy of security where the higher layers increase both the cost of forgery and the cost of legitimate use and let the market decide how much risk to bear. The SSL world tried to do that with extended validation certificates (the green address bar) but I'm not convinced it actually improved overall security since the problem is almost always at the user level. Maybe if they started selling extended validation hardware clients whose components were fixed in epoxy and ran highly secure firmware and software it would actually work. Trusted Computing is the obvious parallel in the PC world but it fails because the cost of developing software aimed at general purpose computers to rigorous security standards is too high. It's possible that as Moore's Law shoves hardware prices through the floor banks will just send relatively cheap secure hardware home with their new customers.

    • by stevetruelord ( 2801571 ) on Tuesday December 25, 2012 @09:33AM (#42386797)
      Good point...Bruce Schneier has written tons on this topics.
    • by starfishsystems ( 834319 ) on Tuesday December 25, 2012 @05:48PM (#42390243) Homepage


      We need to be clear about what EV is. It's not about SSL, it's about X.509. It doesn't solve a technical problem because EV identifies no technical problem with X.509 certificates. EV promises a procedural solution to a procedural problem, namely the failure by Certificate Authorities to take reasonable care to check the real-world credentials of certificate requestors in order to determine that they are who they claim to be. In effect, the CAs are saying, "Yeah, well, we were a bit negligent the last time around, but we promise to do a better job next time if you just pay us more money."

      So I share your misgivings about whether EV has improved security, but for rather different reasons. And there's nothing saying that we both can't be right.
    • To raise the cost of being able to forge a digital document beyond what an attacker is willing to pay the cost of legitimate use becomes greater than the benefit.

      Exactly why we have spam. Make sending forged address and spammy e-mail require prepaid costs such that it was as expensive as snail mail and spam would drop from 90% of mail to less than 0.1%. It's only because it costs nearly nothing either directly for small quantities (like phishing scams) or via botnets to send huge quantities of spam (for bulk ads for fraudulent products including erectile disfunction drugs) that so much can be generated. We have some partial solutions but the entire solution requires better tools to manage the movement and transfer of email.

      As for security of e-documents and other electronic transmissions, we need good tools to seamlessly handle authentication (proof the sender is who they claim they are) and non-repudiation (proof that you signed [or authorized] the document and that you are unable to claim you didn't - or the recipient has proof you did - authorize the document)

      Repudiation is extremely critical to claim fraud or to stop a transaction, like when you claim a credit card charge is not authorized or the merchant stiffed you and you want a chargeback, or when you want to stop payment on a check. Non-repudiation is critical if you verified the recipient, provided the goods and services, and want to guarantee you get paid and the recipient can't falsely file a chargeback when they legitimately did receive the merch.

      For ordinary transactions using paper documents we have notary publics who provide both and the notary affixes their seal and signature, essentially countersigning the signature for original signed documents or verifying certified copies of documents. Some organizations like stock certificate issuers require more, and so there is Medallion Signature Guarantee, where the authenticating official certifies the identity of the party to the document and agrees to cover any financial loss to the recipient of the document if the person who was guaranteed is not the correct individual (or agent if they're signing as attorney in fact or signing on behalf of an entity like a corporation or LLC.)

      I am a notary public, licensed in two states. (A notary's license, or commission is, with minor exceptions only valid in the state of issue.) The jobs of a notary. in general, include the certification of documents, authentication of signatures, and taking oaths and affirmations (and some states don't include all of these or add special extras; notaries in Maryland can't certify copies of documents like contracts, while in Ohio, notaries have the power to issue subpoenas.). All done on paper and the job of a notary hasn't changed from essentially the way it was done 200 to 500 years ago. There are provisions for notaries to authenticate electronic documents but they're not uniform from state-to-state (some states don't even allow it or have the standards on how it is to be done) and the infrastructure to support it (including proper software and/or hardware for document certification and authentication) aren't there yet.

      Until we have well-defined electronic equivalents like notaries public and guarantees like Medallion, we're going to continue to have a problem with those very issues of authentication and/or non-repudiation (plus security when the document must be kept private like certain contracts or secret like classified material.)

  • by AmiMoJo ( 196126 ) * on Tuesday December 25, 2012 @09:13AM (#42386747) Homepage Journal

    Jean-Fran&amp;amp;amp;amp;#231;ois Blanchette

    When TFA text is managed by Slashdot's encoding you know something is wrong. I know Unicode has its problems but it really should be a priority at this point.

  • by Anonymous Coward on Tuesday December 25, 2012 @05:17PM (#42390043)

    We sure have a lot to worry if a tech site can't even get this right...

    Ditto for the RSS feed, which too often is treated like it's HTML while it isn't. It's an XML implementation...

  • by Anonymous Coward on Monday January 07, 2013 @02:20PM (#42507347)

    Just about none of the comments here deal with the claims that cryptography can improve legal evidence. It can bolster assertions of integrity, in that good use of a hash function can show that a document has not been altered between time A and time B. But its use in authentication is subject to a lot of problems. Those problems do not arise because of any problem with the mathematics, and debates about what level of encryption can be broken by what level of attack or attacker are not really helpful for the usual processes in civil or criminal courts.

    The problem with encryption systems is the interface between the user of the computer and the information created by it. How does the computer user know for sure that what he or she intends to create/encrypt/sign is actually what the computer generates as a result? And on the other end (authentication in a court process i.e. submission of evidence), how does the relying party prove that what was received was created/signed by the other party, rather than by someone else at the creator's computer or by intervening malware or a man-in-the-middle?

    Generally, access to the encryption tools is by way of a user ID and password, which are very vulnerable to attack. Even biometric controls are not foolproof, and most people don't have them. I doubt that even French notaries access their document-creating computers through a biometric screen.

    Some elements of the discussion are here: http://www.slaw.ca/2012/07/16/the-myth-of-non-repudiation/ .

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...