Friday, September 21, 2007

2007-09-21 DC Bar Publishes Metadata Ethics Opinion - Sending and Mining Issues Addressed

The District of Columbia Bar recently issued an opinion on a lawyer's ethical obligations and responsibilities in connection with the sending or receipt of electronic data in a non-discovery context, that might contain confidential information.

The DC Bar approach takes a somewhat less balanced approach than does Florida, and permits greater leeway to a receiving party. To that end, the DC Bar opinion places upon the sending attorney a reasonable duty not to send electronic information containing confidence-laden metadata, but places an "actual knowledge" (rather than Florida's more stringent negligence-based "know or should know" standard) that a received document contains metadata.

In actuality the "actual knowledge" element might well prove be difficult to establish, as one might presume that the "actual knowledge" might be acted upon by as recipient in such as way as to *not* give rise to any suspicion of metadata mining. For instance, an attorney might have "actual knowledge" but unless that attorney acted in some way in furtherance upon that "actual knowledge" (such as claiming a fact found only in such metadata, or writing a brief setting forth such confidences, both of which seem fairly unlikely), an allegation of actual knowledge would probably fail.

Excerpt from DC Ethics Opinion 341:

Electronic Documents Provided Outside of Discovery

1. The Sending Lawyer:
Lawyers sending electronic documents outside of the context of responding to discovery or subpoenas have an obligation under Rule 1.6 to take reasonable steps to maintain the confidentiality of documents in their possession. This includes taking care to avoid providing electronic documents that inadvertently contain accessible information that is either a confidence or a secret and to employ reasonably available technical means to remove such metadata before sending the document. See N.Y. State Bar Ass'n Committee Op. 782. Accordingly, lawyers must either acquire sufficient understanding of the software that they use or ensure that their office employs safeguards to minimize the risk of inadvertent disclosures.

2. The Receiving Lawyer:
More often than not, the exchange of metadata between lawyers is either mutually helpful or otherwise harmless. Lawyers routinely exchange contracts, stipulations, and other documents that include “track changes” or other software features which highlight suggested modifications. Similarly, spreadsheets include necessary metadata such as formulas for the columns and rows, thereby providing a useful understanding of the calculations made. But when a receiving lawyer has actual knowledge that the sender inadvertently included metadata in an electronic document, we believe that the principles stated in Opinion Nos. 256 and 318 relating to inadvertent production of privileged material should be used in determining the receiving lawyer’s obligations. In Opinion No. 256, we stated that, where a lawyer knows that a privileged document was inadvertently sent, it is a dishonest act under D.C. Rule 8.4(c) for the lawyer to review and use it without consulting with the sender. We reached a similar conclusion in Opinion No. 318, regarding the receipt of documents from third parties. However, we noted in Opinion 318 that, where the privileged nature of the document is not apparent on its face, there is no obligation to refrain from reviewing it, and the duty of diligent representation under D.C. Rule 1.3 may trump confidentiality concerns.

Saturday, September 08, 2007

2007-09-08 Source Code for Breath Alcohol Device Revealed --- With Scathing Expert Analysis and a pre-4th Circ. Crawford v. Washington Testimonial Hearsay Argument - State v. Chun (N.J. Supreme Court, Docket No 58,879)

More proof that "code speaks" after all, and a good reference tool for those who make the argument that examining the source code comprising in a computing environment is essential in making arguments either in support or challenging computer generated evidence.

A breakthrough has occurred during the pendency of an appeal of a DUI conviction involving breath alcohol device in the New Jersey Supreme Court. The manufacturer of the device, the "Draeger Alco-Test 7110 Mk III" voluntarily provided the source code in advance of a decision by the Supreme Court of New Jersey. A source code examination was performed by software house "Base One."

Overview of Expert Findings - (Which identified more than 19,400 potential errors in the code):

"1. The Alcotest Software Would Not Pass U.S. Industry Standards for Software Development and Testing: The program presented shows ample evidence of incomplete design, incomplete verification of design, and incomplete “white box” and “black box” testing. Therefore the software has to be considered unreliable and untested, and in several cases it does not meet stated requirements. The planning and documentation of the design is haphazard. Sections of the original code and modified code show evidence of using an experimental approach to coding, or use what is best described as the “trial and error” method. Several sections are marked as “temporary, for now”. Other sections were added to existing modules or inserted in a code stream, leading to a patchwork design and coding style. The software development life-cycle concept is governed by one of the nationally and internationally recognized development standards to prevent defects from entering the software during the design process, and to find and eliminate more defects as the software is coded, tested, and released to the field. This concept of software development using standards requires extensive and meticulous supporting data, and notations in source files, and a configuration management system. None of this methodology is evident in the Alcotest code. Further, the decision method of how to allocate the architecture and assignment of tasks does not match any of the software standards. This further substantiates that software development standards were not used to verify or test the software, including the ISO 9000 family of standards. It is clear that, as submitted, the Alcotest software would not pass development standards and testing for the U.S. Government or Military. It would fail software standards for the Federal Aviation Administration (FAA) and Federal Drug Administration (FDA), as well as commercial standards used in devices for public safety. This means the Alcotest would not be considered for military applications such as analyzing breath alcohol for fighter pilots. If the FAA imposed mandatory alcohol testing for all commercial pilots, the Alcotest would be rejected based upon the FAA safety and software standards.

2. Readings are Not Averaged Correctly: When the software takes a series of readings, it first averages the first two readings. Then, it averages the third reading with the average just computed. Then the fourth reading is averaged with the new average, and so on. There is no comment or note detailing a reason for this calculation, which would cause the first reading to have more weight than successive readings. Nonetheless, the comments say that the values should be averaged, and they are not.

3. Results Limited to Small, Discrete Values: The A/D converters measuring the IR readings and the fuel cell readings can produce values between 0 and 4095. However, the software divides the final average(s) by 256, meaning the final result can only have 16 values to represent the five-volt range (or less), or, represent the range of alcohol readings possible. This is a loss of precision in the data; of a possible twelve bits of information, only four bits are used. Further, because of an attribute in the IR calculations, the result value is further divided in half. This means that only 8 values are possible for the IR detection, and this is compared against the 16 values of the fuel cell.

4. Catastrophic Error Detection Is Disabled: An interrupt that detects that the microprocessor is trying to execute an illegal instruction is disabled, meaning that the Alcotest software could appear to run correctly while executing wild branches or invalid code for a period of time. Other interrupts ignored are the Computer Operating Property (a watchdog timer), and the Software Interrupt.

5. Implemented Design Lacks Positive Feedback: The software controls electrical lines, which switch devices on and off, such as an air pump, infrared source, etc. The design does not provide a monitoring sensory line (loop back) for the software to detect that the device state actually changed. This means that the software assumes the change in state is always correct, but it cannot verify the action.

6. Diagnostics Adjust/Substitute Data Readings: The diagnostic routines for the Analog to Digital (A/D) Converters will substitute arbitrary, favorable readings for the measured device if the measurement is out of range, either too high or too low. The values will be forced to a high or low limit, respectively. This error condition is suppressed unless it occurs frequently enough.

7. Flow Measurements Adjusted/Substituted: The software takes an airflow measurement at power-up, and presumes this value is the “zero line” or baseline measurement for subsequent calculations. No quality check or reasonableness test is done on this measurement. Subsequent calculations are compared against this baseline measurement, and the difference is the change in airflow. If the airflow is slower than the baseline, this would result in a negative flow measurement, so the software simply adjusts the negative reading to a positive value.If the measurement of a later baseline is taken, and the measurement is declared in error by the software, the software simply uses the last “good” baseline, and continues to read flow values from a declared erroneous measurement device.

8. Range Limits Are Substituted for Incorrect Average Measurements: In a manner similar to the diagnostics, voltage values are read and averaged into a value. If the resulting average is a value out of range, the averaged value is changed to the low or high limit value. If the value is out of range after averaging, this should indicate a serious problem, such as a failed A/D converter.

9. Code Does Not Detect Data Variations

10. Error Detection Logic: The software design detects measurement errors, but ignores these errors unless they occur a consecutive total number of times. For example, in the airflow measuring logic, if a flow measurement is above the prescribed maximum value, it is called an error, but this error must occur 32 consecutive times for the error to be handled and displayed. This means that the error could occur 31 times, then appear within range once, then appear 31 times, etc., and never be reported. The software uses different criteria values (e.g. 10 instead of 32) for the measurements of the various Alcotest components, but the error detection logic is the same as described.

11. Timing Problems: The design of the code is to run in timed units of 8.192 milliseconds, by means of an interrupt signal to a handler, which then signals the main program control that it can continue to the next segment. The interrupt goes off every 8.192 ms, not 8.192 ms from my latest request for a time delay. The more often the code calls a single 8.192 ms interrupt, the more inaccurate the software timing can be, because the requests from the mainline software instructions are out of phase with the continuously operating timer interrupt routine.

12. Defects In Three Out Of Five Lines Of Code: A universal tool in the opensource community, called Lint, was used to analyze the source code written in C. This program uncovers a range of problems from minor to serious problems that can halt or cripple the program operation. This Lint program has been used for many years. It uncovered that there are 3 error lines for every 5 lines of source code in C. While Draeger's counsel claims that the "The Alcotest [7110] is the single best microprocessor-driven evidential breath tester on the market", Draeger has already replaced the antiquated 7110 with a newer Windows® based version, the 9510. The computer code in the 7110 is written on an Atari®-styled chip, utilizing fifteen to twenty year old technology in 1970s coding style."

The link to the Overview:

The link to the complete Base One Expert Report:

Crawford v Washington Argument re: Testimonial Hearsay: As for the application of Crawford v. Washington to computer generated information used to accuse (and convict), appellant's counsel Evan Levow argues in Appellant's Brief that ""[T]he Framers would be astounded to learn that ex parte testimony could be admitted against a criminal defendant because it was elicited” not “by ‘neutral’ government officers,”178 but by a machine." Mr. Levow's argument is on point, but his brief concludes that the computer code is testimonial hearsay, without addressing whether there is a human "declarant" uttering the out of court statement. Here, the link to Appellant's Brief: He does not address the definitional issue elevating the source code from non-hearsay to hearsay status.

The result of the Base One examination provides additional support for the proposition that the "statements" of the computer programmers in fashioning a series of "if-then" statements, particularly when taken together with the remarks that may be recorded as part of uncompiled source code used to create a software and hardware environment for a breath-alcohol testing device (or any computer, for that matter) are the "statements" of a human declarant made out of court. This critical issue has been misaddressed, or because of semantic confusion, misinterpreted by the 4th and the 10th Circuit in their view that the computer code used to produce (and I am careful not to limit the code process to "calculate" results) is a not a "statement" made by a "person" --- and therefore also not hearsay which might be subject the code to the 6th Amendment scrutiny afforded such testimony by the Supreme Court in Crawford.