CALL OR TEXT FOR THE HELP YOU NEED NOW Call559.540.8899 Text559.233.8886

The Deciders at the Tip of the Technological Iceberg

Yesterday, the Huffington Post published a story about something I covered in a couple blog articles a while back. They mentioned what I wrote about, getting part of it wrong, but that’s to be expected from reporters who aren’t necessarily familiar with Law.[1]

Later in the day, I received a phone call from another reporter wanting to discuss the same topic.

I tried to convince that reporter that the real story was not exactly what everyone seems to be focused on. Or, more accurately (because Huffington Post did address one aspect of private companies being used for law enforcement data storage), the real story is the iceberg, and Axon (formerly Taser) is comprised solely of the Deciders at the Tip of the Iceberg.

To quote from the Huffington Post article,

[T]he problem exposes a more fundamental risk – a risk largely underappreciated [sic] by entrepreneurs and venture funds eagerly investing in police technology companies. Unlike many other industries or enterprises, policing and criminal justice is a tradition-bound and law-bound world, where legal rules, requirements, and practice can thwart innovation.

How many CEO’s understand the rules of criminal procedure, constitutional requirements or evidence rules? How many entrepreneurs know it is a criminal offense in some jurisdictions to share confidential juvenile records? As a VC, would you bet a company on your ignorance?

The focus in the Huffington Post is on the fiscal and legal fortunes of those who take on the task of creating “police technology companies.”

That’s all fine and good, but to quote “HuffPo” again,

Constitutional rights do not bend in the face of disruptive innovations. Zealous lawyers with obligations to protect statutory rights do not care about cost efficiencies or actionable analytics.

What I care about—and what is the foundational concern to the articles I’ve written already—is “who decides how tech interfaces with the criminal law system?”

The Military-Industrial Police State

To paraphrase G.R.R. Martin, “technology is coming.” Whether we want it or not, the police are going to adopt new technologies, and people who like to make money are going to do whatever they can to provide new technologies for adoption. From what I hear, GoPro is already working on a competing product to Axon’s “” I’m sure there are others.

In fact, technology has already come. Long before was a thorn in my side, President Dwight D. Eisenhower told us why it had come, and warned about the need for an alert, knowledgeable citizenry to keep a leash on it. He said:

This conjunction of an immense military establishment and a large arms industry is new in the American experience. The total influence—economic, political, even spiritual—is felt in every city, every state house, every office of the Federal government. We recognize the imperative need for this development. Yet we must not fail to comprehend its grave implications. Our toil, resources and livelihood are all involved; so is the very structure of our society.

The truth is, we have largely failed at that task.

The military-industrial complex has continued to expand. Eisenhower’s observation that the total influence was felt “in every city, every state house, every office of the Federal government” was never more true than today, when not just the military, but local police forces in small towns make use of advanced military technology—particularly, but not exclusively, weaponry—to execute search warrants for small-time drug dealers, petty theft investigations, and cock-fighters.

There is, after all, money to be made:

After the free trial is up, Axon’s Body 2 camera has a one-time cost of $399, while use of the company’s platform costs $15 to $89 per month, per officer.

Deus Ex Machina

Police technology isn’t limited to adoption of military weapons. On other fronts, defense attorneys face the deus ex machina—or the diabola ex machina—that dwells at the hearts and in the “minds” of Intoxilyzers, Video Spectral Comparators, and DNA sequencers, to name just a few of the technologies used by law enforcement.

Who knows how these things work? Well, someone does. But great efforts are exerted to keep criminal defense lawyers from finding out.

This is important. Because, as this 2008 article points out,

These black boxes not only deprive citizens’ of their right to drive, but also wrecks [sic] lives and puts [sic] innocent people in jail.

Yet, as the article also notes, the Attorney General for the State of Minnesota initially tried to cover up known flaws in the Intoxilyzer software—for Intoxilyzers which were still being used to deprive citizens’ of their rights at the time that article was written.

Don’t think he’s the only one.

Americans have been indoctrinated by various CSI television shows to believe the contraptions of science provide definitive answers to questions of guilt, or lack thereof.

Nowhere is this more true than with DNA. Unpronounceable numbers—what the hell is an “enneadekillion”? it’s the new name for what’s known in the U.S. as an “octodecillion,” or in Europe’s system (developed by Jacques Pelletier du Mans) as a “nonilliard,” of course. Such impressive-sounding scienc-y numbers are thrown out by experts to prove that no life form in the known universe could possibly match a particular DNA sequence, except for the accused.

There’s just one problem with this. Actually, there’s more than one, but in the interest of brevity,[2] I’m only going to mention this one: technicians often have to interpret the results output by the machine. And to do that requires the technician to have the ability to mind-meld with the machine.

Okay, maybe not mind-meld (yet). But the technician has to know how the machine works in order to accurately interpret the results.

This, of course, means that the criminal defense team needs to know how the machine works in order to adequately challenge both the results, and the interpretation of the results. But, for reasons allegedly pertaining to intellectual property, companies do not want anyone to know how the machines work. Oz remains safely enshrined behind a curtain of “trust us.”

The deus, or diabola, ex machina problem is one that has existed for some time. It’s an important area that needs to be addressed. It’s going to become even more important as newer tech is phased in not just for the creation of evidence, such as DNA profiles, but to manipulate other pre-existing forms of evidence, like pictures, video, or sound.

But before I say anything more about that, let’s look at electronic “storage” software, and facilities, such as that which comprises

Electronic “Storage” Facilities

So in my initial discussions with the District Attorney’s Office over the use of in juvenile cases, the response was, “These are just electronic ‘storage’ areas.” The Assistant District Attorney who ended up calling me to talk about this wanted to know how this differed from the District Attorney’s Office renting a room at Derrell’s Mini-Storage, and putting paper juvenile files in it.

The difference, as I explained, is that Derrell’s doesn’t have the key to the lock on the rental space. While certainly they could cut the lock off to gain access to the files, doing so would become immediately apparent (to anyone who looked). Someone would certainly be looking at some criminal charges for doing that. Should the DA decide to quit using Derrell’s, it’s not going to be that difficult to retrieve, and remove, all the files before Derrell’s takes control of the space.

Axon/Taser, on the other hand, never has to worry about not having the key to the lock on the rental space. There might be ways to set up a virtual system (via encrypted files, for example) that have a similar effect. Notwithstanding the User Agreement I was originally asked to sign, I’m sure Axon/Taser might even try to say that they have something like that in place right now.[3]

But if Axon (or anyone else) breaks the encryption key (if there is one) to access the data, who’s going to know? It’s not like there will be a broken lock lying on the ground, or missing from the storage area door.

I’m not going to repeat my entire “The Nerve of Law Enforcement” article here, but Axon/Taser has indicated that it has other uses—such as TV shows—for the data uploaded to I’m guessing that means they know how to access it.[4]

The bottom line is that an electronic “storage” facility is about as much like a real-world storage facility as my Second-Life avatar is like my real-life body.[5]

What’s What, Who’s Who, and Who Do I Cross-Examine?

Aside from the above, there are other important concerns with technology, particularly with “storage.” We already know that provides the capability to manipulate evidence that is uploaded to it. We also know that police officers have allegedly used such capabilities. Other digital media used for evidentiary purposes suffers the same potential defects.

I’m not even talking about the “low-tech” methods some police use.

What happens to the evidence once it’s uploaded to, or similar services?

The technology that could enable the future…isn’t the cameras themselves, but artificial intelligence tasked with processing the video and audio—transcribing police interviews and automatically generating timelines, descriptions, and testimonies surrounding crime. Algorithms considered intelligent today are mainly specialized to accomplish one task with great speed—an algorithm can identify objects in photos far faster than humans, or transcribe hours of speech into text in minutes.

Civil rights activists are rightly concerned about the uses to which this technology might be put. As a criminal defense attorney, I have different concerns, more similar to those relating to DUI defense, or defending cases with DNA evidence.

I’m not an expert on facial recognition. Half the time, I’m not even an expert at recognizing faces. Similar as those last two things might seem, I’m expert enough to know that they aren’t. If you think about it, you’ll realize you know a thing or two about facial recognition, and facial recognition software flaws, also. Facebook, for example, sometimes correctly recognizes faces, sometimes incorrectly “recognizes” faces, and sometimes fails to see a face at all where you can clearly see one.

In any event, just as blood-or-breath-alcohol analysis suffers a diabola ex machina problem, just as interpretation gives a false sense of the deus ex machina in DNA analysis, facial recognition is going to have its issues, as well.

Am I hanging out with Lana Lang of Smallville? Or Nicole Arnaiz of the Hernandez Strategy Group?

Axon isn’t planning to stop at facial recognition. Are defense attorneys going to have as much trouble learning how the Artificial Intelligence “decides” what to include in timelines, reports, interviews, and “testimonies”? Will we have unadulterated raw data, or just what’s been manipulated in some way (however minor)? What will this do to cross-examination? Will the officers become incompetent witnesses regarding evidence they themselves collected? Will defense attorneys be able to cross-examine AI constructs? What are the foundational issues to AI-mediated, AI-created, AI-curated evidence? What happens to evidence when cases are closed? What happens 20 years from now when someone files a habeas writ in a death-penalty case, and issues arise around either the evidence, or the way it was “stored” by Axon/Taser, or a similar company? A company which has gone out of business? Technology that evolved, but didn’t bring the evidence along with it?

Who Decides: Private Companies, Law Enforcement, Prosecutors, Legislatures

As it stands right now, these decisions are being made—if they are even considered—by private companies, at best in conjunction with law enforcement agencies that comprise their target market.

Let’s take a moment to remember that these agencies are the same ones who sue for the right to hire less-intelligent officers so they won’t get bored with their jobs; the same ones whose officers are not expected to know the law—so courts invented the Leon ‘good faith’ exception” to cover pseudo-investigatory mistakes.

Perhaps that’s why, as I complained in “The Nerve of Law Enforcement,” they ignored the law relating to juvenile case files: they just weren’t really aware of it. Then, too, the District Attorney’s Office apparently signed off on it.

This is one reason why I not only balked at the requirements to access discovery, but keep blogging about it. Axon/Taser, GoPro—and I should note that someone alleging to work as a consultant on the GoPro project called me after I wrote my article because they “want to do it right”—and other tech companies should not be the Deciders on the Tip of this Technological Iceberg.

Unlike the Huffington Post, I don’t have a lot of faith that,

Judges – not infrequently uncertain about new technologies – will likely side with defendant’s rights over corporate innovation.

Aside from the vested financial interest of the companies currently making the decisions, and their need to satisfy law enforcement (their customers), and not necessarily defense attorneys, or civil rights lawyers, these companies simply have no understanding of the public policies behind the legislation that comprises the criminal law system.

Guess whose responsibility that is?

  1. And it should be noted that I don’t recall talking to anyone from HuffPost, so that’s probably another reason for it. [↩]
  2. Ha ha! [↩]
  3. I have heard from a number of sources, including the Assistant DA who was handling this, that Axon/Taser has changed the User Agreement, partly in response to complaints like mine. [↩]
  4. And, by the way, this is another issue that probably should be considered by Legislatures. See the last section of this article. [↩]
  5. Second Life is still a thing, right? [↩]