Chains of autonomous responsibility

How will technology and law collide?

· Supply Chain and Systems

Today I'd like to talk about something really boring. Because covering the granular details around the implementation of robotic forklifts or how autonomous trucking fleets still have a way to go. That stuff's nowhere near as dry as this next piece.

To answer your question, no I am not trying to test out a new form of warfare which makes people so bored that they fall asleep. I am simply trying to clear the backlog of stuff I’ve not posted anywhere and will do my absolute best to make this topic less painful.

That topic being, chain of responsibility. Personally, I think this is one of the most important parts of logistics and supply chain management. For those who don’t know, CoR is the industries way of keeping track of who to blame. A checklist of causes if you will. A typical CoR framework (a new word I admittedly have no idea how to use) looks something like this.

A truck gets loaded by a forklift operator with the driver supervising. Once the loading is done a piece of paper is signed by the forklift operator - declaring that they have followed the driver’s load instructions. Meaning that as far as responsibility goes for the truck and its load, from that point on, its all on the driver.


Another example of CoR in action happens before a truck is even taken on the road; The driver needs to check that the truck is safe to operate by ticking things off and signing their name to it. The office participates to because staff, within reason, need to assess if the driver is fit to conduct their driving duties.

There are a lot more checks and structures in place and all of them are designed to assess that the links in the supply chain are held to some sort of responsibility. The idea is to manage risk and in a way, disincentivising companies from running questionable operations. Because, and I know I promised to try to keep things light, the chain of responsibility is a very serious and important thing. It is dangerous to put machinery on the road and even more so when it's carrying a load of heavy goods. It’s dangerous for the driver of the truck and for the drivers, pedestrians, joggers, cyclists, and small furry creatures who also use public spaces.

The big question I have is, how will all of this work with automated actors in the supply chain?
Sure, you could say that robots don’t make any mistakes, a robot wouldn’t drive off the road and a robot wouldn't load a truck incorrectly – both are possible – but what about people, people make mistakes. Which protocols are triggered then? Who takes responsibility for that accident? The programmer? The pedestrian?

One could also say that technology needs to be established before it can be regulated. You can’t make legislation based on a hunch, right? But you can incentivize progress this way…

I don’t really have an answer to these things, it will take minds greater and more legally tuned than mine to figure it out. Whether its around self-driving cars, autonomous drones, or a fleet of seriously heavy trucks. The question of who to hold accountable is one we need to start thinking about. So far, the supply chain industry has attempted to manage this risk with concepts like CoR.

But as we transition into a new era of hybrid responsibility between man and machine, we need to look at how to address this kind of risk sooner rather than later. I'd like to believe that after a few years of 'technological advancement' we are capable of being proactive today rather than reactive tomorrow.

If you have any insights into this, please let me know.