The Random Darknet Shopper, an automated online shopping bot with a budget of $100 a week in Bitcoin, is programmed to do a very specific task: go to one particular marketplace on the Deep Web and make one random purchase a week with the provided allowance. The purchases have all been compiled for an art show in Zurich, Switzerland titled The Darknet: From Memes to Onionland, which runs through January 11.
The concept would be all gravy if not for one thing: the programmers came home one day to find a shipment of 10 ecstasy pills, followed by an apparently very legit falsified Hungarian passportâ€“ developments which have left some observers of the botâ€™s blog a little uneasy.
The title of the piece (Robots are starting to break the law and nobody knows what to do about it) elicits worries of AIs gone amok, but the basic conundrum of this piece and others about the Random Darknet Shopper is more complex: if I design an AI which takes a random, blindÂ action in a space which is largely – but not uniformly – illicit, am I legally culpable?
Take this thought experiment: imagine going around your office with a ten dollar bill, offering to buy whatever your colleagues would be willing to sell to you at that price, but under the condition that you do not see the item until the transaction has taken place. If one of your colleagues slipped you some cocaine, who would be at fault? What if you chose to repeat the experiment in an area of town infamous for drug-deals, are you suddenly more culpable?
When I was young, I used to order what they called “Grab Bag” comic packs, where I would pay a set amount of money for an unknown, random assortment of comic books. If someone had slipped a pornographic comic into my grab bag, it’s hard to see how I would be at fault. ButÂ where I choose to make my blind transactions seems to augment how we perceive culpability.
Several years ago I wrote a piece aboutÂ how randomness can complicate our standard notions of guilt. The intersection of randomness, culpability and the law sounds like an area that – if someone hasn’t written about it a lot already – is ripe for further work.