Well, it’s not exactly Skynet, but that doesn’t mean that robots aren’t already taking over. Bad bots now constitute 37.9% of all web traffic. And that number, as you might imagine, is accelerating.
Hard cases make bad law, as they say. Daphne Keller of the Stanford Center for Internet and Society on why Australia’s response to the Christchurch massacre does not create the right incentives for internet platforms.
You don’t say? Ford CEO acknowledges that the company and many of their competitors might have overestimated the arrival of self-driving cars.
In recent law review articles, Hannah Block-Wehba argues why FOIA should apply to allow interested and aggrieved parties access to algorithms that governments and government agencies use to make automated decisions. Without expressing an opinion on the legal claims expressed in the article, I’m curious, if she’s right, who would make FOIA requests and what they would do with that information. I imagine that it could be useful to some plaintiffs’ lawyers. Maybe some journalists and academics. Not sure who else.
Jeremy Kidd asks: Who’s Afraid of Uber? The answer, not surprisingly, is entrenched interests and academics, mainly, whom the author compares to Bootleggers and Baptists.
In not-so-recent law review articles, here’s Eric Goldman on the ten most important Section 230 cases of all time. Spoiler Alert! The defendants almost always win.
Last and least, we have the UK “Online Harms White Paper.” Long on words and short on specifics, the paper follows the grand European internet regulatory tradition of choosing aspiration over analysis.
Here’s an excerpt:
Many of our international partners are also developing new regulatory approaches to tackle online harms, but none has yet established a regulatory framework that tackles this range of online harms. The UK will be the first to do this, leading international efforts by setting a coherent, proportionate and effective approach that reflects our commitment to a free, open and secure internet.