#017 - 2026/03/04
A selection of what I've read this past week.

My main newsletter, Complex Machinery, includes a section called "In Other News..." It's where I list one-liners about interesting articles that didn't fit into any segments.
You can think of this list as a version of In Other News, but with a wider remit than Complex Machinery's "risk, AI, and related topics."
Above the fold
- American and British intelligence agencies had sorted out Russia's plans well in advance of the invasion of Ukraine. The problem? No one believed them. Most importantly, not Ukrainian officials. While I understand the hesitance to believe the "Iraq has WMDs" crew, history will chalk this up there with the Challenger disaster, Al Qaeda attacking the US, Iraq descending into civil war, and several other cases in which leaders failed to listen to experts' warnings. (The Guardian)
- How will the death of CJNG cartel boss "El Mencho" impact the flow of illegal drugs and its associated violence? History tells us that taking out the kingpin offers temporary relief, followed by the situation getting worse. (FT)
- Unexpected knock-on effects: the crypto boom spiked interest in luxury watches, and now Rolex has opened a free (but highly selective) training school for watch repair technicians. Repair of high-end devices is a rare AI-proof job, too. (GQ)
- Back when I covered web3, Sam Altman's Tools for Humanity project – aka Worldcoin, aka World, aka The Orb, aka hand your biometric data over to some random tech company so you can prove that you're human – kept cropping up. Usually in the "a local government demands Worldcoin halt operations" kind of way. So if The Orb partnering with major mainstream retailers sounds like a bit of reputation-washing to me, that's why. (Gizmodo, WSJ)
- Spotify claims that its top developers haven't written any code since December because genAI is handling the work. I'll also note that they dropped this gem during an earnings call, so make of that what you will. (Business Insider)
- The downside of letting genAI bots churn out mountains of code? Human reviewers can't keep up with it, which increases the risk exposure to bugs lurking in the software. (The Register)
- Tech reporter Benj Edwards, who recently published an article that included AI-generated quotes, no longer works at Ars Technica. Compare this to everyone else who misused genAI in the workplace – including several attorneys who included hallucinated artifacts in official filings – who still remain employed. (Futurism)
Special section: AI and war
- The standoff between Anthropic and the DoD has ended with Anthropic sticking to its principles, and the DoD marking it as a supply chain risk. (WSJ, Die Zeit 🇩🇪, Le Monde 🇫🇷, Les Echos 🇫🇷)
- Hot on the heels of the Anthropic/DoD breakup, OpenAI swoops in to fill the void. (CNBC)
- In simulated war games, genAI bots repeatedly hit the nuclear button. (New Scientist)
- Major genAI companies seem all too eager to help automate war ... (The Verge)
- … especially OpenAI, which seems to have given in to the Pentagon's demands even though they claim that they didn't ... (The Verge)
- … but they do acknowledge that the deal "'looked opportunistic and sloppy." Perhaps because it was? (CNBC)
- Whatever the case, Claude's iOS app is more popular than ChatGPT's. (Sherwood News)
The rest of the best
- Microsoft doesn't like people referring to its genAI artifacts as "Microslop." It went so far as to ban the term from its Discord server. (Windows Latest)
- Adding to the list of potential problems caused by the massive datacenter buildout: what happens when several of them disconnect from the power grid at once? (WSJ)
- Celebrity wins follow a suspicious pattern at an online casino. Is it pure luck? Or a way to entice other gamblers to join in? (Bloomberg)
- High-ranking execs are (still) having a rough time with AI adoption… (HBR)
- … which is probably why so few companies are seeing tangible benefits from AI. (Sherwood News)
- In an attempt to control his own robot vacuum, one guy uncovers a security hole that lets him control everyone else's robot vacuum. (Inc)
- Instead of AI-based translation into English, what if we just give you AI-based accented English? (AP News)
- A lot of AI is supported by human labor. Humanoid robots are no exception. (MIT Technology Review)
- ISIS discovers new uses for AI on social networks. You know, right as major social networks are scaling back on their content moderation systems. (404 Media)
Don't miss what's next. Subscribe to In Other News...: