Developments in AI have been coming fast and furious, with various chatbots gaining in apparent sophistication and plausibility, Wikipedia drowning in false and distorted AI-created content, and impactful AI music most recently appearing.
Casey Newton — the journalist behind the excellent “Platformer” newsletter — wrote a thoughtful essay this week about why it’s difficult to cover AI:
Unlike other technological shifts I’ve covered in the past, this one has some scary (and so far mostly theoretical) risks associated with it. But covering those risks is tricky, and doesn’t always fit into the standard containers for business reporting or analysis. . . . The reason I’m having trouble covering AI lately is because there is such a high variance in the way that the people who have considered the question most deeply think about risk.
So, whether you think we’re on the cusp of an age where computers begin serving us in ways that will make our lives far better, or on the brink of a societal catastrophe, nothing is clear except some anxiety. However, Newton is unwilling to hype AI, given how an uncritical hyping of social media companies led to some dark outcomes that extend to this day — political manipulation by enemy states, genocides, insurrections, social fragmentation, misinformation, and more.
But perhaps we can say a few things of concern about AI that are less charged.
So, let’s talk about AI music, and how it might affect culture, law, individuality and agency, the pace of change, and our connection with reality.