Dita Parker

Monday, October 2, 2023

That escalated quickly

Someone took time out of their precious weekend to email me just to tell me...well, I'm not going to post misinformation or disinformation on my blog, that's what Elon's X is for.

Firstly, thank you for writing, although since we don't know each other, your words didn't sting quite as much as you perhaps hoped they would. Anonymous + disposable email = zero fucks given. Secondly, if you disagree but can't/won't converse in an articulate and polite manner, scroll on to topics/opinions more to your liking. It's the World Wide Web, full of nooks and crannies for all tastes and occasions. Thirdly, insults reveal nothing about what you are so vehemently opposed to, so I don't know what to tell ya. Come back with an articulate argument and present it in a polite manner? Because things I care about = all fucks given, so shall we continue our discussion on AI? Despite popular demand, I'm about to.

I speak from a viewpoint of a writer (fiction) and translator (non-fiction), but know all manner of people in the arts, from music through photography to theater. Oh, and a game developer. I also know people in business, so I know the reasoning behind both sides of the isle, so to speak; what creatives fear they are losing, what entrepreneurs hope to gain. And it's not that clear-cut, that black-and-white; not at all. There is plenty artists can do with AI. If we could only agree on some rules. But as things stand, it's almost single-handedly up to the boards of AI/tech companies to make these rules and regulate themselves. Which translates to: No laws = no restrictions.

I also know several teachers, and herein lies my greatest worry: every single one of them has a bad feeling about the digitalization of education and the effects of mobile phones and social media on children and teens. These have been studied and proven to exist, the detrimental effects, I mean. Something else often mentioned: Why do we have to learn these things when we can just look this up if we want to or need to.

What these teachers are trying to hammer home with varying success: Without a baseline, a touchstone, without any accumulated, internalized knowledge, without media literacy and tools to spot misinformation and combat disinformation, facts become a matter of opinion, and the adults of tomorrow easily fooled and led. You don't know what you don't know.

AI doesn't know what it doesn't know, it has to be taught. But it learns in such a different way from humans that it gets things wrong all the time, makes guesses, talks like a confused individual suffering from memory loss, makes things up as it goes, or just confidently gives you an answer that on the surface looks perfect but turns out to be BS. AI also already knows plenty, serves several functions quite admirably, gathers and arranges data super fast, and keeps on learning.

But Pinocchio has a long way to go in order to become a real boy. It will need everything it can get its hands on; even that which we haven't volunteered to give. It will have to wade in a cesspool and enjoy the pinnacles of human achievement alike. What will it present to us as its findings, its truth about things, now that is the question. If we start going to these programs as we would an oracle, if they become omnipresent, a verb, like Google, but we don't know the first thing about what we don't know and have no idea where else to look, we're bound to be fooled and led, inadvertently or intentionally. (Because not everyone knows what they are doing. And just imagine what oppressive regimes could do with a Truth Machine all their own; China and Russia already have a government-curated internet.)

You may not care, but many do. You may gladly volunteer your stuff, but not everyone wants to, so please try to understand, respect, and sympathize with their point of view. A rising tide lifts all boats, they say. As a friend noted, this feels like a tsunami, and many of us will simply drown.

We need a cheering anecdote to cap things off, don't we? I can't recall it verbatim, but you'll find it in This Is How They Tell Me the World Ends: The Cyberweapons Arms Race by Nicole Perlroth. Picture a roomful of tech leaders. When asked to please raise a hand if they liked living in this world they've created, not a single hand came up.

(Another book recommendation, this one re: the reading brain in a digital world: Reader, Come Home by Maryanne Wolf. Every parent, teacher, tech leader, tech follower, AI developer, and human being should read this book.)