In the world we like to talk about since it reflects ourselves, technology weaves dendritically through our lives. Much of it is invisible to us in that it is taken for granted.
The wires overhead spark with Nikola Tesla’s brilliance, the water flowing in pipes dating all the way back 3000-4000 BC in the Indus Valley, the propagation of gas for cooking and heat and the automobiles we spend way too much time in.
Now, even internet access for many is taken for granted as social media platforms vie for timeshares of our lives, elbowing more and more from many by giving people what they want. Except Twitter, of course, but for the most part social media is the new Hotel California – you can check out any time you like, but you may never leave as long as people you interacted with are there.
This is why when I read Panic about overhyped AI risk could lead to the wrong kind of regulation, I wondered about what wasn’t written. It’s a very good article which underlines the necessity of asking the right questions to deal with regulation – and attempting to undercut some of the hype against it. Written by a machine learning expert, Divyansh Kaushik, and by Matt Korda, it reads really well about what I agree could be a bit too much backlash against the artificial intelligence technologies.
Yet their jobs are safe. In Artificial Extinction, I addressed much the same thing but not as an expert but as a layperson who sees the sparking wires, flowing water, cars stuck in traffic, and so on. It is not far-fetched to see that the impacts of artificial intelligence are beyond the scope of what experts on artificial intelligence think. It’s what they omit in the article that is what should be more prominent.
I’m not sure we’re asking the right questions.
The economics of jobs gets called into question as people who spent their lives doing something that can be replaced. This in turn affects a nation’s economy, which in turn affects the global economy. China wants to be a world leader in artificial intelligence by 2030 but given their population and history of human rights, one has to wonder what they’ll do with all those suddenly extra people.
Authoritarian governments could manipulate machine learning and deep learning to assure everyone’s on the same page in the same version of the same book quite easily, with a little tweaking. Why write propaganda when you can have a predictive text algorithm with a thesaurus of propaganda strapped to it’s chest? Maybe in certain parts of Taliban controlled Afghanistan, it will detect that the user is female and give it a different set of propaganda, telling the user to stay home and stop playing with keyboards.
Artificial Extinction, KnowProSE.com, May 31st 2023.
These concerns are not new, but they are made more plausible with artificial intelligence because who controls them controls much more than social media platforms. We have really no idea what they’re training the models on, where that data came from, and let’s face it – we’re not that great with who owns whose data. Henrietta Lacks immediately comes to mind.
My mother wrote a poem about me when I joined the Naval Nuclear Propulsion program, annoyingly pointing out that I had stored my socks in my toy box as a child and contrasting it with my thought at the time that science and technology can be used for good. She took great joy in reading it to audiences when I was present, and she wasn’t wrong to do so even as annoying as I found it.
To retain a semblance of balance between humanity and technology, we need to look at our own faults. We have not been so great about that, and we should evolve our humanity to keep pace with our technology. Those in charge of technology, be it social media or artificial intelligence, are far removed from the lives of people who use their products and services despite them making money from the lives of these very same people. It is not an insult, it is a matter of perception.
Sundar Pichai, CEO of Google, seemed cavalier about how artificial intelligence will impact the livelihoods of some. While we all stared at what was happening with the Titan, or wasn’t, the majority of people I knew were openly discussing what sorts of people would spend $250K US to go to a very dark place to go look at a broken ship. Extreme tourism, they call it, and it’s within the financial bracket of those who control technologies now. The people who go on such trips to space, or underwater, are privileged and in that privilege have no perspective on how the rest of the world gets by.
That’s the danger, but it’s not the danger to them and because they seem cavalier about the danger, it is a danger. These aren’t elected officials who are controlled through democracy, as much of a strange ride that is.
These are just people who sell stuff everybody buys, and who influence those who think themselves temporarily inconvenienced billionaires to support their endeavors.
It’s not good. It’s not really bad either. Yet we should be aspiring toward ‘better’.
Speaking for myself, I love the idea of artificial intelligence, but that love is not blind. There are serious impacts, and I agree that they aren’t the same as nuclear arms. Where nuclear arms can end societies quickly, how we use technology and even how many are ignorant of technology can cause something I consider worse: A slow and painful end of societies as we know them when we don’t seem to have any plans for the new society.
I’d feel a lot better about what experts in silos have to say if they… weren’t in silos, or in castles with moats protecting them from the impacts of what they are talking about. This is pretty big. Blue collar workers are under threat from smarter robots, white collar workers are under threat, and even the creative are wondering what comes next as they no longer are as needed for images, video, etc.
It is reasonable for a conversation that discusses these things to happen, and this almost always happens after things have happened.
We should be aspiring to do better than that. It’s not the way the world works now, and maybe it’s time we changed that. We likely won’t, but with every new technology, we should have a few people pointing that out in the hope that someone might listen.
We need leaders to understand what lays beyond the moat, and if they don’t, stop considering them leaders. That’s why the United States threw a tea party in Boston, and that’s why the United States is celebrating Independence Day today.
Happy Independence Day!
2 thoughts on “Beyond The Moat.”