When You Can’t Trust Voices.

Generative AI is allowing people to do all sorts of things, including imitating voices we have come to respect and trust over the years. In the most recent case of Sir David Attenborough, he greatly objects to it and finds it ‘profoundly disturbing’.

His voice is being used in all manner of ways.

It wasn’t long ago that Scarlet Johannson suffered such an insult that was quickly ‘disappeared’.

The difference here is that a man who has spent decades showing people the natural world has his voice being used in disingenuous ways, and it should give us all pause. I use generative artificial intelligence, as do many others, but there would be no way that I would even consider misrepresenting what I write or work on in the voice of someone else.

Who would do that? Why? It dilutes it. Sure, it can be funny to have a narration by someone like Sir David Attenborough, or Morgan Friedman, or… all manner of people… but to trot out their voices to misrepresent truth is a very grey area in an era of half-truths and outright lies being distributed on the Babel of the Internet.

Somewhere – I believe it was in Lessig’s ‘Free Culture’ – I had read that the UK allowed artists to control how their works were used. A quick search turned this up:

The Copyright, Designs and Patents Act 1988, is the current UK copyright law. It gives the creators of literary, dramatic, musical and artistic works the right to control the ways in which their material may be used. The rights cover: Broadcast and public performance, copying, adapting, issuing, renting and lending copies to the public. In many cases, the creator will also have the right to be identified as the author and to object to distortions of his work.

The UK Copyright Service

It would seem that something similar would have to be done with the voices and even appearance of people around the world – yet in an age moving toward artificial intelligence, where content has been scraped without permission, the only people who can actually stop doing this are the ones who are scraping the content.

The world of trusted humans is being diluted by untrustworthy humans.

Introspection (Writing)

ThoughtReviewing the statistics between KnowProSE.com and RealityFragments.com has been a bit revealing – empirically.

Between the two sites, I’ve done about 300 posts in the last year.

KnowProSE.com has 27 followers and 50 likes (WordPress) with roughly 5,000 views, averaging 2.5 visitors a day with 100 posts over a year.

RealityFragments.com, on the other hand, has 89 followers and 750 likes (WordPress) with roughly 2,000 views, averaging 1.4 visitors a day with 200 posts over a year.

I did not make goals on these sites – I simply allowed myself to post as I wished to, when I wished to, as often as I wished to so that I could see what happened – because, despite what your goal oriented classes have told you, the best thing to do sometimes is to see what happens. That some more technology related writing has gone to TechNewsTT.com isn’t really worth factoring in – my contributions there, while appreciated, aren’t numerous enough to affect things.

My non-technology writing gets more interest than my technology writing – we could argue that I posted less tech, but there are some other factors: KnowProSE.com has been my domain for over 10 years, whereas RealityFragments is only a year old (and doesn’t suffer the history OpenDepth once did, which was messing with statistics).

So what does it mean? Nothing, really. But it’s interesting to look at. It’s a datapoint.

What Society Wants.

Digital Divide; Society Divide.Since I’m writing about technology related things, it makes sense that I talk a little about society. After all, technology is a tool that society uses for a variety of things – from million dollar lawsuits over flatulence based applications to ‘sex beds’ in Second Life having copyright issues to… oh, things that very few people see as meaningful. As a society, we’re kind of like kids and we want to play, eat sweet things and have everything catered to us. No, maybe not you. After all, you’re reading this… but take a look around at what is popular. Take a look at

That’s all open to argument, I suppose – I’m a bit cynical of late – but the point is that as a general rule, society pretty much tells us what it wants from technology. It wants stuff that is easy, that is fun to use, allows us to be healthy while tasting perfect, that boosts our sex appeal even when everyone else has it, and so on. This is a bridge a bit far for us, so let’s keep it simple.

We want to have tools that allow us to do things with less effort. Plowing fields? Yeah, got a tool for that. Shoving metal pieces into wood, or even twisting them? We have tools for that. Boosting your social media presence? Well, we allegedly have tools for that.

So what is it, exactly, that society wants?

Stuff that makes doing things simpler. And the stuff that makes doing things simpler should be simple.

And that requires a fairly high level of complexity to create it.

Behold, the cognitive dissonance of humanity.

Apple vs. FBI: Hedgehog Factor

Sonic the HedgehogOn the old site, I wrote quite a bit about the Hedgehog’s Dilemma and how it applied to social media. I didn’t write about my own experiments with code, what I found, etc. – and that’s because I didn’t fully understand what I found. I still don’t. But I think it’s appropriate to bring it up now in the context of Apple’s amazingly open battle against the government about backdooring it’s own phone. It almost sounds like forced incest when you put it like that. Give me about 4 paragraphs before I make the point, OK?

So, first, the Hedgehog’s Dilemma itself. I like what Schopenhauer wrote:

A number of porcupines huddled together for warmth on a cold day in winter; but, as they began to prick one another with their quills, they were obliged to disperse. However the cold drove them together again, when just the same thing happened. At last, after many turns of huddling and dispersing, they discovered that they would be best off by remaining at a little distance from one another. In the same way the need of society drives the human porcupines together, only to be mutually repelled by the many prickly and disagreeable qualities of their nature. The moderate distance which they at last discover to be the only tolerable condition of intercourse, is the code of politeness and fine manners; and those who transgress it are roughly told—in the English phrase—to keep their distance. By this arrangement the mutual need of warmth is only very moderately satisfied; but then people do not get pricked. A man who has some heat in himself prefers to remain outside, where he will neither prick other people nor get pricked himself.

This is the battle within social media and networks with their consumers and the government. It’s constantly shifting. Personally, I’m amazed at how much people give away to simply have social connections of convenience – but it somehow works. So we have people’s expectations and wants of privacy, varying from person to person, across a network. But privacy is also intimacy, and privacy is largely a matter of how much one wants to be intimate with someone else – not everyone else. So we’re lax about privacy because we don’t consider much of what we do to be a personal space.

Cross into that intimate space, and bad things happen. People get upset, talking about privacy. I suppose in myself, my intimate space is a vast wasteland and I take it more seriously than others do, projecting that into how I interact on social media and networks without actually sucking too much at it. It goes beyond settings hidden behind a gear icon. It’s how much you share, what you share, etc.

So I’m going to drag this home. What’s at stake is the government forcing Apple to backdoor – to create something that wasn’t there – their own device, where so many people now keep their intimacy.

The Hedgehog factor, you see, is intimacy.

 

 

SunTechRamble: Liability And Technology

Atomic CruiserThe really interesting thing that happened this week relates to the regulation of a computer system as a driver (at least in some circumstances).  It means that computer systems are gaining ‘privileges’ that were formerly only for humans. It was bound to happen sooner or later, but admittedly I blinked when I saw it.

Google’s efforts and it’s return in this area are noteworthy:

It appears that Google has persuaded federal regulators that — in some situations at least — the Tin Man has a heart.

In a letter sent this month to Google, Paul Hemmersbaugh, the chief counsel for the National Highway Traffic Safety Administration, seemed to accept that the computers controlling a self-driving car are the same as a human driver…

So there’s the very cool side of this where we could celebrate this as a win. Technology in this area has gotten to the point where we can replace humans as drivers by virtue of increased safety. Google has been posting monthly reports on their self driving car project, and it seems that self-driving cars greatest danger comes from behind.  Google’s first accident involving one of their vehicles was in July of last year – and they were rear-ended.

It’s going to get more complicated if you consider the architecture.

If the vehicle is self-contained, it means it will likely need software updates. That means that unpatched cars may be roaming the countryside, since unpatched software is all over the place.

If the vehicle is completely stupid without an internet connection, as the Amazon Echo is, then connectivity to the controlling application will be an issue.

It’s most likely to be a hybrid of both. Where does your responsibility as a passenger of a vehicle you own start and begin? Will you be able to modify your own vehicle as you can now? What about auto insurance, will that go away or will we be paying insurance on a vehicle we may not own and can’t control ourselves?

Technology and Law are about to meet again. It’s going to get messy.

You might want to start negotiating your side now.

Did We Stop Dreaming of Technology?

It seems we’ve stopped dreaming of technology. It’s something that we just kind of expect, the constant improvement of what we can do with what we have. Technology has become so common place that it’s boring, and access to it is a necessity.

Mundane and addictive. A mother has to convince her neighbor to lock their wifi so that the children’s internet curfew is enforced.

There was a time when we dreamed. There was this period of Star Trek, of SkyNet and the Terminator.

[youtube https://www.youtube.com/watch?v=-UOTLTgDH44]

The Matrix, Tron: Legacy, and so much more. People were talking about Collective Intelligence and things of that ilk. SecondLife opened up the concept of the Metaverse and became an early simulator for what the Internet of Things would be credited for.

But we don’t really have this sort of stuff anymore because we have this sort of stuff everywhere.

The electric sheep need a new dream.

SunTechRamble: Right to Repair and Modify

copyright-hackingThere’s a new advocacy group lobbying for the right to repair everything. It’s not so odd that I found this the same week that Apple will brick (make useless) your phone if you get non-Apple repairs.

It’s not a coincidence. There’s a quickening. Just recently, General Motors (GM) told consumers that they don’t own their cars. They license them.

Why? Because profit. Share prices. And maybe even that 401K you’re letting someone else manage is applying the pressure to the companies to make larger decisions like this. Or maybe it’s just the way things go, like when Dell was nasty enough to make sure that it’s computer components were compatible with off the shelf components. I don’t know if they still do it, but I still won’t buy a Dell. Sadly, I use one at work because… low cost. Warranty. Convenient for companies.

Most people who don’t fix things may not understand why all of this is important. Most people run screaming from anything that blinks 12:00 at them – which is kind of understandable because that was a horrid design from the start (what, no battery? Really?). But non-technical people don’t want things that blink at them expectantly.

People want their cars to run. They want their computers and software – to them, they are sometimes the same thing – working so that they can do whatever it is they wish that apparently includes malware. They don’t want technology, as Douglas Adams wrote. They want stuff that works. So why is this so important?

What people need to  understand is that the idea that you could pay for something and not have anyone but the seller repair it could be a win for everyone except for one thing: Things inconveniently break, and warranties aren’t always as long or as inclusive as those that paid expect them to be.

Ask anyone who has been to a car dealership with a problem, or has had to return a device they got.

Repairing things, be it on your own or in your local area, is a handy thing that enhances a local economy, develops intellectual capital in a geographic (and geopolitical) area. Sure, Cuba’s been embargoed so long that many people don’t know why – yet they have cars from the 1950s driving around. Why? Because they fix their own stuff.

For those of us from the the 70s and before, that was simply a fact of life.

Now we have manufacturing life cycles.

The life cycle isn’t, ‘built to last’. It’s ‘built to last this long’.

Probably before the moment you started hearing about a life-time warranty, this was reality. The days of building things ‘to last’ had passed into the days of manufacturing things ‘to last this long’. Really, it’s not all bad, but in doing this in conjunction with Copyrights and Patents assures that no one can repair but those that are authorized.

To those of us in the software world who have been paying attention, this is nothing new. Famously, the Free Software Movement began when Richard Stallman (RMS) was unable to fix someone else’s code. The Open Source Initative splintered over distinctions in defining whether people could lock the source code away or not. There are plenty of opinions on that, and I do have one, but suffice to say that while distinctions are made between the two, the overall philosophy is largely the same. Both sides would argue with me.

Software itself suffers entropy. It gets more complicated no matter how hard you try for it not to – except maybe Solitaire and Notepad.

So people fix software if it’s worth it to them. Like a car, if they want to spend the money to get something fixed, they can – except maybe in the near future. I wonder how they’ll handle the performance market, the tuners, etc.

I won’t even touch patents in this post.

The point is that what started off as just software has become seen in just about any field. And it’s why Repair.org exists now.

What Repair.org focuses on.

The focus is on a few different industries:

 

I just joined as an individual member. I’m not going to make money off of my membership, and neither will you. But you may be able to help make legislation such that you’re not stuck with items that can’t be repaired or modified.

The Pitter Patter of Digital Footprints

Golden TunnelAnything you have ever done online is a part of your digital footprint. The other part of your digital footprint is what other people and entities have determined your about you.

Some use this information to predict what you’re going to do even when you don’t know you’re going to do it. That’s fairly benign in the hands of marketers because all that will happen is that you’ll go broke buying things that you hopefully need… but likely just want because of ‘good’ marketing.

The increasingly granular bits of information out there might make it more toxic.  Old data hangs around, and while it may have representative of you at certain points, it may not represent you at this time.

I have that issue with LinkedIn.com and wherever I’ve posted my resume in the past. People still ask me if I want to do Drupal when, no, I don’t. But because of this latency of the information, I still get people with thick accents calling me about Drupal. This is an annoyance, but if it were something else it could be damaging. In fact, if I were looking for a job, it would limit me to what I have done rather than what I want to do. It would work against me.

So from my digital footprint, I cast Drupal in my digital shadow and it bytes me on the ankle. Why? Because I made the mistake of working with it for a while – and that, to headhunters who pay for old lists, means I’m still doing it well after leaving it behind for a year. Sure, we know that they’re cheap. That’s not the issue.

What is the issue is that I’m being judged based on data that is no longer relevant. It could just have easily been something else. It could be that I was accused of a crime that I was later found innocent of- and an employer might see that and decide that they don’t want to hire someone felonious. I boggle at writing an example of it because someone’s cheap bot might scrape it and think I did something wrong, when in fact, I did not. But it’s in there now, stuck in the head of a demented network. Like a bad song that plays only to me. Or you.

So when you’re making that little digital footprint sound in what you consider private, it’s not that private – and those little echoes will play until eternity.

Or everyone updates their databases.

 

 

Buying the Future – What are we buying?

Skynet for DummiesA while back, I wrote about the Tyranny of an Inefficient Skynet. I found the thought of a Skynet that is buggy and makes a lot of mistakes a bit scarily amusing. We project our logic onto what we build and we almost always imprint at least some of our irrational behavior on it. Software developers of all ilks have their own styles; the conformists are usually the ones who made the wrong choice in major and feel like they have to suffer for it for the rest of their lives. Either way, all these people hammering out Code. Remember Lessig’s Code: And Other Laws of Cyberspace, Version 2.0?

Now we have the Internet of Things. IoT. A dressed up and marketable version of Web 3.0. People attaching all manner of things to the Internet, collecting data, acting on data when the people themselves may not even know what the data is. Privacy is traded for convenience and the ability to post cat videos on Facebook. The data is collected, decisions made – enough so where people will quite literally have worse lives if they don’t meet the criteria the algorithms (written by those software developers, remember them?)  fit, even if the data is misrepresentative or outright wrong.

That’s where we are now.

And with 3 decades of using and adapting technology behind me, I can’t help but wonder where exactly we’re headed as a society. Feynman spoke about it in his lectures (The Meaning of It All: Thoughts of a Citizen-Scientist) – from his perspective, it’s society that drives how we use science and technology. From the Atomic Bomb to your smartphone and it’s applications, small decisions add up to societal decisions… and ultimately, this Skynet we’re building. The Cold War gone, we live in an era where governments war over the Internet with propaganda machines powered by technology and hackers who have an allegiance of some sort, be it to a government or to a corporation or to themselves – rarely for society itself.

Children that used to go outside to play stay indoors, using software (games), living in worlds created by the imagination(s) of teams and implemented by programmers, increasingly educated by the same software with data that is selectively converted into information. And it happens faster and faster. Remember Gleick’s Faster: The Acceleration of Just About Everything?

So where am I going with this? I’m just wondering more and more often what sort of society we’re buying with our finances that are upended by algorithms on Wall Street. I’m wondering about all the shoddy software pushed out to meet some business need before it’s ready, fragile enough in certain spots that it allows breaking and bending at the weak points. We’re changing the world and we hardly know it as we drive into work, sipping coffee as we dodge traffic – but soon, the cars will be transporting us around.

The Internet has allowed people with common bonds to work together, play together – but in doing so, inadvertently, it has also allowed us to war against those we dislike – from CAPS-LOCK stuck on to outright attacks on someone else’s systems.

Transporting us around so that we can write code to buy things and influence our own future without a thought as to the long term consequences of our actions in a period of time where medicine and associated technologies will have us living longer to see the consequences of our collective decisions.

The small unconscious decisions making the big unconscious decisions for us, mindfulness out the window.

I suppose I may be in a dark mood this morning. I suppose that this may seem pessimistic or cynical. I suppose it’s disturbing if one were to think about it all.

We should go buy something to feel better about all of this…