Reflect. Refract Toward The Future

Wall-E looking At The StarsI got it all wrong.

This is not to say that I have regret, or that I’m disillusioned. It’s more of the realization that I have suffered an illusion, and while I do not understand yet how I became illusioned, I understand that I have been.

It started as a child, really. I grew up the son of an engineer, and understanding how things worked was simply a way of living. It’s not a bad way to live. Later on, the personal computer revolution started and despite then living in a developing nation, PCs became my surfboard – and writing code became a primal need. I happened to be good at it.

The early 80s were a happening time in tech. It was a true revolution; the power of a computer in the hands of individuals and small businesses was unheard of. Given that we didn’t have the Internet and networks were just beginning, the world changed as rapidly as that would allow. The teenage version of me thought that it would be a great way to add value to the world. To make things that would make the world a better place, like the advertising promised… but I was too young to understand that one shouldn’t believe the advertising.

At one point, I began to understand that. And I began to understand that despite my best intentions, I wasn’t actually doing anything of worth. You, reader, may believe you are doing something of worth. I will tell you that maybe you are now, but it will likely not last – the churning evolution of technology swallows things, digests them and incorporates them into other parts – and you never see those things again. And it does so with people, too. Sure, you have the success stories.

In the end, though, you look back on the things you’ve played with and worked on decades later, nostalgically, and realize that they are gone. You made companies money for your living expenses, sold your abilities to the highest bidders, and one morning you wake up and realize that coding is the next blue collar job. There’s nothing wrong with that. But code has a way of changing, being tossed out or simply sitting somewhere on a server as technology rolls by.

I recall at job interviews over the past 10 years being asked about things I wrote, as if I single-handedly wrote anything or maintained anything in the last 10 years other than websites – and websites built disappear over time not through fault of the coders, but through faults of the businesses. And the same happens with the less visible code. Companies get bought out and their technology is either adapted, or tossed out (even if it’s better).

What I got wrong in all of this is not what I did but why I did it. This idea of generating actual value instead of making money is antiquated in this world, and perhaps the best reason for that is the people running things believe that money is the value and that everything else is transient.

Had I known that 3.5 decades ago, my approach on many things would have been different. I joke about being raised wrong, and there was a point when I wistfully pointed out that things used to be built to last  – but the world doesn’t want that. The constant evolution of everything requires, in this world, the financial backbone to do so. No technology survives without it’s own economy, and in that it is a slave to those with the disposable income to pay – not the masses whose lives could be improved by it. The cognitive dissonance of Silicon Valley in this regard, as well as others, leads a path to those who wish to follow – and that path is one of the financial backbone, of bankruptcies and failures unmentioned in the marketing brochures.

Tech will continue to change the world, but the socioeconomic disparity is playing itself out in democracies around the world. Interesting times.

 

 

Apples and Orangutans.

There was a discussion on Facebook about whether Apple products were worthy of the Enterprise, and there was some CTO of some company that processes data (just like everyone else) who put her title in front of her arguments – a nasty habit that diminishes a point – saying that Apple products are.

When it comes to processing and ability, Apple products are often superior to Windows products – but typically not within the same price range, so it’s an odd comparison of Apples and… well, you get the drift. But ability of a single machine wasn’t at issue, it was whether it could work within the Enterprise. At this time, I contend that Apple isn’t Enterprise-friendly because it’s not as cost effective – and let’s be serious, that’s not the market that Apple has really been going after. Yet? Historically, it never has.

But in this discussion, I was trying to tease out the importance of cost effectiveness and cross-compatibility between Apples and other machines on a network by pointing out that the developing world simply can’t afford the Apple-esque thought of the Enterprise, and that in turn got us into the Lowest Common Denominator (LCD)’discussion’ – where our opinions were drastically different. Her contention was not to worry about the LCD, she doesn’t care about them. Well, really, of course she doesn’t because the company she worked for at the time (and maybe now) doesn’t deal with users, and it hordes the processing. That’s their business model. But she couldn’t seem to make that distinction.

That’s a problem for the Enterprise, more so than the cost of Apples. The Enterprise, whether companies like it or not, extends beyond their infrastructure to other infrastructures – which are largely Windows and Linux hybrids. Why? Cost. And where does cost come to be a factor?

Oh. The Enterprise and the Developing world. And – excuse me, I need to twist this into a ending you didn’t expect  – it’s really about mobile devices (thin clients) and access to data.

Natural Language Processing, Health Records and the Developing World.

Case Investigation Team

The Veterans Administration will be using Natural Language Processing (NLP) for their medical records. It can be a powerful tool for searching for trends and getting the right people to the right treatments in a timely manner. That’s a gross oversimplification.

I know a bit about medical records1. I also happen to know quite a bit about Natural Language Processing, since I’ve worked with it in the context of documentation management.

And, as it happens, I know a bit about the developing world – the Caribbean and Latin America. And I know a bit about the hospitals in the region, where hand written records are kept, but lack the rigor and discipline necessary for them to truly be useful. I recently looked at the medical record of someone in Trinidad and Tobago, if you could call it that, since I found it odd that the Doctors and Nurses didn’t seem to communicate not only with each other but their own subgroups. I saw why.

I know of one doctor who keeps patient records in Microsoft Word documents – a step in the right direction.

There is an opportunity here for the developing world in general, but it’s a technology leap that must be undertaken with the discipline of good medical records in the first place. These delapidated medical systems, despite new buildings, need to have medical records that enable good care in the first place.

There’s no reason that medical care in the developing world should suffer; it can be done much more cheaply than in the developed world and with the advancements such as NLP already being implemented, it’s vacuous to build shiny buildings when the discipline of the medical records themselves should be paramount.

But then, maybe implementing electronic medical records properly would be a good start to building that discipline. 

1Medical Records have interested me from my days as a U.S. Navy Corpsman, where we were assiduous about medical records – Doctor’s orders, nursing SOAP notes, lab results – all had their place within a folder. It was just on the very edge of the medical databases that the U.S. Navy rolled out. When I was at my first USMC command, myself and other corpsmen’s first job was  to get the medical records ready enough to allow us to deploy – and it was an onerous task, with those who had gone before not having taken the records as seriously as they should. Later, I would work with a Reserve USMC unit at Floyd Bennet Field where I would be commended for my database work as related to their medical records.

The AI Future On Mankind’s Canvas

Doctor Leia.I met her and the young Brazilian woman on the flight from Miami to Orlando, this young Doctor who had an interview in Ocala. She was to drive across to Ocala, to the East, to see if she would get the job. She didn’t look old enough to be a Doctor, but I’ve passed the age threshold where doctors were younger than myself years ago. We talked about medicine and medical administration for a while even as I checked up on the nervous Brazilian high school graduate. I sat, a thorn between two roses, all the while thinking:

What sort of world were they entering? Doc Leia, a graduate from The University of the West Indies, off to Ocala, and the young woman to my right, off to see the sights as a reward for having survived so many years of schooling. They were both easily younger than most of my nieces. The Doctor had already become heavily invested in her future – medical school was a daunting path and might have been one I would have pursued with the right opportunities. The other was about to invest in her future and it bothered me that there wasn’t as clear a path as there used to be.

Artificial intelligence – diagnosing patients on the other side of the world – is promising to change medicine itself. The first AI attorney, ‘Ross’, had been hired by a NYC firm. The education system in the United States wasn’t factoring this sort of thing in (unless maybe if you’re in the MIT Media Lab), so I was pretty sure that the education systems in the Caribbean and Latin America weren’t factoring it in. I’ve been playing with Natural Language Processing and Deep Learning myself, and was amazed at what already could be done.

The technology threat to jobs – to employment – has historically been robotics, something that has displaced enough workers to cause a stir over the last decades – but it has been largely thought that technology would only replace the blue collar jobs. Hubris. Any job that requires research, repetition, and can allow for reduced costs for companies is a target. Watson’s bedside manner might be a little more icy than House, but the results aren’t fiction.

What are the jobs of the future, for those kids in, starting or just finished with a tertiary education? It’s a gamble by present reckoning. Here are a few thoughts, though:

  • A job that requires legal responsibility is pretty safe, so far. While Watson made that diagnosis, for legal reasons I am certain that licensed doctors were the ones that dealt with the patient, as well as gave the legal diagnosis.
  • Dealing well with humans, which has been important for centuries, has just become much more important – it separates us from AI. So far.
  • Understanding the technology and, more importantly, the dynamic limits of the technology will be key.

Even with that, even as fast food outlets switch to touchscreens for ordering their food (imagine the disease vectors off of that!), even as AI’s become more and more prominent, the landscape is being shaken by technology driven by financial profit.

And I don’t think that it’s right that there’s no real plan for that. It’s coming, there is no stopping that, but what are we as a society doing to prepare the new work force for what is to come? What can be done?

Conversations might be a good place to start.

 

 

 

 

 

Reinvention, Recursive.

Art evolvesWarning: This is kind of long and is a rant-ble. The short of it is that I’m not on the market anymore.

It’s time to evolve again.1

No, this is not the announcement of some Silicon Valley startup that will make you better elbows to stick in your ears or, heaven forbid, something useful.

No, this is about the site, myself, and the career path. To cut to the chase, I’m no longer looking for work or contracts in technology.

There’s a few reasons for this.

  • After 2 and a half decades, it gets boring when done right and annoyingly exciting when done wrong. More often than not in most companies, it’s being done wrong and it’s no fun getting excited for the wrong reasons.
  • Everyone wants a specialist and I’m a generalist.
  • Management doesn’t like me wandering around outside the building. They don’t think I’m working just because of the GIS coordinates of my body during thought.
  • AI is gonna take over at least some programming jobs (advances in programming in the past have had the reverse effect, broadening the field – something else for another time). It will only take one programmer who will because s/he can, and then an ecosystem to evolve it.
  • Did I mention I’m bored?
  • I have other options.

Plugging tech together can only be done in so many permutations. It’s a mathematical fact if you factor in that the geometric progression is necessary for evolution through the permutations.  

I’m not sure I like how the ecosystem is plugging tech together. Frankly, while it’s nice that the iFart application created a few jobs (don’t be the guy with the microphone), and while it will be seen as invaluable to those who pay for it, it’s crap and really doesn’t advance anything but a paycheck. Because, really, money got mistaken for something of value somewhere in the history of mankind.

Because I don’t like the way things are getting plugged together, to work means to evolve again, and the value of working on things I increasingly don’t like is… silly in a human and financial perspective. I’ve always believed that people should do what they want to, then later understood that people should do what they want to only if they’re good at it. I’m still good at it, but I don’t want to think about that too much.

There are other things I’m good at, and it’s time to go do them. It’s not that I’m becoming a Luddite – far from, you should see this heap of silicon I just bought – but that it’s not a career for me, at least for a few years. I’ll be using tech in other endeavors, and a great way to spend time waiting on others is to solve problems: Write code, design systems, or make a better mousetrap. But it’s not my main thrust, and oddly, I’ve been telling kids starting college not to do tech but to do other things with tech.

And in the meanwhile, things that I put my own sweat equity into over 5 years ago are paying, and require some attention.

1 Now there’s a marketing line…

Imagination, Creativity, Innovation (2015)

ObserverImagination. Creativity. Innovation. Powerful words that have become cheapened as buzzwords, falsely attributed to some and sometimes never attributed to the deserving. I’ve been accused of all 3 at different points in my life and it’s not a brag I make – most of the time it has seemed a curse. In fact, the thing that gained the most visibility remains one of my most painful memories.

It should be no surprise that I picked up Imagine: How Creativity Works and have been reading it in spurts. It’s not that my ADD has kicked in or that it’s a difficult read – it’s something I consider a thoughtful read. It’s criticized for not being scientific in that it’s largely – if not all – anecdotal and doesn’t cite references but my own experience is anecdotal. One point does not a graph make but what the book has done has caused some introspection. That’s healthy and, if you do it yourself, it’s cheaper than laying on someone else’s couch. Really.

The reason I picked up the book is because for decades I’ve been dealing with the software developer/ engineering side and tossing in the creative side. Getting the two to work in conjunction has become easier over time but it has become more difficult to function within companies that don’t understand that the two are not necessarily mutually exclusive.

A story I often tell involves an old manager at Honeywell who told me that something needed to be done. I said I would ‘play’ with it after lunch. He told me, “we don’t play at Honeywell. We work!” I arched an eyebrow and went back to what I was doing. My mentor at the time shook his head at the manager. My mentor got me. But it wasn’t new to me – my father was much the same way. In fact, most of my family is the same way. “If you’re not miserable, uncomfortable and otherwise aggravated, it is not WORK.”

Fine. Maybe it isn’t. But that isn’t productive for me.

Misery Loves Company

I’ve worked with people who took great pleasure in grinding away at a problem – and they were often good at it though in a very brute force manner. I’ve always known a playful mind is where ideas come from. I lean on it under stress; when most people are miserable (see the above definition of work), I’ll seem upbeat and amused at the world. Why? That’s how I handle stress. It’s also how I solve problems most efficiently. I don’t beat myself like a member of Opus Dei.

Don’t get me wrong – there are times to be serious. The reality, though, is that being miserable and expecting other people to be miserable lends itself to spreading misery more than creativity and problem solving. Some people mistake this for a positive attitude. It’s not. I’ve found that once I separate myself from the problem I can walk around it, dance with it and get to know it in a circumspect way.

I know I’m not alone in this but I’m fairly certain I’m in a minority.

Everyone’s Creativity and Imagination are… Different

Ask 5 kids to make up a story, individually, and you’ll likely get 5 different stories. If they play at it together, though, you get stuff like ‘tape a cheetah to her back‘. The different ways that individuals approach problems is something that can work synergistically or… not. It means reining in egos and being able to discuss a problem outside of one’s self. It takes a level of trust and a willingness to have bad ideas.

Imagine: How Creativity Works talks about different paths to creativity. Being relaxed allows one to solve problems more readily – something that seems very intuitive but is counterintuitive in most work cultures I have experienced (see definition of ‘work’ above). As a software developer, I typically solve all the problems away from the keyboard and go to the keyboard when I’m ready to implement or test an idea… but most work cultures expect a software developer to sit in place. I often go for walks throughout the day not just to get the blood moving rather than congealing in my posterior but to change what I am experiencing. It’s almost something that Buddhists have patented – being in the present. Being a little distracted, like watching something from the corner of your eye. Throwing yourself at the ground and missing – or as I explained it to someone recently, woolgathering.

It wasn’t too long ago that I fixed a browser plugin as I walked to lunch and pondered Costa Rican addresses. When Microsoft’s documentation fails – more often than Microsoft likes to think – all you can do is look at what you know and try to find a solution. Lateral thinking allowed me to look at the problem with my own experiences and understand a problem that was, literally, not in the documentation. The data being returned by a call to Microsoft’s API for the printer was returning more information than was planned for, causing the plugin to break 3% of the time.

There are other paths to getting creative problem solving and the book (ibid) points them out – anecdotally – but I’ve experienced much of what I have read myself. Did you know that people with ADD/ADHD seem to do better where creativity is required?

Look! Squirrel!

Technology and Innovation

When it comes to technology, more people associate innovation with adding features and functionality (Open Source and Microsoft do have something in common, after all) – but innovation isn’t about adding features and functionality. Innovation is about adding the right features and functionality for a problem or group of problems. You want innovation? Don’t look at the iPad, the iPod or what have you – look at the damned wheel. Look at fire. Look at Visicalc. Yeah, we know you want to talk about the Internet but the Internet hasn’t changed things as much as accelerated them, in my opinion – we could get into the decreased distance per unit time discussion but that’s not what I’m writing about right now. Simmer down.

When you read about how scotch tape was made, you get a real idea of innovation. Incidentally, the company that brought that to you also brought the touch screen. 3M. Not Apple.

Real innovation is rare and it’s even more rarely commercialized. Tesla invented and he basically had to give away his patent to Westinghouse to get alternating current off the ground while Thomas Edison was busy electrocuting any creature he could find during the War of the Currents. If you think Apple vs. Microsoft is a better love story than Twilight, the War of the Currents will knock your socks off – static electricity and all.

People all over the world want to know where innovation comes from – as if it’s a secret sauce that you add or a base upon which to build a foundation. What Imagine: How Creativity Works explores isn’t exactly that but the many paths of getting there – and there are quite a few. And the roots of such innovation, as one might expect, are in imagination and creativity – but the real roots are in being able to harness them in different ways and be willing to fail. Innovation is rarely someone in a white lab coat staring at a computer 12 hours a day. Innovation is what happens when you’re in the shower thinking about what happened during that 12 hour day.

It seems that in a world that shouts for innovation we work ourselves into a corner where we can’t get to it. Everyone has imagination and creativity and, yes, they can be as overdone as the young stereotypical black clad poet screaming into a microphone – but enough imagination and creativity can go a long way.

Image at top left courtesy Flickr user Hartwig HKD, whose inspirational shots were tough to choose from. The work is made available through this Creative Commons License.
http://rcm-na.amazon-adsystem.com/e/cm?t=knowprosecom-20&o=1&p=8&l=as4&ref=ss_til&fc1=000000&IS2=1&lt1=_blank&m=amazon&lc1=0000FF&bc1=000000&bg1=FFFFFF&f=ifr

Disruptive vs. Sustainable

Anachronistic TechnologyIt has been driving me a little nuts over the last few years with all the drivel posts on ‘disruptive’ this and ‘disruptive’ that, particularly when ‘sustainable’ was the catch-phrase from a few years ago that still lingers doubtfully in the verbage of non-profits. In fact, I tend to gloss over ‘disruptive’ these days when it shows up because so many people don’t balance it with sustainability.

You see, I was fortunate enough to read The Innovator’s Dilemma: When New Technologies Cause Great Firms To Fail back when it first came out in 1997 – I still have a copy of the first revision. So for this post, and some thoughts on a potential startup or two, I referred back to what I consider the best work out there on disruption and sustainability.

Here are the high points from the Introduction of Christensen’s book.  I use ‘product’ as an interchangeable word for ‘service’ in this context since a service is a product of sorts.

Sustaining Technology

  • Can be discontinuous or radical (so many internet posts seem to confuse this with disruptive when it can be either),
  • Can be of an incremental nature, or as I like to think of it, iterative.
  • Improves performance of established products along the dimensions of performance that mainstream customers in majority markets historically value.
  • Largely the most advancements in an industry.

Disruptive Technology:

  • Results in worse product performance, at least in the near term (in the majority market context).
  • Brings to market a very different value proposition.
  • Under-performs in established markets.
  • Has new fringe features/functionality.
  • Is typically cheaper, smaller, simpler and has more frequent use.
  • Lower margins, not greater profits.
  • Typically is embraced by the least profitable customers of the majority market.

These are very, very simple ways of looking at the differences between the two. A startup can utilize disruptive technologies and enter the market, but there has to be a plan for sustainability (other than being bought by another company) to present itself as a value proposition to anyone involved.

And that’s the key issue that most of the posts I’ve read on disruptive anything fail to mention. Sure, there is risk, but where there is risk, there should be risk mitigation. Don’t get me wrong, I understand solving problems as they come, but only presenting one half of disruptive technology – or disruptive anything, for that matter, is disingenuous.

The disruption of today, to be successful, should be successful tomorrow. Sustainability. Sustainability is why alternating current is used to transmit power over long distances, marketing is why people still think that Edison was more inventor than he was and that Marconi invented the radio.