Reflect. Refract Toward The Future

Wall-E looking At The StarsI got it all wrong.

This is not to say that I have regret, or that I’m disillusioned. It’s more of the realization that I have suffered an illusion, and while I do not understand yet how I became illusioned, I understand that I have been.

It started as a child, really. I grew up the son of an engineer, and understanding how things worked was simply a way of living. It’s not a bad way to live. Later on, the personal computer revolution started and despite then living in a developing nation, PCs became my surfboard – and writing code became a primal need. I happened to be good at it.

The early 80s were a happening time in tech. It was a true revolution; the power of a computer in the hands of individuals and small businesses was unheard of. Given that we didn’t have the Internet and networks were just beginning, the world changed as rapidly as that would allow. The teenage version of me thought that it would be a great way to add value to the world. To make things that would make the world a better place, like the advertising promised… but I was too young to understand that one shouldn’t believe the advertising.

At one point, I began to understand that. And I began to understand that despite my best intentions, I wasn’t actually doing anything of worth. You, reader, may believe you are doing something of worth. I will tell you that maybe you are now, but it will likely not last – the churning evolution of technology swallows things, digests them and incorporates them into other parts – and you never see those things again. And it does so with people, too. Sure, you have the success stories.

In the end, though, you look back on the things you’ve played with and worked on decades later, nostalgically, and realize that they are gone. You made companies money for your living expenses, sold your abilities to the highest bidders, and one morning you wake up and realize that coding is the next blue collar job. There’s nothing wrong with that. But code has a way of changing, being tossed out or simply sitting somewhere on a server as technology rolls by.

I recall at job interviews over the past 10 years being asked about things I wrote, as if I single-handedly wrote anything or maintained anything in the last 10 years other than websites – and websites built disappear over time not through fault of the coders, but through faults of the businesses. And the same happens with the less visible code. Companies get bought out and their technology is either adapted, or tossed out (even if it’s better).

What I got wrong in all of this is not what I did but why I did it. This idea of generating actual value instead of making money is antiquated in this world, and perhaps the best reason for that is the people running things believe that money is the value and that everything else is transient.

Had I known that 3.5 decades ago, my approach on many things would have been different. I joke about being raised wrong, and there was a point when I wistfully pointed out that things used to be built to last  – but the world doesn’t want that. The constant evolution of everything requires, in this world, the financial backbone to do so. No technology survives without it’s own economy, and in that it is a slave to those with the disposable income to pay – not the masses whose lives could be improved by it. The cognitive dissonance of Silicon Valley in this regard, as well as others, leads a path to those who wish to follow – and that path is one of the financial backbone, of bankruptcies and failures unmentioned in the marketing brochures.

Tech will continue to change the world, but the socioeconomic disparity is playing itself out in democracies around the world. Interesting times.

 

 

Apples and Orangutans.

There was a discussion on Facebook about whether Apple products were worthy of the Enterprise, and there was some CTO of some company that processes data (just like everyone else) who put her title in front of her arguments – a nasty habit that diminishes a point – saying that Apple products are.

When it comes to processing and ability, Apple products are often superior to Windows products – but typically not within the same price range, so it’s an odd comparison of Apples and… well, you get the drift. But ability of a single machine wasn’t at issue, it was whether it could work within the Enterprise. At this time, I contend that Apple isn’t Enterprise-friendly because it’s not as cost effective – and let’s be serious, that’s not the market that Apple has really been going after. Yet? Historically, it never has.

But in this discussion, I was trying to tease out the importance of cost effectiveness and cross-compatibility between Apples and other machines on a network by pointing out that the developing world simply can’t afford the Apple-esque thought of the Enterprise, and that in turn got us into the Lowest Common Denominator (LCD)’discussion’ – where our opinions were drastically different. Her contention was not to worry about the LCD, she doesn’t care about them. Well, really, of course she doesn’t because the company she worked for at the time (and maybe now) doesn’t deal with users, and it hordes the processing. That’s their business model. But she couldn’t seem to make that distinction.

That’s a problem for the Enterprise, more so than the cost of Apples. The Enterprise, whether companies like it or not, extends beyond their infrastructure to other infrastructures – which are largely Windows and Linux hybrids. Why? Cost. And where does cost come to be a factor?

Oh. The Enterprise and the Developing world. And – excuse me, I need to twist this into a ending you didn’t expect  – it’s really about mobile devices (thin clients) and access to data.

Natural Language Processing, Health Records and the Developing World.

Case Investigation Team

The Veterans Administration will be using Natural Language Processing (NLP) for their medical records. It can be a powerful tool for searching for trends and getting the right people to the right treatments in a timely manner. That’s a gross oversimplification.

I know a bit about medical records1. I also happen to know quite a bit about Natural Language Processing, since I’ve worked with it in the context of documentation management.

And, as it happens, I know a bit about the developing world – the Caribbean and Latin America. And I know a bit about the hospitals in the region, where hand written records are kept, but lack the rigor and discipline necessary for them to truly be useful. I recently looked at the medical record of someone in Trinidad and Tobago, if you could call it that, since I found it odd that the Doctors and Nurses didn’t seem to communicate not only with each other but their own subgroups. I saw why.

I know of one doctor who keeps patient records in Microsoft Word documents – a step in the right direction.

There is an opportunity here for the developing world in general, but it’s a technology leap that must be undertaken with the discipline of good medical records in the first place. These delapidated medical systems, despite new buildings, need to have medical records that enable good care in the first place.

There’s no reason that medical care in the developing world should suffer; it can be done much more cheaply than in the developed world and with the advancements such as NLP already being implemented, it’s vacuous to build shiny buildings when the discipline of the medical records themselves should be paramount.

But then, maybe implementing electronic medical records properly would be a good start to building that discipline. 

1Medical Records have interested me from my days as a U.S. Navy Corpsman, where we were assiduous about medical records – Doctor’s orders, nursing SOAP notes, lab results – all had their place within a folder. It was just on the very edge of the medical databases that the U.S. Navy rolled out. When I was at my first USMC command, myself and other corpsmen’s first job was  to get the medical records ready enough to allow us to deploy – and it was an onerous task, with those who had gone before not having taken the records as seriously as they should. Later, I would work with a Reserve USMC unit at Floyd Bennet Field where I would be commended for my database work as related to their medical records.

The AI Future On Mankind’s Canvas

Doctor Leia.I met her and the young Brazilian woman on the flight from Miami to Orlando, this young Doctor who had an interview in Ocala. She was to drive across to Ocala, to the East, to see if she would get the job. She didn’t look old enough to be a Doctor, but I’ve passed the age threshold where doctors were younger than myself years ago. We talked about medicine and medical administration for a while even as I checked up on the nervous Brazilian high school graduate. I sat, a thorn between two roses, all the while thinking:

What sort of world were they entering? Doc Leia, a graduate from The University of the West Indies, off to Ocala, and the young woman to my right, off to see the sights as a reward for having survived so many years of schooling. They were both easily younger than most of my nieces. The Doctor had already become heavily invested in her future – medical school was a daunting path and might have been one I would have pursued with the right opportunities. The other was about to invest in her future and it bothered me that there wasn’t as clear a path as there used to be.

Artificial intelligence – diagnosing patients on the other side of the world – is promising to change medicine itself. The first AI attorney, ‘Ross’, had been hired by a NYC firm. The education system in the United States wasn’t factoring this sort of thing in (unless maybe if you’re in the MIT Media Lab), so I was pretty sure that the education systems in the Caribbean and Latin America weren’t factoring it in. I’ve been playing with Natural Language Processing and Deep Learning myself, and was amazed at what already could be done.

The technology threat to jobs – to employment – has historically been robotics, something that has displaced enough workers to cause a stir over the last decades – but it has been largely thought that technology would only replace the blue collar jobs. Hubris. Any job that requires research, repetition, and can allow for reduced costs for companies is a target. Watson’s bedside manner might be a little more icy than House, but the results aren’t fiction.

What are the jobs of the future, for those kids in, starting or just finished with a tertiary education? It’s a gamble by present reckoning. Here are a few thoughts, though:

  • A job that requires legal responsibility is pretty safe, so far. While Watson made that diagnosis, for legal reasons I am certain that licensed doctors were the ones that dealt with the patient, as well as gave the legal diagnosis.
  • Dealing well with humans, which has been important for centuries, has just become much more important – it separates us from AI. So far.
  • Understanding the technology and, more importantly, the dynamic limits of the technology will be key.

Even with that, even as fast food outlets switch to touchscreens for ordering their food (imagine the disease vectors off of that!), even as AI’s become more and more prominent, the landscape is being shaken by technology driven by financial profit.

And I don’t think that it’s right that there’s no real plan for that. It’s coming, there is no stopping that, but what are we as a society doing to prepare the new work force for what is to come? What can be done?

Conversations might be a good place to start.

 

 

 

 

 

Reinvention, Recursive.

Art evolvesWarning: This is kind of long and is a rant-ble. The short of it is that I’m not on the market anymore.

It’s time to evolve again.1

No, this is not the announcement of some Silicon Valley startup that will make you better elbows to stick in your ears or, heaven forbid, something useful.

No, this is about the site, myself, and the career path. To cut to the chase, I’m no longer looking for work or contracts in technology.

There’s a few reasons for this.

  • After 2 and a half decades, it gets boring when done right and annoyingly exciting when done wrong. More often than not in most companies, it’s being done wrong and it’s no fun getting excited for the wrong reasons.
  • Everyone wants a specialist and I’m a generalist.
  • Management doesn’t like me wandering around outside the building. They don’t think I’m working just because of the GIS coordinates of my body during thought.
  • AI is gonna take over at least some programming jobs (advances in programming in the past have had the reverse effect, broadening the field – something else for another time). It will only take one programmer who will because s/he can, and then an ecosystem to evolve it.
  • Did I mention I’m bored?
  • I have other options.

Plugging tech together can only be done in so many permutations. It’s a mathematical fact if you factor in that the geometric progression is necessary for evolution through the permutations.  

I’m not sure I like how the ecosystem is plugging tech together. Frankly, while it’s nice that the iFart application created a few jobs (don’t be the guy with the microphone), and while it will be seen as invaluable to those who pay for it, it’s crap and really doesn’t advance anything but a paycheck. Because, really, money got mistaken for something of value somewhere in the history of mankind.

Because I don’t like the way things are getting plugged together, to work means to evolve again, and the value of working on things I increasingly don’t like is… silly in a human and financial perspective. I’ve always believed that people should do what they want to, then later understood that people should do what they want to only if they’re good at it. I’m still good at it, but I don’t want to think about that too much.

There are other things I’m good at, and it’s time to go do them. It’s not that I’m becoming a Luddite – far from, you should see this heap of silicon I just bought – but that it’s not a career for me, at least for a few years. I’ll be using tech in other endeavors, and a great way to spend time waiting on others is to solve problems: Write code, design systems, or make a better mousetrap. But it’s not my main thrust, and oddly, I’ve been telling kids starting college not to do tech but to do other things with tech.

And in the meanwhile, things that I put my own sweat equity into over 5 years ago are paying, and require some attention.

1 Now there’s a marketing line…

Pulling My Photos From Flickr: Lessons Learned, More.

nsb SunriseI wrote ‘Risk and the Photo Cloud‘ as a first stab at identifying the problem I am having with mitigating risk with my Flickr stream as well as new needs I have from my collection of images.

Bear in mind, this is a ‘one off’ problem for me, and I approached it as such.

I was a bit more focused on what I wanted to do with them – I have people willing to buy prints – and I may not have made the best choice by paying $25 for Bulkr Pro. The trick was to get the tags, etc, and in doing the initial research on the Flickr forums, I saw nothing better. In writing this today, I found FlickrDownloader which I probably would have chosen given that it’s open source, available at no cost, etc. I may give it a spin anyway. Try it out and let me know how it differs. 

When using Bulkr Pro,  you can opt to have the full information being saved to text files.  This is generally not a bad idea, but it’s certainly not a great idea when you have over 19,000 files on Flickr as I do (only 18,000 or so are public). So, the general idea is to get them into a database.

I opted not to directly import them into the database after some thought because I need to do some manual editing of which files I want to keep, etc – and in doing that, a spreadsheet would be easier for that part of the process. So that’s what I did.

The text files generated by Bulkr are mildly annoying in this regard because their format isn’t meant for what I wanted to do. However, the line numbers for the specific information is constant, as is the labeling, so I was able to whip together a Python 3.x script that reads all the text files and writes them all to a CSV. I tossed it up on my GitHub; you can check out https://github.com/knowprose/BulkrTxtFilesToCSV

So now, I have the information in a CSV and, as time permits, I’m working on choosing which images I am getting rid of (with Flickr, I was lazy about space). Let’s just be clear and call it ‘dirty data’; when I uploaded I did it as a hobby and I now have a professional use.

From there, it’s a simple matter of uploading the CSV to MariaDB, which is extremely fast.

Having spoken to a few professional photographers I know, there are professional tools out there that do what I want with image management, etc, but I’m not too pleased with them. I may end up spinning my own image and file management system in Python, or not – I’m undecided at this point because I’m moving from a ‘one off’ to a more consistent system that will suit my personal needs.

Why Python? Honestly, I like coding in Python, but the job market has always been more interested in my C, C++, C#, VB, VB.Net, PHP/MySQL, etc. This is a low priority project for me, and it’s something I want to be fun while getting used to Python 3 a bit more. 

At this point, the system I’m thinking of will create resized images for the web, as well as allow me to edit based on tags. Pillow will allow for much of that, and more.

The Why and Why Not Of Documentation

FAIL stampMy core pet peeve of the last 10 years is the lack of documentation I’ve encountered with Software Projects. Sometimes it’s as the consultant that’s called in after someone already charged a lot of money for a substandard job. Sometimes it’s that legacy project that sticks around through generations of developers who leave a company. Sometimes it’s the “we don’t have time” factor.

Documentation is a value-added part of any software project for a variety of reasons:

  • Reference: The ability of software engineers involved with a project to look back and see why things were done the way they were, and to allow for knowing the limitations and potential of a project.
  • On-boarding: Bringing new software engineers up to speed quickly without having to bounce around the company finding answers.
  • New Projects: Proper documentation of old projects, including estimates and other data, is of great use for estimating similar projects. (Project Management and Business Analysis)
  • Complicated code reviews are simplified.
  • If someone gets hit by a bus, someone else can take over. Quickly.
  • Legal: If a project is supposed to meet certain requirements, those requirements should be traced to the actual implementation.
  • Teams: Proper documentation in an Agile or DevOps process allows team members to see what others are doing in similar areas of code, and allows easy identification of problem areas quickly.

These are just some of the reasons documentation is important. More often than not, I’ve ended up writing documentation at different companies because it simply did not exist before.

Why didn’t it exist? Here are the standard excuses:

  • No Time: We didn’t have any time to write the documentation because we’re busy working on the next thing!
  • “Me no write so good”: Fair enough, but there’s only one way to get better at writing.
  • “It’s not my job, man”: Erm – no. Documentation is a part of software engineering. A big part.
  • We don’t know how to organize it: Fair enough. Spend some time and figure it out and implement it.

At the base of all the excuses are 2 things that are really simple:

  1. Software Developers generally don’t like writing documentation.
  2. Management fails to have developers write documentation.

These 2 things are understandable, wrong, and amazingly easy to deal with if a company wants to.