Media Responsibility and Learning.

I often cringe when I read what people share on social media. Aside from the inner proofreader that was so necessary as a youth, I run across things like, “TTPS: Illegal entry into T&T is a crime“.

What else is illegal that is a crime? 

If the goal was to make the Trinidad and Tobago Police Service look illiterate – mission accomplished. If the goal was to make The Morning Brew, a local program, look a bit foolish – mission accomplished. And it’s there for all the world to see.

If you watch the video, though, the headline is does not represent what was actually said – a distillation that demonstrates a lack of thought and consideration.

Who came up with this headline, and do they even understand their mistake?

This prompted me to immediately mock it, of course – pondering with a friend as to what else that is illegal might be a crime.

Murder is illegal, so is it a crime?  Littering is illegal, so is it a crime? And so on and so forth – which amused me for a few minutes, but then it struck me:

There are people who may seriously be thinking in that way.

Words have a power all their own, and the way we all learn is not by reading dictionaries but through context.

So yes, I’m picking on this particular headline, which is unfair. In a world where all too often people share without reading the associated link, we’re implicitly showing people how to communicate by example. There could be a secondary school student right now writing an essay that may reach pull the ‘illegal’ and ‘crime’ thing out of their bag unwittingly… only to be openly mocked by an English teacher and their class.

Why? Because they made the mistake of learning from a media headline.


Career Advice from a Neo-generalist Perspective.

Compass StudyPeople ask me career advice now and then. Generally, people who do so can’t follow the beaten path.

There’s plenty of career advice out there for the beaten paths. The basic recipe is simple:

  • Secondary School
  • Tertiary Education
  • Maybe specialize further.
  • ?????
  • Profit!

I’ve met a few people who this has worked for – which means going in debt with student loans sometimes, or having a tether to parents paying for things, or what have you.

The last part, ‘Profit’, is delayed until after people are repaid – bad news, parents are never repaid. In the context of the United States, which is hardly a data model for the rest of the world (but my experience), we have the rising cost of not continuing one’s education versus the toll of student debt.  The fact that studies are largely done by people who followed the beaten path further confuses the issue at times.

How often do you hear a college say you don’t need to go to college? Of course they wouldn’t say that – and one could say that the student loan issue in the United States is akin to tossing out mortgages to people who can’t afford to pay the mortgages. It’s all very muddy water, and where once I had an opinion I just see a sea of biased data and biased opinions and have none myself.

My Path.

My life, my work history, my education – they don’t fit the accepted model of education, ???, profit. I grew up working through secondary school in a printery, in a electrical motor rewinding workshop, and whatever odd jobs came my way. Despite this, and I would later learn because of a former Irish brother who had married a nun, I did not get expelled and managed to graduate – well.

My parents didn’t put me through college, and the debt I did incur toward not finishing college in the late 80s is something I paid off about 22 years later. The interest was bad, but I managed to settle with the Department of Education for pennies on the dollar. Incidentally, despite being what one might term a minority, I wasn’t African or Hispanic enough to gobble up any grants specifically for those minorities. Equal opportunity ain’t so equal.

My time in the Navy was so busy that I never seemed to have time for college classes or college credits. It’s hard to study full time in NNPS and work on college credits at the same time, or work in emergency medicine and pop off whenever you needed to; when people’s lives are at stake you don’t have that luxury. And getting yourself together after being discharged while attempting to support an ill parent just didn’t leave much room for college, or debt – or paying a debt which I still owed, and thus couldn’t continue college. A nasty trap, that, even with the military deferment.

And so I found myself back behind a computer again through some luck, working at Honeywell and proving my worth. It was a cool job, and I had convinced my manager to give me a book allowance where I read the most bleeding edge stuff I could find back then. It was awesome, if only for a while. Others, like Dr. VcG, tried to help round me out and did so a bit, but really, I was focused on just…. learning.d,

I was told that they would pay for my classes to finish a degree so that they could promote me, which I then began – oceanography – and I was to find out that they wouldn’t pay for classes toward that end. No, they wanted me to get a degree in something they were already paying me to do. Why on Earth would I need a validation for them to be promoted when I was already validating it every day at work?

There was only so much I could learn there, and changes to the company started taking the ‘play’ out of it all.

I moved from this company to that, building up references, building up experience, but most importantly to me, knowledge. My knowledge wasn’t validated by some group of academics, it came from the Real World. As time progressed, the economy went down as my age went up, and I found myself working for money instead of knowledge. It was not fun anymore, and I moved on… to where I am now, with a few interesting stops on the way.

Serial Specialist, the Neo-Generalist.

The beauty of software engineering when I started out is that once you could get a computer to do what someone else wanted to do, you got to learn about what they wanted to do. I got to learn about business, banking, avionics, emergency communications, data analysis, science, robotics, and so much more – and I have this knowledge, hard won, without following the beaten path and getting a bunch of letters behind my name.

It’s where all that knowledge intersects that the cool and fun stuff happens. The beaten path could not have given me that.

Frankly, in my experience, the beaten path is pretty slow – which some say is a reflection of my ability. I don’t know that is true. What I do know is that the real world, paying bills and keeping abreast of responsibilities required me to learn faster than I could get a formal education, and I did. Simply put, I had to. I loved most of it, I hated parts of it.

Career Advice

When it comes to the beaten path that I did not take, I point at it. It works for a lot of people, though the ‘works’ does seem increasingly dubious to me as far as a return on investment. Go study something if you have the opportunity or if you can create the opportunity. Get your education validated, but don’t stop learning.

You see, what I can tell you with a degree of certainty is that the world is as bad as it is because of those who rest on their laurels after getting a certificate or a degree. I can also tell you with certainty that the world is as good as it is because of the people who keep learning and applying that knowledge toward good ends.

We don’t need people who are ‘educated’. We have enough of those clogging up the system. We need more of those that are constantly learning, certificate or degree or not – those are the ones who create true progress. Speaking for myself, I pace myself to 2 books a week or more on topics that range widely.

The world is interesting in many ways. You can make it more interesting by knowing how interesting it is from different perspectives.

Learn how to negotiate. Get as much as you can even though you don’t need it – a problem I had – because you don’t know when you will need it.

And avoid working for idiots if you can. You won’t be able to, and sometimes it’s not obvious until later on, but ditch them as soon as you can.

We don’t think the world is getting better. This is why we’re not sure.

Banksy in Boston: Overview of the NO LOITRIN piece on Essex St in Central Square, CambridgeI came across Max Roser’s (Programme Director, Oxford Martin School, University of Oxford) post on the World Economic Forum through social media, and I didn’t have the time to address some of the issues I saw when I posted it. There is something that had struck me as viscerally wrong about it.

Now I know. In the broad strokes, the data points are cherry picked. When we look at how the world has improved based on static measures, we all should know that yes, the world has gotten better. That’s not why we don’t think it is.  It’s because the measures themselves haven’t improved. I’ll make my points quickly as related to his points.


Globally, we have less people starving per capita. There’s no debate there. Where the debate should be is whether this should be a part of the debate. Population growth around the world varies; a nation with lower standards of living tends to have higher population growth while a nation with higher standards of living tends to have lower population growth.

So, if we look at the shell game of poverty, overall the number is decreasing. But is the standard of living? Are people moving forward without people being left behind? Is the number of people we’re leaving behind increasing or decreasing?

We hear more often than not about the ‘erosion of the middle class’. Where did they all go?

These are questions that we want to know the answers to; we know poverty is decreasing, but if our goal is constant improvement, shouldn’t our measure of how we’re doing improve as well? Or are we comparing poverty now with the cave people of a few thousand years ago? No, but metaphorically, the idea of comparing poverty across a few hundred years is a frequent optimistic perspective presented when the masses get a bit disturbed.


Just by social media we know more people are attempting to communicate – some literacy is involved, but I daresay that there is some functional illiteracy out there that has snuck past testing that is supposed to demonstrate literacy.

I had a real world example today. A friend of mine’s granddaughter needed a reference on a form since the form she had filled out was outdated. He told me he needed me to sign it. I looked the old form over and told him I didn’t need to sign it, that she already had references on the old form, and all she needed to do was transfer them to the new form. No signatures required.

An hour later, while I was writing this, he stopped by and told me the new form needed my signature. It did not need my signature. I didn’t need to sign anything. Functionally, that’s a form of illiteracy.  Functional literacy was defined by UNESCO in 1960 – 58 years ago – as:

“using these skills in ways that contribute to socio-economic development, to developing the capacity for social awareness and critical reflection as a basis for personal and social change”

Not knowing the difference between putting your contact information on a form or signing a form is one example. So how are we measuring literacy?

By the numbers reported of those that can read by passing certain tests that, if you ever spend time on social networks, you need to question. Nevermind reading comprehension.

So, while the numbers of those that are reported as literate can be shown to have gone up – from students to teachers to administrators to nations, who wants to give worse reports? The incentive for true reporting is simply not there. How many college professors lose their hair dealing with freshmen?

Has functional literacy gone up? With increased bureaucracy over the decades, as well as technology, what is the new literacy? No one really knows. It’s sort of like the difference between pornography and art; we know it when we see it.


Germ theory is the basis of the postulation here – something come up with in the latter half of the 19th century. We’re in the 21st century; we’ve made leaps and bounds since germ theory that have been put into practice – open heart surgery, as an example, has come a long way in the last few decades. Granted, it could not have happened without germ theory, but if we’re comparing how well we’ve done since germ theory a lot of other things should be spoken of.

Yet there is at least the allegation that big pharmaceutical companies overcharge – Brazil even went rogue with HIV medications because of it. Borders between nations become more permeable when there is a noticeable price difference in medications, where the medications flow to places of higher costs. The United States is no different here; people get medications from Mexico and Canada as examples. How much? I’m pretty sure we don’t have the data for it; black markets don’t publish their data.

Access to healthcare? In the U.S. alone, this has been one of the most sharply debated topics in the last decade.

So yes, gene theory has brought us a lot of good, but what have we done since? With an increased population – remember population growth? – partly because of our advances in medicine, I’d think we’d get some better points than just gene theory.

Yet I can see why no one wants to talk about how health insurance has helped people. After all, it was only about a century ago that doctors were paid in livestock. Gene theory, apparently, gave doctors much more.


Oh, freedom. How do we define it? Is the person who works three jobs to pay the bills, ‘free’? Fortunately, no solid points were made in this section because it’s all pretty ambiguous. One has to wonder why it’s even in there.


Our population is increasing! Yes, we know that. We’re painfully aware of it, and I am not certain that it’s an indicator of things being better. It could mean that a lot of people in nations with lower standards of living might simply be unable to watch the television that they want because of content distribution rights or lack of internet access.

As I pointed out in the section related to poverty – population growth is a factor that is not spoken of enough. You can check out all manner of statistics in the United Nations World Population Prospects 2017.


We live in an era where there is cultural value placed on academic degrees; they were incentivized by salaries – at least at some point – and now the value of them is publicly questioned. Getting in debt for a college education and then being unable to get a job to repay that debt is a reality in the world. Yet we say that education is increased.

Formal education. But how has formal education changed? Aside from changing and adding some subjects, adding a lot of administration, education itself has not changed – and more than once we’ve seen education standards dropped so that more people pass. We don’t talk about that.

So while more people may suffer a formal education by 2100, can we honestly say that they have been educated better than now? Than 10 years ago? We’re talking about quantities when we should also be dealing in quality.

Why Do We Not Know That The World Is Changed?

We know that the world has changed – in our little pockets of what we read and see in the media, social or otherwise, and the reinforced perspectives we get from them. People share things without reading them, without rigorous thought (education? literacy?).

The world has gotten much better since we were cave dwelling mammals, though there is at least a sense of wonder when I consider that: Did we leave the caves because of the population boom caused by fire? Cave real estate maybe got so expensive that finally – probably a guy named Bill or Steve – said, “screw this, I’ll make my own cave!”. And so to this day, we live in variations of the cave, usually made by someone else. With fire. And cooling.

And yet, how have we really improved? The same country that has children eating tide pods also had an immigrant send an electric car to Mars while at least one person on the Tesla waiting list got upset (if they didn’t, I wonder if they should own one?). We have advances in medicine that should have us discussing contraception, even of the immaculate variety, and technology is giving us sex robots that – fortunately, so far – don’t distribute little humans like sexually transmitted diseases, or like Oprah. Look under your seat! There’s one for you!

We have advanced so far in technology that our education, our literacy and lack of it, has become more pronounced as we reinvent Babel despite people speaking the same language. We have people who are so angry that they’re either a mass shooter or a terrorist (but never both). We have archaic systems of governance that cannot shift as fast as the public can become less accurately informed.

The world has gotten better in some ways, yes, but it has become worse because people who never would have known each other 100 years ago now see each other’s posts quickly, algorithmically, based on what someone in a code cave thought was the best solution… so far.

We really don’t know whether things are getting better or worse. We only know within our own contexts and what we are told, and what we are told we too rarely question because our education systems teach us to accept what we are told rather than challenge it.

Challenge it. Challenge everything.  Things will not get better otherwise, and if people actually challenge things more, people won’t feel the need to write posts about ‘how much better things are’, a Hallmark card from the World Economic Forum to the ailing masses who aren’t seeing the improvements promised, with the dreams of yesteryear either dashed or worse, undreamed.

I, for one, do not wish any carcinogens blown up my posterior, no matter how fancy the pipe.

2018: Tech and Society

Brighter FutureOn the human meta level, it’s pretty clear that robotics and AI will continue making inroads into our societies in ways that we aren’t yet prepared for. Personally, it’s amusing when what got me into software engineering for a living as a young man increasingly becomes a reality 2 decades later. In fact, it’s the only reason I code these days, and coding itself as we know it is in it’s twilight.

While blue collar jobs have always been what has been worried about as far as ‘machines taking jobs’, there is a clear bias to deal with expense. Where technology can make things cheaper, it does, so those with high salaries and jobs that can be automated will be increasingly put on notice. This leaves us with the dilemma of how people will earn a living, a real problem in a world where bureaucracies have demonstrably been slow to react to these changes, where politics around the world has somehow become more palpably connected with fear, where people see things faster, and where our ability to use technology to communicate dwarfs our ability to do so.

Renewable energy has gone beyond being a novelty – even here in Trinidad and Tobago, when over a decade ago my father tried to sell the government on solar powered street lights, the local electricity company – state owned T&TEC – announced in late 2017 that they’ll be doing stuff with it. Technology lags in countries around the world, and 2018 will continue increasing that divide – but a nation’s ability to use technology does not define it’s advancement, as economic policies on a global scale have the developed world in for a redefinition. BRIC is a reality, and network power continues to make them powerhouses.

I think of my nieces in college, my nephews about to start college, and how their education can be made worthwhile by simply being relevant over the next few decades of their lives – but their lives will be redefined by things larger than the education systems that they will be indentured to. We are on the precipice of change that we cannot possibly understand the implications of until we’re on the other side of it.

And 2018 will be increasingly about that.

The AI Future On Mankind’s Canvas

Doctor Leia.I met her and the young Brazilian woman on the flight from Miami to Orlando, this young Doctor who had an interview in Ocala. She was to drive across to Ocala, to the East, to see if she would get the job. She didn’t look old enough to be a Doctor, but I’ve passed the age threshold where doctors were younger than myself years ago. We talked about medicine and medical administration for a while even as I checked up on the nervous Brazilian high school graduate. I sat, a thorn between two roses, all the while thinking:

What sort of world were they entering? Doc Leia, a graduate from The University of the West Indies, off to Ocala, and the young woman to my right, off to see the sights as a reward for having survived so many years of schooling. They were both easily younger than most of my nieces. The Doctor had already become heavily invested in her future – medical school was a daunting path and might have been one I would have pursued with the right opportunities. The other was about to invest in her future and it bothered me that there wasn’t as clear a path as there used to be.

Artificial intelligence – diagnosing patients on the other side of the world – is promising to change medicine itself. The first AI attorney, ‘Ross’, had been hired by a NYC firm. The education system in the United States wasn’t factoring this sort of thing in (unless maybe if you’re in the MIT Media Lab), so I was pretty sure that the education systems in the Caribbean and Latin America weren’t factoring it in. I’ve been playing with Natural Language Processing and Deep Learning myself, and was amazed at what already could be done.

The technology threat to jobs – to employment – has historically been robotics, something that has displaced enough workers to cause a stir over the last decades – but it has been largely thought that technology would only replace the blue collar jobs. Hubris. Any job that requires research, repetition, and can allow for reduced costs for companies is a target. Watson’s bedside manner might be a little more icy than House, but the results aren’t fiction.

What are the jobs of the future, for those kids in, starting or just finished with a tertiary education? It’s a gamble by present reckoning. Here are a few thoughts, though:

  • A job that requires legal responsibility is pretty safe, so far. While Watson made that diagnosis, for legal reasons I am certain that licensed doctors were the ones that dealt with the patient, as well as gave the legal diagnosis.
  • Dealing well with humans, which has been important for centuries, has just become much more important – it separates us from AI. So far.
  • Understanding the technology and, more importantly, the dynamic limits of the technology will be key.

Even with that, even as fast food outlets switch to touchscreens for ordering their food (imagine the disease vectors off of that!), even as AI’s become more and more prominent, the landscape is being shaken by technology driven by financial profit.

And I don’t think that it’s right that there’s no real plan for that. It’s coming, there is no stopping that, but what are we as a society doing to prepare the new work force for what is to come? What can be done?

Conversations might be a good place to start.