Exploring the Anki Vector SDK Alpha.

Installed the Vector SDK. @anki
Installing the SDK – which, fortunately, was easier since I am already running Anaconda (Python) for other things I’m fiddling with.

In my last post here, I said that the true value of the Anki Vector to me would be determined by the Software Development Kit (SDK), which wasn’t yet released.

I am a bit disappointed that no one at Anki answered my tweet on it to date – and so I used a Douglas Adams reference about hiding things when I tweeted them again.

A fair criticism of Anki is that they aren’t very good at organizing the information and updating customers when they’re doing pretty good things. Frankly, the beginning novelty of Vector and it’s potential is what seems to be allowing them not to pay as much for this faux pas. And too, I suspect, the project has grown faster than the company has – a testament to engineering. It has apparently sold well, a testament to their marketing. Yet when it comes to information on the product, it seems pretty hard to come by information users/expected are expected to have.

Installing the Vector SDK

I found the Vector SDK Alpha release note through an Anki blog entry not as easily found as I would have liked. Within it you’ll find the link to the SDK documentation, and within that you’ll find the actual downloads. I found this through force of will, largely because Vector was sitting impatiently on his charger for almost a week making R2D2-ish sounds while giving me the baleful look of Wall-E when I walked by.

It’s amazing how those eyes are really the center of how we see Vector.

I installed the Alpha SDK, and I configured Vector – which involves getting the IP address of Vector. It’s not available through the app on the phone, and there’s a trick to it (in case you’re looking for it yourself) – you have to tap Vector’s top button twice, then raise and lower his arm. Vetor’s IP address will then be shown where his eyes are. To get back to normal operation, raise and lower Vector’s arm again. Sacrificing a chicken is optional. Be careful with blood spatter; Vector is not fluid-proof.

After that, it was a simple matter of firing Spyder up – part of the Anaconda data science platform for Python, but available standalone – and ran some of the example code, tweaking it here and there to get a feel for the capabilities of the Vector SDK Alpha.

This is where they shine – when it comes to sharing the code. And the SDK documentation itself, so far, is pretty good.

The Reality of the SDK.

I think I was expecting a bit more from the SDK, which is my fault and I acknowledge that. I had expected more in the way of interacting with the cloud itself – for example, renaming Vector’s wake phrase/word, or allowing behavior change during normal operation. That’s presently not there, which effectively gives Vector a multiple personality disorder – with blackouts where, for better and worse, the SDK allows the hijacking of Vector.

Imagine waking up and not knowing how you got somewhere, what you just did, and where that eyebrow went. That’s a fair anthropomorphization.

The SDK works  through your wireless connection – the code/application has to be running on the same network as Vector, and your specific machine gets a certificate to run the code on Vector – a good security precaution or people would be hacking Vectors and checking out other people’s places.

It’s bad enough with the Alexa integration – I had an Alexa when they first came out but had enough creepy incidents with Amazon to get rid of mine. Still, the world of Amazonians wants it and it’s a good selling point for Anki, so I get it. That seems to be done well enough to please those that wanted it, so maybe they’ll focus on things other than that now.

In all, I’d like to transfer a version of what they have in the cloud into my personal systems and allow me to tinker with that as well.

Still, given what I have been playing with related to machine learning and natural language processing – it’s no mistake that I had the Anaconda distribution of Python installed already – I’m having a bit of fun playing with the SDK and testing the limitations of the hardware.

@anki Vector vide feed example. Rocking.Some things I noticed

The video from the Vector hardware platforms is good enough for some basic things, but lighting really does affect it. This is a limitation in it’s exploration, and it limits it’s facial recognition ability (the one thing I’ve found you can access from the cloud in a limited way).

I’ve been considering a polarizing film over the cameras for better images, and have even considered mounting a light source on Vector for darkness, which would have the misfortune of not being able to be controlled through Vector (but it could be controlled independently through code). I plan to play with the lights part of the SDK to see what I can get away with.

You don’t get to fiddle with facial recognition code, but there’s Python code for that – such as PyPi face_recognition.

The events ability does allow for more reactive things.

Making Vector use profanity is a must, if only once.

There are error codes that aren’t documented – I had the 915 error twice on Vector while I was writing this, and all I found was on Reddit. Without error codes, we don’t get error trapping with Vector – and that’s a problem that I hope they address in the Beta.

Overall – I’m happier with the SDK, which shows promise and a bit of effort on the part of Anki. The criticisms I have so far are of an Alpha SDK – which means that this will change in time.

They do need to get a bit better at the responsiveness, though – something I suspect that they are already aware of. To enjoy this level of success comes with painful growth. If only that were an engineering problem to solve.

The Anki Vector: Let’s Wait For the API.

Vector playing with cube.So, I got an Anki Vector. My reasons for buying one were pretty simple, really – it seemed like a throwback to the 70s when I had a Big Trak, a programmable machine that had me often shooting my mother with a laser and harassing the family dog.

With Big Trak’s Logo-ish programming, there were tangible results even if the ‘fire phaser’ command was really just a flashing light. It was the 1970s,  after all, in an era when Star Wars and Star Trek reigned supreme.

So the idea of the Anki Vector was pretty easy for me to contend with. I’ve been playing with the idea of building and programming a personal robot, and this would allow me to get away from ‘building’.

I hoped.

Out of the Box.

The Anki Vector needed some charging in it’s little home station, and I dutifully installed the application on the phone, following the instructions, connecting it to my Wifi – and while people said that they have had problems with the voice recognition, I have not. Just speak clearly and at an even pace, and Vector seems to handle things well.

The focal length that Vector’s camera(s) are limited to seems to be between 12-24 inches, based on it identifying me. It can identify me, even with glasses, after some training – roughly 30 minutes – as long as my face is withing 12-24 inches from it’s face.

It’s a near-sighted robot, apparently, which had me wondering if that would be something to work with through the API.

It is an expressive robot – it borrows from WALL-E in this regard, it seems. And while it can go to the Internet and impress your friends with it’s ability to use it’s voice to read stuff off of Wikipedia, it’s not actually that smart. In that regard, it’s Wikipedia on tracks with expressive eyes that, yes, you can change the color of.

Really, within the first hour, you run out of tricks with Vector at this time – the marketing team apparently wrote the technical documentation, which is certainly easy to read – largely because it doesn’t actually say much. I’m still trying to figure out why the cube came with it – somewhere, it said it helped Vector navigate outside of it’s ‘home area’ – but navigate and do what?

Explore and do what? Take a picture and see it where? There is a lack of clarity on things in the documentation. While petting Vector has an odd satisfaction to it, it doesn’t quite give me enough.

On December 6th, I tweeted to Anki and asked them about the API – because with the hardware in the Vector, I should be able to do some groovy things and expand it’s functionality.

Crickets for the last 3 days.

Without that API, I think the Vector is limited to the novelty part of the store… which is sad, because I had hopes that it would be a lot more.

Maybe that API will come out before I forget that I have a Vector.