Pragmatism in the real world

Getting started writing an Alexa Skill

We now have 4 Amazon Echo devices in the house, and, inspired by a demo LornaJane gave me at DPC, I have decided to write some skills for it. This article covers what I learnt in order to get my first Swift skill working.

Our bins are collected by the council every other week; one week it’s the green recycling bin and the other week, it’s the black waste bin. Rather than looking it up, I want to ask Alexa which bin I should put out this week.

Firstly, you need an Echo, so go buy one, set it up and have fun! When you get bored of that, it’s time to create a skill.

Creating a skill

Start by registering on the Amazon Developer Portal. I signed in and then had to fill out a form with information that I thought Amazon already knew about me. Accept the terms and then you end up on the dashboard. Click on the “Alexa” link and then click on the “Alexa Skills Kit” to get to the page where you can add a new skill. On this page, you’ll find the “Add a New Skill” button.

I selected a “Custom Interaction Model”, in “English (U.K)”. Rather unimaginatively I’ve called my first skill “Bin Day” with an Invocation Name of “Bin Day” too. Pressing “Save” and then “Next” takes us to the “Interaction Model” page. This is the page where we tell Alexa how someone will speak to us and how to interpret it.

The documentation comes in handy from this point forward!

The interaction model

A skill has a set of intents which are the actions that we can do and each intent can optionally have a number of slots which are the arguments to the action.

In dialogue with Alexa, this looks like this:

Alexa, ask/tell {Invocation Name} about/to/which/that {utterance}

An utterance is a phrase that is linked to an intent, so that Alexa knows which intent the user means. The utterance phrase can have some parts marked as slots which are named so that they can be passed to you, such as a name, number, day of the week, etc.

My first intent is very simple; it just tells me the colour of the next bin to be put out on the road. I’ll call it NextBin and it does’t need any other information, so there are no slots required.

In dialogue with Alexa, this becomes:

Alexa, ask BinDay for the colour of the next bin

And I’m expecting a response along the lines of:T

Put out the green bin next

To create our interaction model we use the “Skill Builder” which is in Beta. It’s a service from a big tech giant, so of course its in beta! Click the “Launch Skill Builder” button and start worrying because the first thing you notice is that there are video tutorials to show you how to use it…

It turns out that it’s not too hard:

  1. Click “Add an Intent”
  2. Give it a name: NextBin & click “Create Intent”
  3. Press “Save Model” in the header section

We now need to add some sample utterances which are what the user will say to invoke our intent. The documentation is especially useful for understanding this. For the NextBin intent, I came up with these utterances:

  • “what’s the next bin”
  • “which bin next”
  • “for next bin”
  • “get the next bin”
  • “the colour of the next bin”

I then saved the model again and then pressed the “Build Model” button in the header section. This took a while!

Click “Configuration” in the header to continue setting up the skill.

Configuration

At its heart, a skill is simply an API. Alexa is the HTTP client and sends a JSON POST request to our API and we need to respond with a JSON payload. Amazon really want you to use AWS Lambda, but that’s not very open, so I’m going to use Apache OpenWhisk, hosted on Bluemix.

The Configuration page allows us to pick our endpoint, so I clicked on “HTTPS” and then entered the endpoint for my API into the box for North America as Bluemix doesn’t yet provide OpenWhisk in a European region.

One nice thing about OpenWhisk is that the API Gateway is an add-on and for simple APIs it’s an unnecessary complexity; we have web actions which are ideal for this sort of situation. As Alexa is expecting JSON responses, we can use the following URL format for our end point:

https://openwhisk.ng.bluemix.net/api/v1/web/{fully qualified action name}.json

The fully qualified name for the action can be found using wsk action list. I’m going to call my action BinDay in the package AlexaBinDay, so this is 19FT_dev/AlexaBinDay/BinDay for my dev space. Hence, my endpoint is https://openwhisk.ng.bluemix.net/api/v1/web/19FT_dev/AlexaBinDay/BinDay.json

Once entered, you can press Next and then have to set the certificate information. As I’m on OpenWhisk on Bluemix, I selected “My development endpoint is a sub-domain of a domain that has a wildcard certificate from a certificate authority”.

Testing

The Developer page for the skill has a “Test” section which you enable and can then type in some text and send it to your end point to get it all working. This is convenient as we can then log the response we are sent and develop locally using curl. All we need to do now is develop the API!

Developing the API endpoint

I’m not going to go into how to develop the OpenWhisk action in this post – that can wait for another one. We will, however, look at the data we receive and what we need to respond with.

Using the Service Simulator, I set the “Enter Utterance” to “NextBin what’s the next bin” and then pressed the “Ask Bin Day” button. This sends a POST request to your API endpoint with a payload that looks like this:

{
  "session": {
    "sessionId": "SessionId.e57e7015-585e-4140-8fa0-982eea4c6d44",
    "application": {
      "applicationId": "amzn1.ask.skill.{some uuid}"
    },
    "attributes": {},
    "user": {
      "userId": "amzn1.ask.account.{some string}"
    },
    "new": true
  },
  "request": {
    "type": "IntentRequest",
    "requestId": "EdwRequestId.3678eeef-bfd5-4711-bc86-a5b40ab768e0",
    "locale": "en-GB",
    "timestamp": "2017-07-15T10:35:45Z",
    "intent": {
      "name": "NextBin",
      "slots": {}
    }
  },
  "version": "1.0"
}

You should probably check that the applicationId matches the ID in the “Skill Information” page on the Alexa developer portal as you only want to respond if it’s what you expect.

The request is where the interesting information is. Specifically, we want to read the intent‘s name as that tells us what the user wants to do. The slots object then gives us the list of arguments, if any.

Once you have determined the text string that you want to respond with, you need to send it back to Alexa. The format of the response is:

{
  "response" : {
    "shouldEndSession": true,
    "outputSpeech" : {
      "type": "PlainText",
      "text": "Put out the green bin next"
    }
  },
  "version": "1.0"
}

To make this work in OpenWhisk, I created a minimally viable Swift action called BinDay. The code looks like this:

BinDay.swift:

func main(args: [String:Any]) -> [String:Any] {

    return [
        "response" : [
            "shouldEndSession" : true,
            "outputSpeech" : [
              "type" : "PlainText",
              "text" : "Put out the green bin next",
            ],
        ],
        "version" : "1.0",
    ]
}

And uploaded it using:

$ wsk package create BinDay
$ wsk action update BinDay/BinDay BinDay.swift --web true

For production we will need to compile the swift before we upload, but this is fine for testing. The Service Simulator now works and so we can get it onto an Echo!

Beta testing on an Echo

To test on an Echo, you need to have registered on the developer portal using the same email address as the one that your Echo is registered with. I didn’t do this as my Echo is registered with my personal email address, not the one I use for dev work.

To get around this, I used the Beta testing system. To enable beta testing you need to fill in the “Publishing Information” and “Privacy & Compliance” sections for your skill.

For Publishing Information you need to fill in all field and provide two icons. I picked a picture of a friend’s cat. Choosing a category was easy enough: Utilities, but none of the sub categories fit, but you have to pick one anyway! Once you fill out the rest of the info, you go onto the Privacy & Compliance questions that also need answering.

The “Beta Test Your Skill” button should now be enabled. You can invite up to 500 amazon accounts to beta test your skill. I added the email address of my personal account as that’s the one registered with my Echo. We also have some Echos registered to my wife’s email address, so I will be adding her soon.

Click “Start Test” and your testers should get an email. There’s also a URL you can use directly which is what I did and this link allowed me to add BinDay to my Echo.

Fin

To prove it works, here’s a video!

That’s all the steps required to make an Alexa skill. In another post, I’ll talk about how I built the real set of actions that run this skill.

3 thoughts on “Getting started writing an Alexa Skill

  1. This looks fun. Bin concept is funny. Laughed out loud at video.

    Have you heard of https://vapor.codes ?

    If so, it would be interesting to hear your opinion on a comparison between OpenWhisk and Vapor.

Comments are closed.