jacobytes
jacobytes

Reputation: 3241

Creating a custom adapter in Botframework for Actions on Google

I'm currently writing a custom adapter in Typescript to connect Google Assistant to Microsoft's Botframework. In this adapter I'm attempting to capture the Google Assistant conversation object through a webook call and change it using my bot.

At this moment the only thing that my bot is doing is receive the request from Actions on Google and parsing the request body into an ActionsOnGoogleConversation object. After this I call conv.ask() to try a simple conversation between the two services.

Api Endpoint:

app.post("/api/google", (req, res) => {
  googleAdapter.processActivity(req, res, async (context) => {
    await bot.run(context);
  });
});

Adapter processActivity function:

public async processActivity(req: WebRequest, res: WebResponse, logic: (context: TurnContext) => Promise<void>): Promise<void> {
    const body = req.body;
    let conv = new ActionsSdkConversation();
    Object.assign(conv, body);
    res.status(200);
    res.send(conv.ask("Boo"));
  };

When I try to start the conversation I get the following error in the Action on Google console.

UnparseableJsonResponse

API Version 2: Failed to parse JSON response string with 'INVALID_ARGUMENT' error: "availableSurfaces: Cannot find field." HTTP Status Code: 200.

I've already checked the response and I can find a field called availableSurfaces in the AoG console and when I call my bot using Postman.

Response:

{
  "responses": [
    "Boo"
  ],
  "expectUserResponse": true,
  "digested": false,
  "noInputs": [],
  "speechBiasing": [],
  "_responded": true,
  "_ordersv3": false,
  "request": {},
  "headers": {},
  "_init": {},
  "sandbox": false,
  "input": {},
  "surface": {
    "capabilities": [
      {
        "name": "actions.capability.MEDIA_RESPONSE_AUDIO"
      },
      {
        "name": "actions.capability.AUDIO_OUTPUT"
      },
      {
        "name": "actions.capability.ACCOUNT_LINKING"
      },
      {
        "name": "actions.capability.SCREEN_OUTPUT"
      }
    ]
  },
  "available": {
    "surfaces": {
      "list": [],
      "capabilities": {
        "surfaces": []
      }
    }
  },
  "user": {
    "locale": "en-US",
    "lastSeen": "2019-11-14T12:40:52Z",
    "userStorage": "{\"data\":{\"userId\":\"c1a4b8ab-06bb-4270-80f5-958cfdff57bd\"}}",
    "userVerificationStatus": "VERIFIED"
  },
  "arguments": {
    "parsed": {
      "input": {},
      "list": []
    },
    "status": {
      "input": {},
      "list": []
    },
    "raw": {
      "list": [],
      "input": {}
    }
  },
  "device": {},
  "screen": false,
  "body": {},
  "version": 2,
  "action": "",
  "intent": "",
  "parameters": {},
  "contexts": {
    "input": {},
    "output": {}
  },
  "incoming": {
    "parsed": []
  },
  "query": "",
  "data": {},
  "conversation": {
    "conversationId": "ABwppHEky66Iy1-qJ_4g08i3Z1HNHe2aDTrVTqY4otnNmdOgY2CC0VDbyt9lIM-_WkJA8emxbMPVxS5uutYHW2BzRQ",
    "type": "NEW"
  },
  "inputs": [
    {
      "intent": "actions.intent.MAIN",
      "rawInputs": [
        {
          "inputType": "VOICE",
          "query": "Talk to My test app"
        }
      ]
    }
  ],
  "availableSurfaces": [
    {
      "capabilities": [
        {
          "name": "actions.capability.AUDIO_OUTPUT"
        },
        {
          "name": "actions.capability.SCREEN_OUTPUT"
        },
        {
          "name": "actions.capability.WEB_BROWSER"
        }
      ]
    }
  ]
}

Does anyone know what might be causing this? I personally feel that creating the ActionsSdkConversation could be the cause, but I've not found any examples of using Google Assistant without getting the conv object from the standard intent handeling setup.

Upvotes: 0

Views: 1022

Answers (1)

jacobytes
jacobytes

Reputation: 3241

So I managed to fix it by changing the approach, instead of having an API point that fits the structure of bot framework I changed it to the intenthandler of AoG.

Google controller

export class GoogleController {

  public endpoint: GoogleEndpoint;
  private adapter: GoogleAssistantAdapter;
  private bot: SampleBot;

constructor(bot: SampleBot) {
    this.bot = bot;
    this.adapter = new GoogleAssistantAdapter();
    this.endpoint = actionssdk();
    this.setupIntents(this.endpoint);
};

private setupIntents(endpoint: GoogleEndpoint) {

    endpoint.intent(GoogleIntentTypes.Start, (conv: ActionsSdkConversation) => {
      this.sendMessageToBotFramework(conv);
    });

    endpoint.intent(GoogleIntentTypes.Text, conv => {
      this.sendMessageToBotFramework(conv);
    });
  };

  private sendMessageToBotFramework(conv: ActionsSdkConversation) {
    this.adapter.processActivity(conv, async (context) => {
      await this.bot.run(context);
    });
  };
};

interface GoogleEndpoint extends OmniHandler, BaseApp , ActionsSdkApp <{}, {}, ActionsSdkConversation<{}, {}>>  {};

Once the conv object was in the adapter, I used the conv object to create an activity which the bot used to do its things and saved it in state using context.turnState()

Adapter ProcessActivity

  public async processActivity(conv: ActionsSdkConversation, logic: (context: TurnContext) => Promise<void>): Promise<ActionsSdkConversation> {

    const activty = this.createActivityFromGoogleConversation(conv);
    const context = this.createContext(activty);
    context.turnState.set("httpBody", conv);
    await this.runMiddleware(context, logic);

    const result = context.turnState.get("httpBody");
    return result;
  };

Bot

export class SampleBot extends ActivityHandler {

  constructor() {
    super();

    this.onMessage(async (context, next) => {
      await context.sendActivity(`You said: ${context.activity.text}`);
      await next();
    });
}

Once the bot send a response, I used the result to modify the conv object, save it and then return it in processActivity().

private createGoogleConversationFromActivity(activity: Partial<Activity>, context: TurnContext) {

    const conv = context.turnState.get("httpBody");

    if (activity.speak) {
      const response = new SimpleResponse({
        text: activity.text,
        speech: activity.speak
      });

      conv.ask(response);
    } else {

      if (!activity.text) {
        throw Error("Activity text cannot be undefined");
      };

      conv.ask(activity.text);
    };

    context.turnState.set("httpBody", conv);
    return;
  };

That resulted into a simple conversation between Google Assistant and Bot Framework.

enter image description here

Upvotes: 2

Related Questions