Talk: Chatting with data

Nov 29, 2018

I was fortunate enough to speak at September's .NET Oxford event this year about chatbots. More specifically an exploration of the key concepts of taking a pre-existing source of data and using a chatbot to surface this to a user in a new way.

I get to work with some great clients at Ridgeway and this year one of them allowed me free-reign over some of their initial project content to experiment with chat bots.  Truth be told, I've been pretty busy so this hasn't received as much of my attention as I would have liked until recently.  Back in 2017, I (like many other developers) had a bit of a mess about with Amazon Alexa and Google Home to see what all of the fuss was about.  There were plenty of examples online that got you off to a basic start, but what I really wanted to look at was some real-world data that I'd used in a web application and how I could present it to in a different way to add more value.

Before I got stuck into the detail of the subject, I did some digging around for some information on how and where bots are being used.  There are a lot of predictions about the use of bots in customer services and the financial sector, but one statistic stuck out more than most: 

£3.5 Billion is expected to be invested in enterprise intelligent assistants by 2021.Opus Research

That's a pretty sizeable estimate of the kind of investment that we might see in over the next couple of years and one which should perhaps help us to focus on the task in hand.  The rate at which market analysts predict people are going to start interacting with bots is a good indicator that, as developers, it's a skill that we need to learn.

I did a very brief exploration of some of the choices that I could make with regards to technology; should I choose Amazon, Google, or Microsoft? Ultimately, I decided to experiment with Google Actions and Dialoglfow, though both Amazon Lex and Microsoft Bot Framework had both been considered and been put through the same initial tests to create a very simple demo. I felt that Dialogflow was initially easier to work with (and I happened to have a spare Google Home Mini to work with).

The usual demos

Having run through essentially the same demo with Google, Amazon, and Microsoft, one thing stuck out - all of the demos seemed very 'hard coded'.  All of the values and responses that could be accepted and given were supplied by me as I worked through the demo.  There were not too many examples of actually retrieving the information from elsewhere.  What I wanted to do was pull some data out of Kentico Cloud and Azure Search to drive a simple user experience.

Actually finding useful examples seemed to be a little difficult in .NET, though if you know your way around JavaScript then you'll be in for a fun ride.  I found that there were basic examples in C#, but I would have liked more of them and preferably with better documentation.

The talk & demo

About half of the talk really was going over some of the key points of bot development and explaining some of the elements and terminology that you will come across as a developer. There are some key elements to bots that are common across all of the platforms but need to be understood in order to be able to progress.  An example of this is how an utterance works and its key parts. 

Simple description of an utterance

With the basics covered, I started to put my demo through its paces and show how it all works.  To put the whole demo together I used Kentico Cloud, Azure Logic App, Azure Search, and Dialogflow. 

The source of my data was something that I'd initially used back in April when I explored importing data into Kentico Cloud. By using the webhooks in Kentico cloud to activate an Azure Logic App, I was able to respond to publish/unpublish events and process the data that would drive the conversation that my customers would (hopefully) be having. 

When you consume data from Kentico Cloud in Azure Logic Apps, you can easily get lost trying to retrieve information from the JSON; you'll see a lot of things called "name" when you try to use the object browser in your Logic App. I'd thoroughly recommend using variables as much as possible to access the values in your content items (there is a library called JSONata which I really want to explore for this to simplify the data coming from Kentico Cloud, but that will have to come later!).


Once I'd identified the information coming in from Kentico Cloud, the Logic App made light work of routing the information and deciding whether or not it should be either added or removed from the Azure Search index and Dialogflow entities.  I ended up having two lists of entities, which enabled me to show that I only ever add ingredients (let's be honest, there's probably going to be more than one thing with rice in it), but can both add and remove dish names as they're quite unique.  

Now that Dialogflow is populated, I could use the information to help train the model.  The more information you can give the better, but it's not just about entities, it's also about providing training phrases like "Describe {chicken katsu curry} for me", or "I'd like to know more about {Tuna Maki}".

The final step in the process was looking at how I could link what is in Dialogflow with the content that now resides in Azure Search.  This is the bit where I found myself reading a lot of documentation.  Dialogflow describes this as Fulfilment, but there aren't many .NET examples that I could find with a  good description.  For me, this was an Azure Function in the end.  Dialogflow allows for a webhook to be configured as the fulfilment end-point, and Azure Functions fit this bill perfectly. A lot of the extremely clever and witty responses that I'd put into Dialogflow seemed to go to waste at this point, as all of the responses were no coming from my Azure Function.  Something to take forward there is that I can probably find a better way to do that bit.

Two things that made the demonstrations easy to put together for this talk were Kentico Cloud and Azure Logic Apps.  I decided to use Azure Logic Apps to stitch everything together based on some work back in June this year I took part in a hack day at the Kentico MVP summit.  Our team explored ways in which we could use Kentico Cloud with Azure Logic Apps to quickly build up the fabric of an application. Having not used Azure Logic Apps prior to that, it was an invaluable start.

Takeaway points

Aside from picking up some of the technical aspects of bot development while creating this talk, one key thing stood out to me: creating a conversation that flows easily is by far the most difficult part of the puzzle to solve.  My demo was like a conversation between a bunch so socially awkward people; if I needed to do this for a proper commercial project then I would most certainly start with the conversation.  If you're going to undertake this kind of project, then make sure that you work out the conversation first, then you can worry about which platform best fits your business.

The other thing that struck me was the ease with which everything fell together using Azure Logic Apps with  Kentico Cloud.  Using variables was invaluable, as the structure of the message from Kentico Cloud can be difficult to work with in Logic Apps.  I decided to use Azure Logic Apps to stitch everything together based on some work back in June this year I took part in a hack day at the Kentico MVP summit. Our team explored ways in which we could use Kentico Cloud with Azure Logic Apps to quickly build up the fabric of an application. Having not used Azure Logic Apps prior to that, it was an invaluable start.