-
-
Notifications
You must be signed in to change notification settings - Fork 3
9.9.1 Creating Bots with Watson
With the advancement of Artificial Intelligence, conversational user interfaces have become quite popular. Conversational UIs, whether through messaging or voice, rely heavily on natural language processing. Natural language processing is in itself achieved through machine learning.
The Caligrafy Bot
makes use of IBM Watson Assistant as the core natural language processor and it provides a server-side framework that communicates with the IBM Watson Assistant API. Also, Caligrafy uses VueJS on the client-side in order to provide an appealing user interface for the conversation.
APIs that the Caligrafy Bot
uses:
- Server-side: Caligrafy core framework class that interfaces with the IBM Watson Assistant API
- Client-side: Caligrafy VueJS logic that uses bot-ui API
- IBM Cloud free account with an IBM Watson Assistant
- All the Dialog logic is created using IBM Watson Assistant interface
- Caligrafy VueJS integration. You can learn more about the integration in the Caligrafy and Vue section
In this video, we show how Caligrafy connects to IBM Watson to help you implement your own custom bot.
Live Example: Ask Cali to show you a picture of a dog or a cat
Caligrafy comes prepackaged with a Bot
that can connect to your Watson IBM Assistant and that you can start using in 4 steps:
-
From the Caligrafy project root, use Caligrafer to create a bot app by running
.bin/caligrafer bot `<your_bot_app_name>` // or php caligrafer.php bot `<your_bot_app_name>`
-
A new app <your_bot_app_name> was created in the
public
folder.
After you open an account with IBM Cloud, you can create an IBM Watson Assistant. From the IBM Watson Assistant manager, you can create several assistants. When you create an assistant and you launch it, you are provided with an API key
and with an Assistant ID
.
You need both these credentials to link them to Caligrafy.
-
Add the
API key
to theWATSON_API_KEY
in the.env
file in Caligrafy -
In the Caligrafy
config/routing/web.php
file, out-of-the-box routes are already built-in to help you get started quickly. Later on, we will explain how you can modify those routes to best match your application. -
Now you can test the Bot. You need to make sure to specify the Watson Assistant ID as a URL parameter when accessing the route
localhost/caligrafy-quill/bot/<your_bot_app_name>?botSkillId=<Watson Assistant ID>
Now that the assistant is linked to Caligrafy, you can start building the dialog flow. This documentation will not go through explaining how to build a dialog flow. IBM provides great tutorials and videos on how to do that.
It is important for Caligrafy to understand what types of responses the IBM Watson responds with. There are 4 types of responses possible:
-
text
: The text response is the simplest form of responses and that is a sentence in any form or tense. -
option
: The assistant can always provide a collection of choices for users to select from. This is a typical way to restrict the user to making choices as opposed to open-ended answers. -
image
: The assistant could send images in a response. -
pause
: The assistant could pause for a certain period of time either to allow the user to answer or to execute an action that requires some time before responding back to the user.
It is also important to understand what types of inputs can be given to the IBM Watson Assistant
-
text
: The assistant understands text and through natural language processing, it will direct the dialog by providing follow up responses. -
context variables
: The assistant can also receive context variables as inputs. Think of these context variables as a storage that is sent to the assistant that constitutes its knowledge. So for example, if you give your name to the assistant, it could refer to you by your name throughout the session and not ask you for your name anymore.
Learn more about creating dialogs with the IBM Watson Assistant here
Once your Assistant is ready to have a conversation, you can first test it in the IBM Watson Assistant to make sure that it is conversing the way you intend it to.
Now you can test the Bot. You need to make sure to specify the Watson Assistant ID as a URL parameter when accessing the route localhost/caligrafy-quill/bot/<your_bot_app_name>?botSkillId=<Watson Assistant ID>
If you want to create your own client-side UI, you will need to use the Caligrafy and Vue integration.
Just like any other Caligrafy-Vue app
, your Bot
app needs to be created in the public
folder using Caligrafer. Refer to Step 1 in the previous section to see how it is done.
Once the bot app created, a folder with the name you specified is created under the public
folder. You can find in there index.php
and a scripts
folder containing a javascript file main.js
that contains the VueJS code.
We will explore both of those files in the next steps.
The index.php
is the file that is responsible for running your Bot application.
<!DOCTYPE html>
<html lang="en">
<head>
<meta http-equiv="content-type", content="text/html; charset=UTF-8">
<meta name="viewport" content="width=device-width initial-scale=1 maximum-scale=1">
<meta name='apple-mobile-web-app-capable' content='yes'>
<meta name='apple-mobile-web-app-status-bar-style' content='black'>
<title>Caligrafy Bot</title>
<!-- Stylesheet and head scripts go here -->
<link rel="shortcut icon" href="<?php echo scripts('favicon'); ?>" type="image/x-icon" />
<link rel="stylesheet" href="<?php echo scripts('bootstrap_css'); ?>" />
<link rel="stylesheet" href="<?php echo session('public').'css/botui.min.css';?>">
<link rel="stylesheet" href="<?php echo session('public').'css/botui-theme-default.css';?>">
</head>
<body>
<!-- Beginning of the app -->
<div id="app">
</div>
<div id="my-botui-app">
<bot-ui></bot-ui>
</div>
<!-- Initialization scripts -->
<script src="https://cdn.jsdelivr.net/npm/vue/dist/vue.js"></script>
<script src="https://unpkg.com/axios/dist/axios.min.js"></script>
<script src="<?php echo session('public').'js/services/botui.js';?>"></script>
<script src="<?php echo APP_SERVICE_ROOT.'app.js'; ?>"></script>
<script>loadEnvironment(`<?php echo $env; ?>`);</script>
<script>
/* Loading the app client framework
* Any environment variables to be passed on from the server can take place in this here
*/
loadVue({
scripts: ['main']
});
</script>
<!-- Additional scripts go here -->
<script src="<?php echo scripts('bootstrap_jquery'); ?>"></script>
<script src="<?php echo scripts('bootstrap_script'); ?>"></script>
<!--[if lt IE 9] -->
<script src="<?php echo scripts('fallback_html5shiv'); ?>"></script>
<script src="<?php echo scripts('fallback_respond'); ?>"></script>
<!--<![endif]-->
</body>
</html>
Your VueJS code lives in the scripts
folder. It can have any name that you desire. Once named, the name needs to be invoked (without the .js extension) in the index.php
(check the code in the previous step).
The main script has a typical VueJS structure.
var app = new Vue({
el: '#app',
data () {
return {
response: null,
env: env
}
},
/* Method Definition */
methods: {
},
/* upon object load, the following will be executed */
mounted () {
}
});
Just like any Caligrafy and VueJs app, the server-side needs to route the url to the client-side pages. For that, you can do small modifications to the routes that come out-of-the-box with Caligrafy since we will be using the same Caligrafy Bot Controller.
...
// BOT PAGE ROUTES
Route::get('/bot/{appName}', 'ClientController'); // A botSkillId needs to be provided in the URL.
// BOT DATA ROUTES
// routes need to start with reserved path __bots__
Route::post('/__bots__/{appName}/communicate', 'WatsonController@communicate');
Route::post('/__bots__/{appName}/{appId}', 'WatsonController@connect');
Route::delete('/__bots__/{appName}', 'WatsonController@end');
...
Notice that there is one Page Route
and 3 Data Routes
(Refer to the Routes section of the Caligrafy and Vue section to learn about the difference).
The Data Routes
are request endpoints to the Caligrafy Bot Controller:
-
connect
: Post request that connects to the IBM API -
communicate
: Post request that sends a message and retrieves a response from the IBM API -
end
: Delete request that manually ends the Bot instance.
There are many changes that you can do to the Page Routes to tailor them to the needs of your application.
-
For Page routes, you can specify any path to route to as long as it always ends with {appName} variable. Don't forget that a
botSkillId
needs to be provided in the browser URL of the route as such: `localhost/caligrafy-quill//<your_bot_app_name>?botSkillId= -
If the destination page of your Page Route is not
index.php
, you can override that in the URL by appending anapp=<name of php file>
to the URL provided in the browser
Your VueJS main script needs to interface with the endpoints through asynchronous requests made to the Caligrafy server-side. In Caligrafy, this is usually achieved using axios
that is already integrated with the VueJs integration.
var app = new Vue({
el: '#app',
data() {
return {
response: null,
env: env,
botId: env.botSkillId,
route: env.home + "__bots__/" + env.appName + "/",
config: {
async: true,
crossDomain: true,
headers: {
"Authorization": "Bearer " + apiKey,
'Content-Type': 'application/json'
}
}
}
},
/* Method Definition */
methods: {
// Connect to the bot
connect: function(route) {
axios.post(route + this.botId, [], this.config)
.then(response => {
// if an error occurs then show a generic message
if(response.data === true) {
console.log('Connected');
} else {
console.log('Connection could not be established');
}
})
.catch(error => (console.log(error)));
},
// Communicate with the bot
communicate: function(route, input) {
axios.post(route + 'communicate', input, this.config)
.then(response => {
if (response.data && response.data['action_success'] === true && response.data.response) {
response.data.response.forEach((element) => {
switch(element['response_type']) {
case 'text':
// do something when the assistant responds with text
break;
case 'option':
// do something when the assistant responds with options
break;
case 'image':
// do something when the assistant responds with an image
break;
case 'pause':
// do something
break;
default:
console.log(response.data);
}
});
} else {
console.log("chat ended");
}
})
.catch(error => (console.log(error)));
}
},
/* upon object load, the following will be executed */
mounted () {
this.connect(this.route);
}
});