Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Discussion: Bot orchestration #715

Open
jesus-seijas-sp opened this issue Nov 9, 2020 · 10 comments
Open

Discussion: Bot orchestration #715

jesus-seijas-sp opened this issue Nov 9, 2020 · 10 comments
Labels

Comments

@jesus-seijas-sp
Copy link
Contributor

Hello!

For the version 5 we are working on making a chatbot orchestration.
You can see an explanation and example in this comment: #713 (comment)

You can see there the commands that are already implemented, but we want to start a dialog to get feedback about which commands can be implemented.

Also, we developed a connector for building bots using CDD (Conversation Driven Development), you can see an example here:
https://github.com/axa-group/nlp.js/blob/master/packages/bot/test/bot.test.js#L54
This Unit Tests is able to run the bot and test it using an scenario described in the file scenario01.dlt, with this content:

user> hello
bot> What's your name?
user> John
bot> (Converting name to uppercases...)
bot> Hello JOHN
bot> This is the help
bot> This is the second help
bot> Bye user

So, feel free to comment what features you wish related with conversation orchestration.

Thank you!

@intech
Copy link

intech commented Nov 10, 2020

If I understand correctly, then this is an analogue of story in rasa, the name is more logical.

They have interactive training that creates a story out of dialogue that can be used for tests too. It is very comfortable.

I like the idea of a single syntax - json better, but for some reason in nlp.js everything is divided into json, md, and a kind of pipeline, which is not very convenient.

@syntithenai
Copy link
Contributor

syntithenai commented Nov 10, 2020 via email

@ericzon
Copy link
Collaborator

ericzon commented Nov 10, 2020

From my point of view, one of the most interesting purposes for the orchestration should:

  • Be able to follow the flow just switching between .dlg files, resolving all the conditions from the code developed in the "registerAction" or "registerCondition". This implies having if ... else if ... else (loops could be handled with the conditions)
  • Be able to call a .dlg from another .dlg (and the same for .dlt)
  • Be able to generate answers with adaptive cards or similar.

Something like this, having in mind these assumptions:

getWeatherCondition
async code attached to bot with registerAction or similar, where getWeatherAnswer value could be accessed and also other data like user coordinates to perform an async request to weather API

getWeatherCard
function that can receive parameters and access to context to generate the json of an adaptive card

plot1.dlg

dialog main
  nlp

dialog askWeatherDialog
  say do you want to know the weather?
  ask getWeatherAnswer
  [if getWeatherCondition is true] run acceptGetWeather
  [else] run declineGetWeather

dialog acceptGetWeather
  read weather.dlg
  run byeDialog

dialog declineGetWeather
  say well, If you have any request, just tell me
  run main

dialog byeDialog
  say bye bye :)

weather.dlg

dialog acceptGetWeather
  say well, the weather in your coordinates is 
  card getWeatherCard

@jesus-seijas-sp
Copy link
Contributor Author

jesus-seijas-sp commented Nov 13, 2020

Hello, a little update of last version 4.16, and what it includes:

You have an example here: https://github.com/jesus-seijas-sp/nlpjs-examples/blob/master/04.bot

Comments
If a line starts with # then is a comment. Example

# this is a comment

Import another .dlg
If a line starts with import then will import other .dlgs. You can provide several separated by space:

import something.dlg ./others/other.dlg

language
If a line starts with language it sets a locale for the code until another language command is found.

# this sets the language to english
language en

intents

You can add new intents using dlg. They will be added to the language of the last language command
For each intent you can define the utterances, tests and answers

language en
intent agent.acquaintance
  utterances
  - say about you
  - describe yourself
  - tell me about yourself
  - who are you
  - I want to know more about you
  answers
  - I'm a virtual agent
  - Think of me as a virtual agent

entity
In the same way that you can define intents, you can define entities. There are two different kind of entities that you can define: enum and regex.

entity hero
  - spiderman: spiderman, spider-man
  - ironman: ironman, iron-man
  - thor: thor
entity email 
  regex /\\b(\\w[-._\\w]*\\w@\\w[-._\\w]*\\w\\.\\w{2,3})\\b/gi

This code will create two entities: hero and email. Email is a regex entity. Hero is an enum entity with three options: spiderman, ironman and thor. Spiderman is identified when the text "spiderman" or "spider-man" is found.

dialog
This will create a new dialog with a pipeline of commands. Example:

# Script for a simple turn conversation
import corpus.dlg
dialog main
  nlp
dialog hellodialog
  [!user_name] run greet
  run bye
dialog greet
  say Hello user!
  say Tell me your name
  ask user_name
  call uppers user_name
  [user_name !== 'ADMIN'] say Hello {{ user_name }}
  [user_name === 'ADMIN'] say You unblocked admin mode
dialog bye
  say Bye user 

say
This is the command for the chatbot to say something to the user.

say Hello, I'm a chatbot

ask
This wait for input of the user and store it into a variable. Important: this will not say anything to the user, just wait the input and store it. If you want to say something to the user, use say.

say What's your name?
ask name

This will store the input from the user in the name variable

run
Executes another dialog by name.
The dialog execution is an stack that stores also the last position of each dialog on the stack. Example:

dialog main
  run greet
  run help
  run bye
dialog greet
  say hello user
dialog help
  say I'm a bot
  say You can ask me questions
dialog bye
  say bye user
  • We start with stack empty: [].
  • The dialog main starts, so we put it in the stack at position 0: [main:0]
  • We read the position 0 is run greet so we move the position of main, and add the greet dialog: [main:1, greet:0]
  • We execute the "say hello user" and then greet ends, so is removed from the stack: [main: 1]
  • We are now in the line 1 of main, that is to run help, so we move main to position 2 and we add help: [main: 2, help:0]
  • We execute the "say I'm a bot" and move help to position 1: [main: 2, help: 1]
  • We execute the "say You can ask me questions" so help ends and is removed from stack: [main: 2]
  • We execute the "run bye" so we move main to next position and add bye to the stack: [main: 3, bye: 0]
  • We execute "say bye user" so bye is removed from stack: [main: 3]
  • Main does not contains more commands, so is removed from stack: []
  • As the stack is empty, it returns to the default state that is to wait for input.

nlp
This command execute the input from user to the nlp and retrieve the answer.
Important thing here: the answre from nlp can be a message to the user or starts by /. If the answer starts by / then it means that is to execute a dialog, so it acts as "run /dialogname". In the example if you go to the corpus you'll find this:

https://github.com/jesus-seijas-sp/nlpjs-examples/blob/master/04.bot/corpus.dlg#L502

intent greetings.hello
  utterances
  - hello
  - hi
  - howdy
  answers
  - /hellodialog

That means that when someone says 'Hello' to the bot, the dialog '/hellodialog' is executed:

dialog hellodialog
  [!user_name] run greet
  run bye

inc
It will increment a variable by it's name. You can provide the increment or not, by default is 1. If the variable does not exists, then initialize it to 0 before inc.

# This will add 1 to count
inc count
# This will add 3 to count
inc count 3

dec
It will decrement a variable by it's name. You can provide the decrement or not, by default is 1. If the variable does not exists, then initialize it to 0 before dec.

# This will substract 1 from count
dec count
# This will substract 3 from count
dec count 3

set
This will set a value to a variable. You can provide an expression:

# this will set count to (count*2+1)
set count count * 2 + 1

conditions
You can add conditions before each command so the command will be execute if the condition is solved to truthy. Just add the condition between square brackets:

# Script for a simple turn conversation
import corpus.dlg
dialog main
  nlp
dialog hellodialog
  [!user_name] run greet
  run bye
dialog greet
  say Hello user!
  say Tell me your name
  ask user_name
  call uppers user_name
  [user_name !== 'ADMIN'] say Hello {{ user_name }}
  [user_name === 'ADMIN'] say You unblocked admin mode
dialog bye
  say Bye user

The [!user_name] run greet means that the dialog greet will be executed only if the variable user_name does not exists.
When the user_name is ADMIN then the user will receive the message "You unblocked admin mode" otherwise the user will receive "Hello {{ user_name }}"

String templating
When you see "Hello {{ user_name }}" that means that the part {{ user_name }} will be replaced with the variable user_name from the context.

call
This is used to call functions so you can code by yourself actions for the chatbot. In the example you'll find:

  call uppers user_name

That means that the bot will try to find the function uppers and call it with parameter "user_name".
Important: user_name will not be replaced with the user name from context, user_name will be provided exactly as is, an string with value "user_name".
The function uppers has this code:

const uppers = (session, context, params) => {
  if (params) {
    const variableName = typeof params === 'string' ? params : params[0];
    if (variableName) {
      context[variableName] = (context[variableName] || '').toUpperCase();
    }
  }
};

The way to register an action is calling bot.registerAction with the name of the action and the function to be executed:

bot.registerAction('uppers', uppers)

The signature of each action is:

  function action(session, context, params)

So the action will receive the session object, the context object and the parameters.

@jesus-seijas-sp
Copy link
Contributor Author

New features added:

  • Now the import accepts files in json format for importing corpus or cards
  • Card jsons can contains one card or an array of cards. Every card must have a "name" property with the name of the card.
  • Command suggest:
    Adds suggestionActions (quick buttons)
suggest Car|Bus|Bicycle
  • Command card:
    Sends a card by name
card card02

Example:

# Script for a simple turn conversation
import corpus-ner.json
import card01.json card02.json card03.json
dialog main
  nlp
dialog demo01
  suggest Car|Bus|Bicycle
  say Please enter your mode of transport.
dialog demo02
  card card02

card02.json content:

{
  "name": "card02",
  "type": "message",
  "attachments": [
    {
      "contentType": "application/vnd.microsoft.card.adaptive",
      "content": {
        "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
        "version": "1.0",
        "type": "AdaptiveCard",
        "speak": "Your flight is confirmed for you and 3 other passengers from San Francisco to Amsterdam on Friday, October 10 8:30 AM",
        "body": [
          {
            "type": "TextBlock",
            "text": "Passengers",
            "weight": "bolder",
            "isSubtle": false
          },
          {
            "type": "TextBlock",
            "text": "Sarah Hum",
            "separator": true
          },
          {
            "type": "TextBlock",
            "text": "Jeremy Goldberg",
            "spacing": "none"
          },
          {
            "type": "TextBlock",
            "text": "Evan Litvak",
            "spacing": "none"
          },
          {
            "type": "TextBlock",
            "text": "2 Stops",
            "weight": "bolder",
            "spacing": "medium"
          },
          {
            "type": "TextBlock",
            "text": "Fri, October 10 8:30 AM",
            "weight": "bolder",
            "spacing": "none"
          },
          {
            "type": "ColumnSet",
            "separator": true,
            "columns": [
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "text": "San Francisco",
                    "isSubtle": true
                  },
                  {
                    "type": "TextBlock",
                    "size": "extraLarge",
                    "color": "accent",
                    "text": "SFO",
                    "spacing": "none"
                  }
                ]
              },
              {
                "type": "Column",
                "width": "auto",
                "items": [
                  {
                    "type": "TextBlock",
                    "text": " "
                  },
                  {
                    "type": "Image",
                    "url": "http://adaptivecards.io/content/airplane.png",
                    "size": "small",
                    "spacing": "none"
                  }
                ]
              },
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "text": "Amsterdam",
                    "isSubtle": true
                  },
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "size": "extraLarge",
                    "color": "accent",
                    "text": "AMS",
                    "spacing": "none"
                  }
                ]
              }
            ]
          },
          {
            "type": "TextBlock",
            "text": "Non-Stop",
            "weight": "bolder",
            "spacing": "medium"
          },
          {
            "type": "TextBlock",
            "text": "Fri, October 18 9:50 PM",
            "weight": "bolder",
            "spacing": "none"
          },
          {
            "type": "ColumnSet",
            "separator": true,
            "columns": [
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "text": "Amsterdam",
                    "isSubtle": true
                  },
                  {
                    "type": "TextBlock",
                    "size": "extraLarge",
                    "color": "accent",
                    "text": "AMS",
                    "spacing": "none"
                  }
                ]
              },
              {
                "type": "Column",
                "width": "auto",
                "items": [
                  {
                    "type": "TextBlock",
                    "text": " "
                  },
                  {
                    "type": "Image",
                    "url": "http://adaptivecards.io/content/airplane.png",
                    "size": "small",
                    "spacing": "none"
                  }
                ]
              },
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "text": "San Francisco",
                    "isSubtle": true
                  },
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "size": "extraLarge",
                    "color": "accent",
                    "text": "SFO",
                    "spacing": "none"
                  }
                ]
              }
            ]
          },
          {
            "type": "ColumnSet",
            "spacing": "medium",
            "columns": [
              {
                "type": "Column",
                "width": "1",
                "items": [
                  {
                    "type": "TextBlock",
                    "text": "Total",
                    "size": "medium",
                    "isSubtle": true
                  }
                ]
              },
              {
                "type": "Column",
                "width": 1,
                "items": [
                  {
                    "type": "TextBlock",
                    "horizontalAlignment": "right",
                    "text": "$4,032.54",
                    "size": "medium",
                    "weight": "bolder"
                  }
                ]
              }
            ]
          }
        ]
      }
    }
  ]
}

image

@torloneg
Copy link

is it possible to send a card as a template ?

@jesus-seijas-sp
Copy link
Contributor Author

@torloneg Yes, you can include templating inside the card that will be replaced with context information.
When you're sending a card, all the card object nodes are visited and passed through a template, so if you put something like "Hello {{ name }}" the "{{ name }}" part will be replaced by the value of context.name.

@torloneg
Copy link

I can generate a card at runtime in onIntent function ?

@jesus-seijas-sp
Copy link
Contributor Author

No, not on an onIntent because onIntent is part of the NLP, not part of the bot orchestration.
But you can do it on a "call", as explained in this same thread, is used to build actions by yourself.

But ok, step by step.

  1. In your code you can build your card dinamically, but here is an example as constant:
const card = {
  "name": "card04",
  "type": "message",
  "attachments": [
    {
      "contentType": "application/vnd.microsoft.card.adaptive",
      "content": {
        "$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
        "version": "1.0",
        "type": "AdaptiveCard",
        "speak": "Your flight is confirmed for you and 3 other passengers from San Francisco to Amsterdam on Friday, October 10 8:30 AM",
        "body": [
          {
            "type": "TextBlock",
            "text": "Passengers",
            "weight": "bolder",
            "isSubtle": false
          },
          {
            "type": "TextBlock",
            "text": "Sarah Hum",
            "separator": true
          },
          {
            "type": "TextBlock",
            "text": "Jeremy Goldberg",
            "spacing": "none"
          },
          {
            "type": "TextBlock",
            "text": "Evan Litvak",
            "spacing": "none"
          },
          {
            "type": "TextBlock",
            "text": "2 Stops",
            "weight": "bolder",
            "spacing": "medium"
          },
          {
            "type": "TextBlock",
            "text": "Fri, October 10 8:30 AM",
            "weight": "bolder",
            "spacing": "none"
          },
          {
            "type": "Input.Text",
            "placeholder": "Placeholder text",
            "id": "inputText"
          },
          {
            "type": "Input.Date",
            "id": "inputDate"
          },
          {
            "type": "Input.Time",
            "id": "inputTime"
          },
          {
            "type": "ActionSet",
            "actions": [
              {
                "type": "Action.Submit",
                "title": "Send!"
              }
            ]
          }
        ]
      }
    }
  ]
}
  1. Create the code for an action. An action receives the session, the context, and the parameters. The session contains the method "sendCard" that receives the card and the context.
const sendCard = (session, context, params) => {
  session.sendCard(card, context);
};
  1. Register the action to the bot with a name. That way you'll be able to call this action from the script of the bot using 'call'
  bot.registerAction('sendCard', sendCard);
  1. Define a dialog that calls this action:
dialog dialogCard
  call sendCard

Ok, now we have the dialog card, that will call the action sendCard, that sends the card. But, how to call this dialog from an intent?

  1. Create an intent where the answer is '/dialogCard'

  2. Run the bot, trigger the intent.

@rohit-32
Copy link

Hello, I am using nlpjs in a react-native project, is it possible to achieve bot orchestration in React-Native. if not, what would be the right way to do it.

FYI
I would prefer to do it code than in script format

Thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants