New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Discussion: Bot orchestration #715
Comments
If I understand correctly, then this is an analogue of story in rasa, the name is more logical. They have interactive training that creates a story out of dialogue that can be used for tests too. It is very comfortable. I like the idea of a single syntax - json better, but for some reason in nlp.js everything is divided into json, md, and a kind of pipeline, which is not very convenient. |
Hi, I've been working on something similar.
I've implemented something like RASA core with rules, stories and forms
based on your neural network library so it can run in a browser.
https://github.com/syntithenai/voicedialogjs
I'm using it in https://opennludata.org/ which is a web UI for annotating
intent data and importing/exporting Jovo/RASA/Mycroft
intents/responses/entities.
When I saw your neural network library I figured I could use that for the
stories part of Rasa style core routing and I've added features to the UI
to manage stories, forms, rules, actions and apis (written in javascript to
run in a browser) that can be exported as a single JSON file containing
everything needed to run the bot in the browser.
Published skills are available as a standalone chat application served from
github eg.
https://opennludata.org/static/skills/Syntithenai-music%20player.html
All heavily based on your work with NLP.js so THANKS A MILLION.
Still a work in progress and a bit a source code dogs breakfast but I saw
your post and it seemed timely to reply.
-------------------------------------------------
I think the combination of forms (slot filling) with machine learning based
stories is a winner in building a bot framework.
I ponder what other conversational structures might have a place and the
idea of goals and goal completion would be a good addition.
Food for thought
Again thanks.
Steve
…On Tue, Nov 10, 2020 at 2:16 AM Jesús Seijas ***@***.***> wrote:
Hello!
For the version 5 we are working on making a chatbot orchestration.
You can see an explanation and example in this comment: #713 (comment)
<#713 (comment)>
You can see there the commands that are already implemented, but we want
to start a dialog to get feedback about which commands can be implemented.
Also, we developed a connector for building bots using CDD (Conversation
Driven Development), you can see an example here:
https://github.com/axa-group/nlp.js/blob/master/packages/bot/test/bot.test.js#L54
This Unit Tests is able to run the bot and test it using an scenario
described in the file *scenario01.dlt*, with this content:
user> hello
bot> What's your name?
user> John
bot> (Converting name to uppercases...)
bot> Hello JOHN
bot> This is the help
bot> This is the second help
bot> Bye user
So, feel free to comment what features you wish related with conversation
orchestration.
Thank you!
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#715>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABA5YTTMSCDIMG2U54Z3IOLSPABUVANCNFSM4TPPB2TA>
.
--
Steve Ryan
Software Engineer
stever@syntithenai.com
phone (mobile) 0457057526
skype : irishflute
260 Spring Creek Road Buckajo NSW 2550
Australia
|
From my point of view, one of the most interesting purposes for the orchestration should:
Something like this, having in mind these assumptions: getWeatherCondition getWeatherCard plot1.dlg
weather.dlg
|
Hello, a little update of last version 4.16, and what it includes: You have an example here: https://github.com/jesus-seijas-sp/nlpjs-examples/blob/master/04.bot Comments
Import another .dlg
language
intents You can add new intents using dlg. They will be added to the language of the last language command
entity
This code will create two entities: hero and email. Email is a regex entity. Hero is an enum entity with three options: spiderman, ironman and thor. Spiderman is identified when the text "spiderman" or "spider-man" is found. dialog
say
ask
This will store the input from the user in the name variable run
nlp https://github.com/jesus-seijas-sp/nlpjs-examples/blob/master/04.bot/corpus.dlg#L502
That means that when someone says 'Hello' to the bot, the dialog '/hellodialog' is executed:
inc
dec
set
conditions
The [!user_name] run greet means that the dialog greet will be executed only if the variable user_name does not exists. String templating call
That means that the bot will try to find the function uppers and call it with parameter "user_name". const uppers = (session, context, params) => {
if (params) {
const variableName = typeof params === 'string' ? params : params[0];
if (variableName) {
context[variableName] = (context[variableName] || '').toUpperCase();
}
}
}; The way to register an action is calling bot.registerAction with the name of the action and the function to be executed:
The signature of each action is: function action(session, context, params) So the action will receive the session object, the context object and the parameters. |
New features added:
Example:
card02.json content: {
"name": "card02",
"type": "message",
"attachments": [
{
"contentType": "application/vnd.microsoft.card.adaptive",
"content": {
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"version": "1.0",
"type": "AdaptiveCard",
"speak": "Your flight is confirmed for you and 3 other passengers from San Francisco to Amsterdam on Friday, October 10 8:30 AM",
"body": [
{
"type": "TextBlock",
"text": "Passengers",
"weight": "bolder",
"isSubtle": false
},
{
"type": "TextBlock",
"text": "Sarah Hum",
"separator": true
},
{
"type": "TextBlock",
"text": "Jeremy Goldberg",
"spacing": "none"
},
{
"type": "TextBlock",
"text": "Evan Litvak",
"spacing": "none"
},
{
"type": "TextBlock",
"text": "2 Stops",
"weight": "bolder",
"spacing": "medium"
},
{
"type": "TextBlock",
"text": "Fri, October 10 8:30 AM",
"weight": "bolder",
"spacing": "none"
},
{
"type": "ColumnSet",
"separator": true,
"columns": [
{
"type": "Column",
"width": 1,
"items": [
{
"type": "TextBlock",
"text": "San Francisco",
"isSubtle": true
},
{
"type": "TextBlock",
"size": "extraLarge",
"color": "accent",
"text": "SFO",
"spacing": "none"
}
]
},
{
"type": "Column",
"width": "auto",
"items": [
{
"type": "TextBlock",
"text": " "
},
{
"type": "Image",
"url": "http://adaptivecards.io/content/airplane.png",
"size": "small",
"spacing": "none"
}
]
},
{
"type": "Column",
"width": 1,
"items": [
{
"type": "TextBlock",
"horizontalAlignment": "right",
"text": "Amsterdam",
"isSubtle": true
},
{
"type": "TextBlock",
"horizontalAlignment": "right",
"size": "extraLarge",
"color": "accent",
"text": "AMS",
"spacing": "none"
}
]
}
]
},
{
"type": "TextBlock",
"text": "Non-Stop",
"weight": "bolder",
"spacing": "medium"
},
{
"type": "TextBlock",
"text": "Fri, October 18 9:50 PM",
"weight": "bolder",
"spacing": "none"
},
{
"type": "ColumnSet",
"separator": true,
"columns": [
{
"type": "Column",
"width": 1,
"items": [
{
"type": "TextBlock",
"text": "Amsterdam",
"isSubtle": true
},
{
"type": "TextBlock",
"size": "extraLarge",
"color": "accent",
"text": "AMS",
"spacing": "none"
}
]
},
{
"type": "Column",
"width": "auto",
"items": [
{
"type": "TextBlock",
"text": " "
},
{
"type": "Image",
"url": "http://adaptivecards.io/content/airplane.png",
"size": "small",
"spacing": "none"
}
]
},
{
"type": "Column",
"width": 1,
"items": [
{
"type": "TextBlock",
"horizontalAlignment": "right",
"text": "San Francisco",
"isSubtle": true
},
{
"type": "TextBlock",
"horizontalAlignment": "right",
"size": "extraLarge",
"color": "accent",
"text": "SFO",
"spacing": "none"
}
]
}
]
},
{
"type": "ColumnSet",
"spacing": "medium",
"columns": [
{
"type": "Column",
"width": "1",
"items": [
{
"type": "TextBlock",
"text": "Total",
"size": "medium",
"isSubtle": true
}
]
},
{
"type": "Column",
"width": 1,
"items": [
{
"type": "TextBlock",
"horizontalAlignment": "right",
"text": "$4,032.54",
"size": "medium",
"weight": "bolder"
}
]
}
]
}
]
}
}
]
} |
is it possible to send a card as a template ? |
@torloneg Yes, you can include templating inside the card that will be replaced with context information. |
I can generate a card at runtime in onIntent function ? |
No, not on an onIntent because onIntent is part of the NLP, not part of the bot orchestration. But ok, step by step.
const card = {
"name": "card04",
"type": "message",
"attachments": [
{
"contentType": "application/vnd.microsoft.card.adaptive",
"content": {
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"version": "1.0",
"type": "AdaptiveCard",
"speak": "Your flight is confirmed for you and 3 other passengers from San Francisco to Amsterdam on Friday, October 10 8:30 AM",
"body": [
{
"type": "TextBlock",
"text": "Passengers",
"weight": "bolder",
"isSubtle": false
},
{
"type": "TextBlock",
"text": "Sarah Hum",
"separator": true
},
{
"type": "TextBlock",
"text": "Jeremy Goldberg",
"spacing": "none"
},
{
"type": "TextBlock",
"text": "Evan Litvak",
"spacing": "none"
},
{
"type": "TextBlock",
"text": "2 Stops",
"weight": "bolder",
"spacing": "medium"
},
{
"type": "TextBlock",
"text": "Fri, October 10 8:30 AM",
"weight": "bolder",
"spacing": "none"
},
{
"type": "Input.Text",
"placeholder": "Placeholder text",
"id": "inputText"
},
{
"type": "Input.Date",
"id": "inputDate"
},
{
"type": "Input.Time",
"id": "inputTime"
},
{
"type": "ActionSet",
"actions": [
{
"type": "Action.Submit",
"title": "Send!"
}
]
}
]
}
}
]
}
const sendCard = (session, context, params) => {
session.sendCard(card, context);
};
bot.registerAction('sendCard', sendCard);
Ok, now we have the dialog card, that will call the action sendCard, that sends the card. But, how to call this dialog from an intent?
|
Hello, I am using nlpjs in a react-native project, is it possible to achieve bot orchestration in React-Native. if not, what would be the right way to do it. FYI Thank you |
Hello!
For the version 5 we are working on making a chatbot orchestration.
You can see an explanation and example in this comment: #713 (comment)
You can see there the commands that are already implemented, but we want to start a dialog to get feedback about which commands can be implemented.
Also, we developed a connector for building bots using CDD (Conversation Driven Development), you can see an example here:
https://github.com/axa-group/nlp.js/blob/master/packages/bot/test/bot.test.js#L54
This Unit Tests is able to run the bot and test it using an scenario described in the file scenario01.dlt, with this content:
So, feel free to comment what features you wish related with conversation orchestration.
Thank you!
The text was updated successfully, but these errors were encountered: