Live Helper Chat support forum.. Forum is locked. New place for questions - Github Discussions
You are not logged in.
Good evening. I searched the forum but I couldn't find a way to integrate the dialogflow. Can anyone tell me how to do this integration?
Offline
Hi,
At the moment I don’t have sample for that. I will prepare sample over next week.
You could take a look at https://github.com/LiveHelperChat/lhcparcel
Offline
Thank you very much. I have a Nodejs server that is integrated with the dialogflow. I want to integrate this server with the LHC. Your example will be very helpful. I look forward to seeing. Thank you very much.
Offline
Here is tutorial which can be used with any external bot.
https://doc.livehelperchat.com/docs/bot … any-ai-bot
Offline
Thank you very much. I will try to understand your code. But could you tell me if it is possible to switch from AI to LHC images and cards?
Offline
I'm not sure I understood? If you read description you would see that answers can come either form LHC either from AI itself. It all depends how you setup everything.
Offline
Hi,
I´m also interested in how to use DialogFlow with LHC.
Can you make a video showing how do implement this integration?
Thanks
Offline
Hi,
Sorry but this implementation requires codding. This extension i prepared is perfect bootstrap for that purpose, but the rest is up to you
Offline
Hi,
Do you have some docs about which structures (objects, functions) can I use to send the bot´s answers to the user?
If not, can you tell me the name of some of them?
Thanks
Offline
If you looked at example E.g
https://github.com/LiveHelperChat/lhcai … ap.php#L51 here I just execute trigger and pass what possible can be returned from bot "text message"
If you want to send response directly from bot, not within extension you can just use https://doc.livehelperchat.com/docs/dev … /rest-api/ and https://api.livehelperchat.com/#/chat/p … ddmsgadmin method.
As for structure you can see how they look like in bot itself. Show code button under all responses.
Offline
Hi. This option does not appear on my system.
Offline
Hi,
You need to update your LHC version
Offline
Hello. I tried to blow up the old files with the new version and it didn't work. I redid the installation, now the option has appeared. What is the best way to update the system?
Offline
Offline
Good Morning. I have almost everything working. But I'm having trouble sending Raw / Json. I'm using Body with Request body Raw and Json. I'm sending it for tests: {"texto":"Minha impressora esta quebrada"}
But my application is accusing that the request is unknown. Am I doing something wrong or can I have a bug to send json?
The same test at Postman is successful.
Last edited by edersonbologna (2020-05-05 15:46:41)
Offline
Hi,
Try to add header that it is json body you are sending. Header section in rest api
Offline
OK thank you. My application is not reporting an error with the header implementation. Now I have a problem handling the answer. In the postman the answer in json is displayed completely. DialogFlow sends a large json response and has many levels. How do I get the internal levels of the json response?
Offline
If you saw youtube video how I use rest api for IP location you can do the same. See examples, that way you can get to any attribute, does not matter how deep it is.
Offline
Hello good day. Wonderful system. Your tips are very accurate. It is all creating form. Thank you.
Another question: Is there a way for the LHC to take a pre-formatted answer (I think in Json) and already assemble a Card or automatically add the quick replies buttons?
Offline
1. What you mean preformated answer?
2. In rest API there is dedicated field called "Meta msg location. If you support Live Helper Chat json syntax you can set location of this response." you just tell where it's his location and your Rest API can reply with meta messages
https://doc.livehelperchat.com/docs/bot … g-location
Easiest way to know api. Just craete a trigger. Execute chat and see in db how meta_msg field looks like so you can just copy paste
Offline
I did not understand the use of Meta msg location. Do you have any videos showing how to use it?
Offline
https://doc.livehelperchat.com/docs/bot … g-location
"Output parsing" has a field "Meta msg location. If you support Live Helper Chat json syntax you can set location of this response."
Offline
Yes I found this option. But I didn't understand how she will appear in the chat. When I use the Response Location 1 field, I place {content_1} on the bot. But using the meta_msg field what do I put in the bot to get the answer?
Offline
You don't need to put anything. You just leave as is bot response so it will write message as it does now and if it sees meta message will hold it also just.
Offline
You don't need to put anything. You just leave as is bot response so it will write message as it does now and if it sees meta message will hold it also just.
OK. And the meta_msg will only appear if it is formatted according to the format recognized by the LHC?
This format ('{"content":{"quick_replies":[{"type":"button","content":{"name":"Rest API Button","payload":"rest_api_button"}}]}}',true) correct?
Last edited by edersonbologna (2020-05-06 15:05:49)
Offline