We want to index our client website and store all the data in IBM Watson Discovery service. When user asks question related to client data then (we will connect discovery with Watson Assistant). The chatbot should connect to Discovery and fetch the data to respond.
Problem:
The client website has multiple links and each link will have further links, we want crawl all the data from website and index and store it in Watson Discovery service. We tried crawling the site but Discovery service is taking much time to crawl the site and also its not completed the task after 1 week also.
Please let us know how we can achieve this in better and faster way.
Note that the web crawling is a current beta and the Watson Discovery documentation for web crawl states that, depending on the website, it will not ingest all data.
I used the web crawl in Discovery in a similar scenario like yours and query my website using a chat built with Watson Assistant. What you should do:
increase the number of hops: how deep should Watson Discovery crawl your website
depending on your website: add multiple entry points
specify all the paths that you want to exclude. I added those that would add duplicate entries and those generated summary pages, RSS feeds, etc.
adjust how often it should crawl
check that Watson Discovery can access your website and that your website does not block crawling
Related
We have a survey which is based on Microsoft Forms, we are using 3 URLs from our website which all auto re-direct users to the Form. However, I want to check that Google Analytics will pick up user and page views stats so we can monitor performance of the 3 URLS. Can you confirm if there is anything I need to do within Google Analytics to ensure this? I've seen Tag Manager mentioned and Thanks, Jayne
There are plenty of websites that can check if a site is penalized, including Google's webmaster tools. I need a scripted method to do this as I have to monitor hundreds of domains. Is there a Google REST interface, API, or sample code for this?
I can already connect my conversation to my Facebook page and the bot is working.
My Problem is that i couldn't find a single tutorial on how to make calls to a server that is owned by a company and has information about accounts and payments, for example when i type to the Chat-bot that i want to see a list of accounts, the chat-bot would reply with a list from that API; Or i can login to that server with my account by typing the account's name and password in the chat-bot. I just wanted to know if anyone knows how to do that or if its even possible or knows a tutorial that explains it. I also would prefer if the tutorial was detailed because i'm new to this field.
As #Felipe Paixao said, you need to develop an orchestrating application.
I've found one example that explains step-by-step how to use Watson Conversation with Facebook Messenger.
Requirements:
Facebook page
IBM Bluemix
A Facebook Developer Account
Source code.
Step - IBM Bluemix:
Create a SDK for Node.js application
Create a Watson Conversation service
Create your Workspace for Watson Conversation and build your Dialog
Configure the JS app and integrate it with the Watson Service Connections -> Connect Existent:
In your App created in IBM Bluemix, access the Tools and go to IDE Eclipse, and create a new file: app.js with this code.
Replace with your credentials the username and password
You can see your credentials inside Service Credentials, or going to your workspace and access:
Step - Facebook Developers:
Enter to facebook developers with your Facebook account
In the Dashboard select Create a New App and Select Messenger
Put the URL from your Bluemix (Now IBM Cloud) app and select the options marked: messages, message_deliveries, messaging_options e messaging_postbacks.
Now, return to our app to configure the app.js file with our new FB token autogenerated and deploy it.
Step Final
Test your chatbot: If you made everything OK, we will have your chatbot available on your Facebook page.
Obs.: For built your example, you can add the lines of code for call your API in the app.js using the http module, using the Advanced JSON in Watson Conversation and create one "action": "callApi", and recognize with your code for do something, like this example.
See the Official Reference that I used to build Facebook Messenger with Node.js SDK (Back-end) source code based on Priscila Parodi codes.
You need to develop an orchestrating application, that would be responsible to connect the IBM Watson Conversation with your desired Fronted, Facebook in this case, and the Backend systems. I imagine that you are using the automatic deploy to Facebook from the Conversation Tooling, this option would not resolve your need to access a Backend.
The image passed by #data_henrik shows exactly that.
In conversation, you can create context variables that will be used as triggers in the Orchestrator to let him know when you need to call a Backend service to provide an information and then write the response into the Conversation context.
About using the Chatbot to login into a system, passing a password, this isn't a safe option since all information typed would go to Watson Conversation log and any person with access to the tooling/api could see the passwords.
Good day,
Background: We have a single web application that multiple external websites link to; users visit www.aaa.com or www.bbb.com and can then click through to our web site at www.example.com.
When we send email comms, the users are directed to their respective client URL.
We require a method of tracking these users from email comms using Google Analytics so that we can see their activity in the Campaigns section.
Issue: The problem is that whilst we have Google Analytics enabled on www.example.com, we are not able to install analytics on client URLs. This means that if we affix the Google tags after the URL in emails, these are stripped out when a user then navigates around a client URL before visiting ours. This then means they do not appear in the 'Campaigns' tab of GA. That is:
trackable --> www.example.com?utm_source=offeremail&utm_campaign=testcampaign&utm_medium=email
not trackable -- > www.aaa.com?utm_source=offeremail&utm_campaign=testcampaign&utm_medium=email
Question: Are we able to start the tracking once a user clicks a link in an email but then accesses our site from another site and then show the results in the campaign tab?
Thank you!
Short answer: No. You need to pass the UTM-parameters in the URL for Google Analytics to understand it is campaign traffic. This is usually the issue with redirects between websites, that this UTM-tagging is lost, thus losing the campaign information.
What I would suggest is that you look at referrals or have your clients add tracking to the links back to your website. Then correlate that to the dates of your send-outs. It is a bit of a middleground, but it should do the trick to atleast see trending success.
I've got client who has several different websites, each website having Google analytics account.
Our marketing department is also running Google Adwords/Facebook/Instagram ads for each of those websites.
I want data/leads from Google and Facebook on my local server. I also want real-time data that should automatically update from Google & Facebook with fixed time interval.
I've been Googling and experimenting with the Analytics API docs, but no luck.
Any one please help me to find better solution.
#This question is not for this website but I didn't find any other option.
Thank you.
There are several services that your are looking for. Use this info like a getting started. Every service has his own simplicities and complexities.
Adwords
Use Campaign Performance Report to download your campaign's metrics. See examples in PHP, Java, Python, etc.
Analytics
Use RealTime API to get real time data. See examples
Use Core Reporting API to get general data. See examples
There is a very useful tool: Query Explorer to see Analytics API working
Facebook / Instagram
Use Ads Insights API to download your campaigns's metrics. See examples