I'm trying to export data from Presence Insights on Bluemix, I followed the following documentation:
https://presenceinsights.ng.bluemix.net/pidocs/analytics/export/
however I can't seem to find export button mentioned inside the document.
Data can be exported from the IBM Presence Insights Dashboard if you have data available. There are also REST APIs for exporting data. They are documented in the Floors, Sites, and Zones sections of the API Reference.
There were REST APIs in the product some time ago, but they were found to have limitations that made them less useful in production. In particular, the amount of data that builds up forces the response time on the API to grow beyond what the Bluemix infrastructure allowed. The API requests would timeout. To that end, the APIs were backed out, but it appears the documentation was left. That will be removed shortly.
Presence Insights still understands the value of exporting the data, so a new scheme is under investigation. For example, it would be ideal if the data could be exported under the covers to a production data storage facility, on a regular time frame.
In the interim, an alternative solution would be to use a Subscription to gather the backend enter/exit/dwell/timeout events directly and roll your own solution to store only what you need in whatever facility works for your application.
Related
I need to connect Salesforce to an external database we have, and constantly keep both the database and salesforce updated in as close to real time as we can get. I have tired Google searching possible solutions, but nearly all of them have been outdated by over a year. Any ideas?
Thank You!
Depending on your exact scenario it is quite difficult to give you a proper answer.
However off the top of my head I would suggest two Salesforce products.
Salesforce Connect
https://www.salesforce.com/products/platform/products/salesforce-connect/
Salesforce Connect allows you to connect to various data sources and turn the tables / objects of that data source into a SObject. For example MySQL, Microsoft SQL Server, Oracle etc. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Heroku Connect
https://www.heroku.com/connect
Heroku Connect allows you to connect a Heroku data source with a Salesforce Object. The sync is not immediate but there are quite a few customisations inside the product to make the sync as "live" as possible. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Salesforce Connect has limitations.. It's good for presenting data via the interface, but if you need to act on the data and report on the data it might not be the best bet.
For close to real time hand coded sync, look at the streaming API, or using Salesforce Platform Events.
If you want to use an ETL tool, my organization has had decent luck with DBAmp, which is a Sql add on product and fairly inexpensive as compared to a lot of ETL tools ($1625 annually.) http://www.forceamp.com/ We're able to replicate the entire SF database offline in SQL with DBAMP, push changes to the offline Sql copy and upsert changes. It's also a good backup solution via offline full data copy. We got very good support from them as well when we encountered challenges.
Hope this helps.
Not sure if you are syncing one object or multiple objects but there are a few options that you have.
You can try the salesforce provided features Salesforce Connect which allows you to view and update data from your external source In salesforce but there are limitations with reporting and other considerations you should consider.
If you make use of Heroku, Heroku Connect is your best bet
You can also use a middleware ESB solutions like MuleSoft which can orchestrate keeping data in sync across multiple data sources and do batch loads, but depending on how often changes you want to keep an eye out for api limits for inbound calls to salesforce.
You can roll your own solution where you can use Outbound Messages in workflow (or triggers that initiates an apex class that calls out, but that is more cumbersome and you have to do custom error handling and retry logic which you get for free using outbound messages) to send changes from salesforce to your homegrown service that writes to you database and have you homegrown solution write back to salesforce using the soap or rest api. That would probably take you some time to build. You would also still need to be aware of API limits depending on how many updates are made on the non salesforce side.
You crate a Canvas App which displays data from your DB in Salesforce as a Tab and hook it up via SSO so users are auto logged in. But again there would not be reporting, or any salesforce features that you can take advantage of.
But I really think that you should spend some time to determine what system is your source of truth because that would determine how the data should be synced. You should also investigate if you really need the sync to be realtime or near realtime, or if you can manage with something like an hourly true up on the system that is not the source of truth.
I was following the below blog and was trying to execute the POC but no luck. i did follow all the steps as suggested however I could not see any report in google analytics after saving the content. No user is shown in report. Please suggest what could be wrong in my implementation.
Reference Link
It is very hard to give a generic answer without looking into the configuration. I just followed the tutorial myself and it all worked fine (to test, I was making curl calls in the terminal window at my laptop and watching Real-Time / Overview report in Google Analytics.)
First and foremost, please check that _system/governance/apimgt/statistics/ga-config.xml has Enabled set to true, and TrackingID set to the UA- tracking code you got from GA.
One thing to check is whether you are looking at Real-Time report or historical. When you just implemented the change - look at Real-Time / Overview report initially as it starts showing data much faster.
Also, since API Cloud has multiple gateway nodes, it takes time for the configuration changes to propagate. So one thing to try is to wait 15 minutes or so from the time you applied configuration changes in the cloud, and then try invoking the API and see if the sessions are reflected in Google Analytics.
Finally, if these do not help, just submit the support ticket in API Cloud - support is included for free with the cloud service.
How to retrieve large amount of data from REST API GitHub? Nowadays it provided only a small amount of data JSON from GitHub timeline, in many cases limited to only 300 events. I need a bigger volume to work in my Master Research and i need to know how to via the REST API.
github's api (and most IMHO good apis) use pagination to reduce load on themselves and clients. you could write a simple script to go through all the "pages" of results one at a time, then combine your results after the fact locally.
more info here:
http://developer.github.com/guides/traversing-with-pagination/
Does anyone have experiences with programmatic exports of data in conjunction with BaaS providers like e.g. parse.com or StackMob?
I am aware that both providers (as far as I can tell from the marketing talk) offer a REST API which will allow for queries against the database, not only to be used by mobile clients but also by e.g. custom web apps.
I am also aware that both providers offer a manual export of data (parse.com via their web interface, StackMob via support).
But lets say I would like to dump all data nightly, so that I can import it into a reporting system for instance. Or maybe simply to have an up-to-date backup.
In this case, I would need a programmatic way to export/replicate the data stored in the backend. Manual exports are not an option for obvious reasons.
The REST APIs offered however seem to be designed for specific queries, not for mass reads (performance?). Let alone the pricing - I assume none of the providers would be happy about a nightly X Gigabyte data export via their REST API, so their probably will be a price tag.
I just couldn't find any specific information on this topic so far, so I was wondering if anyone else has already gone through this. Also, any suggestions on StackMob/parse alternatives are welcome, especially if related to the data export topic.
Cheers, Alex
Did you see the section of the Parse REST API on Batch operations? Batch operations reduce the number of API calls needed to grab data so that you are not using a call for every row you retrieve. Keep in mind that there is still a limit (the default is 100, but you can set it to a maximum of 1000). That means you are still limited to pulling down 1000 rows per API call.
I can't comment on StackMob because I haven't used it. At my present job, we are using Parse and we wrote a C# app which compares the data in a Parse class with a SQL table and pulls down any changes.
I have been playing around with GWT and GWT Visualization Wrapper API. One thing I learned recently is that GWT Visualization API does not work without an internet connection (I was working offline the other day and it took me a good half hour to figure out why my charts were not loading)
After doing a lot of reading online about privacy, data, and GWT, it seems that many people, including me, have a concern about sending data to Google when trying to display graphs. I already searched through many sources, including stackoverflow, and I would like to 100% confirm that my assumptions are correct.
The reason for people's concern about sending data to Google was when you tried to get an image of the said chart. This required data to be sent to Google, they processed it, and then they returned an image to be embedded in your website. According to my studies, that feature has been deprecated from Google charts (and for good reason). The way it works now, to my understanding, is that every time you want to display a chart, you download the most up-to-date library on the client side and perform all the calculations on the client. This makes it so that Google doesn’t actually get any information you will display on the charts.
Thus, I can continue using the visualization API as long as I keep using interactive charts and keep checking on the Google charts documentation page that it says that for this particular chart i.e Line Chart:
https://developers.google.com/chart/interactive/docs/gallery/linechart
(SEE BOTTOM OF PAGE) “All code and data are processed and rendered in the browser. No data is sent to any server” I do not have to worry about anyone getting my data because all information is processed client side.
Please correct any incorrect assumptions that I may have. Thank you.
The charts on this page, https://developers.google.com/chart/interactive/docs/gallery, all include a "Data Policy" section which details whether the chart is rendered on the client and what data will leave the client. Currently, only GeoChart communicates with Google (in order to do the Geocoding); obviously, this could change in the future.
The charts on this other page, https://developers.google.com/chart/interactive/docs/more_charts, include some that were written by Google, and some that were written by third parties. These also include a Data Policy section. For those written by Google, you can rely on this policy. For those written by third parties, Google has not validated the claims and cannot guarantee them.