I am facing difficulty in getting accurate data from my Analytics account. The numbers on Management Interface are higher than the ones on Analytics.
Any idea of what is wrong with it and how can be sorted?
Related
I am looking for a way to fetch data to my flutter app which can be adjusted and modified dynamically after deploying the app. As an example, if I want to change the images of the carousel depending on promotions or launch new books to the digital library. I need an economic option to host the data in cloud storage and fetch it from there.
I have considered firebase as well as google drive, but have yet to find a good guide. being a beginner and having concerns about security I want some expert advice if possible.
*edit-
Seeing many a tutorial I assume there is no better way than linking file URLs from the
Cloud Storage. So to dynamically change those is it possible to refer the URLs to some excel sheet fields to obtain URLs. Those fields can certainly be adjusted then without any hard coding. but the question is how to refer to such a sheet file? *
I also want to segregate the users into paid and free users, I have successfully proceeded with the authentication with firebase but still don't understand(I do have some concepts but don't know where to do that) how to put them in groups and impose limitations on them about accessing the data. any guidance, links and helpful advice will be cordially appreciated.
According to what you are looking for, I highly recommend you to use Firebase Remote Config, which is a cloud tool that allows you to modify your app's functionality and appearance without forcing users to download an update. You define in-app default values that control the functionality and appearance of your app when you use Remote Config. Then, for all app users or for subsets of your user base, you may utilize the Firebase console or the Remote Config backend APIs to modify in-app default values.
Your program can control when updates are applied, and it can check for updates regularly and apply them with minimal performance impact.
Remote Config comes with a client library that takes care of essential functions like fetching parameter values and caching them while still allowing you to manage when new values are active and how they affect the user experience in your app.
Here is a tutorial that uses Flutter and Firebase Remote Config that could also help you.
Visualization tools like tableau, looker, apache superset are not supposed to be used for multi tenant products.
For example. A product with 1000's of users would like analytics on their data. This needs to be secure so company A cannot see other company B visualizations. For this to work these tools need to understand if a user has privileges to view the data. This is usually achieved through cookies after the user has logged in
To ensure data is only accessed by authorized users these third party tools should not be used. Instead sticking to Ruby on Rails with d3js, highcharts etc is the best options. The data can be managed a lot easier through the same authentication methods as you login and so the data is secure.
Actually, Looker handles multi-tenant data situation just fine. It is quite a common use case for Looker.
You can bind attributes to users that will force the right SQL to be written to guarantee that the user only sees appropriate data.
https://docs.looker.com/reference/explore-params/access_filter
We've got lots of customers building extranets for their businesses this way.
Disclosure: I work at looker.
The complexity of multi-tenant deployments goes far beyond the setup of some filter:
Data privacy - you are one typo away from a data privacy breach with the filters. You should use the database security and privacy capabilities to isolate your tenants.
Performance - you need to scale the underlying database to handle the load of concurrent users.
Customization - your tenants might need to load and analyze their own custom data. They need custom reports, etc.
Take a look at gooddata.com and their workspaces.
Disclosure: I work at GoodData
I was following the below blog and was trying to execute the POC but no luck. i did follow all the steps as suggested however I could not see any report in google analytics after saving the content. No user is shown in report. Please suggest what could be wrong in my implementation.
Reference Link
It is very hard to give a generic answer without looking into the configuration. I just followed the tutorial myself and it all worked fine (to test, I was making curl calls in the terminal window at my laptop and watching Real-Time / Overview report in Google Analytics.)
First and foremost, please check that _system/governance/apimgt/statistics/ga-config.xml has Enabled set to true, and TrackingID set to the UA- tracking code you got from GA.
One thing to check is whether you are looking at Real-Time report or historical. When you just implemented the change - look at Real-Time / Overview report initially as it starts showing data much faster.
Also, since API Cloud has multiple gateway nodes, it takes time for the configuration changes to propagate. So one thing to try is to wait 15 minutes or so from the time you applied configuration changes in the cloud, and then try invoking the API and see if the sessions are reflected in Google Analytics.
Finally, if these do not help, just submit the support ticket in API Cloud - support is included for free with the cloud service.
I'm trying to export data from Presence Insights on Bluemix, I followed the following documentation:
https://presenceinsights.ng.bluemix.net/pidocs/analytics/export/
however I can't seem to find export button mentioned inside the document.
Data can be exported from the IBM Presence Insights Dashboard if you have data available. There are also REST APIs for exporting data. They are documented in the Floors, Sites, and Zones sections of the API Reference.
There were REST APIs in the product some time ago, but they were found to have limitations that made them less useful in production. In particular, the amount of data that builds up forces the response time on the API to grow beyond what the Bluemix infrastructure allowed. The API requests would timeout. To that end, the APIs were backed out, but it appears the documentation was left. That will be removed shortly.
Presence Insights still understands the value of exporting the data, so a new scheme is under investigation. For example, it would be ideal if the data could be exported under the covers to a production data storage facility, on a regular time frame.
In the interim, an alternative solution would be to use a Subscription to gather the backend enter/exit/dwell/timeout events directly and roll your own solution to store only what you need in whatever facility works for your application.
What is a good way to store and run some reports on email analytics? Imagine this is a pretty high volume of emails sent, open, click stats broken down to email categories. This info has to be stored at some DB and we need to be able to slice the data in different ways to extract some valuable business information.
One way to do this would be in-house, build a new database and log and track every action. And later on built reports on top of it.
However, this is a lot of work and I was wondering if there's some cloud service that we can use to do this. Azure and Amazon offer cloud DB storage, I guess we can use them, but that means a lot of setup work as well. Not sure if there's a 3rd party email analytics in the cloud service.
Any recommendations on best way to tackle this problem?
Indeed, tracking your email links and analyzing the relevant is a lot of work. I recomend you use a third party application to take care of it.
Most email-as-a-service providers offer prety goood analytics, including things like opens, clicks, by location, time, etc...
It really depends on the provider, but I think you should take a look at some of the best known ones such as Sendgrid, Postmarkapp or Mailjet