Issue with google analytics API regarding number of rows - google-analytics-api

I am new to Google analytic, I have a dashboard and want to export data from googleanalytics dashboard. My problem is as soon as increase the time frame in the dashboard, no. of rows decreasing how is it possible. Even if the rows are being grouped together number of rows should not decrease when the time frame is increased. Can anybody help me on this ??

Related

How to convert analytics views into GB

I would like to have an estimate or an idea in how to convert Google Analytics views into GB.
I have been looking everywhere in the Google Analytics portal but all I see is the number of views
for example
a day 244 views a week~ 855~ views
Now, Im trying to calculate price or estimate with Application Insigths but their table of prices is on GB for example
$2.76 per GB per Day
100GB per day = $220.67
You can check the estimated cost from Azure Monitor Costs
Azure Monitor will have two phases, first it will estimate the cost according to your data, and later it will give you the actual cost after the deployment.
Metrics queries and Alerts
Log Analytics
Application Insights
In your case you can check Azure Application Insights metrics pricing documentations (MSDOCS) and Azure Monitor Pricing

Google Analytics: chart and table show different number of users

In both iOS and Android app Google Analytics show quite different numbers of unique users on chart and in the table:
The problem is the same even for different time intervals. A strange thing is that chart shows much more users than the table: an opposite case is not a problem, just several users can be mentioned in different events (or time interval, like in this question). But on the chart GA counts more people, than the sum of active users based on the events.
Is it a bug? Or GA just counts active users using some other stuff, not the events?
Here is my report settings:

Out of memory - Java heap space

I have a report which should give around 18000 pages after exporting and has 600K rows of record. It is giving me out of memory error after running. I have applied Virualization but it's not working. I also tried with increasing the memory size in tomcat server but after increasing the size the server is not starting.
From my experience you have not enough of RAM on your server.
Is it absolutely necessary to display report as web page? From our customers we have feedback that they never want to browse throught so many pages. It can be better to export them data directly into excel file where they have many option how to work with them.
One solution can be to have more records on one page which leads to generate less number of pages. But it this case with your RAM memory I am not sure that it helps
How to control the number of rows in a JasperReport

What is the upload limit on soundcloud

I sometimes get the error: { error_message: 'Sorry, you\'ve exceeded your upload limit.' } when I post sound files to soundcloud, using their http api.
I couldn't find any explanation for this 'upload limit' in their documentations.
Does anyone know if it's a daily limit? or a size limit? or a combination of both?
Thanks
Sparko is mostly right. The only difference is that you can tell how much remaining time you have by requesting the current user details (GET /me) and you'll there will be a key called upload_seconds_remaining.
Free users get 2 hours. Pro gets 4 hours. Pro Unlimited is unlimited. Regardless of the plan, individual tracks also can not be longer than ~6.5hrs (I forget the exact number)
Individual files cannot exceed 500mb Uploading Audio Files
However, I'd imagine this relates to your overall limit for uploading audio to SoundCloud based on the plan attached to the account you're posting to i.e exceeding the 2 hours provided by the free plan.
The API doesn't appear to provide a property for the remaining time provided to the user, although you could infer this from [user]plan & looping through all of their tracks and summing each [track]duration (although probably not advised).

how to improve the performance of smartgwt listgrid

I have to show around 30,000 records.I am using the datasource.setdata() to set the records. My listgrid fetch the records from the attached datasource. But I am facing a performance issue. it takes too much time to show the records and if i update the records then my browser(IE & firefox) both get hangs.
What is the possible solution of this problem??
These records are at client side only. I have to do some operation on the records then i have to save.
Any help is greatly appreciated.
There's no such thing as DataSource.setData()..
The best way to do this is to implement paging so that you do not load all 30,000 records into the browser. This will improve server performance as well since the server will not have to deliver such a large dataset when most users will only look at a handful of records. To see how to do all this, look at the SmartGWT QuickStart Guide and focus on the Data Binding and Data Integration chapters.
If for some reason you have to load 30,000 records you had better encourage your users not to use IE. Then, use a client-only DataSource.
As far as some kind of "hang when updating" you need to be more specific.
There is no Paging component in Smartgwt, you have to implement it yourself.
I had the same problem as yours.
The solution was simulating Paging:
The client doesn't retrieve the 30,000 records, instead he asks for the first 100 records. When the user scrolls to the the bottom of the listGrid (there is an event for scroll), the client asks the server for the next 100 records, etc ..