Real time (true real time) Google Analytics (GA4) - not the last 30 minutes [duplicate] - google-analytics-api

I've recently added a Google Analytics GA4 tag to a website for the purpose of counting the currently active users. Before, I could see the number of real-time users right now (I think it had a one-minute delay, but I'm not sure).
But in the new GA4, I can only see the number corresponding to 30 minutes, which is not what I needed.
I've looked around and found an option to add a time dimension of 1 minute, but it was for the old google analytics, and it didn't seem right to me.
Not sure if it's needed to provide my code for this question, but if it's a must, then I'll add it.
Edit:
#Run a realtime report to get desired metrics.
def run_realtime_report(property_id):
#Runs a realtime report on a Google Analytics 4 property.
client = BetaAnalyticsDataClient()
#Run the request.
request = RunRealtimeReportRequest(
property=f"properties/{property_id}",
metrics=[Metric(name="activeUsers")],
)
#Parse the response.
response = client.run_realtime_report(request)
...
return activeUsers
#Run the realtime report function.
def run_sample():
global property_id
return run_realtime_report(property_id)

The number of users in the last 30 minutes in GA4 is similar to the "active users on site right now" in Universal Analytics (UA). However after about 5 minutes of inactivity, UA sometimes assesses users are no longer active on the site. This realtime report was generated with pageviews in the last 5 minutes:
After 5 minutes of inactivity, the active users on site goes to zero:
In GA4, you could recreate that calculation by specifying the minutesRange in your realtime report request. This page describes the minuteRange parameter. As a JSON request, this report will only count users who were active in the last 5 minutes:
{
"metrics": [
{
"name": "activeUsers"
}
],
"minuteRanges": [
{
"startMinutesAgo": 5,
"endMinutesAgo": 0
}
]
}
This request is different from GA4 realtime reporting which highlights the "users in last 30 minutes" as the primary realtime metric:

Related

Google Analytics Data API online users [duplicate]

I've recently added a Google Analytics GA4 tag to a website for the purpose of counting the currently active users. Before, I could see the number of real-time users right now (I think it had a one-minute delay, but I'm not sure).
But in the new GA4, I can only see the number corresponding to 30 minutes, which is not what I needed.
I've looked around and found an option to add a time dimension of 1 minute, but it was for the old google analytics, and it didn't seem right to me.
Not sure if it's needed to provide my code for this question, but if it's a must, then I'll add it.
Edit:
#Run a realtime report to get desired metrics.
def run_realtime_report(property_id):
#Runs a realtime report on a Google Analytics 4 property.
client = BetaAnalyticsDataClient()
#Run the request.
request = RunRealtimeReportRequest(
property=f"properties/{property_id}",
metrics=[Metric(name="activeUsers")],
)
#Parse the response.
response = client.run_realtime_report(request)
...
return activeUsers
#Run the realtime report function.
def run_sample():
global property_id
return run_realtime_report(property_id)
The number of users in the last 30 minutes in GA4 is similar to the "active users on site right now" in Universal Analytics (UA). However after about 5 minutes of inactivity, UA sometimes assesses users are no longer active on the site. This realtime report was generated with pageviews in the last 5 minutes:
After 5 minutes of inactivity, the active users on site goes to zero:
In GA4, you could recreate that calculation by specifying the minutesRange in your realtime report request. This page describes the minuteRange parameter. As a JSON request, this report will only count users who were active in the last 5 minutes:
{
"metrics": [
{
"name": "activeUsers"
}
],
"minuteRanges": [
{
"startMinutesAgo": 5,
"endMinutesAgo": 0
}
]
}
This request is different from GA4 realtime reporting which highlights the "users in last 30 minutes" as the primary realtime metric:

Algolia Insight Library - timestamp must not be in the future

Under insights api logs, some events are failed to log with 422 error code and in details it says "timestamp must not be in the future",Insights library is configured like below in Android application
Insights.Configuration configuration =
new Insights.Configuration(5000, 5000, userToken);
Insights insights = Insights.register(GoldenScentApp.getAppContext(), AppConstants.ALGOLIA_APP_ID,
AppConstants.ALGOLIA_KEY, indexName, configuration);
insights.setMinBatchSize(1);
The event was logged with time GMT: Monday, 31 May 2021 06:03:07.048 but the time in Algolia Received at:2021-05-31T06:01:13.859Z
This problem is with algolia sdk. They send old event every time new event send. Like user click event it goes in to local db. Once 10 events limit reach they send event. So in case user send 9 event in one session; then open app after 4 day when another event log algolia will send other events too. But timestamp will be older so they log as error in algolia.
work around : Solution: Set event flushDelay to 5 sec instead of 30 second which in documentation mention as 30 min(idk why).

Using post method to run an export takes too much time

I developed a Facebook application and I doing some offline analytics on the data from the dashboard.
I am using the Export API described here: https://developers.facebook.com/docs/analytics/export
According your explanation the time it should take is about 1-2 hours but, I did it every day in the past week and it took me about 2 days to download each one of the exports.
My application don't have big amount of data, each export is small.
Can you please help me understand what I am doing wrong or maybe you have problem with the export service?
I have Facebook page which I sell there sport clothes.
Each day I would like to see the following parameters from the traffic:
- Timezone of the device
- Device platform - iOS,Android,PC
- Event_log_time - timestamp at which the event is logged
I make everyday POST request to https://graph.facebook.com/v2.6/[Application FBID]/analytics_app_events_exports
with the following data:
access_token = {my access token}
start_ts = timestamp, begining of the day
end_ts = timestamp, end of the day
Then, after 5 hours I make get request to https://graph.facebook.com/V2.6/[Export FBID]?access_token={app access token}.
The response I get is:
"id": "##############",
"start_ts":"***",
"end_ts": "***",
"status": "SCHEDULED",
"column_names":[***],
"event_param_names":[{"event_name":[Event name]',"param_names":[Custom param names]}]
I have noticed that only after 2 Days the respone I get for the get request to https://graph.facebook.com/V2.6/[Export FBID]?access_token={app access token}
the status is COMPLETED.
My question is why it takes to long? according to your API it should take about 1-2 hours.

How do I gather geolocation data on GitHub users in a week's time, given their API rate limiting constraints?

To make a long story short, I need to gather geolocation data and signup date on ~23 million GitHub users for a historical data visualization project. I need to do this within a week's time.
I'm rate-limited to 5000 API calls/hour while authenticated. This is a very generous rate limit, but unfortunately, I've run into a major issue.
The GitHub API's "get all users" feature is great (it gives me around 30-50 users per API call, paginated), but it doesn't return the complete user information I need.
If you look at the following call (https://api.github.com/users?since=0), and compare it to a call to retrieve one user's information (https://api.github.com/users/mojombo), you'll notice that only the user-specific call retrieves the information I need such as "created_at": "2007-10-20T05:24:19Z" and "location": "San Francisco".
This means that I would have to make 1 API call per user to get the data I need. Processing 23,000,000 users then requires 4,600 hours or a little over half a year. I don't have that much time!
Are there any workarounds to this, or any other ways to get geolocation and sign up timestamp data of all GitHub users? If the paginated get all users API route returned full user info, it'd be smooth sailing.
I don't see anything obvious in the API. It's probably specifically designed to make mass data collection very slow. There's a few things you could do to try and speed this up.
Increase the page size
In your https://api.github.com/users?since=0 call add per_page=100. That will cut the number of calls getting the whole user list by 1/3.
Randomly sample the user list.
With a set of 23 million people you don't need to poll very single one to get good data about Github signup and location patterns. Since you're sampling the entire population randomly there's no poll bias to account for.
If user 1200 signed up on 2015-10-10 and user 1235 also signed up on 2015-10-10 then you know users 1201 to 1234 also signed up on 2015-10-10. I'm going to assume you don't need any more granularity than that.
Location can also be randomly sampled. If you randomly sample 1 in 10 users, or even 1 in 100 (one per page). 230,000 out of 23 million is a great polling sample. Professional national polls in the US have sample sizes of a few thousand people for an even bigger population.
How Much Sampling Can You Do In A Week?
A week gives you 168 hours or 840,000 requests.
You can get 1 user or 1 page of 100 users per request. Getting all the users, at 100 users per request, is 230,000 requests. That leaves you with 610,000 requests for individual users or about 1 in 37 users.
I'd go with 1 in 50 to account for download and debugging time. So you can poll two random users per page or about 460,000 out of 23 million. This is an excellent sample size.

Facebook Ads Api: Can you Get Timeseries of CPI in 15 minutes intervals?

I've just started playing with API's for the first time and my first attempt is with the facebook ads api. I'm looking through the documentation and can see how to pull all cost per action data. Right now I'm really interested in CPI (Cost Per Install) data for a specific campaign, ad set and ad in 15 minute intervals.
Does anyone know if this is even possible with the current API?
You can get reporting stats through the Ads Insights Api:
https://developers.facebook.com/docs/marketing-api/insights/v2.5
As you mentioned, you can get Cost Per Install data by requesting the cost_per_action_type field in your request.
For instance, a call to v2.5/<AD_SET_ID>/insights?fields=['cost_per_action_type']
Would have the cost of mobile app installs as part of the response
{
"data": [
{
"cost_per_action_type": [
{
"action_type": "mobile_app_install",
"value": ...
}
],
"date_start": ...,
"date_stop": ...
}
}
You can make api calls at your discretion as long as you're within the api rate limits: https://developers.facebook.com/docs/marketing-api/api-rate-limiting