Historical ad set budget - facebook

I was wondering if there was any way to get the ad set budget in history? When I request it in the params it will always give me the current daily budget
Example request:
https://graph.facebook.com/v5.0/act_${id}/adsets?fields=daily_budget,lifetime_budget,insights.fields(clicks, conversions).time_range({'since':'2019-11-14','until':'2019-11-14'})

Related

XRPL: How to get the history of the balance of an account?

I would like to query the history of the balance of an XRPL account with the new WebSocket API.
For example, how do I check the balance of an account on a particular day?
I know with the v2 api, there was a possibility to query balance_changes. But this doesn't seem to be part of the new version.
For example:
https://data.ripple.com/v2/accounts/rf1BiGeXwwQoi8Z2ueFYTEXSwuJYfV2Jpn/balance_changes?start=2018-01-01T00:00:00Z
How is this done with the new Websocket API's?
There's no convenient API call that the WebSocket API can do to get this. I assume you want the XRP balance, not token/issued currency balances, which are in a different place.
One way to go about it is to make an account_tx call and then iterate through the metadata. Many, but not all, transactions will have a ModifiedNode entry of type AccountRoot—if that transaction changed the account's XRP balance, you can see the difference in the PreviousFields vs. FinalFields for that entry. The Look Up Transaction Results tutorial has some details on how to parse out metadata this way. There are some kind of tricky edge cases here: for example, if you send a transaction that buys 10 drops of XRP in the exchange but burns 10 drops of XRP as a transaction cost, then the metadata won't show a balance change because the net change was zero (+10, -10).
Another approach could be to estimate what ledger_index was most recently closed at a given time, then use account_info to look up the account's balance as of that time. The hard part there is figuring out what the latest ledger index was at a given time. This is one of the places where the Data API was just more convenient than the WebSocket API—there's no way to look up by date in WebSocket so you have to try a ledger index, see what the close time of the ledger was, try another ledger index, see what the date is, etc.

facebook marketing api - results and cost per result

I want to retrieve via API the values for cost per result and results columns of the facebook business manager. Is this possible?
Yes its possible through the Insight API.
Request insights for a campaign and the cost_per_action_type field. The cost_per_action_type field will be an array containing the cost per result of various action types.
In a traffic campaign the action type of interest is link_click. The value for link_clicks is what you would see under results in the fb ad manager.
https://developers.facebook.com/docs/marketing-api/reference/ad-campaign-group/insights/

How do I gather geolocation data on GitHub users in a week's time, given their API rate limiting constraints?

To make a long story short, I need to gather geolocation data and signup date on ~23 million GitHub users for a historical data visualization project. I need to do this within a week's time.
I'm rate-limited to 5000 API calls/hour while authenticated. This is a very generous rate limit, but unfortunately, I've run into a major issue.
The GitHub API's "get all users" feature is great (it gives me around 30-50 users per API call, paginated), but it doesn't return the complete user information I need.
If you look at the following call (https://api.github.com/users?since=0), and compare it to a call to retrieve one user's information (https://api.github.com/users/mojombo), you'll notice that only the user-specific call retrieves the information I need such as "created_at": "2007-10-20T05:24:19Z" and "location": "San Francisco".
This means that I would have to make 1 API call per user to get the data I need. Processing 23,000,000 users then requires 4,600 hours or a little over half a year. I don't have that much time!
Are there any workarounds to this, or any other ways to get geolocation and sign up timestamp data of all GitHub users? If the paginated get all users API route returned full user info, it'd be smooth sailing.
I don't see anything obvious in the API. It's probably specifically designed to make mass data collection very slow. There's a few things you could do to try and speed this up.
Increase the page size
In your https://api.github.com/users?since=0 call add per_page=100. That will cut the number of calls getting the whole user list by 1/3.
Randomly sample the user list.
With a set of 23 million people you don't need to poll very single one to get good data about Github signup and location patterns. Since you're sampling the entire population randomly there's no poll bias to account for.
If user 1200 signed up on 2015-10-10 and user 1235 also signed up on 2015-10-10 then you know users 1201 to 1234 also signed up on 2015-10-10. I'm going to assume you don't need any more granularity than that.
Location can also be randomly sampled. If you randomly sample 1 in 10 users, or even 1 in 100 (one per page). 230,000 out of 23 million is a great polling sample. Professional national polls in the US have sample sizes of a few thousand people for an even bigger population.
How Much Sampling Can You Do In A Week?
A week gives you 168 hours or 840,000 requests.
You can get 1 user or 1 page of 100 users per request. Getting all the users, at 100 users per request, is 230,000 requests. That leaves you with 610,000 requests for individual users or about 1 in 37 users.
I'd go with 1 in 50 to account for download and debugging time. So you can poll two random users per page or about 460,000 out of 23 million. This is an excellent sample size.

Facebook Marketing API get total reach of multiple Ad Sets

I am trying to use the Marketing API to get the summary data for multiple Ad Sets.
I am able to get the data for each Ad Set with the following:
insights/?ids=[**ad_set_ids**]&fields=impressions,clicks,reach,actions,total_actions
I can add up the numbers for each Ad Set to get the total and it is fine except for "reach" because the total of reach doesn't just add up to the total (see image below).
Is there any way to get the summary of data for the ad sets (the last row in the image "Results from 3 Ad Sets")?
I also tried to add the param default_summary=true but it gives me the summary for each ad set instead of the sum of all ad sets.
What you actually need to use the summary field to achieve this. For example you can query your insights at ad account level and then specify level=adset. Then in the filterring, you specify a list of adset.id. And very important you should add summary=["reach"], so that you can get aggregated reach.
Here is an example:
https://graph.facebook.com/act_[acc_id]/insights?limit=5000&level=adset&summary=["reach"]&date_preset=lifetime&action_attribution_windows=["default"]&filtering=[{"field":"adset.id","operator":"IN","value":[adsetID1]"[","[adsetID2]"]}]
And actually you can do this on a campaign node instead of adaccount node too. It may give you better performance.

How to set budget unit which through call API's code

Maybe my ad account's pay method was set CNY.
How to set budget unit which through call API's code?
Here is my CURL code.
create ad set by CURL
Error msg:
Your budget is too low. The minimum budget for this ad set is $1.00.
Thanks!
As stated in docs related to bidding
The bid amount's unit is cent for currencies like USD, EUR, and the basic unit for currencies like JPY, KRW.
In your call you use bid_amount=1, daily_budget=3, which means 1 and 3 cents. For bid_amount that may be relevant amount (depends on used billing_event), but for budget it's too low.
You can see minimum budget values for different currencies in budget limits docs.
Also note that not all currencies use the same offset, which means not everywhere 1 means 1 cent. Those offsets are documented here.