Beats Music, Why does this playlist's tracks fetch returns invalid results? - beatsmusic

I'm trying to fetch tracks for a specific playlist (pl128943612061942272) and it gives me strange results.
https://partner.api.beatsmusic.com/v1/api/playlists/pl128943612061942272/tracks?access_token=
The response has a total count of 15 but returns only 13 tracks.
Besides, if i try to get the 2nd track ( offset 1 and limit 1 ) by the following request, the response is empty.
https://partner.api.beatsmusic.com/v1/api/playlists/pl128943612061942272/tracks?access_token=&offset=1&limit=1

This is happening because two of the tracks are no longer available. Songs sometimes become unavailable. This is often temporary.

Related

How to Get Multiple Instrument Candles in one GET request with Oanda's API

A request for a single instrument's candles is, e.g.
"https://api-fxtrade.oanda.com/v3/instruments/EUR_USD/candles?price=M&from=2020-06-01T10%3A00%3A00.000000000Z&granularity=M10"
to get all EUR_USD 10 minute midpoint candles from a specified time. How can I alter the above to get multiple instruments in one request? For example, an attempt to get EUR_USD and GBP_USD candles, such as
"https://api-fxtrade.oanda.com/v3/instruments/EUR_USD%2CGBP_USD/candles?price=M&from=2020-06-01T10%3A00%3A00.000000000Z&granularity=M10"
does not work.
I would like to get up to 50 different instruments in one request and avoid the need to loop on the client side and make 50 separate requests.

Unable to iterate over (actual) results in Spotify API search

The number of returned results in Spotify API varies with different values of offset and limit. For instance:
https://api.spotify.com/v1/search?query=madonna&offset=0&limit=50&type=playlist
playlists.total= 164
https://api.spotify.com/v1/search?query=madonna&offset=0&limit=20&type=playlist
playlists.total= 177
https://api.spotify.com/v1/search?query=madonna&offset=10&limit=50&type=playlist
playlists.total= 156
https://api.spotify.com/v1/search query=madonna&offset=100&limit=50&type=playlist
playlists.total= 163
The real problem is that some items are missing when iterating over the results. This can be easily reproduced as follows:
Make the following request:
https://api.spotify.com/v1/search?query=redhouse&offset=0&limit=20&type=album
The response returns albums.total=27 and 20 items.
Make another request to get the next page:
https://api.spotify.com/v1/search?query=redhouse&offset=20&limit=20&type=album
The response returns albums.total=21 and 1 item. (6 missing items!)
Make the same request with offset=0 and limit=30
https://api.spotify.com/v1/search?query=redhouse&offset=0&limit=30&type=album
The response returns albums.total=27 and 27 items, which is correct.
This happens in all searches for albums, artist, tracks and playlists. A few people (including myself) reported it as a (critical) bug in the Spotify issue tracking system.
I just wonder if there is any reliable way of iterating over the results of a search.

Facebook Request(s): what counts as 1 request?

I am currently creating an application that polls facebook for data. First request a page in this fashion...
pageID/posts?fields=id,message,created_time,type&limit=250
This returns the top 250 posts from a page. I then check if there page next is set and if it is make another request for the next 250 posts. I continue this recursively until there are no more posts.
With each post that is returned I go out and fetch the post details from the graph api as well.
My question is if I had 500 posts on a page. Would that equate to 502 requests? (500 requests for each post + 2 for parsing through page data to get posts) or am I incorrect in my understanding of a "request". I know when batching calls each query included in the batch actually counts as 1 request. The goal is to avoid the 600 calls / 600 second rate limiting. Thanks!
Every API call is...well, 1 request. So every time you use the /posts endpoint with whatever limit, it will be 1 request. For example, if you do that call you posted, it will be one request that returns 250 elements.
Batch requests are just faster, but each call in the batch counts as a request. So if you combine 10 calls in a batch, it will be 10 requests. The benefit of batch calls is really just that they are a lot faster: as fast as the slowest call in the batch.
If you want to get 500 posts with that example of yours, you would only need 2 calls. First one with 250 returned elements, second one by using the API call defined in the "next" value to get another 250. Just keep in mind that the default is usually 25 elements, and you can´t use any limit you want. There is a max limit for calls and it gets changed from time to time afaik so don´t count on getting the same result every time.
Btw, don't be to fixated on that 600calls/600seconds limit, it's just a general limit. The real limit is dynamic and depends on many factors. It's not public, of course. But if you really hit the limit, you are doing something wrong anyway.

Paging in inbox threads/comments does not work?

I am tryning to list all messages for a thread in the inbox. I notice that I get the 25 last messages by default by doing something like this:
https://graph.facebook.com/<threadID>/comments?access_token=<token>
I get data for the 25 last messages in the thread, in this case message 4 to 28. The first message has a created_time" of "2011-01-21", the last (newest) has a
"created_time" of "2013-09-24".
The data returned for the "comments" connection has paging, the "next" and "previous" links are present and looks like this:
"previous"
https://graph.facebook.com/<threadID>/comments?access_token=<token>&limit=25&since=1380049638&__paging_token=<threadID>_28"
"next"
https://graph.facebook.com/<threadID>/comments?access_token=<token>&limit=25&until=1295625728&__paging_token=<threadID>_4
However, both return empty data sets!
How can I get this to work?
Another obeservation: when experimenting with "until", I noticed that when setting "until=2013-02-23" or earlier the response is also an empty data set!
I have also noticed another thing: the default limit seems to be 25 messages, however even when setting limit to a high number (like "limit=100) you only get around 28-30 messages per request. So it seems that for the thread/comments connections there are two problems: 1) "limit=" does not work as expected 2) "until=" does not work as expected: going back before a certain date/time returns an empty data set (this is why the paging does not work I guess).
Any ideas on how to get around this?
If you have a problem with next URL for the pagination, try using the offset along with the limit parameters in the URI.
For example, instead of making an API call to <threadID>/comments, make a call to /comments?limit=100&offset=0. This will start the list of the messages from an offset of 0 and will display a list of 100 messages on each page. The next URL will work just fine in this case. You can however increase the limit of the messages per page.
Also, there are some issues with the paging. Have a look at this post to learn how it works actually.

Fastest way to query a users news feed on Facebook

As part of an application I am doing we need to retrieve a users facebook news feed (preferably about 150 posts).
var client = new FacebookClient(accessToken);
result = client.Get(connection);
This above is taking roughly 2 seconds
However when I increase the query to be 150 posts
var client = new FacebookClient(accessToken);
result = client.Get(connection, new { fields = "name,from,story,message,picture,comments", limit = count });
This is now taking 6-8 seconds. This is not a nested query so am I right in thinking FQL would give me no performace increase? Is this sort of response time about the best I can hope for?
Doing multiple queries is probably better. You can fire off each request asynchronously to the server and progressively load the data. So, the first call loads posts in ~2 seconds, and then after ~4 seconds you will get the second batch of data from Facebook. Repeat until you get the desired number of posts.
This will mean the user will get to see the data quicker, while your app processes smaller chunks of data.
Take care in coding the loop and account for failure. I.e. if a call fails retry the call or fail gracefully.