Does the JIRA REST API have a limitation with number of JIRA ids in the JQL search? - rest

I am running a request against the JIRA REST API and I am getting an error:
Response status code does not indicate success: 400 (Bad Request).
I am thinking it has to do with a maximum number of items that can be in the JQL search query.
My query looks something like this:
search?jql=issue+in+('PROJ-741','PROJ-724','PROJ-851','PROJ-854','PROJ-856','PROJ-857','PROJ-980','PROJ-1133','PROJ-1132','PROJ-1071','PROJ-852','PROJ-727','PROJ-725','PROJ-853','PROJ-726','PROJ-434','PROJ-436','PROJ-433','PROJ-734','PROJ-733','PROJ-732','PROJ-182','PROJ-174','PROJ-173','PROJ-133','PROJ-301','PROJ-300','PROJ-281','PROJ-266','PROJ-253','PROJ-293','PROJ-287','PROJ-284','PROJ-276','PROJ-271','PROJ-262','PROJ-248','PROJ-214','PROJ-322','PROJ-323','PROJ-310','PROJ-332','PROJ-399','PROJ-600','PROJ-346','PROJ-389','PROJ-409','PROJ-521','PROJ-505','PROJ-490','PROJ-432','PROJ-486','PROJ-464','PROJ-438','PROJ-566','PROJ-534','PROJ-471','PROJ-178','PROJ-240','PROJ-210','PROJ-205','PROJ-655','PROJ-427','PROJ-419','PROJ-422','PROJ-426','PROJ-441','PROJ-442','PROJ-193','PROJ-194','PROJ-197','PROJ-195','PROJ-196','PROJ-513','PROJ-198','PROJ-514','PROJ-199','PROJ-516','PROJ-515','PROJ-200','PROJ-517','PROJ-201','PROJ-441','PROJ-188','PROJ-190','PROJ-189','PROJ-191','PROJ-192','PROJ-134','PROJ-213','PROJ-217','PROJ-219','PROJ-238','PROJ-237','PROJ-239','PROJ-221','PROJ-330','PROJ-418','PROJ-119','PROJ-463','PROJ-789','PROJ-331','PROJ-837','PROJ-959','PROJ-864','PROJ-957','PROJ-787','PROJ-445','PROJ-476','PROJ-786','PROJ-790','PROJ-791','PROJ-792')&startAt=0&maxResults=900&fields=labels,assignee,components,id,key,created,resolutiondate,customfield_10100,summary,issuetype,status,priority
I could try to batch these up into multiple queries, but I first wanted to see if there was any documented limit (I couldn't find anything mentioned in the documentation).

There is no restriction on the JQL queries and fields has been mentioned within Atlassian Documentation.
But if the Query is too large to be encoded as a Query Param then you should instead POST to this resource.
Your JQL has used some reserved characters hence it is returning 400 Error.
I have corrected your Query which compiles correctly and returns 200 status.
Here it is:
search?jql=issue%20in%20('PROJ-741'%2C'PROJ-724'%2C'PROJ-851'%2C'PROJ-854'%2C'PROJ-856'%2C'PROJ-857'%2C'PROJ-980'%2C'PROJ-1133'%2C'PROJ-1132'%2C'PROJ-1071'%2C'PROJ-852'%2C'PROJ-727'%2C'PROJ-725'%2C'PROJ-853'%2C'PROJ-726'%2C'PROJ-434'%2C'PROJ-436'%2C'PROJ-433'%2C'PROJ-734'%2C'PROJ-733'%2C'PROJ-732'%2C'PROJ-182'%2C'PROJ-174'%2C'PROJ-173'%2C'PROJ-133'%2C'PROJ-301'%2C'PROJ-300'%2C'PROJ-281'%2C'PROJ-266'%2C'PROJ-253'%2C'PROJ-293'%2C'PROJ-287'%2C'PROJ-284'%2C'PROJ-276'%2C'PROJ-271'%2C'PROJ-262'%2C'PROJ-248'%2C'PROJ-214'%2C'PROJ-322'%2C'PROJ-323'%2C'PROJ-310'%2C'PROJ-332'%2C'PROJ-399'%2C'PROJ-600'%2C'PROJ-346'%2C'PROJ-389'%2C'PROJ-409'%2C'PROJ-521'%2C'PROJ-505'%2C'PROJ-490'%2C'PROJ-432'%2C'PROJ-486'%2C'PROJ-464'%2C'PROJ-438'%2C'PROJ-566'%2C'PROJ-534'%2C'PROJ-471'%2C'PROJ-178'%2C'PROJ-240'%2C'PROJ-210'%2C'PROJ-205'%2C'PROJ-655'%2C'PROJ-427'%2C'PROJ-419'%2C'PROJ-422'%2C'PROJ-426'%2C'PROJ-441'%2C'PROJ-442'%2C'PROJ-193'%2C'PROJ-194'%2C'PROJ-197'%2C'PROJ-195'%2C'PROJ-196'%2C'PROJ-513'%2C'PROJ-198'%2C'PROJ-514'%2C'PROJ-199'%2C'PROJ-516'%2C'PROJ-515'%2C'PROJ-200'%2C'PROJ-517'%2C'PROJ-201'%2C'PROJ-441'%2C'PROJ-188'%2C'PROJ-190'%2C'PROJ-189'%2C'PROJ-191'%2C'PROJ-192'%2C'PROJ-134'%2C'PROJ-213'%2C'PROJ-217'%2C'PROJ-219'%2C'PROJ-238'%2C'PROJ-237'%2C'PROJ-239'%2C'PROJ-221'%2C'PROJ-330'%2C'PROJ-418'%2C'PROJ-119'%2C'PROJ-463'%2C'PROJ-789'%2C'PROJ-331'%2C'PROJ-837'%2C'PROJ-959'%2C'PROJ-864'%2C'PROJ-957'%2C'PROJ-787'%2C'PROJ-445'%2C'PROJ-476'%2C'PROJ-786'%2C'PROJ-790'%2C'PROJ-791'%2C'PROJ-792')&maxResults=900&fields=labels%2Cassignee%2Ccomponents%2Cid%2Ckey%2Ccreated%2Cresolutiondate%2Ccustomfield_10100%2Csummary%2Cissuetype%2Cstatus%2Cpriority

Related

Aggregate functions in WIQL query

I want download WIQL report using REST API. REST response doesn't give all fields but it gives a list of URLS as workItems.
To get field values I need to download each WorkItem separately.
Any direct REST way to accomplish this in a single REST call?
Repeated REST calls gives me rate limiting or similar error. I get error 500 types after repeated GET request.
Genesis of this need is - There are no aggerate functions available likes of SUM, MAX, MIN, AVG Etc.
REST response doesn't give all fields but it gives a list of URLS as
workItems.
To get field values I need to download each WorkItem separately. Any
direct REST way to accomplish this in a single REST call?
Sorry but as I know, there's no direct rest api available to get WIQL report. An alternative workaround should be:
1.Use Query By Wiql to return the list of WorkItem IDs and Urls (I think this is the same rest api you use).
2.Then use Get Work Items Batch to get all the details(fields) about the requested work item ids. And here's one issue which has similar needs like yours, you can use the upgraded script from konpro11 to get list of IDs and use the IDs to get your report (with the help of second rest api).
Hope it helps :)

How to get ALL communities from DSpace rest api

Using the endpoint:
GET https://mydspace.org/rest/communities/
I seem to only get 100 results returned. I can't see any options in the documentation to return more. How do I do this?
Most of the DSpace rest endpoints support a limit parameter. I suppose that there is some maximum size that you can request, but the limit should be able to go higher than 100.
https://demo.dspace.org/rest/communities?limit=500
If you still are unable to retrieve everything in one request (or if you timeout), you can paginate through the results in your code.
https://demo.dspace.org/rest/communities?limit=100
https://demo.dspace.org/rest/communities?offset=100&limit=100
https://demo.dspace.org/rest/communities?offset=200&limit=100
There is an endpoint (in DSpace 6) that will allow you to retrieve the ids and handles for the full hierarchy.
https://demo.dspace.org/rest/hierarchy
It's not very prominent, but pagination of REST responses is documented at https://wiki.duraspace.org/display/DSDOC6x/REST+API#RESTAPI-RESTEndpoints just above "Index / Authentication".

OneNote API: "Query operation not accepted" when filtering a pages GET

I'm attempting to get a list of pages by notebook from the OneNote REST API.
One option is to iterate over a list of sections and GET pages per section using /api/v1.0/notes/sections/[ID]/pages
Instead, to reduce the number of requests I'm filtering a call to /v1.0/me/notes/pages by parentNotebook/id but I'm getting, for example:
The query operation(s) parentNotebook/id eq [ID]
OR parentNotebook/id eq [ID] OR parentNotebook/id eq [ID] not supported.
http://aka.ms/onenote-errors#C20106
Error #20106 states:
Your request contains a query operator that is not supported. See
OneNote API reference.
My brief reference to the OData 4.0 docs leads me to believe that parentNotebook/id is the correct syntax, so what am I doing wrong please?
UPDATE
So, it does work, if I do one at a time; the problem seems to be the OR?
Can I change my filter to include multiple notebooks or should I be doing an API call per notebook?
Ironically when I did do a call per notebook I got my first http 429 throttling hand slap, though adding a 2 second pause between requests seemed to solve that.
Just tried the following in the interactive console and got back a successful 200 response.
GET https://www.onenote.com/api/v1.0/me/notes/pages?filter=parentNotebook/id eq '0-C6703B3BE9D6A5E0!317' or parentNotebook/id eq '0-D8E91FC12048CF4A!515'
*NOTE* replace the above notebook ids with your own ids.
So filtering on multiple notebooks using or expression is possible.

Facebook Graph API search limit & offset

I want to do a query like this:
search?q=KEY_NAME&type=page&fields=id,name,location&limit=500&offset=0
when I do this the first time the result is about 470 results, now I put offset to 471 and repeat the query
search?q=KEY_NAME&type=page&fields=id,name,location&limit=500&offset=471
and the result is empty.
Why? The key_name is a famous word like "fan" and I don't think that there are only 471 results on fb pages!
What is the problem?
Never use a limit that high, afaik a limit of 100 should be the maximum. Everything else may be buggy. If you use this API call, you get more than 500 with paging:
/search?pretty=0&fields=idmname,location&q=fan&type=page&limit=100
Don´t use "offset", always use the "next" link in the JSON document to get the next batch of results: https://developers.facebook.com/docs/graph-api/using-graph-api/v2.4#paging
The next 100 entries would be available with the following endpoint for me:
/search?pretty=0&fields=idmname,location&q=fan&type=page&limit=100&after=OTkZD
Please refer to this following blog post in which it says
https://developers.facebook.com/blog/post/478/
gist of it.
As such, when querying the following tables and connections, use time-based paging instead of “offset” to ensure you are getting back as many results as possible with each call. For these Graph API connections, use the “since” and “until” parameters

Pagination in the event search API

I am performing a rest call to facebooks search API using type=event
e.x.
search?fields=id,name,picture,owner,description,start_time,end_time,location,venue,updated_time,ticket_uri&q=concert&type=event
I have looked through the documentation and still have a few questions about specific pagination behavior of the event search API.
If I used a broad search term like "ma" and keep querying the pagination ['next'] URL would I cycle through all facebook events starting with "ma"? Does the pagination array give any indication when there are no more results to return?.
Do these searches include past events? If so is it possible to eliminate past events using the "since" parameter?
What is the maximum for the limit parameter?
Update:
As far as I can tell the number of pages you can get from a facebook search is limited to 500. This includes pages that can be accessed via pagination. In other words a query with limit >=500 will not return a pagination url, likewise a query with limit 250 will only return one pages worth of pagination.
You will "next page" until the count of results comes less then the limit
I'm not sure if that is possible using a simple Graph Request. Maybe using FQL
I don't know exactly. But i used a 2000 limit one day. And it worked.
Other doubts you can get answers testing your resquests with this tool
https://developers.facebook.com/tools/explorer/
I am also doing the same thing like you. I am collecting public post using graph search api.
When there are no results available or you reach max limit pagination section will not be there in response. So you can always check for paging is there in json response or not something like this.
NextResult = DeserJsonFBResponce.paging != null ? DeserJsonFBResponce.paging.next : string.Empty;
I am not so sure about this with events but for public post i am able to eliminate post using science and until parameters.
Maximum for the limit parameter is 2000 per get request.