Is there's a way to update the nextBuildNumber directly via Rest? I found the parameter here:
/job/MyJob/api/xml?tree=nextBuildNumber
and in the job directory, there is a nextBuildNumber file
We already use the Rest Api for creating/updating jobs and views, so it would be nice to stick to this, instead of using cli or the nextbuildnum plugin.
Edit: new approach brings another question
Java send integer value with HTTP POST
The REST API does not currently (at time of writing) support changing the value of the nextBuildNumber. As you have found you can (only) read it.
The easiest way I know to manipulate this value is through the Next Build Number plugin.
Related
did someone already called an AWS Rest API with AWS Signature Version 4 ?
I'm not sure how to generate this using the hierarchical stage Calling the Rest API ?
Thank You -
what do you mean by hierarchical stage, do you want to build the whole signature on your own?
Which language are you trying to call it with? There are different packages depending on that, I highly recommend using one of those instead of rebuilding it on your own.
agnostic-aws-signature This worked for me for example...
When creating an instance of REST API (an application), a version (appearing as a prefix) has then to be included in the URL when calling it.
Is there a way to manager several versions (at the same time) of an API? Are we able to change the version number or how is it changed?
The only link I have found is : https://docs.marklogic.com/guide/rest-dev/intro#id_64988
But it is not pretty clear to me.
Thank you for your help
As the link says, "The version number is only updated when resource addresses and/or parameters have changed. It is not updated when resource addresses and/or parameters are added or removed."
In other words, the REST API will increment the version step if it ever becomes necessary to rename or restructure the addresses of resources. Ideally, that will never need to happen. If incrementing becomes necessary, the goal will be to maintain a deprecated interface if possible at the old address for one release.
In addition to David's good suggestion, you could also build your own version numbers into the name of the resource service extension if it's better to support multiple versions of an extension in a single modules database.
If this is to have versions of your rest extensions and use the V# in that process, then I think you could have multiple sets of your code deployed in different modules databases (per version) and dynamically switch modules database based on the version and then rewrite the URL after that to play well with MarkLogic's REST API.
http://developer.marklogic.com/features/enhanced-http
Does the API Manager / Bluemix provide an interface (API, hook) to automatically update API definitions when I push Swagger 2.0 API definition changes to a GitHub repository?
This is currently not possible. Your best bet is to manually re-import the Swagger using the GitHub raw URL every time you update it; however, doing this will require that you create a new API via the import, remove the old API, and then add the new to the same plan and re-deploy.
You can manually update by clicking the Update button in the API editor. It looks like this:
Using this feature will automatically overwrite all changes you've manually made to the API, so it's recommended that you create a new API revision before uploading an updated Swagger doc.
If you plan to make manual changes to your API via the API Manager UI, then you may be better off making small updates by hand.
If you really need an automated approach, you may be able to write your own app / script that gets called any time the Swagger is updated in GitHub, and then can call the API Manager APIs to update the Swagger. This will likely be pretty complicated to set up.
We're currently using MarkLogic's dls functions to handle document versioning, and are trying to switch over to use the REST API. The document endpoint doesn't use versioning by default, and I can't figure out a way to get it to. I'm referring to the dls functions for keeping multiple document versions, btw, not the new "content versioning" the REST API documentation mentions. In fact, the only reference to document versions in the REST API docs seems to be a line saying that content versioning isn't the same thing.
The only solution we've been able to come up with is to write a custom endpoint that duplicates everything the existing document endpoint's PUT does, plus document management. I'd rather avoid that if possible, especially when looking at MarkLogic 7's partial document updates. We're using MarkLogic 6 now, if it matters, but it doesn't look like 7 has any new features related to this.
Is there a way to do this using MarkLogic's existing endpoints?
You can write a REST API extension that automates the DLS operations. See http://docs.marklogic.com/guide/rest-dev/extensions. You will largely end up duplicating a lot of the same things, but this will plug into the existing endpoints.
Yes, MarkLogic 7 added content versioning to make refreshing of caches easier. And unfortunately, the DLS library hasn't been integrated into the REST api so far. You can file a feature request at support if you like.
In the mean time, the best suggestion I can give is use a separate route to do document updates using DLS (your current route or a limited custom endpoint that only supports the DLS functions you need for doc updates), and do anything else (as far as possible) using the existing REST api. You can look at this other stackoverflow question to see how to limit searches to the latest doc versions:
Marklogic REST API search for latest document version
HTH!
A member of MarkLogic has put together a REST extension to provide better DLS support in the REST-api. Hopefully that makes working with DLS over the MarkLogic REST-api a lot easier:
https://github.com/sanjuthomas/marklogic-dls-rest-extension
HTH!
I'm currently using the Jira SOAP interface within a C# (I suppose the language used here isn't terribly important).
Basically, I'm creating an API and a Winform that wraps some of the functionality of the soap service so that our Devs can programmaticly add bugs when something goes wrong in our application.
As part of this, I need to know the custom field IDs that are in use in Jira, rather than hardcoding them (as they are still prone to the occasional change) I used the GetCustomFields() method in the jira-rpc api then filtered it, so that all the developer needs to know is the name of the field, then the ID is filled in for them automagically.
This all works fine, but with one quite important proviso: that you login to the SOAP/RPC service as a user with administrative privaliges.
The Jira documentation indicates that the soap/rpc service follows the usual workflows and security schemes, however I can't find anything anywhere that would appear to remove this restriction on enumerating custom fields (and quite why in any instance you would want someone to HAVE to be an administrator to gain this access, especially as the custom field id's tend to be in Jira's HTML source is beyond me)
Does anyone know if I've missed a setting somewhere? Or if there is some sort of work-around for this, short of hardcoding the custom field id's?
Or is this a case of having to delve in to Jira's RPC plugin and modifying the source for it in order to give me the functionality I require?
Cheers
Edit for the sake of google/posterity
Wow, all this time on, and it looks like Atlassian still haven't changed this behavior.
Worked around this by creating a custom dictionary that logs in as an administrative user, grabs the custom fields and then logs out. Not ideal, but it should work 'til atlassian change things
You're not missing anything - there's no way to get custom fields via standard SOAP API.
In JIRA Client, we learn about custom fields in two ways:
We download issues via RSS view of the issue navigator, or via XML representation of a specific issue. If a custom field is set for an issue, the XML will have its id, class and value (values).
From time to time we inspect the content of IssueNavigator search page - looking for searchers for the custom fields. Screen-scraping the HTML gives us not only ids of the custom fields but also possible values for enum fields.
This is hackery, of course, and it may go wrong, so a good API would have been a lot better.
In your case, I can suggest two solutions:
Create your own SOAP (or REST) remote API plugin that will give you just that info that you miss from the standard API. Since you're seemingly in control of your JIRA, you can install anything there.
Screen-scrape the "New Bug" page for the project and type of issue you need to submit. You'll get all the info - fields, options, default values, which field is required.