My application generates Open API/Swagger specification file. It may have new API additions/updates. I want to have an automated script/job which uploads/updates this specification into existing postman collection. I looked in Powershell, Terraform, Postman APIs but they don't support it.
Is there any other way to achieve what I want ?
The postman has an official tool called swagger2-postman2-converter to update postman.
And the official blog Sync your specs describes in detail how to update collections using this tool.
Here are the detailed steps. I am using the sample code provided by postman. You can replace your file with the corresponding file in the sample.
If you don't have a Postman API key, go to your account settings and generate one. Click Postman API for detailed information.
Stored your key in a json file. The _secrets.json file in this sample.
Run the GET API to get your collection_uid and collection_id and update the config.js file.
Get All Collections: https://api.getpostman.com/collections
Get Single Collection: https://api.getpostman.com/collections/{{collection_uid}}
Run npm install fs swagger2-postman2-converter to install the dependencies.
Run node converter.js
Run PUT API to update your collection.
PUT Update Collection: https://api.getpostman.com/collections/{{collection_uid}}
After you have run it once, if you have any updates to your swagger.json file, you only need to perform steps 5 and 6.
Related
I'm trying to write a script that quickly checks if our Jenkins plugins are up to date. I know that this is a built in feature in Jenkins, but for security reasons, our Jenkins instance doesn't have internet access.
I know that I can get a lot of information about a plugin, including version, from:
https://plugins.jenkins.io/<name-of-plugin>
However, I can't get it to return anything other than HTML. I could scrape the HTML for the version number, but if there is a stable API that returns JSON or similar, that would be preferred. I'm pretty sure Jenkins isn't scraping HTML to check for updates, so the API must exist. Does anyone know where it is?
There seem to be two solutions available. I ended up scraping:
https://updates.jenkins.io/download/plugins/<name-of-plugin>
The latest version is always in the second column of the second row, so scraping is trivial. It works well most of the time, but sometimes the connection is refused, which I assume might be due to of the volume of requests sent by the script.
Another option that I found is to download the following JSON file:
https://updates.jenkins.io/current/update-center.actual.json
It is currently 1.7MB and contains information about the latest version of all Jenkins plugins. It also contains meta data like dependencies, which allows your script to validate that all dependencies are satisfied.
Unfortunately I haven't found a way to download JSON for individual plugins, so you either have to scrape HTML for individual plugins or download a massive JSON for all plugins.
Update: I found the API:
https://plugins.jenkins.io/api/plugin/<name-of-plugin>
And I also found the source code and the documentation:
https://github.com/jenkins-infra/plugin-site-api
Was previously using TFS2013 and XAML-based builds. Had created a number of custom tasks for the XAML builds in C#, including one to set the build number programmatically through the TFS .NET Client API libraries (not the REST API). Have now moved to TFS2015 and trying to move to vNext-based builds. Trying to convert our C#-based customizations to the new build process by re-coding as PowerShell scripts. I am trying to use the following script to programmatically set the build number (as we did previously via C#):
# Step 1: load the TFS assemblies
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Client")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Build.Common")
[void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.TeamFoundation.Build.Client")
# Step 2: get the collection
$collection=[Microsoft.TeamFoundation.Client.TfsTeamProjectCollectionFactory]::GetTeamProjectCollection($env:SYSTEM_TEAMFOUNDATIONCOLLECTIONURI)
# Step 3: get the build server
$buildServer=$collection.GetService([Microsoft.TeamFoundation.Build.Client.IBuildServer])
# Step 4: get the details for the currently running build
$buildDetail = $buildServer.GetBuild($env:BUILD_BUILDURI)
# Step 5: update the build number
$buildDetail.BuildNumber = "1.2.3.4-MyBuild"
# Step 6: save the changes to the build details
$buildDetail.Save()
Everything seems to work just fine until step 6, when I try and save the change made to the buildDetail.BuildNumber. When the "Save()" method is called, the following error is generated:
Exception calling "Save" with "0" argument(s): "TF215070: The build URI vstfs:///Build/Build/40177 is not valid. Make sure that this build exists and has not been deleted.
As best I can tell, step 4 is returning an instance that implements the Microsoft.TeamFoundation.Build.Client.IBuildDetail interface (as I expected it would). Also, clearly the build URI is valid as it is specifically used to load the build information in that same step. Again, this logic mimics the same logic we use in our C#-based XAML customizations, just rewritten under PowerShell.
Searching the internet, I can find nothing related to this error and cannot figure out why I would be receiving it. I did find a (much more complex) version of what I am trying to do here: https://github.com/voice4net/build-scripts/blob/master/ApplyVersionFromSourceControl.ps1. While I haven't tried using this script directly, it appears to be doing basically the same thing and, presumably, worked for its author.
Is there anyone out there that can help me to understand this error, why I am getting it, and, ideally, how to fix it?
SIDE NOTE: This is not Microsoft's hosted TFS; this is a traditional TFS system that our company has installed internally.
It seems like you are trying to use the incorrect API to update your build number. You should be using the WebAPI/REST portion of the client DLLs.
That said I highly suggest you create a custom task in pure PowerShell or TypeScript in order to achieve your goal.
Here is an example of how to do so in TypeScript:
console.log(`##vso[build.updatebuildnumber]${newBuildNumber}`);
or in PowerShell:
Write-VstsUpdateBuildNumber -Value $newBuildNumber
You can check the reference here on how to create build tasks:
https://github.com/Microsoft/vsts-task-lib
The methods in the TFS SOAP object model (such as Microsoft.TeamFoundation.Build.Common) only apply to the XAML build system. You're targeting the wrong build system, which is why you're having problems.
Use the appropriate object model (either the REST APIs directly, or the C# REST API wrappers).
I am using Meteor 1.3 and am unable to get my fixtures to appear on the client.
For testing purposes, I created a new Meteor project and followed the tutorial steps up to this point.
I created a file fixtures.js in /imports/api/
I import this in /server/main.js
When I start the server, I see the console log line indicating that the file was imported and if I query the collection using the mongo shell, I see the fixtures.
If I query the collection from the client, I see nothing.
Am I missing something to get the flixtures to appear on the client? I have not modifed the installed packages, so autopublish is still present.
If you just use chrome console to query your collection, this will not work anymore in 1.3. If you want so, then you should put it into global namespace:
global.CollectionName = CollectionName
I'm designing my new Ember CLI application. I gonna use Ember Simple Auth for authorisation. And I also need to use CSRF protection. I know how to inject the token to requests. The problem is how to embed the token to the application in a right way.
It is required the token to be generated on the backend side and to be embedded into the page. And it should be the same one that API uses to verify requests.
In my case API and Ember application would be served by the same web server (most likely Microsoft IIS, since API is build with .NET MVC). So the web server should generate the token and embed it somehow into the Ember application once its index file requested.
But Ember CLI generates static index.html file during its build process. And I see just two ways of embedding the token.
The first one is to parse Ember's index.html file on each request and embed CSRF token metatag to the proper place. I don't really like this way.
The second one is to make Ember CLI produce index.aspx instead of index.html during the build process. In this case web server would put the token automatically. I like this way better, but i'm not sure if it's possible. And if it is then how to do this?
Maybe, there is another better way? Any help and advices would be appreciated.
It is possible to implement your second idea. You need to specify options.outputPaths.app.html property of EmberApp in ember-cli-build.js:
const app = new EmberApp({
outputPaths: {
app: {
html: 'index.aspx'
}
}
});
Doing this will result in Ember CLI writing index.aspx instead of index.html to filesystem:
You can read more about configuring output paths of your Ember CLI application in user-guide.
You can also dig directly in Ember CLI's ember-app.js source code to see if you can adjust options more to your needs.
Is it possible to make it environment-dependent? It works perfectly
for production build. But it doesn't work for development since Ember
CLI can not serve .aspx files. So it would be nice to overwrite output
files for production builds only and use generic ones for development
and test environments.
You can achieve this by checking value of app.env. So, in ember-cli-build.js, instead of passing outputPaths as argument, use following technique:
const app = new EmberApp();
if (app.env === 'production') {
app.options.outputPaths.app.html = 'index.aspx';
}
This way you still get index.aspx in dist/ when building for production, but development and test environments' outputPaths options are untouched and live development/testing works.
I am working on a CI system with Jenkins. But now I got a problem. I need to do the following steps.
1:Jenkins build
2:Deploy to Tomcat
3:find a way to send the build parameters (Job Name, build number...) to a web server (I am using REST now).
4:Web Server trigger testing system.
5:Jenkins get the result from testing system.
6:update build status
7:send emails.
I have problem with the step 3. I need to send those info after the deploy. I am thinking a way as following.
write those parameters to a file during build step, then call a script or Java problem to process the file and send out those info by REST.
But that is ugly. Is there any better ways to do it?
Side questions
Can groovy do this?
How to import groovy http-builder library to Jenkins?
I found a walk around solution.
1: ran echo command during the build to get the build ID and print to log.
2:wrote a small Java program to get the JSON response of the build then sent the necessary info as rest request to the server you set. The program is like a message forwarder.
3:in post build actions, use groovy post build to fetch the log then call the Java program.