Creating Hudson jobs and views from command line - command-line

Can I create Hudson jobs from the command line?? If yes, can views be created the same way??

You can try the Hudson Remote API, which include an API to create Jobs.

You can always create the xml file for the job. The location will be a new folder in the jobs directory. Than you only need to trigger a reload of the config. It should be similar for the views.

Related

Run custom script in `Initialize Job` task

I have problem when build a C++ project in azure-pipeline, some dll files was access denied.
So I need to run a batch script to stop services which using these dll
I was try to run my script at pre-build event in Visual Studio but it execute after Initialize Job task, so not work
Are there any way to run script in Initialize Job?
Are there any way to run script in Initialize Job?
I am afraid there is no such way to run script in Initialize Job at this moment. The Prepare job/Initialize Job are some of the predefined work built into the pipeline. We could not add our custom script in or before those jobs.
So, to resolve this issue, we have to find the reason for this error and resolve it.
Generally, this error will most likely make an appearance if your Build and Release Agent goes offline or a build is interrupted due to an issue on the machine itself, and where specific files have been created mid-flight within the Azure Devops directory. When Azure Devops/TFS then re-attempts a new build and to write to/recreate the files that already exist, it fails, and the above error is displayed.
The best resolution is to log in to the agent machine manually, navigate to the affected directory/file (in this example, C:\VSTS\_work\xxx\xx\.tmp) and delete the file/folder in question. Removing the offending items will effectively "clean slate" the next Build definition execution, which should then complete without issue.
Hope this helps.
I had to solve this exact same problem. The solution is not ideal, but it works.
I created two pipelines. The first pipeline does any required pre-build steps, like stopping services. The second pipeline is the actual build pipeline, and it gets triggered when the first pipeline finishes. (See the build triggers section in pipeline #2.)

Reusing PowerShell Scripts in Azure DevOps

I have a PowerShell script that I want to re-use across multiple build pipelines. My question is, is there a way I can "store" or "save" my PowerShell script at the project or organization scope so that I can use it in my other build pipelines? If so, how? I can't seem to find a way to do this. It would be super handy though.
It is now possible to check out multiple repositories in one YAML pipeline. You could place your script in one repository and check it out in a pipeline of any other repository. You could then reference the script directly on the pipeline workspace.
More info here.
Depending on how big theese scripts are you can create Taskgroups that contain powershell-tasks with the script as inline-powershell. But this only works on project-scope..
Another attempt i'd try would be to create a repo containing your powershell-scripts, add this repo as submodule in the repository you are trying to build and then call the scripts from the submodule-folder. But this only works when using git-repos.
Or you could create a custom build-task that contains your script.
From what I have seen, no.
A few different options I have explored are:
If using a non-hosted agent, saving the file onto the build server. Admittedly this doesn't scale well, but it is better than copy/pasting the script all over. I was able to put these scripts into version control and deploy them via their own pipeline so that might be an solution for scaling (if necessary)
Cloning another repository that has these shared scripts during the process.
I've been asking for this feature for a bit, but it seems the Azure DevOps team has higher priorities.
How about putting the powershell in a nuget package and install that in depending projects?
I just discovered YAML templates (https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azdevops#step-re-use).
I think it may help you in this case (depending how large it is your file), you can put an inline powershell script in that template yaml, and reuse it on your main yaml.
Documentation is pretty straightforward.

Share the same Powershell script file between multiple repo/Build

We are using VSTS for CI and CD in my team, we got over 40 repositories which are separated projects. but all of them have to run the same PowerShell script in one of their Build steps.
the PowerShell file is bigger too big to be kept as the inline script, so we need to save it inside a file. obviously, I got a copy of the PowerShell file in each repository.
Problem:
Now whenever I need to update the script, then I end up to update it in every repository, which is over 40 at the moment.
I think there should be a better approach. Is there any way that I can put my script in one single repo (a repo dedicated to holding the script) then I use it within each build, therefore we I need to update it I only need to update it once.
There are a few options.
My general recommendation is to publish the script as a package (NuGet or otherwise) and restore it during your application builds. This allows consumers to stay "pinned" to a known-good, known-working version, and update on a schedule that works for them.
Another option is to add a submodule to each repository that requires the script dependency, then initialize the submodule during the build process.
A third option is to turn the shared script into a VSTS build task or extension. This is extensively documented and easily located so I won't belabor the point by including instructions for doing that here.
You can add a git repository to store your powershell file.
Then add a build step to get you file from that repository during build and use it.

Trigger Jenkins job on change of directory content

Below is the problem,
There is a directory,Lets called is user/mydir
Is it possible to trigger a jenkins job on any change to this direcory (eg:copy a file , delete a file ,create a file in the directory).
Any feedback will be really helpful.
You can use the FSTrigger Plugin to do that.

TYPO3: Publish workspace edit via scheduler?

I managed to run the scheduler with a cron task.
I managed to auto publish a whole workspace on the publish date with the scheduler.
But I don't want the whole workspace to be published on the scheduler task, but only a single edit.
I tried to give the edit in the workspace a publish date, but that didn't work out.
Is this even possible?
TYPO3 version: 4.5.x
You could create another Workspace for this single edit. And publish this different workspace where you have only one edit.
If you give some more details, perhaps it is possible to do it without workspaces? F.e. you can create two content elements with different start and stop dates.