Just imagine regular Deploy Target which copies thousand files to the remote network folder using MSBuild Copy Task, I believe pretty common scenario. So when folder is not accessible or there are some access privilegies problems - obviously Copy Task would not be able to copy files, but it will try to copy each file anyway, I want to prevent this to speed up Deploy Target for this case and report Failed status immediately and do not wait 30-60 mins whilst it process all files in the queue...
How to force MSBuild Copy Task to stop immediately in case when a file was not copied successfully and do not try to copy all other files?
If this is not possible using Copy Task perhaps this could be achieved using other facilities?
You can use Exec task instead. Like this:
<Exec Command="xcopy /s "from with spaces" $(WebDeployFolder)\$(WebDeployName)"/>
Would it not be better to use robocopy to copy, its has a plenty of options for similar things. See the task in the extension pack:
http://www.msbuildextensionpack.com/help/4.0.4.0/index.html
The options property of the task takes a number of parameters:
http://technet.microsoft.com/en-us/library/cc733145%28WS.10%29.aspx
Related
How can I run a powershell script after all stages have completed deployment? I have currently selected a deployment group job but am not 100% sure if this is what I need. I have included the script as part of the solution that is being deployed so that it will be available on all machines. Based on what I can find in the UI there seem to be 2 tasks that could work.
The first option would be to execute the task "Powershell Script" but it is asking for a path in the drop directory. The problem with this is that the file that I am interested in is in a zip file and there does not seem to be a way to specify a file in the zip file.
The other task I see is "PowerShell on Target Machines" and then it asks for a list of target machines. I am not sure what needs to be entered here as I want to run the powershell script on the current machine in the deploy group. It seems like this task was intended to run powershell scripts from the deployment machine to another remote machine. As a result this option does not seem like it fits my use case.
From looking the answers that I have come across talk about how to do this as part of an Azure site using something called "Kudu" (not relevant) or don't answer my other questions related to these tasks or seem like they are out of date.
A deployment group job will run on all of the servers specified in that deployment group. Based on what you have indicated, it sounds like that is what you are looking for.
Since you indicated that the file in question is a zip, you are actually going to need to use 2 separate tasks.
Extract Files - use this to extract the zip file so that you can execute the script
Powershell script - use this to execute the script. You can set the working directory for the script to execute in if necessary (under advanced options). Also remember that you don't have to use the file/folder selector 'helper' as it wont work in your case if the file is inside a zip. This is just used to populate the text box which you can manually do starting with the $(System.DefaultWorkingDirectory) variable and adding the necessary path of the script.
I'm completely new to Bamboo, so thank you in advance for the help.
I'm trying to create a Bamboo Run that zips files from a git repo and uploads it to Artifactory. Currently my build contains 2 tasks - source code checkout and a simple powershell script. The first time I run it it builds perfectly fine, but without any modifications any consecutive runs fail.
The error I'm getting in the log is the following:
Failing task since return code of [powershell -ExecutionPolicy bypass -Command /bin/sh /opt/bamboo/agent/temp/OR-J8U-JOB1-4-ScriptBuildTask-539645121146088515.ps1] was -1 while expected 0
Replacing the powershell script with empty space does not resolve the issue - only removing the script completely allows the build to succeed, but I cannot reinsert a new script or it will fail. I read other online questions suggesting that I "merge the user-level PATH environment information in to the system-level PATH" but I cannot find the user-level environment information, my environmental variables section is completely empty.
Like Vlad, I found that it was more efficient to implement my powershell script with batch.
I'm having difficulties trying to setup a startup task in an Azure role.
The ultimate goal is to disable RC4 cipher, along with other SSL configurations. In my (VS2012Express) project (solution partially achieved following another answer here in SO that led me to https://gist.github.com/sidshetye/29d6d48dfa0c2f5488a4 ) I created a Startup.cmd file like this:
# Execute powershell command to disable RC4 and imporve SSL security settings
ECHO Batch started >> "StartupLog.txt" 2>&1
PowerShell -ExecutionPolicy Unrestricted .\HardenSSL.ps1 >> log- HardenSSL.txt 2>&1
EXIT /B 0
HardenSSL.ps1 is the PowerShell script from the previous link. Both the .cmd and .ps1 scripts are placed in the application root directory, marked as "Content" with properties set to "CopyLocal=Always".
In my service definition, I put this:
<Startup>
<Task commandLine="Startup.cmd" executionContext="elevated" taskType="background"></Task>
</Startup>
Now, when I deploy the application to Azure, "nothing" happens. I configured the role instance to allow remote desktop, connected to the machine. I verified the scripts where published, and there were no log files, RC4 still enabled. I tried to manually run the .cmd and the machine runs the scripts to completion, disables RC4 and restarts. So the scripts are actually "correct".
The problem is that the scripts are not getting fired up at startup. I may be wrong, but I don't see anything related looking Windows events. Actually, the server now keeps all the configurations, but I have to be sure the scripts get executed in case I'll have to publish to new instances/cloud services.
I also tried to:
1. place the scripts on a child directory
2. create other 2 "simpler" .cmd that just create a log file with "script started" to exclude problems related to the .cmd calling the PowerShell script.
None of those scripts got executed.
Hope I've been sufficiently clear, any help would be greatly appreciated.
Thank you in advance,
Alberto
UPDATE 1
Reading through various discussions, I missed one very important thing: the script files are actually published in 2 distinct places, one being inside the /bin folder.
Ex: I placed my scripts in a /StartupScripts folder in my project, and when I connect via Remote Desktop to the Azure server I find the scripts both in "approot/StartupScripts" and in "approot/bin/StartupScripts".
The scripts the are actually executing are those placed inside the "bin" folder. the real problem is that I have probably a path problem inside the .cmd since I now found the execution logs with an error.
Now I will try to change it up and update the question here on SO.
Ok.
In the end it was indeed a problem with a path in my Startup.cmd file: .\HardenSSL.ps1 could not be found if the StartUp Task pointed to a subfolder.
Solution was to place both Startup.cmd and HardenSSL.ps1 files in the application root, remove the ".\" part when calling the PowerShell Script and all worked well.
Anyway, I would like to suggest anyone to pick this other solution I found in stack exchage:
https://security.stackexchange.com/a/79957
It links to a NuGet package that does the same thing as the script I found on the link to github in the original post, just "better"; mainly:
Better configuration of cipher suites, with support for ForwardSecrecy for all reference browsers on SSLLabs
Retain SSL support for Internet Explorer 8 on windows XP (unfortunately still a necessity for us)
Alberto.
I have TFS running an automated build. The solution runs a cmd file which runs out batch file and then performs and xcopy at the end to copy the results to our release PC.
If I run the cmd file manually the files copy over.
If I run it in the automated build then it fails with an Access denied cannot create directory error (the directory is there).
Is there an issue with sevices creating directories on other PCs or anything like that?
Sorted - the PC name needed to be added as a user on the remote machine in order to have access rights!
I have a startup task for my webrole that download some executable file from a blob and then proceed to the installation.
From a .cmd file, I start a power shell script that download the files, then I start the file from the .cmd.
The script works fine if I run it manually through RDP after the publishing is done.
But, when running as startup script, it sometimes (often) fail at different points.
The taskType is set to background.
Last time, the error was that the command PowerShell does not exists...
Also, I use powershell -command set-executionpolicy unrestricted before running my PS script, but I read here that other task may reset this setting and make mine fail.
Quite a mess.
So that makes me think that if I could wait for all other task to perform before starting mine, it would eliminate these kinds of problems
I suppose I could check if some process is running and wait for it to finish, but I have no clue wich process to check.
Or maybe there's another solution.
~edit~
I read here that the error about powershell not existing may be caused by the batch file being saved as UTF-8 in visual studio. I re-writed it from scratch in notepad++ and made sure it is save as ANSI. Then, same error. The full message is :
'PowerShell' is not recognized as an internal or external command,
operable program or batch file.
Again, the script run perfectly from command line in remote desktop.
It would be possible to set an environment variable at the end of the script that is required to finish, then in the script which is awaiting the dependencies, loop until the environment variable is set, then kick off its activities.
You could also run everything from a single powershell script and use the '-asjob' switch on your installer statement, use the 'wait-job' cmdlet to block until the task is complete then carry on. Powershell also offers a '?!' operator which ensures the last statement executed properly.
This might be caused by an encoding issue. As mentioned in this answer you should save your file in ASCII to ensure correct interpretation of your script.
From the linked answer:
Open your whatever.cmd file with your VS 2012 Ultimate. Click on File->Save whatever.cmd as -> on the dialog there is little arrow next to the [save] button. It will show up a menu that will have the option Save with Encoding.
Select it. Now choose "US-ASCII Codepage 20127" from the list of available encodings.