I need to run a powershell script as a task when doing the release in VSTS. The script runs perfectly if I run it form the powershell IDE, but I can't configure the script to run as a task in the parameter script path. I'm using $(System.DefaultWorkingDirectory)/vs-releasenotes.ps1 but it does not work
The script is top level in the folder solution. How can I access the ps1 scritp if it is localted at solution level (no folder for the script)
Thanks
Found the error, I was running the script during the release phase but did not add the ps1 file to the task that copied all the files to the build directory, so it was not available. I just added the file and everything works ok
Related
I use ssis to run a powershell script to download a file that used to be csv but recently became large enough to be zipped. I updated the powershell script to look for a zip file and added a task to the package to unzip the file so it can be loaded into a sql database. Well, then it came through as a csv again. I need a solution to choose either the zip file or the csv file. Not sure if this should be a task in ssis or updated powershell.
I would go with a PS task to download the files (either zip/csv) then SSIS foreach container to iterate over the files you just downloaded. Doing this you will assign the individual file to a user variable. Inside your container if the file is zip, (use a variable set via an expression to determine if it is zip or not) run a task that will run PS to unzip and then a expression task to update the variable that holds the file path to be the newly unzipped csv path. Then run your data flow task to import the csv.
If the file is a csv to begin with, then just run the DFT.
Either way the data flow task is the same, take csv and load it. I have found I like to keep my PS in SSIS packages very purpose driven. I have a tendency to build my logic in PS because it is easier, but then my package becomes harder to debug because an issue in my PS script will fail the SSIS package and SSIS tells me nothing usefull about what in the script failed. (unless you are handling redirecting of stdout and stderr from your PS, or doing some other logging)
Best to keep the powershell as simple as needed for each task you need to do.
I have a script that I've created to prep our customer's servers for a software install. Part of this requires the script to be run as administrator, so just instructing people to click "Run With Powershell" doesn't get the job done. The script is in a folder with a number of .ini files that the script needs to copy to different server locations. If I just right-click the Powershell script and select "Run With Powershell," it is able to find the files and copy them without issue. Unfortunately, if I open the script in ISE, it opens with a default directory of C:\users\user, and I can't seem to copy those .ini files without first running a change directory command to get us to the folder that the script and the .ini files are in. But I'd like our installation techs to be able to run this without worrying about the exact location they initially drop these folders. I'd also like them to not have to worry about changing the directory manually in PowerShell. Some of our customers have multiple drives, and it might make sense to put this stuff on something other than the C drive, so it's hard to tell where this folder might end up. But I'm not sure of a command that will get me to the directory of the *.ps1 file, without knowing where that file is beforehand... Anyone have a suggestion?
You can use $PSScriptRoot that will have the location of the directory where the script is located.
This is referenced in the following post:
How can I get the file system location of a PowerShell script?
I have some Nunit automated tests. I am trying to use Nunit console to run those tests over Selenium grid. I have created a .bat file to launch the tests over the grid, which looks at a list of tests in a text file.
When I manually run the .bat file, my tests run and I get an xml results file automatically created.
When I get windows task scheduler to run the .bat file, my tests will run, however, I do NOT get my test results.
Need help understanding why I do not get my results if I use task scheduler.
Here is the .bat file contents;
nunit-console /result:console-testResults2.xml /work:C:\Selenium\TestResults /runlist:C:\Selenium\testlist.txt "C:\Selenium\VisualStudio\Automated Tests\Automated Tests\AutomatedTests\bin\Debug\AutomatedTests.dll"
I had the same problem, turns out it was because of where it runs the tests from. Once I set the "Start In" option to the same directory as my test dll, then it created the XML file in that directory fine.
I am investigating how to use both Glassfish and Ant buildfiles. I've written a build script (for the first time) which will create a WAR file of my basic Hello World app.
I am then trying to deploy this WAR to Glassfish as part of the build script. I found details of the glassfish-deploy task and have managed to get this included in the build script after including the ant-task jar file in the class path.
However, when I run the script I get the message:
[glassfish-deploy] Install Directory of application server not known. Specify either the installDir attribute or the asinstall.dir property
I've tried to find out what is meant by this, but can find no reference to either the installDir attribute or the asinstall.dir property. I have managed to deploy the created WAR file through Glassfish's admin webpage but I cannot seem to get this Ant script to do it successfully.
Any pointers or guidance would be most helpful.
OKay after a lightbulb moment I managed to resolve this by editing the task in my build script so it is now <glassfish-deploy file="${name}.war" installDir="C:\\glassfish3\\glassfish" force="true"/>
As it appears that the ant script does not know where Glassfish is installed. This resolves the issue and the script runs (and workds) However the next stage is to figure out how to do this without having to hardcode the location into into the build script. Especially if I want to write a script that can deploy the WAR file to a remote server.
I would like to zip a bunch of files (.exe and .dll) before I overwrite them with the new build. Is there a simple way to zip files without using some sort of dll?
Just creating a folder with the build number / date time stamp will also work great. How do I pass parameters from the cruise control build process into my Powershell script that will do the work then?
Is this a sustainable way to do things?
Thanks
You could either use CCNET's Package Publisher Task directly or zip the files via the PowerShell Task introduced in CCNET 1.5.
Configuration sample for PowerShell Task:
<powershell>
<description>Adding scheduled jobs</description>
<scriptsDirectory>ScheduledTasks</scriptsDirectory>
<script>CreateScheduledJobsFromListOfTasks.ps1</script>
<buildArgs>-zipDir="C:\foo"</buildArgs>
</powershell>