We are running into a deployment nightmare with our current setup :
1 - DEV makes changes to SSIS
2 - All the packages that change have to be uploaded again to MSDB
3 - once the deployment is done we use dtexe /SQL switch
What we would like essentially is to remove Step 2 and deply the dtsx as files rather then upload it to MSDB, this way anytime DEV changes that need to go to production are just file pushes.
For that purpose we can use dtexec /F switch
The question I had is, is there a way to still use /SQL switch and have it use File system so that deployments are trasparent or is there some other way to achieve this?
We ended up using the following solution:
Use dtsx packages as files rather then uploading to MSDB.
Use the /F switch inside /COM (Command file)
Related
I have some strange problems when I deploy application build with Play framework 2.0. It looks like deployment didn't clean/overwrite compiled files. This I know because one method is changed but still the old is somehow called...
My deployment steps:
in app directory ../play stop
upload all files to app directory from development (except application.conf)
../play clean compile stage
in target directory: ./start -Dhttp.port=80 &
I need some "best practices" advices ;)
Some tips:
Try to use play clean-all instead play clean.
Use the GIT for controlling changes, maybe you forgot to upload something.
Use alternative configuration file for running app in different environments.
Use dist command for building independent production versions, with some bash scripts + git hooks + load balancer you'll be able to switch versions without stopping the application. Anyway remember to move and unzip created file outside the /dist directory, as it is cleaned every time when you call play dist command.
How can I run custom script which will upload ClickOnce deployment files to a web-server (in my case Windows Azure Blog Storage) right after publishing? Is it possible to modify MSBuild file in some way so it would run custom script right after ClickOnce published files into a local folder?
Yes, you can hook to build process using various technics:
pre and post build actions ( from visual studio project properties menu). It's actually exec task hooked into your project file
you can override your DependsOn property for concrete target and append execution of your own target (pre-Msbuild 4.0 way)
you can declare your target and hook with AfterTarget\BeforeTarget attributes (Msbuild4.0 way).
As for uploading something to blob - you can use Exec task in your own target to upload or use whatever tool\script you usually use to uploading files to website\Blob storage.
NB: You could clarify your question with following points (if you need more concrete answer) :
what kind of build process you are using - build from VS, CI server with custom msbuild script, CI server that building your sln file etc
what kind of script\tool you want to execute to upload build result.
do you know the name of last executed msbuild target, after which you want to fire your tool.
Is there a way to create a "build" but not actually compile the output of the site? Bascially I want to push the files live from the source control in TFS to the final IIS folder destination.
I have used CopyDirectory on my other project builds, but that requires a BuildDetail.DropLocation (compiled Build). Maybe there is another option for the CopyDirectory Source I could use that wouldn't require a build DropLocation.
To simplify, I want to copy files directly from a tfs Source control to a folder, using a Build Template, but that won't compile the files. Is that possible?
While the default .xaml build workflow for Team Foundation Build is indeed a compilation build it does not have to be. I usually recommend teams to at least have one compile and one deploy .xaml workflow.
1) The CompileMyStuff.xaml (DefaultBuildTemplate.xaml) should take my stuff from source control and do whatever is needed to create a Build Drop folder with my output. I may or may not need to actually compile before creating the drop, and it looks like you just want to copy to the drop location.
2) The DeployMyStuff.xaml should take a build number and deploy my code to an environment of my choice.
It looks like you want to skip the "Drop" and go state to deploy and while I would never recommend this you do have a "BuildDetail.BuildLocation" for teh local workspace where the build server has done a get of your code. You can just "CopyDirectory" from there to your server/host for the website.
If you are having a little trouble you could use the Community Build Extensions and fire up PowerShell to do your copy/deploy.
I figured out a solution to this problem. I created a new .xaml file and then the only item that I put in the sequence was "DownLoadFiles". Then I filled out the properties of the task and ran a "build" and it worked.
We've started using TFS2010 over at the company I work at. We create e-commerce web applications (shopping sites). I'm creating a custom template to deploy web projects after a build using a build template.
I've looked at the web deploy tool, but MSDN seems to indicate that it can only do initial deployments, and I need to be able to do incremental deployments with the same script.
I'm thinking of using the invokeActivity activity in the template to use powershell to do the job by specifying an FTP script which automatically copies the output of a build to a designated FTP site and then runs the SQL (upgrade) scripts, if needed by using SSH or s powershell remoting interactive session. (possibly specified in a separate SQL script)
There is some unknown for me which I can't get clear through the use of google:
When queuing a build, will I be able to let the user specify a script present in source control ( e.g. $(source)\scripts\ftpscript.ps1 ) as the script which is to be used? Will powershell be able to access/use that file? or should I copy it to the build directory and specify when I run it? (I don't know how to set up the template to get files from source control, so a pointer to some helpful info how to do that would be very much appreciated)
If the previous just doesn't work at all, could I create a folder \scripts\ in my website project, commit that to source control and then use BuildDetail.DropLocationRoot & "\scripts\" as the location for the script and fore a copy of the script files by enabling the force copy option?
To run a PowerShell script I think you can use the InvokeProcess activity which would trigger something like this:
%windir%\system32\windowspowershell\v1.0\powershell.exe "$(SolutionRoot)\test.ps1
And yes, you can reach a script file present in source control using the "SourcesDirectory" keyword.
I'm using jammit to package the js and css files for a rails project.
I would like now to upload the files to Amazon S3 and use CloudFront for the delivery.
What would be the best way to deal with new versions ?
My ideal solution would be to have a capistrano recipe to deal with it.
As anyone already done something like that?
You could simply create a capistrano task that triggers the copy to s3 after deploying.
You might use s3cmd as the command line tool for that.
Alternatively you could create a folder mounted by FuseOverAmazon, and configure it as the package_path in your jammit assets.yml. Make sure to run the rake task for generating the asset packages manually or in your deploy recipie.
http://s3tools.org/s3cmd
http://code.google.com/p/s3fs/wiki/FuseOverAmazon