What is the proper way of delivering temporary build-time assets using nuget?
I am making a nuget package with a single file, which dependent projects require during the build phase. I would like the content of the file to be copied to obj\$(Configuration) folder inside a dependent project before proceeding with the rest of the build. Of course, the obj folder is temporary, so I would like my file to be copied there again as part of the next build if obj gets cleared out.
I tried contentFiles approach described here. This takes care of packaging my file inside nupkg file, but I was unable to set it up so that my file gets delivered (and re-delivered) to obj\$(Configuration).
You're looking for NuGet's MSBuild extensibility. Unfortunately it means you'll need to learn a bit about MSBuild if you don't already know it. I recommend running msbuild -bl or dotnet build -bl, which will create a msbuild.binlog file, which you can view with the msbuild structured log viewer.
One option is to have a target that creates the file in the intermediate output directory at an appropriate time (probably need to use BeforeTargets). You could use the Inputs and Outputs attributes to have msbuild do incremental build checks and skip copying when it doesn't need to, possibly making the build a little faster.
However, unless the file is has dynmanic content, copying the file is a waste. it's just going to be included as an item in another part of the build process. So, if it's static content, you could just create the relevant item in your targets file from your package's extracted directory, and then it's just as good as if it was copied to the intermediate output directory, without wasted time and duplicated disk space.
Related
Is it possible include arbitrary files (in this case a .csv) from a TwinCAT project direct to the Boot directory of a PLC?
By using PATH_BOOTPATH in the file open/read FBs it is possible to load files from this directory in a convenient manner regardless of whether using a CE or Windows deployment, However deployment of files to this location seems to be the sticking point.
I know that a copy of the project code is included within the CurrentConfig<Project>.tpzip file, but this file is not easily accessible from code, or updateable.
I've found the 'Additional Files' section within the system configuration, but it makes little sense.
Adding a file from inside the project as a 'Relative' path doesn't seem to do anything
Adding a file from inside the project as an external path includes the file (via symbolic links?) in the 'CurrentConfig.tszip' file, which has the same issues as the .tpzip
Adding an external file as an external path again includes the file inside of the .tszip.
I'm willing to accept that this might not be possible, but it just feels odd that the PATH_BOOTPRJ and PATH_BOOTPATH roots are there and not accessing useful paths.
Deployment
To quote Beckhoff:
Deployment is used to set up commands that are to be executed during the installation and startup of an application.
The event types are essentially at what stage of the deployment process the command is performed, where the command can either be copying a file or execution of a script/program.
Haven't performed extensive testing but between absolute/relative pathing and execution this should solve nearly all issues with deployment configuration.
Is there anyway I can just call into a define such as LIBFOO_DIRCLEAN, and just do what was implemented in the define?
Inside HOST_LIBFOO_INSTALL_CMDS, I copy files to the target directory, and would like the 'make package-dirclean' to delete what was copied into the target directory. 'make clean', would obviously do this(any many more), but that is much more than I want to do.
I see the following buildroot variables. LIBFOO_EXTRACT_CMDS, LIBFOO_CONFIGURE_CMDS, LIBFOO_BUILD_CMDS, HOST_LIBFOO_INSTALL_CMDS, LIBFOO_INSTALL_TARGET_CMDS, etc.
make foo-dirclean is a simple tool that just deletes the package build directory. In most cases, when the list of files installed by a package does not change over time (only files content changes) you can simply rebuild the package and the target directory will be rebuilt correctly.
If you want you can implement your own foo-myclean step that implements your own logic. However you must understand deleting files in the target directory is not supported by Buildroot and thus you are on your own.
So I'm trying to get the custom build task directory name from powershell when executing a custom build task.
The purpose is that I want jshint to run on build time, and I've got it doing so, but the .jshintignore file needs to know a relative path to exclude files or folders.
So I need be able to get that path at runtime in order to know how many "../" to add on to the excluded files for the minmatch engine, which is what jshint uses, to match them.
I can, of course, hard code it, but that's really not what I'd prefer to do.
Yes, you can use the Agent.HomeDirectory variable.
Agent.HomeDirectory | AGENT_HOMEDIRECTORY | The directory the agent is
installed into. This contains the agent bits
The tasks folder will be $(Agent.HomeDirectory)\tasks(TaskFolder)\
Is there a way to create a "build" but not actually compile the output of the site? Bascially I want to push the files live from the source control in TFS to the final IIS folder destination.
I have used CopyDirectory on my other project builds, but that requires a BuildDetail.DropLocation (compiled Build). Maybe there is another option for the CopyDirectory Source I could use that wouldn't require a build DropLocation.
To simplify, I want to copy files directly from a tfs Source control to a folder, using a Build Template, but that won't compile the files. Is that possible?
While the default .xaml build workflow for Team Foundation Build is indeed a compilation build it does not have to be. I usually recommend teams to at least have one compile and one deploy .xaml workflow.
1) The CompileMyStuff.xaml (DefaultBuildTemplate.xaml) should take my stuff from source control and do whatever is needed to create a Build Drop folder with my output. I may or may not need to actually compile before creating the drop, and it looks like you just want to copy to the drop location.
2) The DeployMyStuff.xaml should take a build number and deploy my code to an environment of my choice.
It looks like you want to skip the "Drop" and go state to deploy and while I would never recommend this you do have a "BuildDetail.BuildLocation" for teh local workspace where the build server has done a get of your code. You can just "CopyDirectory" from there to your server/host for the website.
If you are having a little trouble you could use the Community Build Extensions and fire up PowerShell to do your copy/deploy.
I figured out a solution to this problem. I created a new .xaml file and then the only item that I put in the sequence was "DownLoadFiles". Then I filled out the properties of the task and ran a "build" and it worked.
I store all SSIS packages in Subversion repository, their configuration files as well. Configuration file almost always stored in the same folder where package is.
Problem is - SSIS seems to always store path to configuration file (the one saved in the package itself) as an absolute path.
When someone else checks out folder with the package in the location different from where I had on my development PC the configuration file is not detected (because my absolute path is stored and it doesn't exist on the other developer PC). So another developer has to remove this configuration and add it again from where it is now on his local hard drive. Then changed package is saved which will cause new version to be committed. When I get that version from SVN it will no longer match local path on my PC.
On a related note: another developer may want to change values in configuration file as well. If I later get the latest version of everything from SVN package will no longer work on my PC.
How do you work around these inconveniences?
Another solution is to save your configuration in a database with an environment variable as the first configuration to tell it what database to look in, that's what we do. We have scripts to populate ssisconfig for each server in our source control, but the package uses the actual table data for the database in the environment variable we are using.
Anyone who has heard my SQL Saturday presentations knows I don't much care for XML and this is one of the reasons. A trick to using XML configuration with varying locations is to use an environment variable (indirect configuration) to direct SSIS where it can look for that resource. The big, big downside to this approach is you'd generally need to create an environment variable for each set of configuration files or have a massive, honking .dtsconfig file which becomes painful for versioning.
The option I prefer if XML configuration is a must is that the "variableness" is removed. Developers and admins get together and everyone agrees "there will be a folder everywhere SSIS is done to hold configuration files and that location is X" and then it's just a matter of solving for X. At a previous job, we used D:\ssisdata\configs
#HLGEM's approach of a table for configurations is hands down my favorite approach to SSIS configuration (until you get to 2012 and their project deployment model where configuration is an entirely different animal)
I add a folder called "config" under my projects folder, add it to source control and mantain the config file in this folder. You can also add it to the SSIS project if you like.
I think its a good solution because everybody can have this folder and dowload the config file.
When the package is deployed it will read the config file from where you inform in the deployment manifest so this solution wont impact your development