I have a Build server and i would like to copy the newest deployment file from that location to another location on remote server using MSBuild but i am stuck on doing this as well as confused.
So in my build output directory i have a files such as this:
Installer - 2.5.1403.1201.msi
Installer - 2.5.1405.0701.msi
Now i want to copy the newest file Installer - 2.5.1405.0701.msi to remote server called ServerB.
I read that using PSExec shouldn't be used for copy files from build server to another server. Is there a reason why?
Currently i have the following code but i only got it to work locally on my machine:
<ItemGroup>
<File Include="C:\\LocalCopy\\Installer - 2.5.1403.1201.msi" />
</ItemGroup>
<PropertyGroup>
<DestinationFolder>C:\\Dump</DestinationFolder>
</PropertyGroup>
<Import Project="$(MSBuildToolsPath)\Microsoft.CSharp.targets" />
<Target Name="AfterBuild" Inputs="#(File)" Outputs="#(File -> '$(DestinationFolder)\% (RelativeDir)%(Filename)%(Extension)')">
<Copy SourceFiles="#(File)" DestinationFolder="$(DestinationFolder)" />
</Target>
Can someone provide me with advice how i could go about doing this?
Thanks & Regards,
Related
I'm setting up database projects for the first time and I'm trying to build/deploy using Azure DevOps. I'm using the MSBuild task hosted in Azure (windows-2019) for the build. I'm using a Command Line task running on the SQL server to execute SQLPackage.exe from a working directory of
C:\Program Files (x86)\Microsoft Visual Studio\2019\BuildTools\Common7\IDE\Extensions\Microsoft\SQLDB\DAC\140.
I'm getting errors for the master & msdb databases during deployment with SqlPackage.exe:
No file was supplied for reference master.dacpac; deployment might fail. When C:\Jen_DacpacTest\Artifact\whatever.dacpac was created, the original referenced file was located C:\PROGRAM FILES (X86)\MICROSOFT VISUAL STUDIO\2019\ENTERPRISE\COMMON7\IDE\EXTENSIONS\MICROSOFT\SQLDB\EXTENSIONS\SQLSERVER\140\SQLSCHEMAS\MASTER.DACPAC.
I thought that the references to master and msdb were resolved on build and again on deploy (since it's a variable for the path) but that doesn't seem to be the case. It seems to figure out what $(DacPacRootPath) is during the build and "hardcodes" the references in the model.xml file generated into the dacpac. The references look like this in the proj file:
<ArtifactReference Include="$(DacPacRootPath)\Extensions\Microsoft\SQLDB\Extensions\SqlServer\140\SqlSchemas\master.dacpac">
<HintPath>$(DacPacRootPath)\Extensions\Microsoft\SQLDB\Extensions\SqlServer\140\SqlSchemas\master.dacpac</HintPath>
<SuppressMissingDependenciesErrors>False</SuppressMissingDependenciesErrors>
<DatabaseVariableLiteralValue>master</DatabaseVariableLiteralValue>
</ArtifactReference>
<ArtifactReference Include="$(DacPacRootPath)\Extensions\Microsoft\SQLDB\Extensions\SqlServer\140\SqlSchemas\msdb.dacpac">
<HintPath>$(DacPacRootPath)\Extensions\Microsoft\SQLDB\Extensions\SqlServer\140\SqlSchemas\msdb.dacpac</HintPath>
<SuppressMissingDependenciesErrors>False</SuppressMissingDependenciesErrors>
<DatabaseVariableLiteralValue>msdb</DatabaseVariableLiteralValue>
</ArtifactReference>
Here's what it looks like in the model.xml file if the dacpac is unpacked:
<CustomData Category="Reference" Type="SqlSchema">
<Metadata Name="FileName" Value="C:\PROGRAM FILES (X86)\MICROSOFT VISUAL STUDIO\2019\ENTERPRISE\COMMON7\IDE\EXTENSIONS\MICROSOFT\SQLDB\EXTENSIONS\SQLSERVER\140\SQLSCHEMAS\MASTER.DACPAC" />
<Metadata Name="LogicalName" Value="master.dacpac" />
<Metadata Name="ExternalParts" Value="[master]" />
<Metadata Name="SuppressMissingDependenciesErrors" Value="False" />
</CustomData>
<CustomData Category="Reference" Type="SqlSchema">
<Metadata Name="FileName" Value="C:\PROGRAM FILES (X86)\MICROSOFT VISUAL STUDIO\2019\ENTERPRISE\COMMON7\IDE\EXTENSIONS\MICROSOFT\SQLDB\EXTENSIONS\SQLSERVER\140\SQLSCHEMAS\MSDB.DACPAC" />
<Metadata Name="LogicalName" Value="msdb.dacpac" />
<Metadata Name="ExternalParts" Value="[msdb]" />
<Metadata Name="SuppressMissingDependenciesErrors" Value="False" />
</CustomData>
So I'm wondering how this is supposed to work since Azure MSBuild task (and Visual Studio build task) which are hosted in Azure will reference the master and msdb dacpacs from a Visual Studio Enterprise folder but on the SQL Server which is where SqlPackage.exe is run to do the deployment those files are found in the BuildTools folder instead of Enterprise. (And I don't really think I should have to install VS Enterprise edition on my SQL server to get this to work?)
In Azure:
C:\PROGRAM FILES (X86)\MICROSOFT VISUAL STUDIO\2019\\**ENTERPRISE**\\COMMON7\IDE\EXTENSIONS\MICROSOFT\SQLDB\EXTENSIONS\SQLSERVER\140\SQLSCHEMAS
On the SQL Server:
C:\Program Files (x86)\Microsoft Visual Studio\2019\\**BuildTools**\\Common7\IDE\Extensions\Microsoft\SQLDB\Extensions\SqlServer\140\SQLSchemas
It seems weird that the path to these files on the build server would have to exactly match what's on the sql server. I thought there would be a bit more magic happening to find these dacpac references since from what I understand you aren't supposed to manually add them to your project and drag them around everywhere.
Database projects deployment - No file was supplied for reference
master.dacpac
This is a typical SSDT issue. The path to these files on the build server is not the main cause of your issue.
Please check No file was supplied for reference ABC.dacpac; deployment might fail and Error: The reference to external elements from the source named 'master.dacpac' could not be resolved.
Try copying the master.dacpac to the same folder where your xx.dacpac exists. (You can do this by using copy/xcopy command in cmd) And make sure the working directory of your CMD task is in same folder. Hope it helps :)
I have a problem with WiX when I try to generate an MSI (WiX v3.11).
My product is a windows service that must be installed, and during this installation process, I launch a form that collects information for connection to database. This information is stored in a .config file that is installed in the same application folder.
The problem is that if I do an update of the package, I must keep the configuration file, but if I uninstall the application, it should delete the configuration file.
The configuration file can be modified after or during installation.
<MajorUpgrade Schedule="afterInstallInitialize" />
<ComponentGroup Id="ConfigFiles" Directory="INSTALL_SERVICE">
<Component Id="ConfigFile" Guid="11FDDC05-F4D2-4418-82E8-0CB3B3784300" Win64="$(var.Win64)" NeverOverwrite="yes" >
<File Id="F.config" Name="service.config" DiskId="1" Vital="yes" KeyPath="yes" Source="..\Resources\service.config" />
<RemoveFile Id="CleanUpLogFile" On="uninstall" Name="service.config"/>
</Component>
</ComponentGroup>
With this I have managed to delete in the uninstall process and not be modified during the update, but the update process fails.
I have tried and read many solutions on the web but I do not get any of them working.
I am using ClickOnce application deployment, and I just got my code certificate from Verisign. I am using this certificate to sign the manifest.
When I download and install the application, the smartscreen comes up with my name on it (lame, but I think this is what is supposed to happen). When the ClickOnce installer completes, the smartscreen comes up again for the execution of the actual application, here it says 'Unknown Publisher'.
Does ClickOnce not sign the assemblies by default? How do I do this?
Edit: Currently I am letting VS sign my manifest (installer) for the ClickOnce, and I am setting a Post-build event to sign my assembly. But still when I install the application it says 'unknown publisher' when I go to actually run it.
That does not sound right to me. I have used exactly the same workflow for multiple applications, and it works fine. Most likely there is an issue with your postbuild step. Make sure that you sign EXE file inside the OBJ folder (because that's where ClickOnce takes all the files from) - not the BIN one.
Do ClickOnce publishing, go to the OBJ folder, right click on your application.exe file, and select properties. It should have six tabs - the last one being "Digital Signature":
If you don't have it, you don't sign your application properly.
And here is my postbuild step - note that I sign "RELEASE" configuration only:
<Target Name="SignOutput" AfterTargets="CoreCompile" Condition="'$(ConfigurationName)'=='Release'">
<PropertyGroup>
<TimestampServerUrl>http://timestamp.verisign.com/scripts/timestamp.dll</TimestampServerUrl>
<ApplicationDescription>my app</ApplicationDescription>
<SigningCertificateCriteria>/n "my company."</SigningCertificateCriteria>
</PropertyGroup>
<ItemGroup>
<SignableFiles Include="$(ProjectDir)obj\$(ConfigurationName)\$(TargetName)$(TargetExt)" />
</ItemGroup>
<Exec Condition=" '$(ConfigurationName)'=='Release'" Command=""c:\Program Files (x86)\Windows Kits\8.0\bin\x64\signtool.exe" sign $(SigningCertificateCriteria) /d "$(ApplicationDescription)" /t "$(TimestampServerUrl)" "%(SignableFiles.Identity)"" />
</Target>
Currently I have a post-build event configured in my web project using Visual Studio 2012 like this:
This basically calls a PowerShell script to add a copyright notice to every .cs file.
What I'd like to do is to execute this powershell script only before Publishing the web app to the remote server. Doing so I won't experience a delay every time I need to debug the project. Do you know of any way of accomplishing this?
According to Sayed's answer, I customized a specific publish profile and added this:
<PipelineDependsOn>
CustomBeforePublish;
$(PipelineDependsOn);
</PipelineDependsOn>
</PropertyGroup>
<Target Name="CustomBeforePublish">
<Message Text="******* CustomBeforePublish *******" Importance="high" />
<Exec Command="powershell.exe -file "$(ProjectDir)\Copyright.ps1"" />
</Target>
It depends on how you define before but below is one technique.
When you create a publish profile with VS2012 it will create you a .pubxml file in the Properties\PublishProfiles folder (My Project\PublishProfiles for VB). These are MSBuild files and you can edit them to customize the publish process. In your case you can inject a target into the publish process, before the publish actually occurs. You can do that by extending the PipelineDependsOn property as below.
<PropertyGroup>
<PipelineDependsOn>
CustomBeforePublish;
$(PipelineDependsOn);
</PipelineDependsOn>
</PropertyGroup>
<Target Name="CustomBeforePublish">
<Message Text="********************************** CustomBeforePublish ***********************************" Importance="high"/>
</Target>
FYI regarding the customization of .wpp.targets, that was the only technique which we had for VS2010. My recommendation here is as follows; customize the .pubxml file for most cases and to only create a .wpp.targets file if you want to customize every publish of the given project.
Sayed's answer nails the problem. However, I thought about providing a fully working answer (testing in Visual Studio 2017):
<PropertyGroup>
<PipelineDependsOn>
PreBuildScript;
$(PipelineDependsOn);
</PipelineDependsOn>
</PropertyGroup>
<Target Name="PreBuildScript">
<Message Text="Executing prebuild script" Importance="high"/>
<Exec Command="powershell.exe -file "$(ProjectDir)\InnerFolder\script.ps1"" />
</Target>
Note: This will execute for both Preview and actual Publish action, so one can find out pre-publish errors before the actual Publish.
Declare the following ProjectName.wpp.targets file in the root of your web application:
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<PropertyGroup>
<BeforeAddContentPathToSourceManifest>
$(BeforeAddContentPathToSourceManifest);
AddCopyright;
</BeforeAddContentPathToSourceManifest>
</PropertyGroup>
<Target Name="AddCopyright">
<!-- I recommend passing in $(_MSDeployDirPath_FullPath) to your script
as the base path to search to avoid having to perform a VCS rollback
(files are copied there before the deployment)
-->
<Exec Command="powershell.exe -file "$(SolutionDir)Copyright.ps1" "$(_MSDeployDirPath_FullPath)"" />
</Target>
</Project>
I have the following target in my nant script:
<target name="update" verbose="true">
<copy todir="${dirs.deploy}">
<fileset basedir="${dirs.drop}\_PublishedWebSites\RomanceReminder.Web">
<include name="**/*.*" />
</fileset>
</copy>
</target>
when I run this script manually the following output is visible in the log:
[nant]
C:\Projects\RomanceReminder\BuildScripts.Custom_test_deploy.build
Buildfile:
file:///C:/Projects/RomanceReminder/BuildScripts.Custom/_test_deploy.build
Target framework: Microsoft .NET
Framework 3.5
Target(s) specified: go
error_check:
stop_w3svc:
cleanup:
[echo] Deleting C:\Webs\Nightly.
update:
[copy] Copying 93 files to
'C:\Webs\Nightly'.
start_w3svc:
go:
BUILD SUCCEEDED
Total time: 2.6 seconds.
As you can see it move 93 files into the web\nightly folder.
When this script is run via TeamCity the copy doesn't happen for some reason. Team city is running under an admin account so it should have all the permissions it needs. The log file for TC show the exact text above except the update task shows nothing.
Anyone have ideas on how I can even troubleshoot this?
UPDATE: I flipped the bit on the copy task to give verbose logging. and now I see the following in my TeamCity log:
[copy] Copying 0 files to 'C:\Webs\Nightly'.
I still am flummoxed by I can run it from the command line and everything works, but TC doesn't copy files... 8(
User Error User Error User Error
Of course, I was not trusting the tool assuming it was doing something wrong. The drop directory is only populated in the package step. This particular script executes before that. Team City destroys the build directory every time it runs including the drop directory. So nant was correct, there were no files to copy. I modified my script to use the build output and all is good with the world.