I tried searching with as many different terms as I could and couldn't find exactly what I'm looking for.
I have a C++ Project developed in Visual Studio 2019 and I am trying to build and deploy it in Azure Pipelines. It uses Boost and OpenCV. I skipped trying to include these in Azure Artifacts because of a rabbit hole with Azure CLI errors that took me almost half a day.
So it seems that there is a task to publish pipeline artifacts in the .yml file. How do I do this when my project needs to reference a certain directory, instead of one specific file or .dll? Here are images for how this is configured in Visual Studio:
include directory for boost image
include directory settings for opencv image
Edit: Still trying, see my comment. Thinking about switching over to CircleCI.
I found out what to do. Hopefully no one else wastes as much time as I did.
The key was MSBuild. One needs to first find out the values of $(IncludePath) and $(LibraryPath) by doing the following first in Visual Studio:
Right-click on your project, choose "Properties"
Go to the Build Events tab, and click "Pre-Build Event"
Click on and expand the Command Line row, and click "Edit"
Now click the button that says "Macros>>"
You will see a bunch of different variables and their values. Find the values for LibraryPath and IncludePath, copy and past them into a text file.
Now, assuming you already set up a local agent, follow these steps:
Put the text file in the root folder of where your agent is installed. For me, this was "C:\agents"
Have the first line be "LibraryPath=value" and the other line be "IncludePath=value". Use double slashes for the directory paths.
Rename the file to .env. If the agent is currently running, restart it so it can read in the environment variables it will use during your build.
In the MSBuild task of your pipeline, specify arguments. For my case, it was simply this: /p:IncludePath="C:\Program Files\boost_1_77_0;$(IncludePath)" /p:LibraryPath="$(LibraryPath)"
Run the pipeline. You can check your completed build on the local machine. For me, the path it kept going to was "C:\agents_work\2\s"
Related
We are creating a new data warehouse using SSIS and are looking at BIML Studio. I know that for BIML Express I need Visual Studio, but for BIML Studio it seems that we don't even need Visual Studio if we develop our entire ETL with BIML Studio. Is this correct or do I still need Visual Studio in some way?
Standard consulting answer: "It depends"
Your BimlStudio workflow is probably going to be a few BimlScript files that contain your core logic. And then there's gonna be the generated artifacts. As an example, here's shot of the Logical view of my current Biml project. It's a large and still growing "export procs to fixed width files" solution for a client.
Since I need to write to a flat file, that means each package needs a Flat File Connection Manager and a Flat File Connection Manager needs a Flat File Format definition. So, 1 logical entity requires 3 Biml artifacts (at least for how I'm building it)
What you see in the BimlScripts and Connections folder are what run the project (plus the custom metadata repository aka "one big table").
The black circle next to 01_FFF.biml means that's a "live" Biml so every time I make a change to my metadata or the underlying file, whoosh out comes 45 File Format entries (project view)
All of this is great but eventually, I need to translate what's in my Integration Services node (1 project, 45 Packages) into a deliverable.
What's your deliverable?
Right clicking on the project gives me 3 options: Build, Build & Run, Build and & Open in SSDT
Build - this results in a .ispac file being created. That's the quantum for pushing a project deployment model into the SSISDB
Build & Run - Honestly, I don't know what this option does. I should check the book https://link.springer.com/book/10.1007/978-1-4842-3135-7
Build & Open in SSDT - This results in everything you need to interact with the project in Visual Studio (a .dtproj file, Project.params, any Project level .conmgr files and all the associated .dtsx files)
When would I need Visual Studio?
Debugging. I'm pretty good at this stuff but even I miss some settings for things I don't have solid patterns for. For example, this project is using Fixed Width File Formats and in early iterations, I was getting defects open as the files weren't correct because the default file encoding was for Unicode, despite each individual column being defined as DT_STR (non-unicode). Little stuff like that is much easier to find and resolve and fix back in BimlStudio. Otherwise, you're trying to debug the results of a package execution but if you knew you had the wrong pattern, you wouldn't have built the bug into your pattern.
I try to use programs or scripts on github form time to time, and several times I find myself incapable of executing the program and that is generally because the tutorial points towards a .exe or that does not exists on the github repository (folder containing all the gihub project files).
This is the perfect example :
https://github.com/agaboduarte/AliExpressScraper
This indicates that for using it you put
alishop.exe -ProductId=32704963843
or
alishop.exe -ProductUrl=https://pt.aliexpress.com/store/product/Original-Meizu-MEILAN-E-5-5-inch-2-5D-FHD-1080P-MTK-Helio-P10-Octa-Core/103919_32712980451.html?detailNewVersion=&categoryId=5090301&spm=a2g03.8047714.2169898.2.6F0m5X
As command line.
But there is no such file as 'alishop.exe' in this repo !
The only other .exe file that there is is NuGet.exe, so I tried to use
NuGet.exe -ProductId=32704963843
Result : the command '-ProductId=32704963843' is said to be 'unknown' (command launched in the folder containing NuGet.exe of course)
Any idea of what do I do wrong ?
Source code must typically be built or compiled. In this case, it's a C# project, and expects you to have Visual Studio.
In other cases, Github projects may simply be scripts, documents, or plugins. They may not be standalone executables, and you will need other tools to make use of them.
I've added a webjobs sdk project to my existing website. The website runs as an azure app service. I've always done building and deployment by queueing up a new build in visual studio online and deploying from there to my azure website. Recently I created this webjob project in the same solution, that based on this webjobs-list.json generated and put in the website project should cause the webjob to also be deployed with the website during deployment (or so the documentation says). What is happening though is that when it deploys, and I take a look at what is in app_data\jobs\continuous, is not the binaries and executable that I expect, it's the actual source code/project files that have been copied into there. Obviously that isn't going to run, and it shouldn't have thrown source code out there on my website anyway.
I also had to change my release definition in visual studio online to just look for [my website project name].zip, instead of just *.zip, because otherwise I'd get an error from the release indicating: Error: More than one package matched with specified pattern. Please restrain the search patern.
...this appeared to be because the build process not only creates a zip file for my website, it also creates one for the webjobs project. From what I understand and have read, I am supposed to change my release to just look for the website zip file and ignore the other zip file, and just let that get deployed and it should all work fine, but again, what is copied into my jobs folder on the website isn't the binaries or executable for the webjob, it's the actual source files.
How can I get this to deploy just the binaries and executable with the site instead of the source files?
The only other thing I could find to do is remove the webjobs-list.json file from the web project so they are no longer linked together, which causes the build to no longer populate app_data\jobs\continuous with my web job project source files when deployed, and to create an additional task in my release definition to grab and deploy the other zip file that is created during the build (for the webjob project, and it contains the debug files with those binaries for whatever reason). However everything I read tells me that this is not supposed to have to be done, it should just work without me having to do this.
EDIT:
My web project is an MVC 5 project that I created with VS 2013. The web jobs project uses the 2.0.0.0 version of the webjobs sdk.
The build and release definitions, I followed the steps in this article to create:
https://www.visualstudio.com/en-us/docs/build/apps/cd/deploy-webdeploy-webapps
The only additional thing I did after following this article, is in my release definition, I changed the Package or Folder field to look for [my mvc web project name].zip, instead of *.zip, otherwise I'd get the error message noted above.
How can I run custom script which will upload ClickOnce deployment files to a web-server (in my case Windows Azure Blog Storage) right after publishing? Is it possible to modify MSBuild file in some way so it would run custom script right after ClickOnce published files into a local folder?
Yes, you can hook to build process using various technics:
pre and post build actions ( from visual studio project properties menu). It's actually exec task hooked into your project file
you can override your DependsOn property for concrete target and append execution of your own target (pre-Msbuild 4.0 way)
you can declare your target and hook with AfterTarget\BeforeTarget attributes (Msbuild4.0 way).
As for uploading something to blob - you can use Exec task in your own target to upload or use whatever tool\script you usually use to uploading files to website\Blob storage.
NB: You could clarify your question with following points (if you need more concrete answer) :
what kind of build process you are using - build from VS, CI server with custom msbuild script, CI server that building your sln file etc
what kind of script\tool you want to execute to upload build result.
do you know the name of last executed msbuild target, after which you want to fire your tool.
Is there a way to create a "build" but not actually compile the output of the site? Bascially I want to push the files live from the source control in TFS to the final IIS folder destination.
I have used CopyDirectory on my other project builds, but that requires a BuildDetail.DropLocation (compiled Build). Maybe there is another option for the CopyDirectory Source I could use that wouldn't require a build DropLocation.
To simplify, I want to copy files directly from a tfs Source control to a folder, using a Build Template, but that won't compile the files. Is that possible?
While the default .xaml build workflow for Team Foundation Build is indeed a compilation build it does not have to be. I usually recommend teams to at least have one compile and one deploy .xaml workflow.
1) The CompileMyStuff.xaml (DefaultBuildTemplate.xaml) should take my stuff from source control and do whatever is needed to create a Build Drop folder with my output. I may or may not need to actually compile before creating the drop, and it looks like you just want to copy to the drop location.
2) The DeployMyStuff.xaml should take a build number and deploy my code to an environment of my choice.
It looks like you want to skip the "Drop" and go state to deploy and while I would never recommend this you do have a "BuildDetail.BuildLocation" for teh local workspace where the build server has done a get of your code. You can just "CopyDirectory" from there to your server/host for the website.
If you are having a little trouble you could use the Community Build Extensions and fire up PowerShell to do your copy/deploy.
I figured out a solution to this problem. I created a new .xaml file and then the only item that I put in the sequence was "DownLoadFiles". Then I filled out the properties of the task and ran a "build" and it worked.