We are creating a new data warehouse using SSIS and are looking at BIML Studio. I know that for BIML Express I need Visual Studio, but for BIML Studio it seems that we don't even need Visual Studio if we develop our entire ETL with BIML Studio. Is this correct or do I still need Visual Studio in some way?
Standard consulting answer: "It depends"
Your BimlStudio workflow is probably going to be a few BimlScript files that contain your core logic. And then there's gonna be the generated artifacts. As an example, here's shot of the Logical view of my current Biml project. It's a large and still growing "export procs to fixed width files" solution for a client.
Since I need to write to a flat file, that means each package needs a Flat File Connection Manager and a Flat File Connection Manager needs a Flat File Format definition. So, 1 logical entity requires 3 Biml artifacts (at least for how I'm building it)
What you see in the BimlScripts and Connections folder are what run the project (plus the custom metadata repository aka "one big table").
The black circle next to 01_FFF.biml means that's a "live" Biml so every time I make a change to my metadata or the underlying file, whoosh out comes 45 File Format entries (project view)
All of this is great but eventually, I need to translate what's in my Integration Services node (1 project, 45 Packages) into a deliverable.
What's your deliverable?
Right clicking on the project gives me 3 options: Build, Build & Run, Build and & Open in SSDT
Build - this results in a .ispac file being created. That's the quantum for pushing a project deployment model into the SSISDB
Build & Run - Honestly, I don't know what this option does. I should check the book https://link.springer.com/book/10.1007/978-1-4842-3135-7
Build & Open in SSDT - This results in everything you need to interact with the project in Visual Studio (a .dtproj file, Project.params, any Project level .conmgr files and all the associated .dtsx files)
When would I need Visual Studio?
Debugging. I'm pretty good at this stuff but even I miss some settings for things I don't have solid patterns for. For example, this project is using Fixed Width File Formats and in early iterations, I was getting defects open as the files weren't correct because the default file encoding was for Unicode, despite each individual column being defined as DT_STR (non-unicode). Little stuff like that is much easier to find and resolve and fix back in BimlStudio. Otherwise, you're trying to debug the results of a package execution but if you knew you had the wrong pattern, you wouldn't have built the bug into your pattern.
Related
I tried searching with as many different terms as I could and couldn't find exactly what I'm looking for.
I have a C++ Project developed in Visual Studio 2019 and I am trying to build and deploy it in Azure Pipelines. It uses Boost and OpenCV. I skipped trying to include these in Azure Artifacts because of a rabbit hole with Azure CLI errors that took me almost half a day.
So it seems that there is a task to publish pipeline artifacts in the .yml file. How do I do this when my project needs to reference a certain directory, instead of one specific file or .dll? Here are images for how this is configured in Visual Studio:
include directory for boost image
include directory settings for opencv image
Edit: Still trying, see my comment. Thinking about switching over to CircleCI.
I found out what to do. Hopefully no one else wastes as much time as I did.
The key was MSBuild. One needs to first find out the values of $(IncludePath) and $(LibraryPath) by doing the following first in Visual Studio:
Right-click on your project, choose "Properties"
Go to the Build Events tab, and click "Pre-Build Event"
Click on and expand the Command Line row, and click "Edit"
Now click the button that says "Macros>>"
You will see a bunch of different variables and their values. Find the values for LibraryPath and IncludePath, copy and past them into a text file.
Now, assuming you already set up a local agent, follow these steps:
Put the text file in the root folder of where your agent is installed. For me, this was "C:\agents"
Have the first line be "LibraryPath=value" and the other line be "IncludePath=value". Use double slashes for the directory paths.
Rename the file to .env. If the agent is currently running, restart it so it can read in the environment variables it will use during your build.
In the MSBuild task of your pipeline, specify arguments. For my case, it was simply this: /p:IncludePath="C:\Program Files\boost_1_77_0;$(IncludePath)" /p:LibraryPath="$(LibraryPath)"
Run the pipeline. You can check your completed build on the local machine. For me, the path it kept going to was "C:\agents_work\2\s"
I ran Protobuild.exe, but no one has mentioned where it outputs to. Does anyone here know where the default folder is suppose to be at?
Running Protobuild.exe generates the Visual Studio projects and solutions for every platform. The solutions are in the root directory, i.e. along Protobuild.exe.
Note that Protobuild does not build the framework - you have to open the solution and build it in Visual Studio (which generates its output in MonoGame.Framework\bin subdirectory).
Is there a way to create a "build" but not actually compile the output of the site? Bascially I want to push the files live from the source control in TFS to the final IIS folder destination.
I have used CopyDirectory on my other project builds, but that requires a BuildDetail.DropLocation (compiled Build). Maybe there is another option for the CopyDirectory Source I could use that wouldn't require a build DropLocation.
To simplify, I want to copy files directly from a tfs Source control to a folder, using a Build Template, but that won't compile the files. Is that possible?
While the default .xaml build workflow for Team Foundation Build is indeed a compilation build it does not have to be. I usually recommend teams to at least have one compile and one deploy .xaml workflow.
1) The CompileMyStuff.xaml (DefaultBuildTemplate.xaml) should take my stuff from source control and do whatever is needed to create a Build Drop folder with my output. I may or may not need to actually compile before creating the drop, and it looks like you just want to copy to the drop location.
2) The DeployMyStuff.xaml should take a build number and deploy my code to an environment of my choice.
It looks like you want to skip the "Drop" and go state to deploy and while I would never recommend this you do have a "BuildDetail.BuildLocation" for teh local workspace where the build server has done a get of your code. You can just "CopyDirectory" from there to your server/host for the website.
If you are having a little trouble you could use the Community Build Extensions and fire up PowerShell to do your copy/deploy.
I figured out a solution to this problem. I created a new .xaml file and then the only item that I put in the sequence was "DownLoadFiles". Then I filled out the properties of the task and ran a "build" and it worked.
I am building a C++ solution with Visual Studio 2005.
Sometimes I open the solution in Visual Studio and build it from within the development environment. Other times I build it from the command line using msbuild.exe. I'm wondering if there is a way that I can determine which of these two types of builds I'm using at compile time (for example, a macro or something like like that). I want to change the path of my output files based on this determination. So, if I'm building from within Visual Studio I would put my output files in FolderA but if I'm building from the command line I would put my output files in FolderB. Is this possible?
Perhaps you can pass in a command-line parameter when building from the command-line that would indicate you are building the solution from the command-line. Otherwise, you can assume you are building from within Visual Studio.
I don't have the answer to your general question, but in order to change the output path, have you thought of adding project configurations ? You could copy project configurations and update the output path of the new ones.
My team is developing a new DotNetNuke web application and would like to know what is recommended to setup a development environment with source control and automated builds? We would like to keep the DNN source code separate from our custom modules and extensions source code.
The DotNetNuke Compiled Module template for Visual Studio wants us to store the source code in the DesktopModules directory of the DNN source code and output to the DNN source code bin directory. Is this the recommended structure? I would rather keep the files in different locations, but then it becomes more difficult to run and debug locally as it would require an install of the module for each change. Also, how should an automated build deploy any changes?
How have others set this up? Is there a recommended best practice?
For my source control, I develop modules in their own project. This contains the module code, test code, data provider code (if applicable) and anything else. This is checked into source control like any other project. Note that the module project contains no links to a specific DNN website, and DNN references are made in the project to a common "bin" directory that references your target build. For example, in my projects folder, I have \bin460 , \bin480, \bin510, \bin520 etc. Each of these folders holds a set of binaries for a specific DNN version. That way you can build against a particular version but test against any version you like.
The problem with source-controlling a module in place in a dnn install is
- sometimes not all of the module code is easily isolated under a single parent directory
- doesn't lend well to a PA module approach
- not easy to shift the project to a different DNN Version for development or testing
- easy to inadvertently source control parts of the DNN solution, particularly with integrated VS source control solutions.
This approach compiles quickly because you're not trying to compile the entire project. For test deployment I have a build script that copies the various parts of the module into a target website. This can be done via the compile (link the build script) or just run after you've had a successful compile in a cmd window. My build script has a 'target' environment switch, so that I can say 'dnn520' to deploy the build to my test dnn520 install. Note that you need to manually create the module configuration first before this will work, but this is a one-time effort, and you can use the export feature to create your .dnn module manifest.
To build your module package, invest the time in a comprehensive script which will take the various parts from your source directory, and zip them into an install package. Keep all of the parts in your source control folder, and copy them into a temp directory, then run a command-line zip utility (I use an ancient version of pkzip) to pack it into an installable file.
The benefits of this approach is :
- separation of module code from installed code
- simple way of keeping only the module code in source control (don't have to exclude all the website code)
- ability to quickly test out modules in different dnn versions
- packaging script allows you to quickly and easily build a new version of a module for install testing/deployment
The drawbacks are
- can't use the magic green 'go' button in VS (have to manually attach debugger)
- more setup time than developing in-place
We typically stick to keeping the module code in a folder under DesktopModules and building to the website's bin directory.
In source control, we just map the individual modules, rather than the entire website. Depending on what we're working on, a module may be an entire project in source control, or we may have multiple related modules in the same project, living next to each other.
Automatically deploying changes is somewhat difficult in DNN. It's highly recommended to have a build script that packages your module into an installable form. You can then copy installable packages into the website's Install/Module folder, and get the URL /Install/Install.aspx?mode=InstallResources, which will install any packages in that folder.
In response to bduke's answer. You should, and don't want to build projects in the DesktopModules folder.
That's where all of the source code for the site out of the box goes.
That's where you modules will be "installed" and thus if someone "updates" or re-installs one, then it will be overwritten
It can make upgrading your Application far more difficult. Many developers don't understand that the idea of not touching the original source code files to modify their behavior. BECAUSE it will just be overwritten when you perform an upgrade.
If you want to build modules, create a solution folder called Modules and place your seperate project modules there.
If you want to debug them, make the target debug output point to the web\bin folder.
If you want to install/deploy them. Build it in release mode and install them through the Module/Extension filter.