Our development uses lots of open-source code and I'm trying to figure out what the best way to manage these external dependencies.
Our current configuration:
we are developing for both linux and windows
We use svn for our own code
external dependencies (boost, log4cpp, etc) are not stored in svn. Instead I put them under ./extern (or c:\extern on windows). I don't want to put them in our repository because I will not be able to update them that way. Some of these are constantly being updated.
My questions
What to do if I need to modify external code?
Currently I have created a folder in my svn repository called extern_hacks and that is where I put the modified external code. I then link (or copy on windows) the files into the external directory structure. This solution is problematic since it is hard to keep track of copying the files, and very hard to update from svn when files are sitting in two repositories (mine for the modified files, and the original repository say sourceforge)
How to manage versions of external dependencies?
I'm interested to hear how others deal with these issues. Thanks.
I keep them in svn, and manage them as vendor branches. Keeping them loose externally makes it very hard to go back to a previous build, or fix bugs in a previous build (especially if the bug is from a change to the external dependency)
Keeping them in svn has saved me lots of headache, and also allows you to get a new workstation able to work on your codebase quickly.
I do not understand why you say
I don't want to put them in our repository because I will not be able to update them that way. Some of these are constantly being updated.
You really need to
include external dependencies in your source control and periodically update them and then tese, test, test.
Coordinate your build process with the updates for the external dependencies.
If your code depends upon something, then you really need to have control over when it gets updated/modified. Coding in a space where these dependencies can get updated at any time is too painful as you're no doubt finding out. I personally prefer option 1.
When I had to do something like this, I added the external source as external, and then applied a patch to it. The patch contains my modifications to the external source. So, I actually only version control my patches. Most of the times this works, if there are no "dramatic" changes in the external code.
Have you considered Maven? It's a build system that has excellent support for managing dependencies. For each project you can specify the required dependencies in an xml file as part of that project. The external libraries are held in a dependency repository (in our case Artifactory) this is separate from your version control system and can just be a network drive. It also allows managing different versions of projects.
I would be careful considering Maven because:
it is another repository in a system where you already have a repository with your current version control system;
it (Maven) is based on the only "common version control" every developer have, the file system (which means no metadata, or properties attached to the file, no proper history in term of who modified what and when)
Now when dealing with third-parties, you can consider having them in your version control system, but in a packaged way: that is in a very compact way, with sources and documentations zipped, in order to have the least possible number of files.
That way, you will manage the deployment of those (many) third-party libraries easily since the number of files to deploy is low.
Plus, having them under source control allows you to make a branch (say, a 'hack' branch), in which you will stored the packaged (or zipped) version of the hacked library.
What you can store in an external way is the un-zipped, complete set of files representing those libraries since there is no real development on them, or just a punctual hack: normally, your job is not to develop existing libraries, but to use them (even a bit modified) for implementing faster some features of your project.
If you need at some point to compare some hacked version with some official version, you will just pull out from svn the appropriate 'hacked' version number, unzip-it and compare-it with the official (and externally stored) version (with winmerge for instance)
Related
So I decided to add my referenced 3rd party dlls to source control in a separate folder called lib and then reference them from said directory.
This works just fine, but when I want to update the files, TFS seems completely oblivious to the fact that the files have actually changed. Even if I copy over the old files, there seems to be no way of checking in the newer ones. If I choose the Check-in pending changes from the source control explorer, I get an info box saying there are no changes. But if I run a compare to a single DLL between latest and workspace versions, TFS does tell me the files are indeed different.
So is the only solution to delete the files from source control and then re-add them back as the newer versions, or could I just somehow update them?
Team Foundation Server (through 2010, and with 2012's "Server Workspaces") use a "Checkout/Edit/Checkin" model for version control that differs from many other types of version control systems (eg, "Edit/Merge/Commit" systems).
In order to update your binaries, you need to explicitly check them out and update the contents. You can then check them in. This type of system is tuned for dealing with large repositories and large files like binaries since it need not scan your disk to determine whether files have changed or not.
If you prefer to work with an Edit/Merge/Commit type system, which will scan your disk looking for changes and you need not explicitly check files out, this is available in TFS 2012 (as "Local Workspaces").
Have you tried to check out for edit the file before replacing it? It works here...
I store all SSIS packages in Subversion repository, their configuration files as well. Configuration file almost always stored in the same folder where package is.
Problem is - SSIS seems to always store path to configuration file (the one saved in the package itself) as an absolute path.
When someone else checks out folder with the package in the location different from where I had on my development PC the configuration file is not detected (because my absolute path is stored and it doesn't exist on the other developer PC). So another developer has to remove this configuration and add it again from where it is now on his local hard drive. Then changed package is saved which will cause new version to be committed. When I get that version from SVN it will no longer match local path on my PC.
On a related note: another developer may want to change values in configuration file as well. If I later get the latest version of everything from SVN package will no longer work on my PC.
How do you work around these inconveniences?
Another solution is to save your configuration in a database with an environment variable as the first configuration to tell it what database to look in, that's what we do. We have scripts to populate ssisconfig for each server in our source control, but the package uses the actual table data for the database in the environment variable we are using.
Anyone who has heard my SQL Saturday presentations knows I don't much care for XML and this is one of the reasons. A trick to using XML configuration with varying locations is to use an environment variable (indirect configuration) to direct SSIS where it can look for that resource. The big, big downside to this approach is you'd generally need to create an environment variable for each set of configuration files or have a massive, honking .dtsconfig file which becomes painful for versioning.
The option I prefer if XML configuration is a must is that the "variableness" is removed. Developers and admins get together and everyone agrees "there will be a folder everywhere SSIS is done to hold configuration files and that location is X" and then it's just a matter of solving for X. At a previous job, we used D:\ssisdata\configs
#HLGEM's approach of a table for configurations is hands down my favorite approach to SSIS configuration (until you get to 2012 and their project deployment model where configuration is an entirely different animal)
I add a folder called "config" under my projects folder, add it to source control and mantain the config file in this folder. You can also add it to the SSIS project if you like.
I think its a good solution because everybody can have this folder and dowload the config file.
When the package is deployed it will read the config file from where you inform in the deployment manifest so this solution wont impact your development
I am tasked with setup a Mercurial version control system for our small team of developers (2-3 person). There was no version control system before, just shared folders and multi-copies. I don't have much experience in setting version control system except for personal projects, just happened to be the most experienced person in term of version control system in our team. The code repository is in a shared folder in centre server, the top leve directory is client name, one level down is project name for that client.
The problems is I haven't figure out how to deal with binary files in our code repository. From what I read, the binary files shouldn't be version tracked. But as the code repository is centralized on the server, shouldn't the binary in here as well? Otherwise for things like image file, and third-party dll files, the project wouldn't build or run properly when cloned from centre server. Also there is a nice feature for Mercurial web interface where you can download the whole source package as ZIP or BZ2 compressed file, without necessary binary files, the download project wouldn't run or compile.
I guess the solution is including everything for the version control system except the temporary files and the files for debug purpose, but other than that, most binary files should be included? Due to limitation of version control system, I don't think there is a way for them to track changes sets only for binary files, so I guess we have to deal with it for a version control system.
Edit: After more research about how to setup version-control repository, the more recommended way of using version-control is to "store everything which is created manually, and nothing else", quote from Eric Sink.
You want to version control anything that you can't generate from other stuff in version control. That would be your source files, and your instances of third-party libraries, tools, etc. that your package relies on.
The binaries built from your project are something else entirely, and should be treated as different sorts of artifacts. If you want an easy-to-test downloadable archive, adapt your build process to provide that as a target: it should build the code, and then compress the source and built binary into the desired single file.
Binary files that are related or required by the project must be included in version-control, they can be tracked. The only thing that version control can't do with binary files is compare and merge.
Is it common for a developer to keep their NAnt global configuration file (NAnt.exe.config) in version control?
And should or shouldn't the the rest of the files in the NAnt installation be added to the ignore file of the version control system?
One use of version control is as a backup. If the only copy of NAnt.exe.config is on a hard disk that dies, it will take some effort to reconstruct it (along with everything else that disappeared and wasn't backed up).
From the corporate perspective, having all of the work in progress backed up is a method for preserving assets. The corporate owner of the source code asset is assured that the asset will not be destroyed.
When there is another backup strategy, then sometimes the rule of thumb is not to put anything into version control that should not be shared with other developers. Such as customized data relevant only to one user and/or machine, or confidential information.
I keep a copy of the NAnt code for the version I'm using. This includes the .config file. This is so my build system is safe from "it disappeared from the internet" events (unlikely, but still).
Beyond that I see no reason to keep it around on your code repository, unless for some reason you've modified it somehow. Most everything in NAnt can be overridden in build files, like the target framework and so on.
My team is developing a new DotNetNuke web application and would like to know what is recommended to setup a development environment with source control and automated builds? We would like to keep the DNN source code separate from our custom modules and extensions source code.
The DotNetNuke Compiled Module template for Visual Studio wants us to store the source code in the DesktopModules directory of the DNN source code and output to the DNN source code bin directory. Is this the recommended structure? I would rather keep the files in different locations, but then it becomes more difficult to run and debug locally as it would require an install of the module for each change. Also, how should an automated build deploy any changes?
How have others set this up? Is there a recommended best practice?
For my source control, I develop modules in their own project. This contains the module code, test code, data provider code (if applicable) and anything else. This is checked into source control like any other project. Note that the module project contains no links to a specific DNN website, and DNN references are made in the project to a common "bin" directory that references your target build. For example, in my projects folder, I have \bin460 , \bin480, \bin510, \bin520 etc. Each of these folders holds a set of binaries for a specific DNN version. That way you can build against a particular version but test against any version you like.
The problem with source-controlling a module in place in a dnn install is
- sometimes not all of the module code is easily isolated under a single parent directory
- doesn't lend well to a PA module approach
- not easy to shift the project to a different DNN Version for development or testing
- easy to inadvertently source control parts of the DNN solution, particularly with integrated VS source control solutions.
This approach compiles quickly because you're not trying to compile the entire project. For test deployment I have a build script that copies the various parts of the module into a target website. This can be done via the compile (link the build script) or just run after you've had a successful compile in a cmd window. My build script has a 'target' environment switch, so that I can say 'dnn520' to deploy the build to my test dnn520 install. Note that you need to manually create the module configuration first before this will work, but this is a one-time effort, and you can use the export feature to create your .dnn module manifest.
To build your module package, invest the time in a comprehensive script which will take the various parts from your source directory, and zip them into an install package. Keep all of the parts in your source control folder, and copy them into a temp directory, then run a command-line zip utility (I use an ancient version of pkzip) to pack it into an installable file.
The benefits of this approach is :
- separation of module code from installed code
- simple way of keeping only the module code in source control (don't have to exclude all the website code)
- ability to quickly test out modules in different dnn versions
- packaging script allows you to quickly and easily build a new version of a module for install testing/deployment
The drawbacks are
- can't use the magic green 'go' button in VS (have to manually attach debugger)
- more setup time than developing in-place
We typically stick to keeping the module code in a folder under DesktopModules and building to the website's bin directory.
In source control, we just map the individual modules, rather than the entire website. Depending on what we're working on, a module may be an entire project in source control, or we may have multiple related modules in the same project, living next to each other.
Automatically deploying changes is somewhat difficult in DNN. It's highly recommended to have a build script that packages your module into an installable form. You can then copy installable packages into the website's Install/Module folder, and get the URL /Install/Install.aspx?mode=InstallResources, which will install any packages in that folder.
In response to bduke's answer. You should, and don't want to build projects in the DesktopModules folder.
That's where all of the source code for the site out of the box goes.
That's where you modules will be "installed" and thus if someone "updates" or re-installs one, then it will be overwritten
It can make upgrading your Application far more difficult. Many developers don't understand that the idea of not touching the original source code files to modify their behavior. BECAUSE it will just be overwritten when you perform an upgrade.
If you want to build modules, create a solution folder called Modules and place your seperate project modules there.
If you want to debug them, make the target debug output point to the web\bin folder.
If you want to install/deploy them. Build it in release mode and install them through the Module/Extension filter.