Ensuring SSIS package on server and source control match - version-control

Does anyone have any tips to ensure the SSIS on a production server is in sync with source control?
The only way I can think of to do that is have a separate group that always deploys the SSIS packages, but that isn't feasible in my environment.
Thanks

You can check the version number of the package in source control and the version number on the server.
You can also export the package on the server to the file system of your development box and check to see if the .dtsx file is the same as the one you have in source control.

You could also compare/diff XMLs...

Related

Find Task names of an SSIS file or Find xml code of an ssis package from sql server 2019 using T-SQL

there is a Windows Server which runs an SQL Server 2019 instance at my company. I have deployed an Integration services solution (project deployment model) with some SSIS packages. I wondered if there is any way to get data flow task names from the packages, or to get the xml file body in order to extract names by using T-SQL. If none of the above is possible i would like to know in which directory the actual dtsx files are stored in Windows Server when you deploy a solution in SQL Server. I have searched a lot for the above but i cannot find any answer.
Thanks in advance.
How is the package deployed? If it's File System, you could likely do
something like Powershell and SQL. I can't, personally, remember if
packages deployed in msdb are encrypted (I haven't used the deployment
method since 2012), however, if they are deployed via SSISDB you won't
be able to query the packages stored in the database as they are all
encrypted. You'd need to inspect the source packages (in your source
controlled project).
From a comment by #Larnu

NuGet install-package from solution

Is there a way to add a package to a project when this package is already added to another project? I would like to avoid depending on the net because sometimes we are require to work in a non connected scenario.
I know I can simple add a reference and browse, but I suspect I will loose the ability to update automatically the reference in the future.
One way to be able to work with NuGet in a non-connected state, is to supply an alternative package source, either locally on your own computer, or on a machine on your internal network. You can add new package sources in Visual Studio through Tools > Options > Package Manager > Package Sources.
We do this ourselves for two reasons:
With a local package source you can work without an internet connection.
With a local package source and the official NuGet package source disabled, we have better control of which packages are available. This way will we for instance avoid undesired updates in our development group until we've approved them.

NuGet is returning 503 Server Unavailable

Is there a way to load a package from an alternative server when Visual Studio Package Manager (NuGet) is responding with a "The remote server returned an error: (503) Server Unavailable" message?
This is an obscure condition that will likely only occur on an "enterprisy" network environment. If these conditions apply you:
you are required to access the Internet via an HTTP proxy server
the HTTP proxy server requires a valid user ID & password (or AD authentication) to allow requests to proceed
you've been messing with cool developer tools that were ported to Windows from a Linux/Unix environment
the new cool tool(s) work after adding the HTTP_PROXY (or possibly HTTPS_PROXY or both) environment variable(s)
you can access the NuGet servers from a browser without getting a 503 error
Then it's likely you broke NuGet by inadvertently invoking this configuration feature. I'm not sure exactly how the environment variable breaks NuGet but I suspect NuGet is detecting & using the http_proxy URL but sending an empty user ID & password which causes the HTTP proxy to reject the request.
Fix: remove the environment variable(s) you added and see if the cool tool can be configured to use an HTTP proxy without them.
Update: Ran into a version of this issue with the NuGet config file referenced in the "this configuration feature" link above. Open this file:
%appdata%\nuget\nuget.config
in your favorite editor. If it contains elements with http_proxy or https_proxy then removing these elements may fix the issue too.
PS: Hopefully I'll get an up vote from Colonel Panic :-)
If you have used the package in the past it is probably in your cache. You can add the local cache as an available package source by going into the Library Package Manager Settings under the Tools menu in Visual Studio. For Visual Studio 2012, choose Tools, Library Package Manager, Package Manager Settings, and then click on Package Sources.
In the Available package sources section, type a name like "Cache" and then in for the source, browse to %LocalAppData%\NuGet\Cache. You may need to use Windows Explorer to translate %LocalAppData%\NuGet\Cache into the full path (usually C:\Users\YourAccountName\AppData\Local\NuGet\Cache).
Once you have the Cache as an available source, you can now use the Package Manager Console (found under the View menu under Other Windows or also under the Tools menu under Library Package Manager).
From the Console (which is a PowerShell window with commandlets for NuGet) you can type "get-help NuGet" to see available commands.
Then using Get-Package, you can get a list of Package ID's. Make sure the "Package source" is set to "Cache" (or whatever you called it) and the Default project is set to the project you need manipulate, both of these are dropdowns located at the top of the Page Manager Console. You can also use the Get-Project to verify you are working against the correct project in your solution.
Finally, you can type Install-Package and when prompted enter the Package ID from the output of the Get-Package commandlet.
i had also this problem, it was becouse of my network.
if you have any blocking on your Internet, (like in companies internet or etc..)
you may not allowed to download the nuget package.
try to download the package in another network, maybe it can help you!
Talbott's answer did not work for me, as my cache was empty. However, if you have used the package in another solution, you can copy the items you want from the "packages" folder in the other solution to a packages folder in your target solution.
If you have no packages installed in the target solution, you may need to add the following to a repositories.xml file in the packages folder:
<?xml version="1.0" encoding="utf-8"?>
<repositories>
</repositories>
After doing that, the packages appeared to be installed in my solution and I was able to add them to projects.
Additional Note: I had to use the "Manage NuGet Packages for Solution" option at the solution level to add the package to individual projects. Using Install-Package from the console still returns a 503 even though the packages is already installed in the solution.
You can also get this error if you are using a VPN client (e.g. Cisco AnyConnect) and you have recently renewed your VPN certificate. The issue can occur after you have updated your certificate, but before you have rebooted. A reboot resolves the issue.
It is a pretty old question, but I have just encountered the same problem. In my case it occurred because I had more than one nuget package source configured in the Visual Studio Package Manager. In my company we use NuGet to get mainstream packages and MyGet for our own stuff.
When I attempted to pull a pretty big package it failed with a 503 code and the error link looked pretty odd, it had MyGet in it istead of NuGet. Turns out Visual Studio package manager tried to pull it from another source despite having NuGet chosen as a current source. Disabling other sources and then proceeding with a download fixed it.
Hopefully it will help somebody who stumbled upon this thread just like I did.
Another possible reason for recieving 503: If you're using Azure DevOps feed, then NuGet packages are limited to 500 MB.

Using NuGet for Internal & External Dependencies in TFS

I'm currently looking at NuGet to solve my dependency problems in TFS and what I wanted to do is to host my own NuGet server that would take care of internal dependencies. I also want to use NuGet to handle my 3rd party dependencies as well. I'm trying to set up automated builds for our company and this is one roadblock I'm trying to overcome with NuGet.
So my question is how do I handle this scenario in which I have to retrieve my dependencies from different servers?
Is there a better way to handle internal dependencies? How is everyone else doing this?
Also just as a note I intend on using NuGet without committing packages to TFS. I planned on using the method outline in this article:
http://blog.davidebbo.com/2011/08/easy-way-to-set-up-nuget-to-restore.html
Glad you're looking into the no commit scenario for NuGet packages on TFS. You can take a look at my blog post on this topic where I explain the concept.
EDIT (2012/06/13): NuGetPowerTools is replaced by NuGet's built-in package restore functionality. However, same concept of changing the PackageSources element in nuget.targets still applies.
You definitely should take a look at David Fowler's NuGetPowerTools.
After installing this package, you can Enable-PackageRestore (newly installed command in Package Manager Console), which will add...
Enabling package restore will add MSBuild targets to your project files. These MSBuild targets will trigger nuget.exe in a pre-build step and fetch any packages required by your project.
No need to check-in NuGet packages in source control, all you need is the packages.config and these msbuild tasks.
To configure multiple, different package sources, you need to set some settings to be used by these MSBuild tasks. One of them is PackageSources. You can set it by editing the NuGet.targets file, which you will find in the .nuget folder once you enabled package restore.
Regarding those package sources, you could set up different internal NuGet galleries, or simply set up different network shares to be used. This is a matter of requirements and preference, so you can choose. All you need to do, is to tell your msbuild targets to use these packagesources. The order in which you define them, will be the order of lookup of packages as well.
Good luck!
Xavier
Little update on accepted answer and question:
When using TFS as a buildmachine without visual studio installed on it, you can do the following so the buildmachine automatically uses your custom packageSources (more than 1 in the same solution) without any further configuration of packagesources in your solution.
Create a machine default config by placing a NuGet.Config in the root ( C:\NuGet.Config ) by using sample from: http://docs.nuget.org/docs/reference/nuget-config-file
Comment out the line with: <add key="repositorypath" value="$\External\Packages" />
Otherwise your packages gets expanded in C:\$\External\packages\'. When commented out, the config gets chained and the right directory will be used.
Config your needed packagesource(s).
For more Info about other options (e.g. user specifc) see: http://docs.nuget.org/docs/reference/nuget-config-file (bottom of the page).

How to deploy: database, source and binary changes in 1 patch?

I'm part of a development team that works on many CMS based projects, using systems like Joomla and Drupal.
In our development process, all of our code changes are managed inside of Git. At the end of a sprint, we create a DIFF that we can apply via patch to live site.
The problem is that most of the time, the changes include
Database Schema Changes
Database Data Changes
Source Code changes
Binary file changes (like images)
Git Diff handles Source Code changes beautifully. Binary files are only not included in the Diff except for reference to the fact that the files have changed.
Database Schema Changes and Database Data Changes are a mess.
I was wandering if anything like an unified patch system exists that could be used to deploy all of these changes in 1 patch.
So the question is, "Is there a system that can be used to deploy all of these changes in 1 shot?
Ideally, this system would allow to run dry-run like patch, but for all of the 4 data types.
Edit:
Thank you everyone for the feedback that you provided, it was a starting point for my research in this area.
Here is what I found so far:
It's difficult to deploy php based
applications using linux packaging
system because the changes to the
project happen iteratively rather
then as releases.
It would be possible to use dbconfig to deploy changes to a
project, but the problem is
generating mysql db diffs (schema
and data)
what really is missing for deployment of php based applications
is a deployment manager that would
be installed on the server and would
be the interface for deploying the
patches
I started a Google Wave on this topic and produced a lot of information as a result.
If anyone is interested in reading this wave, please let me know and I will add you.
For handling installation and upgrade of our application, we use the debian packaging system . ( .deb package )
Context :
We are making J2EE + Flex application. Shipping and administred throught a VPN.
So not so far from you.
Fresh install and upgrade for a version to another are made through puppet ( a system for automating system administration tasks : he install our .deb )
In the .deb we have
our compiled sourcecode
the schema of the database ( handled by [db-config][1] )
binary stuff
how to install throught apt all other application needed ( mysql, tomcat ... )
= All stuff for a fresh install
We also add the info to go from a version to another
the script for upgrading the database ( for each version )
new binary
new stuff to lauch at the machine start ( eg : some weeks ago we have add a activeMQ server )
=> Once the .deb is made correctly, we can install or upgrade seamless in one operation. ( it's made automatically, without any prompt ).
Theire is one .deb per realease, each .deb has a version number and a signature.
You can pick any of our .deb and make a fresh install or upgrade from the actual version to the version number he hold.
The .deb is in our continous integration system. ( we build a .deb each hour, like if we are about to realease a new version )
What are the benefit ?
Install / upgrade automaticcally, with confidence.
Rollback a version
run dry are natively supported
In your precise case
* Database Schema Changes
* Database Data Changes
* Source Code changes
* Binary file changes (like images)
Database => you will have to write migration script. One for each version. ( ex : 1.2-update.sql 1.3-update.sql )
Source code and binary => add them, say in witch version they have to be copied/use
Edit : i'm not sure about source code. We are doing that with compiled code...
Some links to start :
https://wiki.ubuntu.com/PackagingGuide/Complete
http://www.debian.org/doc/manuals/maint-guide/index.fr.html#contents ( in french )
[1]: http://pwet.fr/man/linux/formats/dbconfig dbconfig
[1]: http://www.debian.org/doc/FAQ/ch-pkg_basics.en.html debian
I don't think you'll find a fail-safe mechanism.
I recommend that, when possible, you take into account compatibility with the current published source when making schema/data changes.
This way you can make a v. simple tool that runs database scripts committed to a particular svn location (you don't want diff on database changes, as if you need further modifications you need different statements).
With the above done, you can have a simple command that runs the database changes, then the binary & source code changes.
For database there is also the option of schema&data comparisons tools, these could be used to compare environments & make sure there isn't anything unexpected missing in the change scripts - could also generate the change scripts, but as I said you really want to make sure it won't break current source.
You can create a tool to do the migrations painlessly -- something similar to Peoplesoft's Patch Upgrade Assistant.
It is basically a standalone executable that reads an "Upgrade Template" and carries out tasks. The upgrade template declaratively describes the upgrade tasks or "steps". The steps could be - copy (for backing up or moving the precompiled objects like classes and othar binaries), database (for altering schema elements), SQL Scripts (for loading or transforming current data). The steps will have some predicate logic capable - if it is this, do this, else skip it and go to next etc.
The template is usually an XML file. It also provides for manual steps with instructions for manual actions. Each step also specifies if it is recoverable or not. It would also validate if the step has succeeded or not.
It may be possible to have a Open Source project around this requirement which is quite common.
You need to save git commit objects in local file and then import them into other repo/branch.