Hi all I am very new to SSIS. I have got SSIS package developed by some other guy this package reads data from flat files and stores to database after mapping.
Flow:
1) First package extract records from flat file and stores in table.
2) Then it calls child package using Execute package tasks.
3) Then child package do some calculations and update the database table.
SSIS is using Environment variable to get database information.
Every thing is working fine but now I want to deploy this package to my client's server.
Ques: Do I need to copy and paste files from bin folder and paste on clients machine?
What I Tried: I copy files from bin folder and placed on my local computer. Then I create a job in MSSQL and run the job. Package runs perfectly. But Later I changed location of my project and problem starts job stops working.
Issue: Error says location of child package is not available(As I changed position of my project files)
Kindly suggest what to do.
I am going to make several assumtions here so please correct me if I get any wrong.
The problem I am guessing is that on your Package.dtsx within the connection manager this is currently linked to the package location within the project folder. In this case you are wanting to change it to another location, however the package in the connection manager is still pointing to the project location.
If I were you I would do the following:
Create a string variable
PackageFolderPath - C:\CurrentPackagePath\DBPackage.dtsx
Now what you want to do is go to the package within the connection manager and under the properties add an expression for ConnectionString with the following: “#[User::PackageFolderPath] If you evaluate the expression it should give you the location you setup in your variables.
Please note however that if you want this to work on the development system then setup the package to the project location.
Now once you have those setup, copy the files across the new server and under the SQL agent job to go the Set Values tab and within here you want to add the following:
\Package.Variables[User::PackageFolderPath].Properties[Value]
Under the value you want to put wherever the package is now located
This now should pickup the new location of the package when it is run.
A better way to do this would be to make use of the deployment utility and using an XML configuration variable on the package. However this way should work.
Related
I have an application that uses NuGet packages to manage versions of tooling available to an instance of the application.
For instance, A Test Manager, the actual Tests that are executed by the Manager are contained in a .nupkg file. If you want to change which tests are executed all you have to do is replace the .nupkg file. When the application runs it finds the package and uses the package manager to get all the files out and add them into my app domain.
The package manager extracts all the files into a temp folder and returns the directory they were extracted to.
Unfortunately every time the application runs it extracts to a new random location because of this block of code in OptimizedZipPackage:
protected virtual string GetExpandedFolderPath()
{
return Path.GetRandomFileName();
}
Yes it is virtual and I can override it in my own version of the package, but in order to use my own version of the package I also have to have my own version of the LocalPackageRepository class, with an override.
What I'm hoping for is an way to get fixed folder names so it uses the same folder for the same package/version every time without having to write override classes.
Is this possible?
I have a package that needs to be copied to three 3 different servers. Each server is used for a different testing environment. All three servers have the same directory layout. The layout is as follows:
*\SERVER\ConfigFiles* <- Here go the .dtsConfig files.
*\SERVER\Packages* <- Here go the .dtsx files.
I want to be able to use the same package copied over the three 3 different servers without any modification. The only difference amongst the 3 servers would be the content inside the .dtsConfig file. The config files contain directories for the excel, log, and SQL server connection for each environment.
For example. Let's say I have a package called Cars.dtsx. This package is EXACTLY the same amongst all three servers. The package file points to a .dtsConfig file that is in the ConfigFiles folder (which is found on all three servers). I want a way for the package to point to the ConfigFiles\Cars.dtsConfig file on each server, but I want to do it without having to provide the name of the server in the directory.
The way I tried it is using "$(ProjectDir)..\ConfigFiles\Cars.dtsConfig" which seems to work if I run the package through the .sln file rather than the .dtsx file.
I hope that wasn't too confusing. Let me know if you need anymore info. Thanks.
Unless I'm missing some nuance, you don't need to do anything special.
Your package is going to have a hard coded reference to D:\ConfigFiles\Cars.dtsConfig It won't matter whether that package is being run from ServerA, ServerB or ServerZ (as long as you have the same file structure on those servers).
By virtue of your asking the question, are you experiencing something different?
I store all SSIS packages in Subversion repository, their configuration files as well. Configuration file almost always stored in the same folder where package is.
Problem is - SSIS seems to always store path to configuration file (the one saved in the package itself) as an absolute path.
When someone else checks out folder with the package in the location different from where I had on my development PC the configuration file is not detected (because my absolute path is stored and it doesn't exist on the other developer PC). So another developer has to remove this configuration and add it again from where it is now on his local hard drive. Then changed package is saved which will cause new version to be committed. When I get that version from SVN it will no longer match local path on my PC.
On a related note: another developer may want to change values in configuration file as well. If I later get the latest version of everything from SVN package will no longer work on my PC.
How do you work around these inconveniences?
Another solution is to save your configuration in a database with an environment variable as the first configuration to tell it what database to look in, that's what we do. We have scripts to populate ssisconfig for each server in our source control, but the package uses the actual table data for the database in the environment variable we are using.
Anyone who has heard my SQL Saturday presentations knows I don't much care for XML and this is one of the reasons. A trick to using XML configuration with varying locations is to use an environment variable (indirect configuration) to direct SSIS where it can look for that resource. The big, big downside to this approach is you'd generally need to create an environment variable for each set of configuration files or have a massive, honking .dtsconfig file which becomes painful for versioning.
The option I prefer if XML configuration is a must is that the "variableness" is removed. Developers and admins get together and everyone agrees "there will be a folder everywhere SSIS is done to hold configuration files and that location is X" and then it's just a matter of solving for X. At a previous job, we used D:\ssisdata\configs
#HLGEM's approach of a table for configurations is hands down my favorite approach to SSIS configuration (until you get to 2012 and their project deployment model where configuration is an entirely different animal)
I add a folder called "config" under my projects folder, add it to source control and mantain the config file in this folder. You can also add it to the SSIS project if you like.
I think its a good solution because everybody can have this folder and dowload the config file.
When the package is deployed it will read the config file from where you inform in the deployment manifest so this solution wont impact your development
We currently carry out development on a mapped drive. When I write nunit tests against a test assembly it will pick up the assembly, however does not recognise any of the tests.
If I move the solultion etc to a local drive and reference it again then everything works fine.
What I really woiuld like to know is why this is being caused, and how I can carry on using a network drive for development.
Per http://geekswithblogs.net/TimH/archive/2007/08/02/114340.aspx, NUnit apparently does not have appropriate permissions to access the assembly when on a network drive. The suggested fix is to add a post-build event to copy the assembly to a local temp directory and run NUnit off that copied assembly:
Within VS, open the project properties.
Go to the Build Events tab and enter the following 'Post-build event command line':
del /q c:\temp\nunit*.*
copy "$(TargetDir)." c:\temp\nunit
A potential issue you may have as a result of this change is related to the AppBase as per Unable to load <mytest> because it is not located under Appbase. The answer there is to update the Settings element within the .nunit file to include an app base of C:\Temp\NUnit then update the assembly element's path to remove any leading directory information.
I have a small windows app and am trying to use SQL CE for the local datastore. I have had a couple of problems deploying it. I am using ClickOnce deployment.
First question:
In the Publish properties -> Application Files I have it set to Data File(Auto), Required, Include. However, it doesn't seem to be included? When I navigate to the location that Click Once installs to its not there?
Second:
Click once creates a new directory in the User\Local\Apps directory, with the app files and SDF file in when I update the app and release a new version I don't want to start with a new database. All the data in the existing database will be lost? The just doesn't seem to make sense?
What is the procedure around this?