Transfer SQL job to another machine - sql-server-2008-r2

How can I transfer my SQL Agent job (maintenance plan) from my server to another server?
I use SQL Server 2008 R2. My job runs a T-SQL script.

The following article explains how to to do it: Migrating a Maintenance Plan from One SQL Server to Another.
Note that they're using SQL Server 2000, but it should work for SQL Server 2008 R2 as well.

I solve this problem in this way:
I use SSIS to transfer SQL Agent job (maintenance plan) from my server to another server.
this is the way that i use:
Open SSIS and go to Stored package --> MSDB --> maintenance plan and right click on your job and choose export
choose file system for location
choose the path for your package
copy your file in new server
connect to another machines SSIS
now choose import package on MSDB
go to maintenance plan and save your job or run Creat script

Related

Microsoft SSIS Package running successfully manually from Visual Studio and SSMS Integration Services Catalog, but not via SQL Server Agent

I have a complex SSIS package, which detects the file extension from a folder, and loads the file into a SQL Server table. I have a For Each Loop Container to load a number of files in this manner, from this folder location and load each file into a SQL Server table.
After the loading of each file into the SQL Server table, the SSIS has a File System Task in the Control Flow; this File System Task first creates an archive folder and then moves each file into this archive folder.
I am using Environment Variables in the SSMS Integration Services Catalog, to map to the parameters in the SSIS package/project.
The entire process is successful when I run the SSIS package in the SSMS Integration Services Catalog manually, but when I try to run via a SQL Server Agent, the data loading and the (File System Task's) folder creation are successful, but not the File System Task's file moving process. (The Agent is Run as SQL Server Agent Service Account.)
I get the following error when I see the execution report in the Integration Services Catalog in the SSMS:
File System Task - Move Files:Error: An error occurred with the following error message: "Access to the path is denied.
While the SQL Server Agent is able to create a folder using a File System Task successfully, it is not able to
move the file into this new folder location.
Inside the SQL Server Agent History, I see this in the job step:
Execute as user: NT Service\SQLSERVERAGENT. Microsoft(R) SQL Server Execute Package Utility Version 14.0.2002. 14 for 64-bit.
... Package execution on IS Server failed. Execution ID: 30449, Execution Status : 4.
I am not good with this permission issue in SQL Server Agent. I read about some proxy setting etc. but am not able to comprehend.
Is there a step-by-step solution you can provide me to fix this issue ?
The SQL Agent job executes the package using the SQL Agent service account. When you run the package manually, the package executes using the credentials you used to sign in. Most likely the SQL Agent service account does not have enough access to the directory, especially if it has just been created. Make sure the service account has "Full Control" of the directory the package is referencing. To test whether it is an access issue, log on to the server using the service account credentials and manually run the package from the SSIS catalog. If it fails for the same reason, you know you need to look at file system access for the service account.

Azure Data Factory v2 Oracle on-premise to SQL Server (IaaS)

I am trying to create a ADF v2 pipeline that will copy data from on-premise Oracle server to SQL Server VM.
Network Admins have set up Integration Runtime for Oracle. Their idea was that we can simply use SQL Azure as a target. It worked, but for some other reason, we want to use SQL Server on VM instead.
I figured that I need to set up stand-alone IR and set it on VM. Unfortunately, when I tried to run pipeline I got the error that both source and target need to be on the same IR.
It is expected that source and target should be on the same self-hosted IR, so that the Copy activity could be executed on that IR. You can learn more details on how Copy activity work in this doc: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-overview.
For your case:
If your IR for Oracle can access the target SQL Server VM, you can just use that IR to copy data from Oracle to SQL Server.
Else, you need 2 copy activities: (1) copy from Oracle to a data store that both source Oracle VM and target SQL Server VM can access (e.g. a Blob Storage) on IR for Oracle. (2) copy from Blob to SQL Server on IR for SQL Server

How to query IIS web logs from T-SQL directly

I learned how to do basic queries of the IIS web log with LogParser Studio and to export the LogParser query into a PowerShell script, which is great because I can have the Windows Task Scheduler run the script on a schedule.
However, I was wondering if it might be possible to directly query the IIS web log using a T-SQL Script rather than using my current procedure:
Use TaskScheduler to run Powershell script to run LogParser query of IIS web log and export results as CSV
Use SQL Agent to run SSIS package to import CSV into SQL Server Table
View table results on MVC webpage
What I'd really like to do is have the user in the MVC webpage click a button to trigger (via actionlink) a stored procedure that re-queries the IIS weblog, runs the SSIS import package via SQL Agent Job, and displays the results on screen.
While I could have the Task Scheduler and Agent Jobs run more frequently, having it run by demand (by user click) makes sure that the query is run only on demand, and not during time intervals in which there is no demand for the query results.
Is such a thing even possible with the state of SQL Server? (I'm running version 2014 and IIS version 8.5).
As #Malik mentioned, sp_start_job can be used to run an unscheduled Agent job.
In this case, my job has two steps:
Run the PowerShell script, with the script pasted directly into the Agent job step. I had to make sure that I had a proxy for PowerShell set up.
Run the SSIS package to import the CSV into the SQL table.
My stored procedure is very simple:
ALTER PROCEDURE [dbo].[RefreshErrorQuery]
-- No parameters
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Rigger unscheduled job to run
exec msdb.dbo.sp_start_job N'Refresh Web Query';
END

Is it possibile to remotely process an SSAS cube throgh script?

I have an SQL Server Analysis Service (SSAS) cube (developed with BIDS 2012) and I would like to give the opportunity to the users (that use cube through PowerPivot) to process the cube in their local machines.
I found some material on how to make a scheduled job on the server through Powershell or SQL Agent or SSIS but no material on remotely process the cube. Any advice?
There are several possibilities to trigger a cube processing. The low level method is issuing an XMLA statement to the database containing the cube. To see how this looks like, open SQL Server Management Studio, connect to the AS instance, right-click on an AS database, and select "Process". Configure the processing settings, but instead of hitting OK, select "Script from the top toolbar to have the XMLA process command be generated for you. Leave the dialog with Cancel.
All methods that process a cube end in some way or the other in sending a command like this to the AS database.
There are several options to trigger a cube processing:
In Management Studio, by clicking OK in the above mentioned dialog.
In PowerShell (see http://technet.microsoft.com/en-us/library/hh510171.aspx).
In Integration Services, there is an Analysis Services processing task (http://msdn.microsoft.com/en-us/library/ms141779.aspx).
You can set up a SQL Server Agent job, job steps could either be a direct XMLA step, or an Integration Services step containing the process task (among possibly other tasks).
The question, however, is how the setups described above can be accessed by end users. An important issue here is of course that the user executing the process task needs to have the permission to process the cube. As you might not want to give this permission directly, it might make sense to use some impersonation on the way of calling it. With Management Studio - and as far as I am aware with PowerShell - this cannot easily be achieved.
Integration services and Agent jobs offer the possibility of impersonations. Integration services packages are executed by the dtexec command line tool (part of the SQL Server client tools), there is also a tool called dtexecui (available as "Execute Package Utility" in a standard SQL Server client tool installation), which lets you use a dialog to configure all settings, and then execute a package, but it also can display the command line for dtexec, according to your settings.
And to call a SQL Server Agent job, an easy interface are the stored procedures (http://msdn.microsoft.com/en-us/library/ms187763.aspx), especially sp_start_job (Note this is asynchronous, you call it, it starts the job and returns. It does not wait for the job to complete before returning.) and sp_help_jobactivity to ask for job status as well as sp_help_jobhistory for details of jobs that were running.
All in all I think there is no final solution available, but I mentioned some building blocks that you could use to code your own solution, depending on the preferences in your environment.

Use data mining in SQL Server 2008 R2

I have SQL Server 2008 R2 on my computer and I want to use data mining with this version of SQL Server. My question is how can I do this? Because I've read some where that I can use data mining in SQL Server evaluation edition. I can use data mining in SQL Server 2008 R2?.
And I have one other problem when I want to use SQL Server 2008 Data Mining Add-Ins I can't connect to SQL Server and displays this error message.
Unable to connect to server 'localhost'. Please make sure user '' has at least read permission to some database on the server.
First you should get SQL Server Data Tools which runs in Visual Studio.
You will need Analysis Services installed; if you don't have it just run the SQL Server installer again and look for the option to install it.
After that you can take a look at this post I wrote a few months ago:
http://www.sqlservercentral.com/Forums/Topic480010-147-1.aspx
I wrote it specifically targeting the Neural Network models, but it contains details on several background steps you will need to do.
Finally - since you're using an evaluation version, you may want to just go for SQL Server 2012 (that's what I use, so I know it works).