Using SQL developer to automate several sql scripts - oracle-sqldeveloper

I have several scripts that I run in SQL Developer every day. I have it set up as #script_1.sql; #script_2.sql; #script_n.sql; and I hit F5 and let it run through all my scripts and then look at my results when it has finished parsing through all of them. I want to have this automated so it is finished running those scripts by the time I get into work. My system admins have disabled task scheduler. I have a few jobs set up that work fine but when I tried turning this into a job it wouldn't work. How would I automate this using only SQL Developer?

You could create a package and convert those SQL files into its procedures. Then schedule their execution with DBMS_JOB or DBMS_SCHEDULER package.

Related

Run Powershell script every hour on Azure

I have found this great script which backs up SQL Azure database to BLOB.
I want to run many different variations of this script - e.g. DB1 goes to Customer1Blob, DB2 goes to Customer2Blob.
I have looked at Scheduler Job Collections. However I can only see options (Action settings) for HTTP(S)/ Storage Queue / Service Bus.
Is it possible to run a specific .ps1 script (with commands) scheduled?
You can definitely run a Powershell script as a WebJob. If you want to run a script on a schedule, you can add a settings.job file containing a chron expression with your webjob. The docs for doing so are here.
For this type of automation tasks, I prefer to use the Azure Automation service. You can create runbooks using powershell and then schedule this with the use of the Azure scheduler. You can have it run "on azure" so you do not need to use compute power that you pay for (rather you pay by the minute the job runs) or you can configure it to run with a hybrid worker.
For more information, please see the documentation
When exporting from SQL DB or from SQL Server, make sure you are exporting from a quiescent database. Exporting from a database with active transactions can result in data integrity issues - data being added to various tables while they are also being exported.

How to query IIS web logs from T-SQL directly

I learned how to do basic queries of the IIS web log with LogParser Studio and to export the LogParser query into a PowerShell script, which is great because I can have the Windows Task Scheduler run the script on a schedule.
However, I was wondering if it might be possible to directly query the IIS web log using a T-SQL Script rather than using my current procedure:
Use TaskScheduler to run Powershell script to run LogParser query of IIS web log and export results as CSV
Use SQL Agent to run SSIS package to import CSV into SQL Server Table
View table results on MVC webpage
What I'd really like to do is have the user in the MVC webpage click a button to trigger (via actionlink) a stored procedure that re-queries the IIS weblog, runs the SSIS import package via SQL Agent Job, and displays the results on screen.
While I could have the Task Scheduler and Agent Jobs run more frequently, having it run by demand (by user click) makes sure that the query is run only on demand, and not during time intervals in which there is no demand for the query results.
Is such a thing even possible with the state of SQL Server? (I'm running version 2014 and IIS version 8.5).
As #Malik mentioned, sp_start_job can be used to run an unscheduled Agent job.
In this case, my job has two steps:
Run the PowerShell script, with the script pasted directly into the Agent job step. I had to make sure that I had a proxy for PowerShell set up.
Run the SSIS package to import the CSV into the SQL table.
My stored procedure is very simple:
ALTER PROCEDURE [dbo].[RefreshErrorQuery]
-- No parameters
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Rigger unscheduled job to run
exec msdb.dbo.sp_start_job N'Refresh Web Query';
END

How to call SQL Agent Job Wizard from PowerShell?

I need it to automate modification of SQL Server Scheduled jobs and want to reuse functionality of SSMS Scheduled job wizard.
Is it possible to popup SSMS Scheduled job wizard using PowerShell, use it and then close?
Update: I am aware of how to modify jobs using SPs and PS. The wizard is just one step in automated process. The next steps are to read the job settings and generate job script as per company's standard (that is important as out of the box Drop and Create script is not good enough), set encoding of the file etc.
You call a job via integrated MS SQL procedure sp_start_job. You call this SQL via Invoke-SqlCmd.
To modify/create the job you have to use sp_add_job, sp_add_jobstep, sp_update_job and sp_add_jobschedule.

Is it possibile to remotely process an SSAS cube throgh script?

I have an SQL Server Analysis Service (SSAS) cube (developed with BIDS 2012) and I would like to give the opportunity to the users (that use cube through PowerPivot) to process the cube in their local machines.
I found some material on how to make a scheduled job on the server through Powershell or SQL Agent or SSIS but no material on remotely process the cube. Any advice?
There are several possibilities to trigger a cube processing. The low level method is issuing an XMLA statement to the database containing the cube. To see how this looks like, open SQL Server Management Studio, connect to the AS instance, right-click on an AS database, and select "Process". Configure the processing settings, but instead of hitting OK, select "Script from the top toolbar to have the XMLA process command be generated for you. Leave the dialog with Cancel.
All methods that process a cube end in some way or the other in sending a command like this to the AS database.
There are several options to trigger a cube processing:
In Management Studio, by clicking OK in the above mentioned dialog.
In PowerShell (see http://technet.microsoft.com/en-us/library/hh510171.aspx).
In Integration Services, there is an Analysis Services processing task (http://msdn.microsoft.com/en-us/library/ms141779.aspx).
You can set up a SQL Server Agent job, job steps could either be a direct XMLA step, or an Integration Services step containing the process task (among possibly other tasks).
The question, however, is how the setups described above can be accessed by end users. An important issue here is of course that the user executing the process task needs to have the permission to process the cube. As you might not want to give this permission directly, it might make sense to use some impersonation on the way of calling it. With Management Studio - and as far as I am aware with PowerShell - this cannot easily be achieved.
Integration services and Agent jobs offer the possibility of impersonations. Integration services packages are executed by the dtexec command line tool (part of the SQL Server client tools), there is also a tool called dtexecui (available as "Execute Package Utility" in a standard SQL Server client tool installation), which lets you use a dialog to configure all settings, and then execute a package, but it also can display the command line for dtexec, according to your settings.
And to call a SQL Server Agent job, an easy interface are the stored procedures (http://msdn.microsoft.com/en-us/library/ms187763.aspx), especially sp_start_job (Note this is asynchronous, you call it, it starts the job and returns. It does not wait for the job to complete before returning.) and sp_help_jobactivity to ask for job status as well as sp_help_jobhistory for details of jobs that were running.
All in all I think there is no final solution available, but I mentioned some building blocks that you could use to code your own solution, depending on the preferences in your environment.

Run SSIS Package from T-SQL

I noticed you can use the following stored procedures (in order) to schedule a SSIS package:
msdb.dbo.sp_add_category #class=N'JOB', #type=N'LOCAL', #name=N'[Uncategorized (Local)]'
msdb.dbo.sp_add_job ...
msdb.dbo.sp_add_jobstep ...
msdb.dbo.sp_update_job ...
msdb.dbo.sp_add_jobschedule ...
msdb.dbo.sp_add_jobserver ...
(You can see an example by right clicking a scheduled job and selecting "Script Job as-> Create To".)
AND you can use sp_start_job to execute the job immediately, effectively running SSIS packages on demand.
Question: does anyone know of any msdb.dbo.[...] stored procedures that simply allow you to run SSIS packages on the fly without using sp_cmdshell directly, or some easier approach?
Well, you don't strictly need the sp_add_category, sp_update_job or sp_add_jobschedule calls. We do an on-demand package execution in our app using SQL Agent with the following call sequence:
- sp_add_job
- sp_add_jobstep
- sp_add_jobserver
- sp_start_job
Getting the job status is a little tricky if you can't access the msdb..sysjobXXX tables, but our jobs start & run just fine.
EDIT:: Other than xp_cmdshell, I'm not aware of another way to launch the the SSIS handlers from withinSQL Server. Anyone with permissions on the server can start the dtexec or dtutil executables; then you can use batch files, job scheduler etc.
Not really... you could try sp_OACreate but it's more complicated and may not do it.
Do you need to run them from SQL? They can be run from command line, .net app etc
In SQL Server 2012+ it is possible to use the following functions (found in the SSISDB database, not the msdb database) to create SSIS execution jobs, prime their parameters, and start their execution:
[catalog].[create_execution]
[catalog].[set_execution_parameter_value]
[catalog].[start_execution]