problems with synchronization parameter in SSIS - tsql

I am using SSIS synchronization parameter while running SSIS package with stored procedure
EXEC [SSISDB].[catalog].[set_execution_parameter_value]
#execution_id, -- execution_id from catalog.create_execution
#object_type=50,
#parameter_name=N'SYNCHRONIZED',
#parameter_value= 1; -- turn on synchronized execution
is there any problem associated with synchronization parameter. I have not faced any till now. But want to know the possible issues.

Related

Recommended way of invoking pg_repack on google cloud

We have installed pg_repack on our postgresql database.
What is the best way to periodically invoke the pg_repack command using GCP infrastructure?
We tried running it using Cloud Run, but the 1 hour time limit often means that it times out before it can finish.
When it times out, we face the following error on subsequent runs:
WARNING: the table "public.<table name>" already has a trigger called "repack_trigger"
DETAIL: The trigger was probably installed during a previous attempt to run pg_repack on the table which was interrupted and for some reason failed to clean up the temporary objects. Please drop the trigger or drop and recreate the pg_repack extension altogether to remove all the temporary objects left over.
Which forces us to manually recreate the extension.
What is the easiest way to schedule pg_repack without the fear of it timing out? Alternatively, is there a way to gracefully shut down pg_repack, so that we can retry without having to recreate the extension?
This pg_repack command takes a lot to complete when you use Cloud Run .
The best and easiest approach is to work with Google Cloud Functions or Google Cloud Scheduler to schedule the pg_repack command to run smoothly.
Without the risk of timing out by using Cloud Scheduler to trigger the cloud function at the required interval( such as once a day) run periodically.
Below the code, the repack_table_felix function is used in this code to repack the designated table. The force_inplace=true parameter forces the repack to take place in-place, which might be faster but uses up more disc space. This parameter can be changed to suit your unique use case.
The "repack_trigger" handles the exception error messages, This will allow the function to exit gracefully without leaving any object behind.
import psycopg2
def repack_database(request):
conn = psycopg2.connect(host="<host name>", dbname="<database name>", user="<username>", password="<password>")
cur = conn.cursor()
try:
cur.execute("SELECT pg_repack.repack_table_felix('<your table name>', force_inplace=true)")
conn.commit()
except Exception as e:
conn.rollback()
print(e)
finally:
cur.close()
conn.close()
return "pg_repack completed successfully"

Recreate Azure SQL Database before E2E test

I have an application with an Azure SQL Server. I have a test environment where I deploy the app for end to end testing. I want to reset the database to a certain state before the tests. I have a bacpac file with that state. My goal is to do this with a ci tool (appveyor). I am aware that I can drop the entire database with azure powershell api. Should I drop the database or should I empty it? Or am I thinking this totally wrong?
You could use Powershell to restore the BACPAC before the tests and drop the database after the tests. An example is available on Azure docs: https://learn.microsoft.com/en-us/azure/sql-database/scripts/sql-database-import-from-bacpac-powershell
An option that will probably execute faster, is to directlry run T-SQL against the database to setup your initial state before the tests and empty the database after the tests.
However, doing a full BACPAC restore and drop out is much easier to setup in such a way that you can run multiple testruns in parallel as it is not relying on a pre-existent database that it is claiming for the duration of the run.
Edit: Changed link to en-us instead of nl-nl
Another option is you run Azure database as docker service, this may simplify your initial databse setup.

How to query IIS web logs from T-SQL directly

I learned how to do basic queries of the IIS web log with LogParser Studio and to export the LogParser query into a PowerShell script, which is great because I can have the Windows Task Scheduler run the script on a schedule.
However, I was wondering if it might be possible to directly query the IIS web log using a T-SQL Script rather than using my current procedure:
Use TaskScheduler to run Powershell script to run LogParser query of IIS web log and export results as CSV
Use SQL Agent to run SSIS package to import CSV into SQL Server Table
View table results on MVC webpage
What I'd really like to do is have the user in the MVC webpage click a button to trigger (via actionlink) a stored procedure that re-queries the IIS weblog, runs the SSIS import package via SQL Agent Job, and displays the results on screen.
While I could have the Task Scheduler and Agent Jobs run more frequently, having it run by demand (by user click) makes sure that the query is run only on demand, and not during time intervals in which there is no demand for the query results.
Is such a thing even possible with the state of SQL Server? (I'm running version 2014 and IIS version 8.5).
As #Malik mentioned, sp_start_job can be used to run an unscheduled Agent job.
In this case, my job has two steps:
Run the PowerShell script, with the script pasted directly into the Agent job step. I had to make sure that I had a proxy for PowerShell set up.
Run the SSIS package to import the CSV into the SQL table.
My stored procedure is very simple:
ALTER PROCEDURE [dbo].[RefreshErrorQuery]
-- No parameters
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Rigger unscheduled job to run
exec msdb.dbo.sp_start_job N'Refresh Web Query';
END

Why does a SQL Azure DACPAC upgrade (via a PowerShell script) consistently take 30min to complete

I created a PowerShell script to upgrade a SQL Azure instance with my latest DACPAC (taken from http://msdn.microsoft.com/en-us/library/ee634742.aspx).
What I have experienced when running my PowerShell script is that it consistently takes approximately 30min to execute. The script is idle for almost half an hour waiting on $dacstore.IncrementalUpgrade($dacName, $dacType, $upgradeProperties) to return from execution and nothing is printed out on the PowerShell console window. Only right at the end of the half hour does the incremental update start spitting out console messages which inform me that the upgrade is taking place (essentially it appears that the script has hung for 30min until it finally comes back alive and the script does this consistently every time).
Does it usually take this long for the IncrementalUpgrade to complete and is there supposed to be a 30min period of inactivity/waiting?
Note that I am running the PowerShell script from my local machine which is external to the Azure network.
Thanks for any insight you can give for this, I am hoping that I can reduce this incremental upgrade process to substantially less than 30min so that my continuous integration build doesn't take so long.
According to Microsoft Support this is a known issue and will be fixed in SQL Server 2012 (code named Denali). Here are the details from Microsoft Support:
It’s a known issue that using SSMS 2008 or PowerShell to update DAC on
SqlAzure is very slow. SQLServer 2008 utilize old extraction engine
which run query for every column and small object. This way works well
at on-premise server, and meets SQLServer 2008 original design target.
However, when managing the SqlAzure database, the query need be
transferred over internet, network latency makes the old extraction
becomes inefficient, especially, when network is not good.
Our SQL product team aware this issue and designed new extraction
engine to fix it. The new engine is integrated in SQL Server 2012
(code name Denali). Unfortunately, some of the engine behavior may
bring break changes to SQL Server 2008. We try different approach but
we can’t relief regression barrier when apply the new engine in the
SQL server 2008. Therefore, we don’t have plan to deliver the new
extraction engine as hotfix on SQLServer 2008 so far. That will impact
the current on-premise users and operation.
Further details about how I architected the PowerShell script with a continuous integration (CI) process can be found here.

Run SSIS Package from T-SQL

I noticed you can use the following stored procedures (in order) to schedule a SSIS package:
msdb.dbo.sp_add_category #class=N'JOB', #type=N'LOCAL', #name=N'[Uncategorized (Local)]'
msdb.dbo.sp_add_job ...
msdb.dbo.sp_add_jobstep ...
msdb.dbo.sp_update_job ...
msdb.dbo.sp_add_jobschedule ...
msdb.dbo.sp_add_jobserver ...
(You can see an example by right clicking a scheduled job and selecting "Script Job as-> Create To".)
AND you can use sp_start_job to execute the job immediately, effectively running SSIS packages on demand.
Question: does anyone know of any msdb.dbo.[...] stored procedures that simply allow you to run SSIS packages on the fly without using sp_cmdshell directly, or some easier approach?
Well, you don't strictly need the sp_add_category, sp_update_job or sp_add_jobschedule calls. We do an on-demand package execution in our app using SQL Agent with the following call sequence:
- sp_add_job
- sp_add_jobstep
- sp_add_jobserver
- sp_start_job
Getting the job status is a little tricky if you can't access the msdb..sysjobXXX tables, but our jobs start & run just fine.
EDIT:: Other than xp_cmdshell, I'm not aware of another way to launch the the SSIS handlers from withinSQL Server. Anyone with permissions on the server can start the dtexec or dtutil executables; then you can use batch files, job scheduler etc.
Not really... you could try sp_OACreate but it's more complicated and may not do it.
Do you need to run them from SQL? They can be run from command line, .net app etc
In SQL Server 2012+ it is possible to use the following functions (found in the SSISDB database, not the msdb database) to create SSIS execution jobs, prime their parameters, and start their execution:
[catalog].[create_execution]
[catalog].[set_execution_parameter_value]
[catalog].[start_execution]