CI DACPAC not deploying to Azure SQL Database - azure-devops

I am using DACPAC approach for my automatic database deployment on Azure Sql Server database from visual studio 2019,I have added DACPAC database for my local database now problem is that dacpac file is created in CI pipeline and going to release it on Azure SQL database using 'Azure SQL DacpacTask' but their I am facing issue while releasing the dacpac file.
*** Could not deploy package.
Error SQL72014: .Net SqlClient Data Provider: Msg 40515, Level 15, State 1, Procedure ddltrg_CREATE_Activity_LOG, Line 16 Reference to database and/or server name in 'TrackDBChanges..tblDDLEventLog' is not supported in this v
ersion of SQL Server.
Error SQL72045: Script execution error. The executed script:
CREATE TRIGGER [ddltrg_CREATE_Activity_LOG]
ON DATABASE
FOR DDL_DATABASE_LEVEL_EVENTS
AS SET NOCOUNT ON;
BEGIN
DECLARE #xmlEventData AS XML;
SET #xmlEventData = eventdata();
IF CONVERT (VARCHAR (250), #xmlEventData.query('data(/EVENT_INSTANCE/DatabaseName)')) NOT IN ('model')
INSERT INTO TrackDBChanges..tblDDLEventLog (EventTime, EventType, ServerName, DatabaseName, ObjectType, ObjectName, UserName, CommandText, CommandTextXML, HostName, LoginName)
SELECT REPLACE(CONVERT (VARCHAR (MAX), #xmlEventData.query('data(/EVENT_INSTANCE/PostTime)')), 'T', ' '),
CONVERT (VARCHAR (MAX), #xmlEventData.query('data(/EVENT_INSTANCE/EventType)')),
CONVERT (VARCHAR (MAX), #xmlEventData.query('data(/EVENT_INSTANCE/ServerName)')),
CONVERT (VARCHAR (MAX), #xmlEventData.query('data(/EVENT_INSTANCE/DatabaseName)')),
CONVERT (VARCHAR (MAX), #xmlEventDa
1 more errors. Click on expand view in the context menu to view complete logs.
my structure of Dacpac project

Related

Trying to use cloud storage on Db2 external tables

I try to use Db2 external tables, with Swift or Amazon S3 object storage option.
I use my own Openstack Swift server.
Here are the details of steps:
[i1156#lat111 ~]$ swift --auth-version 3 --os-auth-url http://myip:5000/v3 --os-project-name myproject --os-project-domain-name default --os-username user1 --os-password mypass list container1
outfile
[i1156#lat111 ~]$ db2 "CREATE EXTERNAL TABLE TB_EXTERNAL(COL1 VARCHAR(5)) USING (FORMAT TEXT DELIMITER '|' QUOTEDVALUE DOUBLE CCSID 1208 NULLVALUE 'NULL' NOLOG TRUE DATAOBJECT 'outfile' SWIFT ('http://myip:5000/v3', 'user1', 'mypass', 'container1'))"
DB20000I The SQL command completed successfully.
[i1156#lat111 ~]$ db2 "select * from TB_EXTERNAL"
COL1
-----
SQL20569N The external table operation failed due to a problem with the
corresponding data file or diagnostic files. File name: "outfile". Reason
code: "1". SQLSTATE=428IB
The result is the same when I try to use AWS S3 storage. Any idea ?
Thanks

Error when creating external table in Redshift Spectrum with dbt: cross-database reference not supported

I want to create an external table in Redshift Spectrum from CSV files. When I try doing so with dbt, I get a strange error. But when I manually remove some double quotes from the SQL generated by dbt and run it directly, I get no such error.
First I run this in Redshift Query Editor v2 on default database dev in my cluster:
CREATE EXTERNAL SCHEMA example_schema
FROM DATA CATALOG
DATABASE 'example_db'
REGION 'us-east-1'
IAM_ROLE 'iam_role'
CREATE EXTERNAL DATABASE IF NOT EXISTS
;
Database dev now has an external schema named example_schema (and Glue catalog registers example_db).
I then upload example_file.csv to the S3 bucket s3://example_bucket. The file looks like this:
col1,col2
1,a,
2,b,
3,c
Then I run dbt run-operation stage_external_sources in my local dbt project and get this output with an error:
21:03:03 Running with dbt=1.0.1
21:03:03 [WARNING]: Configuration paths exist in your dbt_project.yml file which do not apply to any resources.
There are 1 unused configuration paths:
- models.example_project.example_models
21:03:03 1 of 1 START external source example_schema.example_table
21:03:03 1 of 1 (1) drop table if exists "example_db"."example_schema"."example_table" cascade
21:03:04 Encountered an error while running operation: Database Error
cross-database reference to database "example_db" is not supported
I try running the generated SQL in Query Editor:
DROP TABLE IF EXISTS "example_db"."example_schema"."example_table" CASCADE
and get the same error message:
ERROR: cross-database reference to database "example_db" is not supported
But when I run this SQL in Query Editor, it works:
DROP TABLE IF EXISTS "example_db.example_schema.example_table" CASCADE
Note that I just removed some quotes.
What's going on here? Is this a bug in dbt-core, dbt-redshift, or dbt_external_tables--or just a mistake on my part?
To confirm, I can successfully create the external table by running this in Query Editor:
DROP SCHEMA IF EXISTS example_schema
DROP EXTERNAL DATABASE
CASCADE
;
CREATE EXTERNAL SCHEMA example_schema
FROM DATA CATALOG
DATABASE 'example_db'
REGION 'us-east-1'
IAM_ROLE 'iam_role'
CREATE EXTERNAL DATABASE IF NOT EXISTS
;
CREATE EXTERNAL TABLE example_schema.example_table (
col1 SMALLINT,
col2 CHAR(1)
)
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
STORED AS TEXTFILE
LOCATION 's3://example_bucket'
TABLE PROPERTIES ('skip.header.line.count'='1')
;
dbt config files
models/example/schema.yml (modeled after this example:
version: 2
sources:
- name: example_source
database: dev
schema: example_schema
loader: S3
tables:
- name: example_table
external:
location: 's3://example_bucket'
row_format: >
serde 'org.apache.hadoop.hive.serde2.OpenCSVSerde'
with serdeproperties (
'strip.outer.array'='false'
)
columns:
- name: col1
data_type: smallint
- name: col2
data_type: char(1)
dbt_project.yml:
name: 'example_project'
version: '1.0.0'
config-version: 2
profile: 'example_profile'
model-paths: ["models"]
analysis-paths: ["analyses"]
test-paths: ["tests"]
seed-paths: ["seeds"]
macro-paths: ["macros"]
snapshot-paths: ["snapshots"]
target-path: "target"
clean-targets:
- "target"
- "dbt_packages"
models:
example_project:
example:
+materialized: view
packages.yml:
packages:
- package: dbt-labs/dbt_external_tables
version: 0.8.0

PSQLEXCEPTION: Unexpected error code "55000" received from data source "FEDSER". Associated text and tokens are "This ResultSet is closed."

I am trying to do the differential data load from db2 to PostgreSQL table through InfoSphere Federation Server.
Followed below steps and got the expeption:
SQL1822N Unexpected error code "55000" received from data source "FEDSER".
Associated text and tokens are "This ResultSet is closed.".
Please find the below steps which I followed:
create wrapper jdbc
DB20000I The SQL command completed successfully.
CREATE SERVER FEDSER TYPE JDBC VERSION '12' WRAPPER JDBC OPTIONS( ADD DRIVER_PACKAGE 'E:\Sandhya\postgresql-8.1-415.jdbc3.jar', URL 'jdbc:postgresql://localhost:5432/SCOPEDB', DRIVER_CLASS 'org.postgresql.Driver', DB2_IUD_ENABLE 'Y', db2_char_blankpadded_comparison 'Y', db2_varchar_blankpadded_comparison 'Y', VARCHAR_NO_TRAILING_BLANKS 'Y', JDBC_LOG 'Y')
DB20000I The SQL command completed successfully.
CREATE USER MAPPING FOR SANAGARW SERVER FEDSER OPTIONS (REMOTE_AUTHID 'postgres',REMOTE_PASSWORD '*****')
DB20000I The SQL command completed successfully
SELECT COUNT(*) FROM "SCOPE".EMPLOYEE
SQL1822N Unexpected error code "55000" received from data source "FEDSER".
Associated text and tokens are "This ResultSet is closed.".
I am using Postgres version 12, Java version "1.8.0_241"
Please help me to resolve this issue. or once connection get created then I can only create the nickname.
Consider using Db2 11.5 instead of InfoSphere Federation Server which went out of support 2017-09-30 https://www.ibm.com/support/lifecycle/#/search?q=InfoSphere%20Federation%20Server
Db2 11.5 includes inbuilt support for PostgreSQL federation in all Db2 editions including Db2 Community Edition
https://www.ibm.com/support/pages/data-source-support-matrix-federation-bundled-db2-luw-v115

Staus NOTRUN in Automatic task scheduler

I created a DB2 task to run my stored procedure automatically at a specific time, I created the task using the ADMIN_TASK_ADD procedure:
CALL SYSPROC.ADMIN_TASK_ADD ( 'WR_AM_ADT_AUTO_CNRRM_SCHDLR',
NULL,
NULL,
NULL,
'05 16 * * *',
'ASPECT',
'WR_AM_ADT_AUTO_CNRRM',
'81930',NULL,NULL);
COMMIT;
I want to run my scheduled task every day at 04:05 PM, but it didn't work and giving the status as
NOTRUN, SQLCODE -104
.
So can anyone please tell me what am I doing wrong?
I also checked my scheduler in task list using following command:
SELECT * from SYSTOOLS.ADMIN_TASK_LIST
I am using DB2 9.7 version on Windows.
The status of the task NOTRUN means an error prevented the scheduler from calling the task's procedure. The SQLCODE indicates the type of error.
I suggest you the followings;
Confirm scheduler is enabled.
db2 > db2set
DB2_ATS_ENABLE=YES
ATS depends on the SYSTOOLSPACE tablespace to store historical data and configuration information. You can check if the tablespace exists in your system with the following query.
db2 select TBSPACE from SYSCAT.TABLESPACES where TBSPACE = 'SYSTOOLSPACE'
You can test stored procedure in isolation
CALL WR_AM_ADT_AUTO_CNRRM()
Then run your task in schedular!

Full-text search using Windows Search Service and SQL Server 2008 R2

Currently I'm trying to query the Windows Search Service from a SQL Server 2008 R2 instance (also tested on SQL Server 2012) . Windows Search is being exposed as an OLE DB datasource, giving me several options to query the search index. When configuring a new Linked Server in SQL Server, Management Studio gives me the option to select the Microsoft OLE DB Provider for Search, implying that I should be able to connect to it from SQL Server. It turns out to be a challenge to get this up and running however. Below you'll find the error message I stumbled upon.
OLE DB provider "Search.CollatorDSO" for linked server "TESTSERVER" returned message "Command was not prepared.".
Msg 7399, Level 16, State 1, Line 2
The OLE DB provider "Search.CollatorDSO" for linked server "TESTSERVER" reported an error. Command was not prepared.
Msg 7350, Level 16, State 2, Line 2
Cannot get the column information from OLE DB provider "Search.CollatorDSO" for linked server "TESTSERVER".
Things get even more interesting. Although the Linked Server solution isn't working, I'm able to wrap code that queries Windows Search in a CLR Function (using MSDN: Querying the Index Programmatically) and use if successfully within SQL Server. This is however less desirable, because of the steps needed to set it up (deploying the library, configuring permissions, etc.). I've tried several parameter settings, without any luck. I've also tried enabling some of the Search.CollatorDSO provider options, like allowing the provider to be instantiated as an in-process server. I'm currently using the settings below. For security I'm using the login's current security context.
Provider: Microsoft OLE DB Provider for Search
Data source: (local)
Provider string: Provider=Search.CollatorDSO.1;EXTENDED?PROPERTIES="Application=Windows"
Location: -
Additionally I need to search network drives, can this be done using shared Windows libraries?
I'm aware more people have been struggling with this problem over the last few years. I'm wondering if someone has been able to get this up and running, or could point me in the right direction.
OLEDB Works
Normal ADO/OLEDB components can query the Windows Search service with the connection string:
provider=Search.CollatorDSO.1;EXTENDED PROPERTIES="Application=Windows"
And an example query:
SELECT TOP 100000 "System.ItemName",
"System.ItemNameDisplay",
"System.ItemType",
"System.ItemTypeText",
"System.Search.EntryID",
"System.Search.GatherTime",
"System.Search.HitCount",
"System.Search.Store",
"System.ItemUrl",
"System.Filename",
"System.FileExtension",
"System.ItemFolderPathDisplay",
"System.ItemPathDisplay",
"System.DateModified",
"System.ContentType",
"System.ApplicationName",
"System.KindText",
"System.ParsingName",
"System.SFGAOFlags",
"System.Size",
"System.ThumbnailCacheId"
FROM "SystemIndex"
WHERE CONTAINS(*,'"Contoso*"',1033)
You can try the query directly on SQL Server in SQL Server Management Studio by attempting to run:
SELECT *
FROM OPENROWSET(
'Search.CollatorDSO',
'Application=Windows',
'SELECT TOP 100 "System.ItemName", "System.FileName" FROM SystemIndex');
Which gives the errors:
OLE DB provider "Search.CollatorDSO" for linked server "(null)" returned message "Command was not prepared.".
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "Search.CollatorDSO" for linked server "(null)" reported an error. Command was not prepared.
Msg 7350, Level 16, State 2, Line 1
Cannot get the column information from OLE DB provider "Search.CollatorDSO" for linked server "(null)".
Bonus Reading
Connect to Windows Search from SQL Server using Linked Server (January 2011)
Link with SQL Server (May 2008 - December 2012)
Trying to access Windows Search from SQL Server: An Appeal (July 2007)
Calling Windows Search from SQL Server 2008 (March 2011)
OLE DB provider "Search.CollatorDSO" returns "Command was not prepared" (April 2014 - Suggests CLR workaround)
Linked Server to Windows Search (February 2011)
MSDN Blogs: Query to the SYSTEMINDEX to read the Microsoft search results fails when using Search.CollatorDSO provider (August 2009 - suggests CLR workaround)
Vista Search (February 2007)
Have a look at this code..It may help
USE [YourDB]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
Create PROC [dbo].[SearchAllTables]
#SearchStr nvarchar(100)
AS
BEGIN
DECLARE #dml nvarchar(max) = N''
IF OBJECT_ID('tempdb.dbo.#Results') IS NOT NULL DROP TABLE dbo.#Results
CREATE TABLE dbo.#Results
([tablename] nvarchar(100),
[ColumnName] nvarchar(100),
[Value] nvarchar(max))
SELECT #dml += ' SELECT ''' + s.name + '.' + t.name + ''' AS [tablename], ''' +
c.name + ''' AS [ColumnName], CAST(' + QUOTENAME(c.name) +
' AS nvarchar(max)) AS [Value] FROM ' + QUOTENAME(s.name) + '.' + QUOTENAME(t.name) +
' (NOLOCK) WHERE CAST(' + QUOTENAME(c.name) + ' AS nvarchar(max)) LIKE ' + '''%' + #SearchStr + '%'''
FROM sys.schemas s JOIN sys.tables t ON s.schema_id = t.schema_id
JOIN sys.columns c ON t.object_id = c.object_id
JOIN sys.types ty ON c.system_type_id = ty.system_type_id AND c .user_type_id = ty .user_type_id
WHERE t.is_ms_shipped = 0 AND ty.name NOT IN ('timestamp', 'image', 'sql_variant')
INSERT dbo.#Results
EXEC sp_executesql #dml
SELECT *
FROM dbo.#Results
END