I have two tables:
create table Clients
(
id_client int not null identity(1,1) Primary Key,
name_client varchar(10) not null,
phone_client int not null
)
create table Sales
(
id_sale int not null identity(1,1) Primary Key,
date_sale date not null,
total_sale float not null
id_clients int not null identity(1,1) Foreign Key references Clients
)
So, let's insert into Clients ('Ralph', 00000000), the id_client is going to be 1 (obviously). The question is: How could I insert that 1 into Sales?
FIrst of all - you cannot have two columns defined as identity in any table - you will get an error
Msg 2744, Level 16, State 2, Line 1
Multiple identity columns specified for table 'Sales'. Only one identity column per table is allowed.
So you will not be able to actually create that Sales table.
The id_clients column in the Sales table references an identity column - but in itself, it should not be defined as identity - it gets whatever value your client has.
create table Sales
(
id_sale int not null identity(1,1) Primary Key,
date_sale date not null,
total_sale float not null
id_clients int not null foreign key references clients(id_client)
)
-- insert a new client - this will create an "id_client" for that entry
insert into dbo.Clients(name_client, phone_client)
values('John Doe', '+44 44 444 4444')
-- get that newly created "id_client" from the INSERT operation
declare #clientID INT = SCOPE_IDENTITY()
-- insert the new "id_client" into your sales table along with other values
insert into dbo.Sales(......, id_clients)
values( ......., #clientID)
This works for me like a charm, because as far as I understand, if I migrate user or customer data etc from an old copy of the database, - the identity column and the UId(user id) or CId(customer id) used to give each row it's own unique identity - will become unsynced and therefor I use the second column(TId/UId etc) that contains a copy of the identity column's value to relate my data purely through app logic flow.
I can offset the actual SQL Identity Column(IdColumn) when a migration takes place via inserting and deleting dummy data to the count of the largest number found in that TId/CId etc column with the old data to increment the new database table(s) identity column(s) away from having duplicates on new inserts by using the dummy inserts to push the identity column up to old data values and therefor new customers or users etc will continue to source unique values from the identity column upon new inserts after being filled with the old data which will still contain the old identity values in the second column so that the other data(transactions etc) in other tables will still be in sync with which customer/user is related to said data, be it a transaction or whatever. But not have duplicates as the dummy data will have pushed up the IdColumn value to match the old data.
So if I have say 920 user records and the largest UId is 950 because 30 got deleted. Then I will dummy insert and delete 31 rows into the new table before adding the old 920 records to ensure UId's will have no duplicates.
It is very around the bush, but it works for my limited understanding of SQL XD
What this also allows me to do is delete a migrated user/customer/transaction at a later time using the original UId/CId/TId(copy of identity) and not have to worry that it will not be the correct item being deleted if I did not have the copy and were to be aiming at the actual identity column which would be out of sync if the data is migrated data(inserted into database from old database). Having a copy = happy days. I could use (SET IDENTITY_INSERT [dbo].[UserTable] ON) but I will probably screw that up eventually, so this way is FOR ME - fail safe.
Essentially making the data instance agnostic to an extent.
I use OUTPUT inserted.IdColumn to then save any pictures related using that in the file name and am able to call them specifically for the picture viewer and that data is also migration safe.
To do this, a copy, specifically at insert time, is working great.
declare #userId INT = SCOPE_IDENTITY() to store the new identity value-UPDATE dbo.UserTable to select where to update all in this same transaction
-SET UId = #userId to set it to my second column
-WHERE IdColumn = #userId; to aim at the correct row
USE [DatabaseName]
GO
/****** Object: StoredProcedure [dbo].[spInsertNewUser] Script Date:
2022/02/04 12:09:19 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER PROCEDURE [dbo].[spInsertNewUser]
#Name varchar(50),
#PhoneNumber varchar(20),
#IDNumber varchar(50),
#Address varchar(200),
#Note varchar(400),
#UserPassword varchar(MAX),
#HideAccount int,
#UserType varchar(50),
#UserString varchar(MAX)
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
INSERT INTO dbo.UserTable
([Name],
[PhoneNumber],
[IDNumber],
[Address],
[Note],
[UserPassword],
[HideAccount],
[UserType],
[IsDeleted],
[UserString])
OUTPUT inserted.IdColumn
VALUES
(#Name,
#PhoneNumber,
#IDNumber,
#Address,
#Note,
#UserPassword,
#HideAccount,
#UserType,
#UserString);
declare #userId INT = SCOPE_IDENTITY()
UPDATE dbo.UserTable
SET UId = #userId
WHERE IdColumn = #userId;
END
Here is the create for the table for testing
CREATE TABLE [dbo].[UserTable](
[IdColumn] [int] IDENTITY(1,1) NOT NULL,
[UId] [int] NULL,
[Name] [varchar](50) NULL,
[PhoneNumber] [varchar](20) NULL,
[IDNumber] [varchar](50) NULL,
[Address] [varchar](200) NULL,
[Note] [varchar](400) NULL,
[UserPassword] [varchar](max) NULL,
[HideAccount] [int] NULL,
[UserType] [varchar](50) NULL,
[IsDeleted] [int] NULL,
[UserString] [varchar](max) NULL,
CONSTRAINT [PK_UserTable] PRIMARY KEY CLUSTERED
(
[IdColumn] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF,
ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
Related
CREATE TABLE instances(
ser_name VARCHAR(20) NOT NULL,
id INTEGER NOT NULL ,
ser_ip VARCHAR(16) NOT NULL,
status VARCHAR(10) NOT NULL,
creation_ts TIMESTAMP,
CONSTRAINT instance_id PRIMARY KEY(id)
);
CREATE TABLE characters(
nickname VARCHAR(15) NOT NULL,
type VARCHAR(10) NOT NULL,
c_level INTEGER NOT NULL,
game_data VARCHAR(40) NOT NULL,
start_ts TIMESTAMP ,
end_ts TIMESTAMP NULL ,
player_ip VARCHAR(16) NOT NULL,
instance_id INTEGER NOT NULL,
player_username VARCHAR(15),
CONSTRAINT chara_nick PRIMARY KEY(nickname)
);
ALTER TABLE
instances ADD CONSTRAINT ins_ser_name FOREIGN KEY(ser_name) REFERENCES servers(name);
ALTER TABLE
instances ADD CONSTRAINT ins_ser_ip FOREIGN KEY(ser_ip) REFERENCES servers(ip);
ALTER TABLE
characters ADD CONSTRAINT chara_inst_id FOREIGN KEY(instance_id) REFERENCES instances(id);
ALTER TABLE
characters ADD CONSTRAINT chara_player_username FOREIGN KEY(player_username) REFERENCES players(username);
insert into instances values
('serverA','1','138.201.233.18','active','2020-10-20'),
('serverB','2','138.201.233.19','active','2020-10-20'),
('serverE','3','138.201.233.14','active','2020-10-20');
insert into characters values
('characterA','typeA','1','Game data of characterA','2020-07-18 02:12:12','2020-07-18 02:32:30','192.188.11.1','1','nabin123'),
('characterB','typeB','3','Game data of characterB','2020-07-19 02:10:12',null,'192.180.12.1','2','rabin123'),
('characterC','typeC','1','Game data of characterC','2020-07-18 02:12:12',null,'192.189.10.1','3','sabin123'),
('characterD','typeA','1','Game data of characterD','2020-07-18 02:12:12','2020-07-18 02:32:30','192.178.11.1','2','nabin123'),
('characterE','typeB','3','Game data of characterE','2020-07-19 02:10:12',null,'192.190.12.1','1','rabin123'),
('characterF','typeC','1','Game data of characterF','2020-07-18 02:12:12',null,'192.188.10.1','3','sabin123'),
('characterG','typeD','1','Game data of characterG','2020-07-18 02:12:12',null,'192.188.13.1','1','nabin123'),
('characterH','typeD','3','Game data of characterH','2020-07-19 02:10:12',null,'192.180.17.1','2','bipin123'),
('characterI','typeD','1','Game data of characterI','2020-07-18 02:12:12','2020-07-18 02:32:30','192.189.18.1','3','dhiraj123'),
('characterJ','typeD','3','Game data of characterJ','2020-07-18 02:12:12',null,'192.178.19.1','2','prabin123'),
('characterK','typeB','4','Game data of characterK','2020-07-19 02:10:12','2020-07-19 02:11:30','192.190.20.1','1','rabin123'),
('characterL','typeC','2','Game data of characterL','2020-07-18 02:12:12',null,'192.192.11.1','3','sabin123'),
('characterM','typeC','3','Game data of characterM','2020-07-18 02:12:12',null,'192.192.11.1','2','sabin123');
here I need a view that shows the name of the server, the id of the instance and the number of active sessions (a session is active if the end timestamp is null). do my code wrong or something else? i am starting to learn so hoping for positive best answers.
my view
create view active_sessions as
select i.ser_name, i.id, count(end_ts) as active
from instances i, characters c
where i.id=c.instance_id and c.end_ts = null
group by i.ser_name, i.id;
This does not do what you want:
where i.id = c.instance_id and c.end_ts = null
Nothing is equal to null. You need is null to check a value against null.
Also, count(end_ts) will always produce 0, as we know already that end_ts is null, which count() does not consider.
Finally, I would highly recommend using a standard join (with the on keyword), rather than an implicit join (with a comma in the from clause): this old syntax from decades ago should not be used in new code. I think that a left join is closer to what you want (it would also take in account instances that have no character at all).
So:
create view active_sessions as
select i.ser_name, i.id, count(c.nickname) as active
from instances i
left join characters c on i.id = c.instance_id and c.end_ts is null
group by i.ser_name, i.id;
I want to create e temp table using select into syntax. Like:
select top 0 * into #AffectedRecord from MyTable
Mytable has a primary key. When I insert record using merge into syntax primary key be a problem. How could I drop pk constraint from temp table
The "SELECT TOP (0) INTO.." trick is clever but my recommendation is to script out the table yourself for reasons just like this. SELECT INTO when you're actually bringing in data, on the other hand, is often faster than creating the table and doing the insert. Especially on 2014+ systems.
The existence of a primary key has nothing to do with your problem. Key Constraints and indexes don't get created when using SELECT INTO from another table, the data type and NULLability does. Consider the following code and note my comments:
USE tempdb -- a good place for testing on non-prod servers.
GO
IF OBJECT_ID('dbo.t1') IS NOT NULL DROP TABLE dbo.t1;
IF OBJECT_ID('dbo.t2') IS NOT NULL DROP TABLE dbo.t2;
GO
CREATE TABLE dbo.t1
(
id int identity primary key clustered,
col1 varchar(10) NOT NULL,
col2 int NULL
);
GO
INSERT dbo.t1(col1) VALUES ('a'),('b');
SELECT TOP (0)
id, -- this create the column including the identity but NOT the primary key
CAST(id AS int) AS id2, -- this will create the column but it will be nullable. No identity
ISNULL(CAST(id AS int),0) AS id3, -- this this create the column and make it nullable. No identity.
col1,
col2
INTO dbo.t2
FROM t1;
Here's the (cleaned up for brevity) DDL for the new table I created:
-- New table
CREATE TABLE dbo.t2
(
id int IDENTITY(1,1) NOT NULL,
id2 int NULL,
id3 int NOT NULL,
col1 varchar(10) NOT NULL,
col2 int NULL
);
Notice that the primary key is gone. When I brought in id as-is it kept the identity. Casting the id column as an int (even though it already is an int) is how I got rid of the identity insert. Adding an ISNULL is how to make a column nullable.
By default, identity insert is set to off here to this query will fail:
INSERT dbo.t2 (id, id3, col1) VALUES (1, 1, 'x');
Msg 544, Level 16, State 1, Line 39
Cannot insert explicit value for identity column in table 't2' when IDENTITY_INSERT is set to OFF.
Setting identity insert on will fix the problem:
SET IDENTITY_INSERT dbo.t2 ON;
INSERT dbo.t2 (id, id3, col1) VALUES (1, 1, 'x');
But now you MUST provide a value for that column. Note the error here:
INSERT dbo.t2 (id3, col1) VALUES (1, 'x');
Msg 545, Level 16, State 1, Line 51
Explicit value must be specified for identity column in table 't2' either when IDENTITY_INSERT is set to ON
Hopefully this helps.
On a side-note: this is a good way to play around with and understand how select insert works. I used a perm table because it's easier to find.
Well, I have two tables:
CREATE TABLE Temp(
TEMP_ID int IDENTITY(1,1) NOT NULL, ... )
CREATE TABLE TEMP1(
TEMP1_ID int IDENTITY(1,1) NOT NULL,
TEMP_ID int, ... )
they are linked with TEMP_ID foreign key.
In a stored procedure I need to create tons of
Temp and Temp1 rows and update them, so I created a table variable (#TEMP) and I am dealing with it and finally make one big INSERT into Temp. My question is: how can I fill #Temp with correct TEMP_ID's without insert safely from multiple sessions?
you can use Scope_Identity() to find out last inserted row. You can use Output clause to find all newly inserted (or updated) rows.
create table #t1
(
id int primary key identity,
val int
)
Insert into #t1 (val)
output inserted.id, inserted.val
values (10), (20), (30)
T-SQL Insert into multiple linked tables using a condition and without using a cursor.
Hello,
I have the following tables
CREATE TABLE [dbo].[TestMergeQuote](
[uid] [uniqueidentifier] NOT NULL,
[otherData] [nvarchar](50) NULL,
CONSTRAINT [PK_TestMergeQuote] PRIMARY KEY CLUSTERED
(
[uid] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
ALTER TABLE [dbo].[TestMergeQuote] ADD CONSTRAINT [DF_TestMergeQuote_uid] DEFAULT (newid()) FOR [uid]
--=============
CREATE TABLE [dbo].[TestMergeClient](
[id] [int] IDENTITY(1,1) NOT NULL,
[otherData] [nvarchar](50) NULL,
CONSTRAINT [PK_TestMergeClient] PRIMARY KEY CLUSTERED
(
[id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
--==============
CREATE TABLE [dbo].[TestMergeDocument](
[id] [int] NOT NULL,
[uid_quote] [uniqueidentifier] NOT NULL,
[id_owner] [int] NOT NULL,
[id_keeper] [int] NULL,
[otherData] [nvarchar](50) NULL
) ON [PRIMARY]
GO
ALTER TABLE [dbo].[TestMergeDocument] WITH CHECK ADD CONSTRAINT [FK_TestMergeDocument_TestMergeClient_Keeper] FOREIGN KEY([id_keeper])
REFERENCES [dbo].[TestMergeClient] ([id])
GO
ALTER TABLE [dbo].[TestMergeDocument] CHECK CONSTRAINT [FK_TestMergeDocument_TestMergeClient_Keeper]
GO
ALTER TABLE [dbo].[TestMergeDocument] WITH CHECK ADD CONSTRAINT [FK_TestMergeDocument_TestMergeClient_Owner] FOREIGN KEY([id_owner])
REFERENCES [dbo].[TestMergeClient] ([id])
GO
ALTER TABLE [dbo].[TestMergeDocument] CHECK CONSTRAINT [FK_TestMergeDocument_TestMergeClient_Owner]
GO
ALTER TABLE [dbo].[TestMergeDocument] WITH CHECK ADD CONSTRAINT [FK_TestMergeDocument_TestMergeQuote] FOREIGN KEY([uid_quote])
REFERENCES [dbo].[TestMergeQuote] ([uid])
GO
ALTER TABLE [dbo].[TestMergeDocument] CHECK CONSTRAINT [FK_TestMergeDocument_TestMergeQuote]
GO
AND also table X with other various data.
I want to insert into these three tables the data that already exists in these 3 tables, but giving it different id's, and also replacing some of the data within the X table.
It's a sort of a "copy the data from last year", but add new info.
The condition is that id_keeper is sometimes null, and no insert should be done for it.
I am aware that I have to use OUTPUT and MERGE, but I have no ideea how to achieve something this complex.
The CRUDE code for this using a cursor would be:
DECLARE #OldIdDocument INT, #NewIdDocument INT
DECLARE #OldIdOwner INT, #NewIdOwner INT
DECLARE #OldIdKeeper INT, #NewIdKeeper INT
DECLARE #OldIdQuote UNIQUEINDETIFIER, #NewIdQuote UNIQUEINDETIFIER,
INSERT INTO TestMergeQuote(otherData)
SELECT TOP(1) otherData FROM TestMergeQuote WHERE uid = #OldIdQuote
SET #NewIdQuote = ##IDENTITY
INSERT INTO TestMergeClient(otherData)
SELECT TOP(1) otherData FROM TestMergeClient WHERE uid = #OldIdOwner
SET #NewIdOwner = ##IDENTITY
IF(#OldIdKeeper IS NOT NULL)
BEGIN
INSERT INTO TestMergeClient(otherData)
SELECT TOP(1) otherData FROM TestMergeClient WHERE uid = #OldIdKeeper
SET #NewIdKeeper = ##IDENTITY
END
INSERT INTO TestMergeDocument([uid_quote], [id_owner] , [id_keeper], otherData)
SELECT TOP(1) #NewIdQuote , #NewIdOwner , #NewIdKeeper ,otherData FROM TestMergeDocument WHERE uid = #OldIdDocument
SET #NewIdDocument = ##IDENTITY
You shouldn't have to use a cursor. What I would try is to first pump the data out into separate tables so you can manipulate the data to your heart's content.
Something like this first:
select * into TestMergeQuote_Temp from TestMergeQuote
That will make a new table with the data you want to copy. Of course you can add a where clause to filter the data so you aren't copying a very large table.
Then you can add values, change values, delete values on the _Temp versions.
When you are ready you can insert the data back. Of course you might have to turn auto id off if you have a primary key that is auto-incrementing. Or if you just want new id's and don't want to make id's manually, you should be able to insert the new records just fine and have new id's created for you.
But as a start, try pumping the data into new tables and then worry about inserting after that.
I've got a T-SQL script, that converts field to IDENTITY (in a weird way).
How do I convert it to PL/SQL? (and, probably, figure out, if there is a simpler way to do this - without creating a temporary table).
The T-SQL script:
-- alter table ts_changes add TS_THREADID VARCHAR(100) NULL;
-- Change Field TS_ID TS_NOTIFICATIONEVENTS to IDENTITY
BEGIN TRANSACTION
GO
CREATE TABLE dbo.Tmp_TS_NOTIFICATIONEVENTS
(
TS_ID int NOT NULL IDENTITY (1, 1),
TS_TABLEID int NOT NULL,
TS_CASEID int NULL,
TS_WORKFLOWID int NULL,
TS_NOTIFICATIONID int NULL,
TS_PRIORITY int NULL,
TS_STARTDATE int NULL,
TS_TIME int NULL,
TS_WAITSTATUS int NULL,
TS_RECIPIENTID int NULL,
TS_LASTCHANGEDATE int NULL,
TS_ELAPSEDCYCLES int NULL
) ON [PRIMARY]
SET IDENTITY_INSERT dbo.Tmp_TS_NOTIFICATIONEVENTS ON
GO
IF EXISTS(SELECT * FROM dbo.TS_NOTIFICATIONEVENTS)
EXEC('INSERT INTO dbo.Tmp_TS_NOTIFICATIONEVENTS (TS_ID, TS_TABLEID, TS_CASEID, TS_WORKFLOWID, TS_NOTIFICATIONID, TS_PRIORITY, TS_STARTDATE, TS_TIME, TS_WAITSTATUS, TS_RECIPIENTID, TS_LASTCHANGEDATE, TS_ELAPSEDCYCLES)
SELECT TS_ID, TS_TABLEID, TS_CASEID, TS_WORKFLOWID, TS_NOTIFICATIONID, TS_PRIORITY, TS_STARTDATE, TS_TIME, TS_WAITSTATUS, TS_RECIPIENTID, TS_LASTCHANGEDATE, TS_ELAPSEDCYCLES FROM dbo.TS_NOTIFICATIONEVENTS WITH (HOLDLOCK TABLOCKX)')
GO
SET IDENTITY_INSERT dbo.Tmp_TS_NOTIFICATIONEVENTS OFF
GO
DROP TABLE dbo.TS_NOTIFICATIONEVENTS
GO
EXECUTE sp_rename N'dbo.Tmp_TS_NOTIFICATIONEVENTS', N'TS_NOTIFICATIONEVENTS', 'OBJECT'
GO
ALTER TABLE dbo.TS_NOTIFICATIONEVENTS ADD CONSTRAINT
aaaaaTS_NOTIFICATIONEVENTS_PK PRIMARY KEY NONCLUSTERED
(
TS_ID
) WITH( STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
GO
COMMIT
From version 12C, Oracle supports the IDENTITY data type e.g.:
CREATE TABLE Tmp_TS_NOTIFICATIONEVENTS
( TS_ID int NOT NULL GENERATED ALWAYS AS IDENTITY.
...
Prior to version 12C, Oracle doesn't have an IDENTITY data type, so there is no equivalent PL/SQL code for this. If you want to ensure that all future inserts automatically get assigned a unique value for TS_ID you can do this:
1) Find out the highest value currently used:
select max(ts_id) from TS_NOTIFICATIONEVENTS;
2) Create a sequence that starts with a value higher than that, e.g.:
create sequence ts_id_seq start with 100000;
3) Create a trigger to populate the column from the sequence on insert:
create or replace trigger ts_id_trig
before insert on TS_NOTIFICATIONEVENTS
for each row
begin
:new.ts_id := ts_id_seq.nextval;
-- or if pre 11G:
-- select ts_id_seq.nextval into :new.ts_id from dual;
end;