I need to push the results from this code below into a undefined TEMP table. Temp table must be undefined because I wont know the column names of the result set .
declare #sql varchar(4000)
set #sql ='Select * from #Test'
exec (#sql)
--Need to insert the final result set into #TempTableName because I need to use it in code lower down in my Stored Procedure.
Found the answer.............
Needed to use a Global Temp table and that did it for me.
declare #sql varchar(4000)
set #sql ='Select * INTO ##TempTableName from #Test'
exec (#sql)
Select * from ##TempTableName
The ## is for a global temp table and that worked for me.
Related
I am creating one store proc that will get some tablename as a parameter and it will do
"select #TableName from #TableName"
But Sybase ASE sql not allowing me to do that. I am getting this message "Cannot select from or insert/update variable '#TableName' because it is not a table variable."
Here is my store proc mentioned below
CREATE PROC Test_result #TableName VARCHAR(40)
as
BEGIN
CREATE TABLE #Results (TableName nvarchar(370))
INSERT INTO #Results select #TableName from #TableName
select * from #Results
END
EXEC Test_result 'sometablename'
This will simulate my actual problem. I want to insert a tablename into a Results table if it match some condition(I haven't mention that here because I don't want to confuse you).
**
**Note: I want to do a quick select query from a TableName which I passed to the store proc.I don't want to create again the table
structure because that store proc may get another tablename whose
table DDL is different **
**
Could anyone provide some alternative or any solution on it ?
Sorry for delay in response. I have found myself a workaround for that which I would like to share.
INSERT INTO #Results select #TableName from #TableName
To make this working, use a variable to store this query and execute using EXEC statement in sybase.
The workaround will be,
BEGIN
SET #sqlquery='INSERT INTO #Results select #TableName from #TableName '
EXEC(#sqlquery)
END
This solved my problem as #tablename variable we can't directly used to replace the value of a table.
If the objective is to insert the value of #TableName into #Results then either of the following should suffice:
INSERT INTO #Results select #TableName
INSERT INTO #Results values (#TableName)
If the intent is to insert #TableName into #Results but only if there's a user table with this name in the current database then try:
INSERT INTO #Results select name from sysobjects where type = 'U' and name = #TableName
If this doesn't answer the question then please update the question with more details as well as some examples of #TableName values that do and do not work.
I am trying to insert result of dynamic sql into temp table. Important thing is i dont know the column names in advance. As per the SO suggestion the following should work
INSERT into #T1 execute ('execute ' + #SQLString )
also, omit the EXECUTE if the sql string is something other than a procedure.
However this is not working on SQL 2017
CREATE TABLE Documents(DocumentID INT, Status NVARCHAR(10))
INSERT INTO Documents(DocumentID,Status)
VALUES
(1,'Active'),
(2,'Active'),
(3,'Active'),
(4,'Active')
DECLARE #SQLString NVARCHAR(MAX)
SET #SQLString = 'SELECT * FROM Documents'
INSERT into #T1 execute ('execute ' + #SQLString )
I get error `Invalid object name '#T1'.`
Then i tried by omitting execute
INSERT into #T1 execute (#SQLString)
with same error `Invalid object name '#T1'.`
I should be able to do
SELECT * FROM #T1
You cannot do an INSERT INTO without having the table predefined. But what I believe you are asking is to do a SELECT INTO. I am aware of two ways of doing it. The first uses OPENROWSET, but I believe this has some drawbacks for security purposes. You could do the following:
sp_configure 'Ad Hoc Distributed Queries', 1
GO
RECONFIGURE
GO
SELECT *
INTO #T1
FROM OPENROWSET('SQLNCLI',
'Server=localhost;Trusted_Connection=yes;',
'SELECT * from <YOURDATABASE>.dbo.Documents')
Your second option is to create an inline TVF that will generate the table structure for you. So you could do the following:
CREATE FUNCTION getDocuments()
RETURNS TABLE
AS
RETURN
SELECT * from Documents
GO
SELECT * into #T1 FROM getDocuments()
I want to take a backup of a table with the timestamp value linked in the backup table.So that it can be easily figured out to which date this backup belongs to.I am trying something like this which is obviously not working.
Please suggest how to modify table name at runtime.
Scenario:
Insert into original_table+'_'+Convert(varchar(10),GETDATE(),112)
select * from original_table
The output should be:
A table should be created original_table_20141015 with the data.
You can build a SQL string with the new table name, then execute it using sp_executesql.
Example:
DECLARE #sql nvarchar(MAX)
SET #sql = 'SELECT * INTO original_table_' +
CONVERT(varchar(8), GETDATE(), 112) +
' FROM original_table'
EXEC sp_executesql #sql
Is it possible to pass the table name as input parameter to the stored procedure?
For example:
create procedure test
#tablename char(10)
as
begin
select * from #tablename
end
go
I know this does not work. So what is the best way if I want to pass the table name into the stored procedure?
Many thanks
The safest way to do this is via a view.
Create a view which unions all the tables you may wish to access (and which must all have the same column structure), and prefix the rows with the table name.
CREATE VIEW MultiTable
AS
SELECT 'table1' AS TableName, * FROM table1
UNION ALL
SELECT 'table2' AS TableName, * FROM table2
UNION ALL
SELECT 'table3' AS TableName, * FROM table3
Your stored procedure can now filter on the table name:
CREATE PROCEDURE test
#TableName varchar(100)
AS
SELECT * FROM MultiTable WHERE TableName = #TableName
This is safer than using dynamic SQL creation and execution.
You would need to use dynamic SQL, but you need to be aware of potential sql injection risks you open yourself up to as if #tablename contained something dodgy, you could end up in a world of pain.
e.g.
-- basic check to see if a table with this name exists
IF NOT EXISTS(SELECT * FROM sys.tables WHERE name = #tablename)
RETURN
DECLARE #sql NVARCHAR(100)
SET #sql = 'SELECT * FROM ' + QUOTENAME(#tablename)
EXECUTE(#sql)
You need to be very careful with this approach, make sure you don't open up a can of security worms.
My other concern is that you may be trying to make generic data access sprocs which is usually a bad idea. Obviously I don't know your use case.
DECLARE #Name VARCHAR(50)
SET #Name='Company'
EXEC('SELECT * from ' + #Name )
use this way to get record from database.
I installed SQL Server 2012, and attached a database originally generated by SQL Server 2008 R2.
Everything appeared to work perfectly, with one problem: merges dropped from 1000 per second to 10 per second (a 100x slowdown).
I'm surmising that its because I am accessing a SQL Server 2008 R2 database from SQL Server 2012. Is there some way to convert the database to SQL Server 2012 format? Or is there something else thats going on that might explain the 100x slowdown in performance?
Please make sure that you set the compatibility mode of the database to 110, and update statistics.
ALTER DATABASE MyDatabase SET COMPATIBILITY_LEVEL = 110;
DECLARE #sql NVARCHAR(MAX) = N'';
SELECT #sql += CHAR(13) + CHAR(10) + 'UPDATE STATISTICS '
+ QUOTENAME(SCHEMA_NAME(schema_id))
+ '.' + QUOTENAME(name) + ' WITH FULLSCAN;'
FROM sys.tables;
PRINT #sql;
--EXEC sp_executesql #sql;
When I ran the SQL in the answer the nvarchar overflowed. The problem is when your database has too many tables the SQL is too long for an nvarchar. My database had enough tables to overflow a varchar (twice as long as a nvarchar). So I edited the SQL to loop through each table and execute separate statements. This way you wont miss updating the stats on any of your tables.
ALTER DATABASE MyDatabase SET COMPATIBILITY_LEVEL = 110;
DECLARE #SQL NVARCHAR(MAX) = N'';
Declare #Tables table
([Schema] nvarchar(50)
,[TableName] nvarchar(100))
Insert into #Tables
Select QUOTENAME(SCHEMA_NAME(schema_id)),QUOTENAME(name)
FROM sys.tables;
Declare #Schema nvarchar(50), #TableName nvarchar(100)
While Exists(Select * From #Tables)
Begin
Select Top 1 #Schema = [Schema], #TableName = [TableName] From #Tables
Set #SQL = 'UPDATE STATISTICS ' + #Schema + '.' + #TableName + ' WITH FULLSCAN;'
Begin Try
EXEC SP_ExecuteSql #SQLToExecute = #SQL
Print 'Completed: ' + #SQL
End Try
Begin Catch
DECLARE #ErrMsg nvarchar(4000)
SELECT #ErrMsg = SubString(ERROR_MESSAGE(),0,900)
Select GetDate(), 'Failed updating stats on ' + #Schema + ' ' + #TableName + '. Error: '+#ErrMsg
End Catch
Delete From #Tables Where [Schema] = #Schema and [TableName] = #TableName
End
Updating the Stats is must when you detach and attach database. Otherwise query planner cannot generate efficient execution plan and end-up with long execution time. This is what I noticed.
To upgrade a database file to use LocalDB:
1.In Server Explorer, choose the Connect to Database button.
2.In the Add Connection dialog box, specify the following information:
Data Source: Microsoft SQL Server (SqlClient)
Server Name: (LocalDB)\v11.0
Attach a database file: Path, where Path is the physical path of the primary .mdf file.
Logical Name: Name, where Name is the name that you want to use with the file.
Choose the OK button.
When prompted, choose the Yes button to upgrade the file.
Is this on the right track:
http://msdn.microsoft.com/en-us/library/ms189625.aspx
USE master;
GO
CREATE DATABASE MyDatabase
ON (FILENAME = 'C:\MySQLServer\MyDatabase.mdf'),
(FILENAME = 'C:\MySQLServer\Database.ldf')
FOR ATTACH;
GO