Update List
set Date = "2009-07-21T19:00:40"
sql server doesn't recognize this format. is there a conversion function
You can use CAST and CONVERT. But maybe you must replace the 'T' with a space, before you can convert it. (There are also string manipulation functions available.)
Worked just fine for me (SQL Express 2005) except for the double quotes (which SQL Server assumed was a column delimiter); is that what's throwing the error? Thanks for the sample code, but can you produce the actual error?
In other words,
DECLARE #List TABLE ( [date] DATETIME
)
INSERT INTO #List
SELECT GETUTCDATE()
UPDATE #List SET Date =
"2009-07-21T19:00:40"
produces
Msg 207, Level 16, State 1, Line 7
Invalid column name
'2009-07-21T19:00:40'.
Whereas
DECLARE #List TABLE ( [date] DATETIME
)
INSERT INTO #List
SELECT GETUTCDATE()
UPDATE #List SET Date =
'2009-07-21T19:00:40'
runs successfully.
Try this:
UPDATE List SET Date = '2009/07/21 19:00:40'
Even worked for me too(2005 & 2008)..
DECLARE #tbl TABLE ( [date] DATETIME )
INSERT INTO #tbl SELECT getdate()
select * from #tbl
UPDATE #tbl SET Date = '2009-07-21T19:00:40'
select * from #tbl
However, just give a shot with DATEFORMAT COMMAND
Something like this
**SET DATEFORMAT dmy**
DECLARE #tbl TABLE ( [date] DATETIME )
INSERT INTO #tbl SELECT getdate()
select * from #tbl
UPDATE #tbl SET Date = '2009-07-21T19:00:40'
select * from #tbl
Related
I have a table T1 with alphanumeric codes (varchar column) where always the three first digits will be numeric like this:
001ABCD
100EFGH
541XYZZ
OTHER
NOTE: Please notice that I have ONE exception record which is all alpha (OTHER).
Also I have a table T2 with 3-digit numbers (int column) like this:
001
200
300
So when I run the following query:
SELECT * from T1
LEFT JOIN T2
ON SUBSTRING(T1.code1,1,3) = T2.code2
WHERE T1.code1 <> 'OTHER'
It is causing me the error:
Conversion failed when converting the varchar value 'OTH' to data type int.
I know the issue but not how to fix it (it's trying to compare 'OTH' with the T2.code2 INT column).
I tried to use WHERE but it didn't work at all.
I cannot get rid of the 'OTHER' record and convert the T2.code2 column from int to varchar is not an option. Any idea?
Here are 3 different ways you can solve this. I would recommended the persisted computed column since it only has to be calculated on insert and update, not every time you run the read query.
DROP TABLE IF EXISTS #T2;
DROP TABLE IF EXISTS #T1;
CREATE TABLE #T1
(
Code1 VARCHAR(10)
,Code2Computed AS TRY_CONVERT(INT,SUBSTRING(Code1,1,3)) PERSISTED
)
;
CREATE TABLE #T2
(
Code2 INT
)
;
INSERT INTO #T1
(Code1)
VALUES
('001ABCD')
,('100EFGH')
,('541XYZZ')
,('OTHER')
;
INSERT INTO #T2
(Code2)
VALUES
(001)
,(100)
,(200)
,(300)
,(541)
;
--Convert INT to 3 digit code
SELECT *
FROM #T1
LEFT JOIN #T2
ON SUBSTRING(#T1.Code1,1,3) = RIGHT(CONCAT('000',#T2.Code2),3)
;
--Convert 3 digit code to INT
SELECT *
FROM #T1
LEFT JOIN #T2
ON TRY_CONVERT(INT,SUBSTRING(#T1.Code1,1,3)) = #T2.Code2
;
--Use computed column
SELECT *
FROM #T1
LEFT JOIN #T2
ON #T1.Code2Computed = #T2.Code2
;
Hi all I have a stored procedure with two parameters #Startdate and #Enddate. When i execute the procedure i get data.
Now i added a parameter and it has list of values. So i added a split function and added in the WHERE clause. Now after making the changes when i execute my SP i do not get any data. I tried commenting out the 3rd Parameter from the WHERE clause and now i see the data again. Not sure what is happening. Any advice is greatly appreciated.
I have tried different split functions and Charindex(','+cast(tableid as varchar(8000))+',', #Ids) > 0 and nothing has worked.
Thanks
NOTE: The concatenation and splitting of parameter values is a poor design for performance reasons and, most importantly, very susceptible to SQL injection attacks. Please research some alternatives. If you must proceed down this path...
There are a great many split functions out there, but I used this one here to illustrate a possible solution.
CREATE FUNCTION [dbo].[fnSplitString]
(
#string NVARCHAR(MAX),
#delimiter CHAR(1)
)
RETURNS #output TABLE(splitdata NVARCHAR(MAX)
)
BEGIN
DECLARE #start INT, #end INT
SELECT #start = 1, #end = CHARINDEX(#delimiter, #string)
WHILE #start < LEN(#string) + 1 BEGIN
IF #end = 0
SET #end = LEN(#string) + 1
INSERT INTO #output (splitdata)
VALUES(SUBSTRING(#string, #start, #end - #start))
SET #start = #end + 1
SET #end = CHARINDEX(#delimiter, #string, #start)
END
RETURN
END
GO
It's unclear, from your question, if you need to filter your results based on an int, varchar or various other data types available, but here are two options (and probably the most common).
DECLARE #TableOfData TABLE
(
ID_INT INT,
ID_VAR VARCHAR(100),
START_DATE DATETIME,
END_DATE DATETIME
)
DECLARE #StartDate DATETIME
DECLARE #EndDate DATETIME
DECLARE #Ids VARCHAR(1000)
DECLARE #Delimiter VARCHAR(1)
SET #Delimiter = ','
SET #StartDate = GETDATE()
SET #EndDate = DATEADD(HH, 1, GETDATE())
SET #Ids = '1,2,4'
--Create some test data
INSERT INTO #TableOfData
SELECT 1, '1', GETDATE(), DATEADD(MI, 1, GETDATE()) --In our window of expected results (date + id)
UNION SELECT 2, '2', GETDATE(), DATEADD(D, 1, GETDATE()) --NOT in our window of expected results b/c of date
UNION SELECT 3, '3', GETDATE(), DATEADD(MI,2, GETDATE()) --NOT in our expected results (id)
UNION SELECT 4, '4', GETDATE(), DATEADD(MI,4, GETDATE()) --In our window of expected results (date + id)
--If querying by string, expect 2 results
SELECT TD.*
FROM #TableOfData TD
INNER JOIN dbo.fnSplitString(#Ids, #Delimiter) SS
ON TD.ID_VAR = SS.splitdata
WHERE START_DATE >= #StartDate
AND END_DATE <= #EndDate
--If querying by int, expect 2 results
SELECT TD.*
FROM #TableOfData TD
INNER JOIN dbo.fnSplitString(#Ids, #Delimiter) SS
ON TD.ID_INT = CONVERT(int, SS.splitdata)
WHERE START_DATE >= #StartDate
AND END_DATE <= #EndDate
You cannot use a parameter with a list directly in your query filter. Try storing that separated data into a table variable or temp table and call that in your query or use dynamic SQL to write your query if you don't want to use table variable or temp tables.
Once I run the query how do I put them back together again?
I was able to run the following query to convert date value into datetime and append time part to it
declare #date char(8), #time char(8)
select #date='20101001',#time ='12:10:47'
select cast(#date as datetime)+#time
In the above method, date value is converted to datetime datatype and time value is
added to it.
--------------Output ----------------------
result tab -
(No column name )
row1 || 2011-09-16 22:16.000
How can I covert back to the original data Value(undo)??????
I ran the above query to converted to datetime datatype and time value is
added to it- worked well...Now I want to undo go back to the original date value.....
It is not clear what the question is but this is my guess. If you are trying to extract pieces of a datetime use the DatePart funciton,
declare #date char(8), #time char(8)
select #date='20101001',#time ='12:10:47'
select cast(#date as datetime)+#time
select cast(cast(#date as datetime)+#time as datetime)
select DATEPART(mm,cast(cast(#date as datetime)+#time as datetime))
To extract the constituent parts of a datetime into a string of a specific format use the CONVERT function and pass the desired style. To get back to where you started use
DECLARE #date CHAR(8),
#time CHAR(8)
SELECT #date = '20101001',
#time = '12:10:47'
DECLARE #dt DATETIME
SELECT #dt = CAST(#date AS DATETIME) + #time
SELECT CONVERT(CHAR(8), #dt, 112) AS [#date],
CONVERT(CHAR(8), #dt, 108) AS [#time]
Which gives
#date #time
-------- --------
20101001 12:10:47
-- Given a CSV string like this:
declare #roles varchar(800)
select #roles = 'Pub,RegUser,ServiceAdmin'
-- Question: How to get roles into a table view like this:
select 'Pub'
union
select 'RegUser'
union
select 'ServiceAdmin'
After posting this, I started playing with some dynamic SQL. This seems to work, but seems like there might be some security risks by using dynamic SQL - thoughts on this?
declare #rolesSql varchar(800)
select #rolesSql = 'select ''' + replace(#roles, ',', ''' union select ''') + ''''
exec(#rolesSql)
If you're working with SQL Server compatibility level 130 then the STRING_SPLIT function is now the most succinct method available.
Reference link: https://msdn.microsoft.com/en-gb/library/mt684588.aspx
Usage:
SELECT * FROM string_split('Pub,RegUser,ServiceAdmin',',')
RESULT:
value
-----------
Pub
RegUser
ServiceAdmin
See my answer from here
But basically you would:
Create this function in your DB:
CREATE FUNCTION dbo.Split(#origString varchar(max), #Delimiter char(1))
returns #temptable TABLE (items varchar(max))
as
begin
declare #idx int
declare #split varchar(max)
select #idx = 1
if len(#origString )<1 or #origString is null return
while #idx!= 0
begin
set #idx = charindex(#Delimiter,#origString)
if #idx!=0
set #split= left(#origString,#idx - 1)
else
set #split= #origString
if(len(#split)>0)
insert into #temptable(Items) values(#split)
set #origString= right(#origString,len(#origString) - #idx)
if len(#origString) = 0 break
end
return
end
and then call the function and pass in the string you want to split.
Select * From dbo.Split(#roles, ',')
Here's a thorough discussion of your options:
Arrays and Lists in SQL Server
What i do in this case is just using some string replace to convert it to json and open the json like a table. May not be suitable for every use case but it is very simple to get running and works with strings and files. With files you just need to watch your line break character, mostly i find it to be "Char(13)+Char(10)"
declare #myCSV nvarchar(MAX)= N'"Id";"Duration";"PosX";"PosY"
"•P001";223;-30;35
"•P002";248;-28;35
"•P003";235;-26;35'
--CSV to JSON
--convert to json by replacing some stuff
declare #myJson nvarchar(MAX)= '[['+ replace(#myCSV, Char(13)+Char(10), '],[' ) +']]'
set #myJson = replace(#myJson, ';',',') -- Optional: ensure coma delimiters for json if the current delimiter differs
-- set #myJson = replace(#myJson, ',,',',null,') -- Optional: empty in between
-- set #myJson = replace(#myJson, ',]',',null]') -- Optional: empty before linebreak
SELECT
ROW_NUMBER() OVER (ORDER BY (SELECT 0))-1 AS LineNumber, *
FROM OPENJSON( #myJson )
with (
col0 varchar(255) '$[0]'
,col1 varchar(255) '$[1]'
,col2 varchar(255) '$[2]'
,col3 varchar(255) '$[3]'
,col4 varchar(255) '$[4]'
,col5 varchar(255) '$[5]'
,col6 varchar(255) '$[6]'
,col7 varchar(255) '$[7]'
,col8 varchar(255) '$[8]'
,col9 varchar(255) '$[9]'
--any name column count is possible
) csv
order by (SELECT 0) OFFSET 1 ROWS --hide header row
Using SQL Server's built in XML parsing is also an option. Of course, this glosses over all the nuances of an RFC-4180 compliant CSV.
-- Given a CSV string like this:
declare #roles varchar(800)
select #roles = 'Pub,RegUser,ServiceAdmin'
-- Here's the XML way
select split.csv.value('.', 'varchar(100)') as value
from (
select cast('<x>' + replace(#roles, ',', '</x><x>') + '</x>' as xml) as data
) as csv
cross apply data.nodes('/x') as split(csv)
If you are using SQL 2016+, using string_split is better, but this is a common way to do this prior to SQL 2016.
Using BULK INSERT you can import a csv file into your sql table -
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Even the accepted answer is working fine. but I got this function much faster even for thousands of record. create below function and use.
IF EXISTS (
SELECT 1
FROM Information_schema.Routines
WHERE Specific_schema = 'dbo'
AND specific_name = 'FN_CSVToStringListTable'
AND Routine_Type = 'FUNCTION'
)
BEGIN
DROP FUNCTION [dbo].[FN_CSVToStringListTable]
END
GO
CREATE FUNCTION [dbo].[FN_CSVToStringListTable] (#InStr VARCHAR(MAX))
RETURNS #TempTab TABLE (Id NVARCHAR(max) NOT NULL)
AS
BEGIN
;-- Ensure input ends with comma
SET #InStr = REPLACE(#InStr + ',', ',,', ',')
DECLARE #SP INT
DECLARE #VALUE VARCHAR(1000)
WHILE PATINDEX('%,%', #INSTR) <> 0
BEGIN
SELECT #SP = PATINDEX('%,%', #INSTR)
SELECT #VALUE = LEFT(#INSTR, #SP - 1)
SELECT #INSTR = STUFF(#INSTR, 1, #SP, '')
INSERT INTO #TempTab (Id)
VALUES (#VALUE)
END
RETURN
END
GO
---Test like this.
declare #v as NVARCHAR(max) = N'asdf,,as34df,234df,fs,,34v,5fghwer,56gfg,';
SELECT Id FROM dbo.FN_CSVToStringListTable(#v)
I was about you use the solution mentioned in the accepted answer, but doing more research led me to use Table Value Types:
These are far more efficient and you don't need a TVF (Table valued function) just to create a table from csv. You can use it directly in your scripts or pass that to a stored procedure as a Table Value Parameter. The Type can be created as :
CREATE TYPE [UniqueIdentifiers] AS TABLE(
[Id] [varchar](20) NOT NULL
)
I have a stored procedure in an old SQL 2000 database that takes a comment column that is formatted as a varchar and exports it out as a money object. At the time this table structure was setup, it was assumed this would be the only data going into this field. The current procedure functions simply this this:
SELECT CAST(dbo.member_category_assign.coment AS money)
FROM dbo.member_category_assign
WHERE member_id = #intMemberId
AND
dbo.member_category_assign.eff_date <= #dtmEndDate
AND
(
dbo.member_category_assign.term_date >= #dtmBeginDate
OR
dbo.member_category_assign.term_date Is Null
)
However, data is now being inserted into this column that is not parsable to a money object and is causing the procedure to crash. I am unable to remove the "bad" data (since this is a third party product), but need to update the stored procedure to test for a money parsable entry and return that.
How can I update this procedure so that it will only return the value that is parsable as a money object? Do I create a temporary table and iterate through every item, or is there a more clever way to do this? I'm stuck with legacy SQL 2000 (version 6.0) so using any of the newer functions unfortunately is not available.
Checking for IsNumeric may help you - you can simply return a Zero value. If you want to return a 'N/a' or some other string value
I created the sample below with the columns from your query.
The first query just returns all rows.
The second query returns a MONEY value.
The third one returns a String value with N/A in place of the non-integer value.
set nocount on
drop table #MoneyTest
create table #MoneyTest
(
MoneyTestId Int Identity (1, 1),
coment varchar (100),
member_id int,
eff_date datetime,
term_date datetime
)
insert into #MoneyTest (coment, member_id, eff_date, term_date)
values
(104, 1, '1/1/2008', '1/1/2009'),
(200, 1, '1/1/2008', '1/1/2009'),
(322, 1, '1/1/2008', '1/1/2009'),
(120, 1, '1/1/2008', '1/1/2009')
insert into #MoneyTest (coment, member_id, eff_date, term_date)
values ('XX', 1, '1/1/2008', '1/1/2009')
Select *
FROM #MoneyTest
declare #intMemberId int = 1
declare #dtmBeginDate DateTime = '1/1/2008'
declare #dtmEndDate DateTime = '1/1/2009'
SELECT
CASE WHEN ISNUMERIC (Coment)=1 THEN CAST(#MoneyTest.coment AS money) ELSE cast (0 as money) END MoneyValue
FROM #MoneyTest
WHERE member_id = #intMemberId
AND #MoneyTest.eff_date <= #dtmEndDate
AND
(
#MoneyTest.term_date >= #dtmBeginDate
OR
#MoneyTest.term_date Is Null
)
SELECT
CASE WHEN ISNUMERIC (Coment)=1 THEN CAST (CAST(#MoneyTest.coment AS money) AS VARCHAR) ELSE 'N/a' END StringValue
FROM #MoneyTest
WHERE member_id = #intMemberId
AND #MoneyTest.eff_date <= #dtmEndDate
AND
(
#MoneyTest.term_date >= #dtmBeginDate
OR
#MoneyTest.term_date Is Null
)
Apologies for making a new answer, where a comment would suffice, but I lack the required permissions to do so. Onto the answer to your question, I would only like to add that you should use the above ISNUMERIC carefully. While it works much as expected, it also parses things like '1.3E-2' as a value numeric, which strangely enough you cannot cast into a numeric or money without generating an exception. I generally end up using:
SELECT
CASE WHEN ISNUMERIC( some_value ) = 1 AND CHARINDEX( 'E', Upper( some_value ) ) = 0
THEN Cast( some_value as money )
ELSE Cast( 0 as money )
END as money_value