I am developing a stored proc in SQL Server 2012
In some part I would like to determine the condition if "NAME" contains sub-string #pattern, Case Insesitive-ly
#pattern is a string with lower case letters only
Somehow I've tried
where CONTAINS(NAME, #pattern) is fine
but when I add LOWER()
where CONTAINS(LOWER(NAME), #pattern)
It gives me syntax error.
I am new to T-SQL, can anyone help me out?
Why is it syntax error, and how to solve it?
Is there any better way to do exact the same task?
Thanks!
if your table not full-text indexed then CONTAINS will not work
another variants to solve your issue below:
DECLARE #table AS TABLE ( SomeText VARCHAR(40) )
INSERT INTO #table
VALUES ( 'abcdfh' ),
( 'ghijkl' ),
( 'mnopq' );
DECLARE #pattern AS VARCHAR(10)
SET #pattern = '%abc%'
--variant using like
SELECT T.sometext
FROM #table AS T
WHERE T.SomeText LIKE #pattern
--variant using PATINDEX
SELECT sometext
FROM #table AS T
WHERE PATINDEX(#pattern, T.SomeText) > 0
--variant using CHARINDEX, but it can't be used with '%' and '[ ]',
--just to find first char position of the searched word in text as below
SET #pattern = 'abc'
SELECT sometext
FROM #table AS T
WHERE CHARINDEX(#pattern, T.SomeText) > 0
Related
What i want is to compare 2 strings and get how many characters both strings have in common.
For example:
I have declare a variable with value Test1.
Get values from a table with a select query and compare them with the variable to get how many characters are the same in order starting from the first character of the variable.
I compare the variable against the values from the query.
Character are case sensitive (i use UPPER( string ) to capitalize both variable and value from the select statement)
I will select the String with the MAX() number. From the output image i will select Test1 and NOT Test11 because Test11 exist the number of characters against the variable.
Output
Any suggestions?
You can use a recursive CTE for this...
For your next question: Please to not post pictures. Rather try to set up a stand alone and self-running sample as I do it here (DDL and INSERT).
DECLARE #tbl TABLE(ID INT IDENTITY, SomeValue VARCHAR(100));
INSERT INTO #tbl VALUES ('Test1')
,('Test11')
,('Test')
,('abc')
,('Tyes')
,('cest');
--This is the string we use to compare (casing depends on the underlying collation)
DECLARE #CheckString VARCHAR(100)='Test1';
--The query
WITH recCTE AS
(
SELECT t.ID
,t.SomeValue
,1 AS pos
--,SUBSTRING(#CheckString,1,1) AS LetterInCheckString
--,SUBSTRING(t.SomeValue,1,1) AS LetterInTableValue
,CASE WHEN SUBSTRING(#CheckString,1,1)=SUBSTRING(t.SomeValue,1,1) THEN 1 ELSE 0 END AS IsTheSame
FROM #tbl t
UNION ALL
SELECT recCTE.ID
,recCTE.SomeValue
,recCTE.Pos+1
--,SUBSTRING(#CheckString,recCTE.Pos+1,1)
--,SUBSTRING(recCTE.SomeValue,recCTE.Pos+1,1)
,CASE WHEN SUBSTRING(#CheckString,recCTE.Pos+1,1)=SUBSTRING(recCTE.SomeValue,recCTE.Pos+1,1) THEN 1 ELSE 0 END
FROM recCTE
WHERE recCTE.IsTheSame=1 AND SUBSTRING(#CheckString,recCTE.Pos+1,1) <>''
)
SELECT ID,SomeValue,SUM(IsTheSame)
FROM recCTE
GROUP BY ID,SomeValue
ORDER BY ID;
The idea in short:
We start with the recursion's anchor at position=1
We add to this, as long as the string is the same and substring() returns a value.
The result is the SUM() of same characters.
To be honest: T-SQL is the wrong tool for this...
I'm currently doing a data conversion project and need to strip all alphabetical characters from a string. Unfortunately I can't create or use a function as we don't own the source machine making the methods I've found from searching for previous posts unusable.
What would be the best way to do this in a select statement? Speed isn't too much of an issue as this will only be running over 30,000 records or so and is a once off statement.
You can do this in a single statement. You're not really creating a statement with 200+ REPLACEs are you?!
update tbl
set S = U.clean
from tbl
cross apply
(
select Substring(tbl.S,v.number,1)
-- this table will cater for strings up to length 2047
from master..spt_values v
where v.type='P' and v.number between 1 and len(tbl.S)
and Substring(tbl.S,v.number,1) like '[0-9]'
order by v.number
for xml path ('')
) U(clean)
Working SQL Fiddle showing this query with sample data
Replicated below for posterity:
create table tbl (ID int identity, S varchar(500))
insert tbl select 'asdlfj;390312hr9fasd9uhf012 3or h239ur ' + char(13) + 'asdfasf'
insert tbl select '123'
insert tbl select ''
insert tbl select null
insert tbl select '123 a 124'
Results
ID S
1 390312990123239
2 123
3 (null)
4 (null)
5 123124
CTE comes for HELP here.
;WITH CTE AS
(
SELECT
[ProductNumber] AS OrigProductNumber
,CAST([ProductNumber] AS VARCHAR(100)) AS [ProductNumber]
FROM [AdventureWorks].[Production].[Product]
UNION ALL
SELECT OrigProductNumber
,CAST(STUFF([ProductNumber], PATINDEX('%[^0-9]%', [ProductNumber]), 1, '') AS VARCHAR(100) ) AS [ProductNumber]
FROM CTE WHERE PATINDEX('%[^0-9]%', [ProductNumber]) > 0
)
SELECT * FROM CTE
WHERE PATINDEX('%[^0-9]%', [ProductNumber]) = 0
OPTION (MAXRECURSION 0)
output:
OrigProductNumber ProductNumber
WB-H098 098
VE-C304-S 304
VE-C304-M 304
VE-C304-L 304
TT-T092 092
RichardTheKiwi's script in a function for use in selects without cross apply,
also added dot because in my case I use it for double and money values within a varchar field
CREATE FUNCTION dbo.ReplaceNonNumericChars (#string VARCHAR(5000))
RETURNS VARCHAR(1000)
AS
BEGIN
SET #string = REPLACE(#string, ',', '.')
SET #string = (SELECT SUBSTRING(#string, v.number, 1)
FROM master..spt_values v
WHERE v.type = 'P'
AND v.number BETWEEN 1 AND LEN(#string)
AND (SUBSTRING(#string, v.number, 1) LIKE '[0-9]'
OR SUBSTRING(#string, v.number, 1) LIKE '[.]')
ORDER BY v.number
FOR
XML PATH('')
)
RETURN #string
END
GO
Thanks RichardTheKiwi +1
Well if you really can't use a function, I suppose you could do something like this:
SELECT REPLACE(REPLACE(REPLACE(LOWER(col),'a',''),'b',''),'c','')
FROM dbo.table...
Obviously it would be a lot uglier than that, since I only handled the first three letters, but it should give the idea.
Hi all I have query in oracle as follows
DECLARE in_variable Varchar;
Select Row_Number()
OVER
(
Order By
Decode(in_variable,'column_name ASC',t.column_name) Asc) b
From table t
Converted to sql server as follows
DECLARE #in_variable NVARCHAR(100)
SELECT ROW_NUMBER() OVER
(
ORDER BY
IIF ( #in_sort_by <> '', 'column_name ASC', t.column_name ) ASC )
FROM table t
Is it the correct one or am I doing wrong when I give the value for #in_variable I am getting conversion exception in sql so can some one help me
Rather than using either DECODE or IIF, you'd be better of using CASE. For SQL Server, this would be:
SELECT ROW_NUMBER() OVER
( ORDER BY
CASE WHEN #in_sort_by <> ''
THEN 'column_name ASC'
ELSE t.column_name END ASC )
FROM table t
If you're getting a type conversion error, that would imply that t.column_name is an int. SQL Server will try to convert the static string 'column_name ASC' to match the data type of the column it is being used in place of. To fix this, you can try using CAST to convert the column to VARCHAR:
SELECT ROW_NUMBER() OVER
( ORDER BY
CASE WHEN #in_sort_by <> ''
THEN 'column_name ASC'
ELSE CAST(t.column_name as varchar) END ASC )
FROM table t
However, I think you're probably pursuing the wrong solution here. It looks like you're trying to make the analytic function sort differently based on the variable provided. Providing the alternate column name and sort order as a string is not going to do that. You should probably look questions related to dynamic sorting for how to do this correctly.
I have a POstgreSQL 8.4.
I have a table and i want to find a string in one row (character varying datatype) of this table using substring (character varying datatype) returned by subquery:
SELECT uchastki.kadnum
FROM uchastki
WHERE kadnum LIKE (
SELECT str
FROM test
WHERE str IS NOT NULL)
But get a error
ERROR: more than one row returned by a subquery used as an expression
In field test.str i have strings like 66:07:21 01 001 in uchastki.kadnum 66:07:21 01 001:27.
How to find substring using results of subquery?
UPDATE
Table test:
CREATE TABLE test
(
id serial NOT NULL,
str character varying(255)
)
WITH (
OIDS=FALSE
);
ALTER TABLE test OWNER TO postgres;
Table uchastki:
CREATE TABLE uchastki
(
fid serial NOT NULL,
the_geom geometry,
id_uch integer,
num_opora character varying,
kod_lep integer,
kadnum character varying,
sq real,
kod_type_opora character varying,
num_f11s integer,
num_opisanie character varying,
CONSTRAINT uchastki_pkey PRIMARY KEY (fid),
CONSTRAINT enforce_dims_the_geom CHECK (st_ndims(the_geom) = 2)
)
WITH (
OIDS=FALSE
);
ALTER TABLE uchastki OWNER TO postgres;
Use like any :
SELECT uchastki.kadnum
FROM uchastki
WHERE kadnum LIKE ANY(
SELECT str
FROM test
WHERE str IS NOT NULL)
Or perhaps:
SELECT uchastki.kadnum
FROM uchastki
WHERE kadnum LIKE ANY(
SELECT '%' || str || '%'
FROM test
WHERE str IS NOT NULL)
this is a nice feature, You can use different operators, for example = any (select ... ), or <> all (select...).
I'm going to take a wild stab in the dark and assume you mean that you want to match a string Sa from table A against one or more other strings S1 .. Sn from table B to find out if any of the other strings in S1 .. Sn is a substring of Sa.
A simple example to show what I mean (hint, hint):
Given:
CREATE TABLE tableA (string_a text);
INSERT INTO tableA(string_a) VALUES
('the manual is great'), ('Chicken chicken chicken'), ('bork');
CREATE TABLE tableB(candidate_str text);
INSERT INTO tableB(candidate_str) VALUES
('man'),('great'),('chicken');
I want the result set:
the manual is great
chicken chicken chicken
because the manual is great has man and great in it; and because chicken chicken chicken has chicken in it. There is no need to show the substring(s) that matched. bork doesn't match any substring so it is not found.
Here's a SQLFiddle with the sample data.
If so, shamelessly stealing #maniek's excellent suggestion, you would use:
SELECT string_a
FROM tableA
WHERE string_a LIKE ANY (SELECT '%'||candidate_str||'%' FROM tableB);
(Vote for #maniek please, I'm just illustrating how to clearly explain - I hope - what you want to achieve, sample data, etc).
(Note: This answer was written before further discussion clarified the poster's actual intentions)
It would appear highly likely that there is more than one str in test where str IS NOT NULL. That's why more than one row is returned by the subquery used as an expression, and, thus, why the statement fails.
Run the subquery stand-alone to see what it returns and you'll see. Perhaps you intended it to be a correlated subquery but forgot the outer column-reference? Or perhaps there's a column also called str in the outer table and you meant to write:
SELECT uchastki.kadnum
FROM uchastki
WHERE kadnum LIKE (
SELECT test.str
FROM test
WHERE uchastki.str IS NOT NULL)
?
(Hint: Consistently using table aliases on column references helps to avoid name-clash confusion).
-- Given a CSV string like this:
declare #roles varchar(800)
select #roles = 'Pub,RegUser,ServiceAdmin'
-- Question: How to get roles into a table view like this:
select 'Pub'
union
select 'RegUser'
union
select 'ServiceAdmin'
After posting this, I started playing with some dynamic SQL. This seems to work, but seems like there might be some security risks by using dynamic SQL - thoughts on this?
declare #rolesSql varchar(800)
select #rolesSql = 'select ''' + replace(#roles, ',', ''' union select ''') + ''''
exec(#rolesSql)
If you're working with SQL Server compatibility level 130 then the STRING_SPLIT function is now the most succinct method available.
Reference link: https://msdn.microsoft.com/en-gb/library/mt684588.aspx
Usage:
SELECT * FROM string_split('Pub,RegUser,ServiceAdmin',',')
RESULT:
value
-----------
Pub
RegUser
ServiceAdmin
See my answer from here
But basically you would:
Create this function in your DB:
CREATE FUNCTION dbo.Split(#origString varchar(max), #Delimiter char(1))
returns #temptable TABLE (items varchar(max))
as
begin
declare #idx int
declare #split varchar(max)
select #idx = 1
if len(#origString )<1 or #origString is null return
while #idx!= 0
begin
set #idx = charindex(#Delimiter,#origString)
if #idx!=0
set #split= left(#origString,#idx - 1)
else
set #split= #origString
if(len(#split)>0)
insert into #temptable(Items) values(#split)
set #origString= right(#origString,len(#origString) - #idx)
if len(#origString) = 0 break
end
return
end
and then call the function and pass in the string you want to split.
Select * From dbo.Split(#roles, ',')
Here's a thorough discussion of your options:
Arrays and Lists in SQL Server
What i do in this case is just using some string replace to convert it to json and open the json like a table. May not be suitable for every use case but it is very simple to get running and works with strings and files. With files you just need to watch your line break character, mostly i find it to be "Char(13)+Char(10)"
declare #myCSV nvarchar(MAX)= N'"Id";"Duration";"PosX";"PosY"
"•P001";223;-30;35
"•P002";248;-28;35
"•P003";235;-26;35'
--CSV to JSON
--convert to json by replacing some stuff
declare #myJson nvarchar(MAX)= '[['+ replace(#myCSV, Char(13)+Char(10), '],[' ) +']]'
set #myJson = replace(#myJson, ';',',') -- Optional: ensure coma delimiters for json if the current delimiter differs
-- set #myJson = replace(#myJson, ',,',',null,') -- Optional: empty in between
-- set #myJson = replace(#myJson, ',]',',null]') -- Optional: empty before linebreak
SELECT
ROW_NUMBER() OVER (ORDER BY (SELECT 0))-1 AS LineNumber, *
FROM OPENJSON( #myJson )
with (
col0 varchar(255) '$[0]'
,col1 varchar(255) '$[1]'
,col2 varchar(255) '$[2]'
,col3 varchar(255) '$[3]'
,col4 varchar(255) '$[4]'
,col5 varchar(255) '$[5]'
,col6 varchar(255) '$[6]'
,col7 varchar(255) '$[7]'
,col8 varchar(255) '$[8]'
,col9 varchar(255) '$[9]'
--any name column count is possible
) csv
order by (SELECT 0) OFFSET 1 ROWS --hide header row
Using SQL Server's built in XML parsing is also an option. Of course, this glosses over all the nuances of an RFC-4180 compliant CSV.
-- Given a CSV string like this:
declare #roles varchar(800)
select #roles = 'Pub,RegUser,ServiceAdmin'
-- Here's the XML way
select split.csv.value('.', 'varchar(100)') as value
from (
select cast('<x>' + replace(#roles, ',', '</x><x>') + '</x>' as xml) as data
) as csv
cross apply data.nodes('/x') as split(csv)
If you are using SQL 2016+, using string_split is better, but this is a common way to do this prior to SQL 2016.
Using BULK INSERT you can import a csv file into your sql table -
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/
Even the accepted answer is working fine. but I got this function much faster even for thousands of record. create below function and use.
IF EXISTS (
SELECT 1
FROM Information_schema.Routines
WHERE Specific_schema = 'dbo'
AND specific_name = 'FN_CSVToStringListTable'
AND Routine_Type = 'FUNCTION'
)
BEGIN
DROP FUNCTION [dbo].[FN_CSVToStringListTable]
END
GO
CREATE FUNCTION [dbo].[FN_CSVToStringListTable] (#InStr VARCHAR(MAX))
RETURNS #TempTab TABLE (Id NVARCHAR(max) NOT NULL)
AS
BEGIN
;-- Ensure input ends with comma
SET #InStr = REPLACE(#InStr + ',', ',,', ',')
DECLARE #SP INT
DECLARE #VALUE VARCHAR(1000)
WHILE PATINDEX('%,%', #INSTR) <> 0
BEGIN
SELECT #SP = PATINDEX('%,%', #INSTR)
SELECT #VALUE = LEFT(#INSTR, #SP - 1)
SELECT #INSTR = STUFF(#INSTR, 1, #SP, '')
INSERT INTO #TempTab (Id)
VALUES (#VALUE)
END
RETURN
END
GO
---Test like this.
declare #v as NVARCHAR(max) = N'asdf,,as34df,234df,fs,,34v,5fghwer,56gfg,';
SELECT Id FROM dbo.FN_CSVToStringListTable(#v)
I was about you use the solution mentioned in the accepted answer, but doing more research led me to use Table Value Types:
These are far more efficient and you don't need a TVF (Table valued function) just to create a table from csv. You can use it directly in your scripts or pass that to a stored procedure as a Table Value Parameter. The Type can be created as :
CREATE TYPE [UniqueIdentifiers] AS TABLE(
[Id] [varchar](20) NOT NULL
)