T SQL - How can I get some data from a field or column in SQL table - tsql

I need to get some data from a column.
For example I have data in a column like peter#msn.com. I want to get msn.com.
How I can I get that?

try this:
DECLARE #a VARCHAR(100) = 'peter#msn.com'
SELECT RIGHT(#a, LEN(#a) - CHARINDEX('#', #a))

SELECT SUBSTRING(Columnname,7,LEN(Columnname));

Related

Split the column value and make key as column name in postgres query

I have the table with the column value as below:
data_as_of_date:20210202 unique_cc:3999
data_as_of_date:20220202 unique_cc:1999
i need to convert this column into like this:
data_as_of_date unique_cc
20210202 3999
20220202 1999
Sample data:
create table test (val varchar);
insert into test(val) values ('data_as_of_date:20210202 unique_cc:3999');
insert into test(val) values ('data_as_of_date:20220202 unique_cc:1999');
I have tried with unnest with string_to_array & crosstab functions, but it is not working.
You don't need unnest or a crosstab for this. A simple regular expression should do the trick:
select substring(the_column from 'data_as_of_date:([0-9]{8})') as data_as_of_date,
substring(the_column from 'unique_cc:([0-9]{4})') as unqiue_cc
from the_table;

I have a requirement where I need to dynamically convert column values into column headers in Azure SQL DW

table structure is something like below (total number of record goes upto 150)
After transposing, table result set should be like below where .... represent n number of columns
Basically, my idea is to create a temp table on the fly and have its column names defined from the select statement to get result-set shown in 2nd picture
Query should be something like ---
SELECT * INTO #Cols FROM (select * of above resultset)A WHERE 1=2
Note:- Please refrain from using FOR XML Path as Azure SQL DW currently doesn't support this feature.
I have no way of validating this works, however, from my searchfu STRING_AGG is available on Azure Data warehouse. I assume it has access to QUOTENAME and it does have access to dynamic statements so you can do something like this:
DECLARE #SQL_Start nvarchar(4000) = N'SELECT ',
#SQL_Columns nvarchar(4000),
#SQL_End nvarchar(4000) = N'INTO SomeTable FROM YourTable WHERE 1 = 2;';
SET #SQL_Columns = (SELECT STRING_AGG(QUOTENAME(ColumnName),',') WITHIN GROUP (ORDER BY ColumnName)
FROM (SELECT DISTINCT ColumnName
FROM YourTable) YT);
EXEC(#SQL_Start + #SQL_Columns + #SQL_End);
But, again, the real solution is to fix your design.

How to query from the result of a changed column of a table in postgresql

So I have a string time column in a table and now I want to change that time to date time type and then query data for selected dates.
Is there a direct way to do so? One way I could think of is
1) add a new column
2) insert values into it with converted date
3) Query using the new column
Here I am stuck with the 2nd step with INSERT so need help with that
ALTER TABLE "nds".”unacast_sample_august_2018"
ADD COLUMN new_date timestamp
-- Need correction in select statement that I don't understand
INSERT INTO "nds".”unacast_sample_august_2018” (new_date)
(SELECT new_date from_iso8601_date(substr(timestamp,1,10))
Could some one help me with correction and if possible a better way of doing it?
Tried other way to do in single step but gives error as Column does not exist new_date
SELECT *
FROM (SELECT from_iso8601_date(substr(timestamp,1,10)) FROM "db_name"."table_name") AS new_date
WHERE new_date > from_iso8601('2018-08-26') limit 10;
AND
SELECT new_date = (SELECT from_iso8601_date(substr(timestamp,1,10)))
FROM "db_name"."table_name"
WHERE new_date > from_iso8601('2018-08-26') limit 10;
Could someone correct these queries?
You don't need those steps, just use USING CAST clause on your ALTER TABLE:
CREATE TABLE foobar (my_timestamp) AS
VALUES ('2018-09-20 00:00:00');
ALTER TABLE foobar
ALTER COLUMN my_timestamp TYPE timestamp USING CAST(my_timestamp AS TIMESTAMP);
If your string timestamps are in a correct format this should be enough.
Solved as follows:
select *
from
(
SELECT from_iso8601_date(substr(timestamp,1,10)) as day,*
FROM "db"."table"
)
WHERE day > date_parse('2018-08-26', '%Y-%m-%d')
limit 10

KDB Query for OR operator

What is the equivalent query in KDB Web:
SELECT * FROM TABLE
WHERE (COLA = 'A' AND COLB = 'B') OR (COLA = 'C' AND COLB = 'D')
http://kdbserver:5001/?select fro table where _____________________
N.B.: cola and colb are having string datatype
You can do:
select from table where ((COLA like "string1")&(COLB like "string2"))|((COLA like "string3")&(COLB like "string4"))
select from table where ([]colA;colB) in ([]colA:`A`C;colB:`B`D)
Connor is right and his answer is quite efficient. Just want to add a version with list operation instead of table:
tab:([]cola:("aaa";"bbb";"ccc");colb:("ddd";"eee";"fff"))
select from tab where (flip(cola;colb))in\:(("aaa";"ddd");("bbb";"eee"))
Execution speed is almost identical with Connor's one
Sometime I prefer changing type to symbol to get results,
say,
tab1:([]a:string 10?`2;b:string 10?`2; c: string 10?`2)
--
select from tab1 where (((`$a)=`$"ci") & ((`$b)=`$"lf")) or (((`$a)=`$"en") & ((`$b)=`$"dl"))

Best way to search in postgres by a group of keyword

right now I have a keyword array like:
['key1', 'key2', 'key3'.......] , the keyword can be number or character.
If I want to search in my table (postgres database), and find out all record contain any of keyword in that array, how can I do that?
For example:
I got a table which has a column called name and a column called description
I need find all record that either name or description contains any keywords in that array.
thanks
Maybe this example will be useful:
CREATE TABLE TEST(
FIELD_KEY TEXT);
INSERT INTO TEST VALUES('this is hello');
INSERT INTO TEST VALUES('hello');
INSERT INTO TEST VALUES('this');
INSERT INTO TEST VALUES('other message');
SELECT *
FROM TEST
WHERE FIELD_KEY LIKE ANY (array['%this%', '%hel%']);
This will return:
this is hello
hello
this
Here other example:
SELECT *
FROM TEST
WHERE FIELD_KEY ~* 'this|HEL';
~* is case insensitive, ~ is case sensitive
You can try this example here.
select *
from t
where
array[name] <# my_array
or array[description] <# my_array
Couple the like operator with the any subquery expression:
select *
from t
where name like any (values ('%John%'), ('%Mary%'))
Or the array syntax:
where name like any (array['%John%', '%Mary%'])