select data with single quote only- Postgres - postgresql

im using Like to select all data which has single quote in name like Jon's.
select * from users where file_name like '%'%';
I then want to remove the ' from all results.
Ideas?

Double quotes in SQL to escape them:
select * from users where file_name like '%''%';
(For any vaguely recent PostgreSQL version; the non-standard escape-string phrasing E'%\'%' will work with even very old PostgreSQL versions, but not other databases.)
It sounds like you want to remove those characters from the file names. If so, something like the untested:
update users
set file_name = replace(file_name, '''', '')
should do the trick.

Related

ADF dynamic content using concat - need to embed commas inside of string for long list of columns

The use case seems pretty simple....
Produce a sql statement as part of a copy activity that includes a hard coded column listing and also concatenated to a parameter-provided database and table name (since the database and table names can change across environments such as dev/test/prod).
The problem is....If you use concat function it treats every comma as a new value to be concatenated. I was hoping for a way to escape the comma and treat it as a value but nothing I've tried works.
For example....concatenate the following string ....
SELECT event_date, event_timestamp,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_title') AS page_title
FROM '
to... pipeline().parameters.Database_Nm + . + pipeline().parameters.Table_Nm
The workaround has been to quote the beginning and end of every line so the comma is treated as data so every column/line is a separate concatenation such as this....
#concat('SELECT event_date,',
'(SELECT value.string_value FROM UNNEST(event_params) WHERE key = ''page_title'') AS page_title,',
'from ', pipeline().parameters.Database_Nm, '.', pipeline().parameters.Table_Nm
That works...but I have over a hundred columns so this is just a bit silly as a solution. Am I missing a simpler method? TIA!
When most of your string is hard coded and not expressions you can use the following string interpolation expression format:
SELECT event_date,
(SELECT value.string_value FROM UNNEST(event_params) WHERE key = 'page_title') AS page_title
from #{pipeline().parameters.Database_Nm}.#{pipeline().parameters.Table_Nm}

Postgresql 11.16 equal comparison on varchar with two space characters not working anymore

We're experiencing a weird behavior on our select statements since we've updated from postgres 11.12 to 11.16.
We are selecting rows using a WHERE condition on a pretty simple Varchar column. The value that were looking for in the condition does contain two consecutive SPACE characters, something like this: WORD1 WORD2.
Our query for finding the necessary data looks like this:
SELECT * FROM table WHERE name = 'WORD1 WORD2';
While this query used to be working fine (and still does on our older test systems), right now it does not find the given entry in our productive environment. What seems to be working though is really wild:
-- LIKE with % after word working
SELECT * FROM table WHERE name LIKE 'WORD1 WORD2%';
-- LIKE with % before word working
SELECT * FROM table WHERE name LIKE '%WORD1 WORD2';
-- ILIKE without % working
SELECT * FROM table WHERE name ILIKE 'WORD1 WORD2';
-- IS NOT DISTINCT FROM working
SELECT * FROM table WHERE name IS NOT DISTINCT FROM 'WORD1 WORD2';
-- standard equals (=) NOT working
SELECT * FROM table WHERE name = 'WORD1 WORD2';
We double checked white space characters before and after the visible string, even updated the value with plain text again to make sure no strange non visible stuff is found in the value. Nothing worked. We also checked other entries inside of the same table with only a single space between words like WORD1 WORD2 which somehow still seems to be working, so from our perspective it seems to have something to do with equals and two white spaces in the varchar.
We are accessing the database through DB visualizer, through a Java 8 driver and through psql shell, all with the same result that equals does not seem to find the required entry.
Any help is greatly appreciate, we're a little out if ideas right now.

Find " ' " in a text Postgres

I want to found if a varchar contains a ' using the LIKE option.
I think it's something like that
select *
from table
where field like '%'%'
but a bit different.
answering about escaping single quote:
you can use dollar sign quoting:
select * from table where field like $thing$%'%$thing$;
or use E before quote:
select * from table where field like e'%\'%'
or two single quotes:
select * from table where field like '%''%'
https://www.postgresql.org/docs/current/static/sql-syntax-lexical.html

How to escape underscores in Postgresql

When searching for underscores in Postgresql, literal use of the character _ doesn't work. For example, if you wanted to search all your tables for any columns that ended in _by, for something like change log or activity information, e.g. updated_by, reviewed_by, etc., the following query almost works:
SELECT table_name, column_name FROM information_schema.columns
WHERE column_name LIKE '%_by'
It basically ignores the underscore completely and returns as if you'd searched for LIKE '%by'. This may not be a problem in all cases, but it has the potential to be one. How to search for underscores?
You need to use a backslash to escape the underscore. Change the example query to the following:
SELECT table_name, column_name FROM information_schema.columns
WHERE column_name LIKE '%\_by'
Just ran into the same issue and the single backslash wasn't working as well. I found this documentation on the PostgreSQL community and it worked:
The correct way is to escape the underscore with a backslash. You
actually have to write two backslashes in your query:
select * from
foo where bar like '%\\_baz'
The first backslash quotes the second one for the query parser, so
that what ends up inside the system is %\_baz, and then the LIKE
function knows what to do with that.
Therefore use something like this:
SELECT table_name, column_name FROM information_schema.columns
WHERE column_name LIKE '%\\_by'
Source Documentation: https://www.postgresql.org/message-id/10965.962991238%40sss.pgh.pa.us

exporting to csv from db2 with no delimiter

I need to export content of a db2 table to CSV file.
I read that nochardel would prevent to have the separator between each data but that is not happening.
Suppose I have a table
MY_TABLE
-----------------------
Field_A varchar(10)
Field_B varchar(10)
Field_A varchar(10)
I am using this command
export to myfile.csv of del modified by nochardel select * from MY_TABLE
I get this written into the myfile.csv
data1 ,data2 ,data3
but I would like no ',' separator like below
data1 data2 data3
Is there a way to do that?
You're asking how to eliminate the comma (,) in a comma separated values file? :-)
NOCHARDEL tells DB2 not to surround character-fields (CHAR and VARCHAR fields) with a character-field-delimiter (default is the double quote " character).
Anyway, when exporting from DB2 using the delimited format, you have to have some kind of column delimiter. There isn't a NOCOLDEL option for delimited files.
The EXPORT utility can't write fixed-length (positional) records - you would have to do this by either:
Writing a program yourself,
Using a separate utility (IBM sells the High Performance Unload utility)
Writing an SQL statement that concatenates the individual columns into a single string:
Here's an example for the last option:
export to file.del
of del
modified by nochardel
select
cast(col1 as char(20)) ||
cast(intcol as char(10)) ||
cast(deccol as char(30));
This last option can be a pain since DB2 doesn't have an sprintf() function to help format strings nicely.
Yes there is another way of doing this. I always do this:
Put select statement into a file (input.sql):
select
cast(col1 as char(20)),
cast(col2 as char(10)),
cast(col3 as char(30));
Call db2 clp like this:
db2 -x -tf input.sql -r result.txt
This will work for you, because you need to cast varchar to char. Like Ian said, casting numbers or other data types to char might bring unexpected results.
PS: I think Ian points right on the difference between CSV and fixed-length format ;-)
Use "of asc" instead of "of del". Then you can specify the fixed column locations instead of delimiting.