I want to insert a string that contains double quotes and single quotes as below.
db.collectionName.insert({"filedName" : "my field contains "double quotes" and 'single quotes' how to insert"}) ;
When I try insert above, it got error as my field contain double quotes, can some one tell me something like escape sequence to insert double quote?
Can't do as under my field also contain single quotes.
db.collectionName.insert({"filedName" : 'my field contains "double quotes" and 'single quotes' how to insert'}) ;
I think it depends on which context you use your code.
If it's in pure js (node.js for example) you can escape the quote char with \, like this :
db.collectionName.insert({"filedName" : "my field contains \"double quotes\" and 'single quotes' how to insert"}) ;
But in the HTML context it's not possible, you need to replace the double-quote with the proper XML entity representation, "
Consider using grave/accent (`, decimal ASCII code 96) for string enclosing quotes.
JSON allows unicode syntax using \u1234 format according to ECMA-404. Therefore you can use the following syntax to insert double quote and single quotes from MongoDB shell.
db.collectionName.insert({"filedName" : "my field contains \u0022double quotes\u0022 and \u0027single quotes\u0027 how to insert"})
Further, you can use this syntax to insert non-ASCII characters too. This special handling is required if we are forming the JSON manually. If the JSON is being serialized automatically from plain objects (e.g. POJOs) using a framework such as GSON, Xstream the required conversion would happen automatically while converting to JSON.
Related
I have a glue job, in which am reading table from SF using soql:
df = (
spark.read.format("com.springml.spark.salesforce")
.option("soql", sql)
.option("queryAll", "true")
.option("sfObject", sf_table)
.option("bulk", bulk)
.option("pkChunking", pkChunking)
.option("version", "51.0")
.option("timeout", "99999999")
.option("username", login)
.option("password", password)
.load()
)
and whenever there is a combination of double-quotes and commas in the string it messes up my table schema, like so:
in source:
Column A
Column B
Column C
000AB
"text with, comma"
123XX
read from SF in df :
Column A
Column B
Column C
000AB
"text with
comma"
Is there any option to avoid such cases when this comma is treated as a delimiter? I tried various options but nothing worked. And SOQL doesn't accept REPLACE or SUBSTRING functions, their text manipulation functions are, well, basically there aren't any.
All the information I'm giving need to be tested. I do not have the same env so it is difficult for me to try anything but here is what I foud.
When you check the official doc, you find that there is a field metadataConfig. The documentation of this field can be found here : https://resources.docs.salesforce.com/sfdc/pdf/bi_dev_guide_ext_data_format.pdf
On page 2, csv format, it says :
If a field value contains a control character or a new line the field value must be contained within double quotes (or your
fieldsEscapedBy value). The default control characters (fieldsDelimitedBy, fieldsEnclosedBy,
fieldsEscapedBy, or linesTerminatedBy) are comma and double quote. For example, "Director of
Operations, Western Region".
which kinda sounds like you current problem.
By default, the values are comma and double quotes, so, I do not understand why it is failing. But, apparently, in your output, it keeps the double quotes, so, maybe, it considers only simple quote.
You should try to enforce the format and add in you code :
.option("metadataConfig", '{"fieldsEnclosedBy": "\"", "fieldsDelimitedBy": ","}')
# Or something similar - i could'nt test, so you need to try by yourself
I was reading a fixed with file from hadoop and doing substr and converting it to delimiter file. Code is working fine but instead of emply values in case of null it is returning \"\". Could you please suggest?
snippet
df.select(
df.value.substr(31, 1).alias('status'),
df.value.substr(32, 1).alias('tin_cert'),
df.value.substr(116, 1).alias('c_notice_flg'),
df.value.substr(117, 2).alias('nbr_non_prime_trlrs'),
df.value.substr(119, 3).alias('aw_related')
).write.option("delimiter", "|").csv(unixFile)
output
|\"\"|0|N|00|\"\"|199|
desired output
||0|N|00||199|
no quotes in the input file
000000000014999999999 281AAAA AAAAAAA AAAA 1NN00
000000000024 200BBBBB BBBBBBBBBBBBBBBBB 0NN00
000000000034 200 0NN00
000000000044 200 0NN00
I think escaped quotes are added because of default arguments for pyspark.sql.DataFrameWriter.csv method. In fact, as you can see from the docs:
quote – sets a single character used for escaping quoted values where the separator can be part of the value. If None is set, it uses the default value, ". If an empty string is set, it uses u0000 (null character).
escape – sets a single character used for escaping quotes inside an already quoted value. If None is set, it uses the default value, \
I will write a stored procedure in PostgreSQL which accepts a variable (my knowledge of SQL is close to zero, so I apologize if the question is obvious). Since this variable will be used verbatim in the call, I wanted to ensure that it is properly escaped to avoid injection.
Is there a function I can wrap the variable in, which would properly do the escaping?
I specifically would like to do that in SQL, as opposed to sanitizing the input (that variable) in the code which calls the SQL query (which would have arguably been easier).
I am surprised not to find any prominent documentation about such a functionality, which leads me to believe that this is not a standard practice. The closest I could get to was with the lexer source code of Postgresql but this is beyond my capacities to understand whether this is the right escaping that is mentioned (and which would lead to string being used as u&’stringuescape’’’, which looks quite barbaric)
There are several quoting functions in PostgreSQL, documented at https://www.postgresql.org/docs/current/functions-string.html
quote_ident(string text) text Return the given string suitably quoted to be used as an identifier in an SQL statement string. Quotes are added only if necessary (i.e., if the string contains non-identifier characters or would be case-folded). Embedded quotes are properly doubled. See also Example 40-1. quote_ident('Foo bar') "Foo bar"
quote_literal(string text) text Return the given string suitably quoted to be used as a string literal in an SQL statement string. Embedded single-quotes and backslashes are properly doubled. Note that quote_literal returns null on null input; if the argument might be null, quote_nullable is often more suitable. See also Example 40-1. quote_literal(E'O\'Reilly') 'O''Reilly'
quote_literal(value anyelement) text Coerce the given value to text and then quote it as a literal. Embedded single-quotes and backslashes are properly doubled. quote_literal(42.5) '42.5'
quote_nullable(string text) text Return the given string suitably quoted to be used as a string literal in an SQL statement string; or, if the argument is null, return NULL. Embedded single-quotes and backslashes are properly doubled. See also Example 40-1. quote_nullable(NULL) NULL
quote_nullable(value anyelement) text Coerce the given value to text and then quote it as a literal; or, if the argument is null, return NULL. Embedded single-quotes and backslashes are properly doubled. quote_nullable(42.5) '42.5'
But if you're designing procedures that prepare SQL from a string, you should use query parameters instead.
PREPARE fooplan (int, text, bool, numeric) AS
INSERT INTO foo VALUES($1, $2, $3, $4);
EXECUTE fooplan(1, 'Hunter Valley', 't', 200.00);
Read more in https://www.postgresql.org/docs/current/sql-prepare.html
I asked similar question here for: hstore value with space. And get solved by user: Clodoaldo Neto. Now I have come across next case with string containing single quote.
SELECT 'k=>"name", v=>"St. Xavier's Academy"'::hstore;
I tried it by using dollar-quoted string constant by reading http://www.postgresql.org/docs/current/static/sql-syntax-lexical.html#SQL-SYNTAX-CONSTANTS
SELECT 'k=>"name", v=>$$St. Xavier's Academy$$'::hstore;
But I couldn't get it right.
How to make postgresql hstore using strings containing single quote?
It seems like there are more such exceptions possible for this query. How to address them all at once?
You can escape the embedded single quote that same way you'd escape any other single quote inside a string literal: double it.
SELECT 'k=>"name", v=>"St. Xavier''s Academy"'::hstore;
-- ------------------------------^^
Alternatively, you could dollar quote the whole string:
SELECT $$k=>"name", v=>"St. Xavier's Academy"$$::hstore;
Whatever interface you're using to talk to PostgreSQL should be taking care of these quoting and escaping issues. If you're using manual string wrangling to build your SQL then you should be using your driver's quoting and placeholder methods.
hstore's internal parsing understands double quotes around keys:
Double-quote keys and values that include whitespace, commas, =s or >s.
Dollar quoting is, as you noted, for SQL string literals, hstore's parser doesn't know what they mean.
I need to update a record, which contains literal percent signs, using PostgreSQL in Railo. The query looks like
<cfquery>
update foo set bar = 'string with % in it %'
</cfQuery>
It throws error as ColdFusion normally interprets it as a wildcard character. I can escape it using the following query.
<cfquery>
update foo set bar = 'string with escaped \% in it \%'
</cfQuery>
However, the record now contains "\%" in the database and will be displayed on the page as "\%".
I found a documentation with an example of escaping percent sign in a SELECT. But it does not work for me: syntax error at or near "ESCAPE".
SELECT emp_discount
FROM Benefits
WHERE emp_discount LIKE '10\%'
ESCAPE '\';
Is there a better to achieve the same goal? The underlining database is PostgreSQL. Thanks!
Queryparameters escape special characters. Yet another reason to use them.