Replace and conversion is converting to NULL in SQL Server - tsql

I have a scenrio in which I have to remove some text from date and then convert them into DATETIME. but when I using below method it is resulting in NULL output.
SELECT CAST(REPLACE('2017-09-21','',NULL) AS DATETIME)
Same output is coming when using CONVERT. Why is this happening?

It's documented:
Returns NULL if any one of the arguments is NULL.
You pass NULL as last argument.
You could use NULLIF instead of REPLACE:
SELECT CAST(NULLIF('20170921', '') AS DATETIME)
This will return NULL if the string is empty and a casted datetime otherwise.

You did specify NULL. In SQL (all products) NULL means UNKNOWN. Applying any function to an unknown value results in an unknown result, hence NULL. The exception are the functions specifically meant to deal with NULLs.
It's hard to understand how your expression can be fixed though, since you try to replace an empty string with NULL. What is the empty string supposed to match?

Related

Postgres JSONB values are all strings

Somehow populating a database with a JSONB column ended up with every value in the column being a JSONB string instead of an object.
=> select specifications from checklist_item;
specifications
---------------------
"{}"
"{}"
"{\"x\": 2, \"y\": \"z\"}"
Is it possible to update, in a single statement, each of these values to JSONB objects as opposed to strings?
I tried to_jsonb(specifications) but that did not parse as expected. I've gone over documentation but all the examples seem to show ways to manipulate data that is already a jsonb array or a jsonb object but not a plain string.
I can write a script and do the parsing in Python, but there certainly must be a nice way to do with in a single update command with a json function that I simply cannot find at the moment. Is there such a json function or operator that will "parse" my bad data?
to_jsonb(specifications) does to_jsonb(specifications::text), which just gets the JSON text with the string literal as text. What you need is to get the value of the JSON string literal, then cast that to jsonb:
UPDATE checklist_item
SET specifications = (specifications #>> '{}')::jsonb
-- or … = to_jsonb(specifications #>> '{}')
WHERE jsonb_typeof(specifications) = 'string';

Optional where condition with List input variable

I'm trying to ignore condition when the input is null. There are already a lot of threads on stackoverflow dealing with this situation but I've not been able to find one with a List input variable.
The mentionned solution does not work as IN null raises an error. That's why I added a COALESCE :
:toto type : List<Integer>
SELECT *
FROM test_table
WHERE (:toto is null OR year IN (COALESCE(:toto, (1, 2))))
Problem is that COALESCE itself returns an error :
COALESCE types bytea and record cannot be matched
Strangest thing is that this query is working with a null input if I'm making a raw query directly onto the database. I'm suspecting JPA to don't give a real null value to :toto.
Any solution leading to make index working and providing the correct behavior would be ok.

How to prevent Entity Framework from converting empty strings to null in database-first approach

I have to insert empty strings in a non-nullable varchar field in an oracle db.
The property of the object I'm trying to save is set to empty string, but when I call SaveChanges I get an error because EF converts my empty string to null.
I know that, in code-first approach, there you can use ConvertEmptyStringToNull=false: is there a way to achieve the same behavior with database-first approach?
It appears that in Oracle (at least for now) the empty string is treated as null.
Therefore there is no way to save an empty string in a varchar field.
Note:Oracle Database currently treats a character value with a length of zero as null. However, this may not continue to be true in future releases, and Oracle recommends that you do not treat empty strings the same as nulls.
Source

How to capture rows that are not casted by pyspark function?

I have a function written which converts the datatype of a dataframe to the specified schema in Pyspark. Cast function silently makes the entry as Null if it is not able to convert to the respective datatype.
e.g. F.col(col_name).cast(IntegerType())will typecast to Integer and if the column value is Long it will make that as null.
Is there any way to capture the cases where it converts to Null? In a data pipeline that runs daily, if those are not captured it will silently make them Null and pass to the upstream systems.

Null value in Database

Null value means
No value
Inapplicable,unassigned, unknown, or unavailable
Which is true?
It's all about the context in which it's used. A null means there is no value but the reason for this will depend on the domain in which it is being used. In many cases the items you've listed are all valid uses of a null.
It can mean any of those things (and it is not always obvious which), which is one argument against using nulls at all.
See: http://en.wikipedia.org/wiki/Null_(SQL)#Controversy
From Wikipedia
Null is a special marker used in
Structured Query Language (SQL) to
indicate that a data value does not
exist in the database. Introduced by
the creator of the relational database
model, E. F. Codd, SQL Null serves to
fulfill the requirement that all true
relational database management systems
(RDBMS) support a representation of
"missing information and inapplicable
information". Codd also introduced the
use of the lowercase Greek omega (ω)
symbol to represent Null in database
theory. NULL is also an SQL reserved
keyword used to identify the Null
special marker.
Obviously you have the DB definition of what null means, however to an application it can mean anything. I once worked on a strange application (disclaimer- I didn't design it), that used null in a junction table to represent all of the options (allegedly it was designed this way to "save space"). This was a DB design for user and role management.
So null in this case meant the user was in all roles. That's one for daily WTF. :-)
Like many people I tend to avoid using nulls where realistically possible.
null indicates that a data value does not exist in the database, thus representing missing information.
Also allows for three-way truth value; true, false and unknown.
The only answer supported by SQL semantics is "unknown." If it meant "no value," then
'Hi there' = NULL
would return FALSE, but it returns NULL. This is because the NULL value in the expression means an unknown value, and the unknown value could very well be 'Hi there' as far as the system knows.
NULL is a representation that a field has not had a value set, or has been re-set to NULL.
It is not unknown or unavailable.
Note, that when looking for NULL values, do not use '=' in a where clause, use 'is', e.g.:
select * from User where username is NULL;
Not:
select * from User where username = NULL;
NULL, in the relational model, means Unknown. It's a mark that appears instead of a value wherever a value can appear in SQL.
Null means nothing, unknown and no value.
It does not mean unavailable or in applicable.
Null is a testable state of a column in a row, but it has no value itself.
By example:
An int can be only ...,0,1,2,3,... and also NULL.
An datetime can be only a valid date... and also NULL.
A bit can be only 0 or 1... and also NULL.
An varchar can be a string... and also NULL.
see the pattern?
You can make a column NOT NULL-able so that you can force a column to take a value.
The NULL SQL keyword is used to represent either a missing value or a value that is not applicable in a relational table
all :-)
if you want to add a semantic meaning to your field, add an ENUM
create TABLE myTable
(
myfield varchar(50)
myfieldType enum ('OK','NoValue','InApplicable','Unassigned','Unknown','Unavailable') NOT NULL
)