I am trying to insert into a numeric column a decimal but I keep getting an error.
Below is my statement
INSERT INTO blse VALUES (2082.7, 'Total Nonfarm' ,'Alabama','01/31/2020');
it basically says i need to cast this statement. I do not know what I am doing wrong.. I am beginner taking this class.
and this is the error:
It is highly recommended that you specify the columns in an INSERT statement:
INSERT INTO tab (col1, col2, col3)
VALUES (val1, val2, val3);
That way, you can be certain what value is inserted where.
Since you didn't do that, the first value in the VALUES clause gets inserted into the first table column, which is of type date. That causes the error you observe.
Related
I am inserting a null date with a INSERT ... SELECT FROM statement in sql
CREATE TABLE null_date (
id bigserial PRIMARY KEY
, some_date date
);
WITH date_data (some_date) AS (
VALUES (null)
)
INSERT INTO null_date (some_date)
SELECT some_date
FROM date_data;
and it fails with
ERROR: column "some_date" is of type date but expression is of type text
LINE 5: SELECT some_date
^
HINT: You will need to rewrite or cast the expression.
However, if I try to insert it directly, it works
INSERT INTO null_date (some_date)
VALUES (null)
can somebody please help me understand what's happening here? Here is the link to db<>fiddle. Thanks
The problem is that the VALUES statement and consequently the WITH clause will treat the NULL value as type text, because PostgreSQL doesn't know which data type the NULL should be. You don't have that problem with INSERT INTO ... VALUES (...), because here PostgreSQL knows right away that the NULL value with unknown type will be inserted into a certain column, so it will resolve it to the target data type.
In cases where PostgreSQL cannot guess the data type from context, you had better use an explicit type cast:
WITH date_data (some_date) AS (
VALUES (CAST(null AS date))
)
INSERT INTO null_date (some_date)
SELECT some_date
FROM date_data;
PostgreSQL used to behave differently in cases like this, but commit 1e7c4bb0049 changed that in 2017. Read the commit message for an explanation.
I have to produce a dynamically generated T-SQL script that inserts records into various tables. I've done a bunch of searching and testing but can't seem to find the path I'm looking for.
I know that the following is valid SQL:
INSERT INTO [MyTable] ( [Col1], [Col2], [Col3] )
SELECT N'Val1', N'Val2', N'Val3';
But, is it at all possible to write something akin to this:
INSERT INTO [MyTable]
SELECT [Col1] = N'Val1', [Col2] = N'Val2', [Col3] = N'Val3';
By having the columns in the select statement, I'm able to do it all at once vs writing 2 separate lines. Obviously my idea doesn't work, I'm trying to figure out whether something similar is possible or I need to stick with the first one.
Much appreciated.
Best practice for insert statements is to specify the columns list in the insert clause, and for very good reasons:
It's far more readable. You know exactly what value goes into what column.
You don't have to provide values to nullable \ default valued columns.
You're not bound to the order of the columns in the table.
In case a column is added to the table, your insert statement might not break (It will if the newly added column is not nullable and doesn't have a default value).
In some cases, SQL Server demands you specify the columns list explicitly, like when identity_insert is set to on.
And in any case, the column names or aliases in the select clause of the insert...select statement does not have any effect as to what target columns the value column should go to. values are directed to target based only on their location in the statement.
I am trying to do an UPSERT in DB2 9.7 without creating a temporary table to merge. I am specifying values as parameters, however I'm always getting a syntax error for the comma separating the values when I try to include more than one row of values.
MERGE INTO table_name AS tab
USING (VALUES
(?,?),
(?,?)
) AS merge (COL1, COL2)
ON tab.COL1 = merge.COL1
WHEN MATCHED THEN
UPDATE SET tab.COL1 = merge.COL1,
tab.COL2 = merge.COL2
WHEN NOT MATCHED THEN
INSERT (COL1, COL2)
VALUES (merge.COL1, merge.COL2)
I have also tried teknopaul's answer from Does DB2 have an “insert or update” statement, but have received another syntax error complaining about the use of SELECT.
Does anybody know how to correctly include a table with values in my merge, without actually creating/dropping one on the database?
I believe you need something like USING (SELECT * FROM VALUES ( ...) ) AS ...
I have a table. I wrote a function in plpgsql that inserts a row into this table:
INSERT INTO simpleTalbe (name,money) values('momo',1000) ;
This table has serial field called id. I want in the function after I insert the row to know the id that the new row received.
I thought to use:
select nextval('serial');
before the insert, is there a better solution?
Use the RETURNING clause. You need to save the result somewhere inside PL/pgSQL - with an appended INTO ..
INSERT INTO simpleTalbe (name,money) values('momo',1000)
RETURNING id
INTO _my_id_variable;
_my_id_variable must have been declared with a matching data type.
Related:
PostgreSQL next value of the sequences?
Depending on what you plan to do with it, there is often a better solution with pure SQL. Examples:
Combining INSERT statements in a data-modifying CTE with a CASE expression
PostgreSQL multi INSERT...RETURNING with multiple columns
select nextval('serial'); would not do what you want; nextval() actually increments the sequence, and then the INSERT would increment it again. (Also, 'serial' is not the name of the sequence your serial column uses.)
#Erwin's answer (INSERT ... RETURNING) is the best answer, as the syntax was introduced specifically for this situation, but you could also do a
SELECT currval('simpletalbe_id_seq') INTO ...
any time after your INSERT to retrieve the current value of the sequence. (Note the sequence name format tablename_columnname_seq for the automatically-defined sequence backing the serial column.)
I have a table named "temp_table" and a column named "temp_column" of type varchar. The problem is "temp_column" must be of type integer. If I will just automatically update the table into type integer, it will generate an error since some data has non-numeric data in it.
I want a query that will show all rows if "temp_column" has non-numeric values in it (or the other way around) and update or SET the value accordingly. I'm having a hard time since ISNUMERIC is not available in postgresql.
how to do this?
This will show all rows where you have non-integer values in that column. It uses a regular expression to find all values that have anything else than just numbers in it:
select *
from temp_table
where temp_column ~ '[^0-9]';
this can also be used in an update statement:
update temp_table
set temp_column = null
where temp_column ~ '[^0-9]';
This will also filter out "numeric" values like 3.14 as those aren't integers.