Can't enter date into postgres field with datatype reltime - postgresql

I'm trying to make an insert into postgres 8.4.13
insert into my_table (id, hour_memo) values (1,'17:30:00.000000 +01:00:00');
hour_memo is 'reltime datatype'
During the execution of the insert task i have this trouble:
ERROR: invalid input syntax for type reltime: "17:30:00.000000 +01:00:00"
I have absolutely no idea on how to do this?

The answer is that reltime doesn't support time zones, so the "+01..." thing is breaking it. Still - using reltime type is bad idea, and should be replaced by some normal type.

Related

Snowflake : Unsupported subquery type cannot be evaluated

I am using snowflake as a data warehouse. I have a CSV file at AWS S3. I am writing a merge sql to merge data received in CSV to the table in snowflake. I have a column in time dimension table with data type as Number(38,0) data type in SF. This table holds all dates time, one e.g. is of column
time_id= 232 and time=12:00
In CSV I am getting a column with the label as time and value as 12:00.
In merge sql I am fetching this value and trying to get time_id for it.
update table_name set start_time_dim_id = (select time_id from time_dim t where t.time_name = csv_data.start_time_dim_id)
On this statement I am getting this error "SQL compilation error: Unsupported subquery type cannot be evaluated"
I am struggling to solve it, during this I google for it and got one reference for it
https://github.com/snowflakedb/snowflake-connector-python/issues/251
So want to make sure if anyone have encountered this issue? If yes, will appreciate pointers over it.
It seems like a conversion issue. I suggest you to check the data in CSV file. Maybe there is a wrong or missing value. Please check your data, and make sure it returns numeric values
create table simpleone ( id number );
insert into simpleone values ( True );
The last statement fails with:
SQL compilation error: Expression type does not match column data type, expecting NUMBER(38,0) but got BOOLEAN for column ID
If you provide sample data, and SQL to produce this error, maybe we can provide a solution.
unfortunately correlated and nested subqueries in Snowflake are a bit limited at this stage.
I would try running something like this:
update table_name
set start_time_dim_id= time_id
from time_dim
where t.time_name=csv_data.start_time_dim_id

ERROR In Sequences query if postgres

I create one sequence in postgres and fire one query which is mentioned below
SELECT M_PRODUCTSEQ.NEXTVAL from DUAL;
but it gives me the below error:
ERROR: relation "dual" does not exist.
Kindly help me out. How can i made the relation with dual?
PostgreSQL does NOT support the from DUAL syntax. It does however make the from portion of a query like this optional, so getting the next value (nextval) of a sequence you would do something like this:
SELECT nextval('m_productseq');

Rails 4, migration to change datatype of column from daterange to tsrange causing PG::DatatypeMismatch: ERROR:

I'm trying to change a column of type daterange to tsrange (I realized I need time as well as date) using a vanilla Rails migration
def self.up
change_column :events, :when, :tsrange
end
After running rake db:migrate the error is
PG::DatatypeMismatch: ERROR: column "when" cannot be cast automatically to type tsrange
HINT: Specify a USING expression to perform the conversion.
: ALTER TABLE "events" ALTER COLUMN "when" TYPE tsrange
I tried following the hint and used the following
def self.up
change_column :events, :when, :tsrange, 'tsrange USING CAST(when AS tsrange)'
end
but then got
no implicit conversion of Symbol into Integer
From what I can tell, USING CAST is mainly meant for use with ints. Assuming I don't want to drop and then recreate the column, what do you have to specify to alter the type from daterange to tsrange?
I'm using
Rails 4.0.1
ruby-2.0.0-p247
psql (9.2.4)
Some background, daterange and tsrange were introduced to Rails 4 in the following PR: https://github.com/rails/rails/pull/7345. Thanks.
The USING clause is used to specify how to convert the old values to the new ones:
The optional USING clause specifies how to compute the new column value from the old; if omitted, the default conversion is the same as an assignment cast from old data type to new. A USING clause must be provided if there is no implicit or assignment cast from old to new type.
So USING shows up any time there is no default cast from the old type to the new type. Also note that USING is specified as USING expression so any expression (whose value is of the correct type) can be used with a USING, the most common is USING CAST(...) but the expression can be pretty much anything.
Hopefully that should clear up some confusion about USING.
So what's up with the ActiveRecord error? Well, change_column is expecting to see an options Hash in the fourth argument but you're sending in a string. If you look at the change_column source, you'll see things like options[:limit] but String#[] expects integer arguments so your string argument is triggering odd looking complains about Symbols.
AFAIK there is no way to get AR to add a USING clause to the ALTER TABLE ... ALTER COLUMN that change_column generates. This leaves connection.execute(some_sql) if you need a USING clause. Of course this is further complicated by the (apparent) lack of a built-in cast from daterange to tsrange but building the necessary expression isn't terribly difficult if you pull the daterange apart with the upper and lower functions:
connection.execute(%q{
alter table events
alter column "when"
type tsrange using tsrange(lower("when"), upper("when"))
})
You can see the table change in action over here: http://sqlfiddle.com/#!15/fb047/2
That assumes that you're using the default half-open intervals ([...)) for your ranges; if you have ranges that aren't closed on the left and open on the right then you'll have to build a more complicated USING expression using the other range functions to see if the left and right ends of the ranges are open or closed.
BTW, when is a PostgreSQL keyword so it isn't the best choice for an identifier, you'll have to say "when" every time you refer to that column in SQL snippets and that might get tiring. I'd recommend using a different name for that column so that you don't have to worry about quoting.

Implicit conversion from datatype 'TEXT' to 'VARCHAR' is not allowed. Use the CONVERT function to run this query

I am putting few insert queries in a stored procedure. The insert queries are independently working fine without any issues like :
Implicit conversion from datatype 'TEXT' to 'VARCHAR' is not allowed.
But when the sp is run, it gives the above error for 3 queries.
Checked all the columns, non of them are TEXT type.
Has anyone has faced this issue, any clue would help.
It appears that the problem is not with the stored procedure at all. The error happens when the input exceeds 8000 characters. SQL Server 2000 doesn't have VARCHAR(MAX), the maximum length for VARCHAR is 8000. So, if you try to pass a longer string to your sp, it need to do a conversion to TEXT, but it can't be an implicit conversion, so you need a parameter of type TEXT. Of course, you would need to change your sp, and there are many operations that can't be done on a column of this datatype, so you may be unable to actually do what you want.

Rectifying timeformat in PostgreSQL

I am working on third party data which I need to load into my postgresql database. I am running into problems where sometimes I get the time '24:00:30' when it actually should be '00:00:30'. This rejects the data.
I tried to cast but it did not work.
insert into stop_times_test trip_id, cast(arrival_time as time), feed_id, status
from external_source;
Is there any way to convert to the correct one internally?
This may work for your case:
> select '0:0:0'::time + '24:00:30'::interval;
00:00:30
Cast to interval, then cast to time:
SELECT '24:00:30'::interval::time
If you want to bulk load the data with COPY or mass INSERT make the target column data type interval and convert it to time later. This works out of the box:
ALTER TABLE mytable ALTER col1 TYPE time;
No, there is no magic way of doing it. No cast will help you. 24:00:30 is an invalid time. Period.
You could try adding that value on a varchar and then using regular expressions to update the right values and insert them on the right columns. This sort of things happen a lot when doing data transformation.