Mapping Dataflows time column to Sql time column - azure-data-factory

SQL server db has a time column datatype of
[start_time] time NULL,
[end_time] time NULL,
but dataflows doesn't have a function for this.
The only way I can think of doing this is a post query (if you fully recreate the table)
alter table dbo.[testTable]
alter column [start_time] time(0)
alter column [end_time] time(0)
I tried using timestamp but again it's not a matching datatype
toTimestamp(substring(start_date,12,9),'HH:mm:ss')
so this doesn't work.
Any help on understanding this would be great
** updating with screens shots
So this issue is for parquet or csv to Sql db tables.
if you have a column that looks like DateTime you need to keep it as a string as there is no toDateTime function only toTimestamp. Neither string or Timestamp can be converted to DateTime datatype in SQLdb sink. You will end up with nulls in your column
Sample befor using expression to change the start_date to yyyy-mm-dd THH:mm:ss

You can simply map the DATETIME column to the target TIME column in the sink activity.
Make sure the option "Allow schema drift" in sink activity is unchecked.
My test schema:
-- source table
DROP TABLE IF EXISTS [dbo].[tempSourceTable]
CREATE TABLE [dbo].[tempSourceTable](
[id] int IDENTITY(1,1) NOT NULL,
[key] nvarchar(max) NULL,
[start_date] datetime NULL
)
INSERT INTO [dbo].[tempSourceTable] VALUES ('key1', '2021-10-14 12:34:56')
SELECT * FROM [dbo].[tempSourceTable]
-- target table
DROP TABLE IF EXISTS [dbo].[tempTargetTable]
CREATE TABLE [dbo].[tempTargetTable](
[id] int IDENTITY(1,1) NOT NULL,
[key] nvarchar(max) NULL,
[start_time] time NULL
)
result after execute the dataflow in a pipeline:

Here is my testing CSV input:
start_date,end_date,start_date_time,end_date_time,start_time,end_time
09/01/2020,09/01/2020,09/01/2020 11:01,09/01/2020 11:01,11:01:46,11:01:52
09/01/2020,,09/01/2020 11:01,,11:01:47,
09/01/2020,09/01/2020,09/01/2020 11:01,09/01/2020 11:50,11:01:49,11:50:41
09/01/2020,09/01/2020,09/01/2020 11:01,09/01/2020 11:01,11:01:51,11:01:55
09/01/2020,09/01/2020,09/01/2020 11:01,09/01/2020 11:01,11:01:52,11:01:56
You may specify the data/time/datetime format for CSV source data:
You can see the correct parsing result in data preview:
After that, a simple sink activity should achieve what OP wants to do:
The sink table schema I used for testing:
CREATE TABLE [dbo].[tempTargetTable](
[start_date] date NULL,
[end_date] date NULL,
[start_date_time] datetime NULL,
[end_date_time] datetime NULL,
[start_time] time NULL,
[end_time] time NULL
)
Result in DB:

Related

insertion of nested array of custom in table for postgres

command = (
CREATE TYPE belongings AS (
item TEXT,
quantity INTEGER
)
CREATE TYPE student AS (
name TEXT,
id INTEGER,
bag belongings[]
)
CREATE TABLE studentclass(
date DATE NOT NULL,
time TIMESTAMPTZ NOT NULL,
PRIMARY KEY (date, time),
class student
)
)
Can i ask how to do insert for this in postgres psycog2? thank you.
When i put the insert as
insert_sql = "INSERT INTO studentclass (date, time, class) VALUES (%s,%s,%s)"
error output is
DETAIL: Cannot cast type text[] to belongings[] in column
I don't think i just cast it with "::belongings[]" in the INSERT statement as it is a nested.
My earlier asked question for a simpler table.
Unable to insert nested record in postgres

How to select last value insert in column ( like function LAST() for OracleDB)

I'm actually sutend and I'm setting up DB PostgreSQL for my AirsoftShop and some request on it. I need to find similar function as SELECT LAST(xx) FROM yy usable on SQL server and OracleDB i think. For return the last insert values in the column target by LAST().
I have this table :
CREATE TABLE munition.suivi_ammo (
type_ammo integer NOT NULL,
calibre integer NOT NULL,
event integer NOT NULL,
date_event date NOT NULL,
entrance integer NOT NULL,
exit integer NOT NULL,
inventory integer NOT NULL,
FOREIGN KEY (calibre) REFERENCES munition.index(numero),
FOREIGN KEY (event) REFERENCES munition.index(numero),
FOREIGN KEY (type_ammo) REFERENCES munition.index(numero)
);
and index for definition by number id :
CREATE TABLE munition.index (
numero integer NOT NULL,
definition text NOT NULL,
PRIMARY KEY (numero)
);
I want to select the last inventory insert in the table and calculate the current inventory according to the inflow and outflow made after my inventory
It's works when i do this type of request with specific date to be sure to only have the last one inventory, but I do not want to have to do it
SELECT index.definition,
Sum(suivi_ammo.inventory) + Sum(suivi_ammo.entrance) - Sum(suivi_ammo.exit) AS Stock
FROM munition.suivi_ammo
INNER JOIN munition.index ON suivi_ammo.type_ammo = index.numero
WHERE date_event < '03/05/2019' AND date_event >= '2019-04-10'
GROUP BY index.definition;
I also tried to used last_value() window function but doesn't work.
Thx !

Average MySQL in new table

I have a database about weather that updates every second.
It contains temperature and wind speed.
This is my database:
CREATE TABLE `new_table`.`test` (
`id` INT(10) NOT NULL,
`date` DATETIME() NOT NULL,
`temperature` VARCHAR(25) NOT NULL,
`wind_speed` INT(10) NOT NULL,
`humidity` FLOAT NOT NULL,
PRIMARY KEY (`id`))
ENGINE = InnoDB
DEFAULT CHARACTER SET = utf8
COLLATE = utf8_bin;
I need to find the average temperature every hour.
This is my code:
Select SELECT AVG( temperature ), date
FROM new_table
GROUP BY HOUR ( date )
My coding is working but the problem is that I want to move the value and date of the average to another table.
This is the table:
CREATE TABLE `new_table.`table1` (
`idsea_state` INT(10) NOT NULL,
`dateavg` DATETIME() NOT NULL,
`avg_temperature` VARCHAR(25) NOT NULL,
PRIMARY KEY (`idsea_state`))
ENGINE = InnoDB
DEFAULT CHARACTER SET = utf8
COLLATE = utf8_bin;
Is it possible? Can you give me the coding?
In order to insert new rows into a database based on data you have obtained from another table, you can do this by setting up an INSERT query targeting the destination table, then run a sub-query which will pull the data from the source table and then the result set returned from the sub-query will be used to provide the VALUES used for the INSERT command
Here is the basic structure, note that the VALUES keyword is not used:
INSERT INTO `table1`
(`dateavg`, `avg_temperature`)
SELECT `date` , avg(`temperature`)
FROM `test`;
Its also important to note that the position of the columns returned by result set will be sequentially matched to its respective position in the INSERT fields of the outer query
e.g. if you had a query
INSERT INTO table1 (`foo`, `bar`, `baz`)
SELECT (`a`, `y`, `g`) FROM table2
a would be inserted into foo
y would go into bar
g would go into baz
due to their respective positions
I have made a working demo - http://www.sqlfiddle.com/#!9/ff740/4
I made the below changes to simplify the example and just demonstrate the concept involved.
Here is the DDL changes I made to your original code
CREATE TABLE `test` (
`id` INT(10) NOT NULL AUTO_INCREMENT,
`date` DATETIME NOT NULL,
`temperature` FLOAT NOT NULL,
`wind_speed` INT(10),
`humidity` FLOAT ,
PRIMARY KEY (`id`))
ENGINE = InnoDB
DEFAULT CHARACTER SET = utf8
COLLATE = utf8_bin;
CREATE TABLE `table1` (
`idsea_state` INT(10) NOT NULL AUTO_INCREMENT,
`dateavg` VARCHAR(55),
`avg_temperature` VARCHAR(25),
PRIMARY KEY (`idsea_state`))
ENGINE = InnoDB
DEFAULT CHARACTER SET = utf8
COLLATE = utf8_bin;
INSERT INTO `test`
(`date`, `temperature`) VALUES
('2013-05-03', 7.5),
('2013-06-12', 17.5),
('2013-10-12', 37.5);
INSERT INTO `table1`
(`dateavg`, `avg_temperature`)
SELECT `date` , avg(`temperature`)
FROM `test`;

Varchar to Numeric Conversion

I have loaded the excel data into a table using SSIS.
Table Structure :
Monthly_Budget
SEQUENCE INT IDENTITY,
TRANSACTION_DATE VARCHAR(100),
TRANSACTION_REMARKS VARCHAR(1000),
WITHDRAWL_AMOUNT VARCHAR(100),
DEPOSIT_AMOUNT VARCHAR(100),
BALANCE_AMOUNT VARCHAR(100)
Values in WITHDRAWL_AMOUNT Column:
7,987.00
1,500.00
7,000.00
50.00
NULL
253.00
4,700.00
2,000.00
148.00
2,000.00
64.00
1,081.00
2,000.00
NULL
NULL
7,000.00
Now, I am trying to run a query to get the summation of values under WITHDRWAL_AMOUNT but I am getting an error :
Error converting data type varchar to numeric.
My Query :
SELECT SUM(CAST(ISNULL(LTRIM(RTRIM(WITHDRAWL_AMOUNT)),0) AS NUMERIC(6,2))) AS NUM FROM MONTHLY_BUDGET
Try converting them like this:
select SUM(CAST(ltrim(rtrim(replace(WITHDRAWL_AMOUNT, ',', ''))) as numeric(6, 2)) )
It is much, much preferable to store the values in the proper types that you want. I can, however, understand putting external data into a staging table and then using logic such as the above to load the data into the final table.

T-SQL 'Cannot insert the value NULL' error with non-NULL values

I'm migrating a Simple Membership database to Identity 2.0. I'm copying a CreateDate datetime NULL column to a CreateDate datetime NOT NULL column. I've examined all the records in the Membership.CreateDate column in the data source table to verify that they contain valid DateTimes. This error is returned:
Cannot insert the value NULL into column 'CreateDate', table 'Settlement.dbo.AspNetUsers'; column does not allow nulls. INSERT fails.
The statement has been terminated.
I've also tried deleting all the records in Membership but one (its CreateDate column contains 2012-12-27 01:35:03.610). I get the same error.
I'm running the script in SSMS.
Migration script excerpts:
CREATE TABLE [dbo].[AspNetUsers] (
[Id] [nvarchar](60) NOT NULL,
[UserName] [nvarchar](15) NOT NULL,
[AcId] INT NOT NULL,
[LocId] INT NOT NULL,
[CreateDate] [datetime] NOT NULL,
[Email] [nvarchar](60),
[EmailConfirmed] [bit] Default ((0)) NOT NULL,
[PasswordHash] [nvarchar] (100),
[SecurityStamp] [nvarchar] (60),
[PhoneNumber] [nvarchar] (15),
[PhoneNumberConfirmed] [bit] Default ((0)) NOT NULL,
[TwoFactorEnabled] [bit] Default ((0)) NOT NULL,
[LockoutEndDateUtc] [datetime],
[LockoutEnabled] [bit] Default ((0)) NOT NULL,
[AccessFailedCount] [int] Default ((0)) NOT NULL,
CONSTRAINT [PK_dbo.AspNetUsers] PRIMARY KEY ([Id])
);
GO
INSERT INTO AspNetUsers(Id, UserName, AcId, LocId, PasswordHash, SecurityStamp, CreateDate )
SELECT UserProfile.UserId, UserProfile.UserName, UserProfile.BaId, UserProfile.OfcId,
webpages_Membership.Password, webpages_Membership.PasswordSalt, webpages_Membership.CreateDate
FROM UserProfile
LEFT OUTER JOIN webpages_Membership ON UserProfile.UserId = webpages_Membership.UserId
GO
If I change the AspNetUsers CreateDate column to NULL it successfully copies the datetimes from table to table so I know all the column names and types are correct.
(After doing this I ran
ALTER TABLE [dbo].[AspNetUsers] ALTER COLUMN [CreateDate] datetime NOT NULL
and got the same error)
This is happening with the Production copy of the database. I have a development copy of the database generated through the same EF code first code and it successfully copies the data into the CreateDate NOT NULL field.
I'm at wits end at this point. Why am I getting a Cannot insert the value NULL into column error when the source data is valid datetimes? Or why doesn't it recognize the CreateDate columns data as valid datetimes?
You have some users in the UserProfile that haven't corresponding users in the webpages_Membership, so you try to insert users without any information. You must use INNER JOIN instead of LEFT OUTER JOIN or provide the default values for users which haven't corresponding information.