New software that we have installed needs to have a specific id to be used as a refrence token to know where it left off in the sql database. Presently our other software that enters in data that i am refrencing is not giving no time 2013-05-20 00:00:00. I would like to combine that date time stamp with my pipesize and period.
The data looks like this:
trandate = 2013-05-20 00:00:00 Pipesize = 30 Period = A
I need to have the data converted to look like 2013052000000030A if it is possible or something close.
I used the chart here to determine the correct format: w3schools.com/sql/func_convert.asp and the resulting sqlfiddle here. sqlfiddle.com/#!3/d9654/3
Related
Good day all,
I am trying to filter todays result in SQL table to a collection in powerapps. The column "dt" represents the column in sql of datetime type.
This is my powerapps filter:
ClearCollect(myCollectionName, Filter(myDatasource, Text(dt,"dd/mm/yyyy") = Text(Now(),"dd/mm/yyyy" )));
Seems like the collection is still empty even there is data for today in sql. May I know if my approach is the correct way in filtering?
Short answer: the data is likely being changed based on the client time zone. To fix it, you can update it by applying the time zone offset to the data from the SQL table, something along the lines of:
ClearCollect(
myCollectionName,
Filter(
myDatasource,
Text(DateAdd(dt, TimeZoneOffset(dt), Minutes), "dd/mm/yyyy") =
Text(Now(), "dd/mm/yyyy")))
Long(er) answer: the datetime type in SQL Server represents an absolute value of date and time. For example, the value '2021-12-23 09:30:00' represents 9:30 in the morning of the 23rd day of December, 2021 - at any part of the world. The date/time type in Power Apps, however, represents a point in time, typically referring to the local time where the app is being executed (or created). For example, if I selected that value and I'm in the US Pacific Time Zone (UTC-08:00), that would represent the same value as if someone in London (UTC+00:00) selected 2021-12-23 17:30:00. Since the two types represent different concepts, we may have mismatches like you are facing. To fix this, we can either use a type in SQL Server that has the same semantics as Power Apps (for example, 'datetimeoffset'), or adjust the time when it is being transferred between SQL and Power Apps.
The blog post at https://powerapps.microsoft.com/en-us/blog/working-with-datetime-values-in-sql explains in more details how to work with date/time values in SQL and Power Apps.
I am very new to Azure Data Factory. I have created a simple Pipeline using the same source and target table. The pipeline is supposed to take the date column from the source table, apply an expression to the column date (datatype date as shown in the schema below) in the source table, and it is supposed to either load 1 if the date is within the last 7 days or 0 otherwise in the column last_7_days (as in schema).
The schema for both source and target tables look like this:
Now, I am facing a challenge to write an expression in the component DerivedColumn. I have managed to find out the date which is 7 days ago with the expression: .
In summary, the idea is to load last_7_days column in target Table with value '1' if date >= current date - interval 7 day and date <= current date like in SQL.I would be very grateful, if anyone could help me with any tips and suggestions. If you require further information, please let me know.
Just for more information: source/target table column date is static with 10 years of date from 2020 till 2030 in yyyy-mm-dd format. ETL should run everyday and only put value 1 to the last 7 days: column last_7_days looking back from current date. Other entries must recieve value 0.
You currently use the expression bellow:
case ( date == currentDate(),1, date >= subDays(currentDate(),7),1, date <subDays(currentDate(),7,0, date > currentDate(),0)
If we were you, we will also choose case() function to build the expression.
About you question in comment, I'm afraid no, there isn't an another elegant way for. To achieve our request, Data Flow expression can be complex. It may be comprised with many functions. case() function is the best one for you.
It's very clear and easy to understand.
following 1 I started creating a Spreadsheet which reads data from BigQuery, but I'm having an issue handling parameters related to date values.
In the first sheet, I created 2 cells with 2 parameters, the start and the end of a date interval, with proper values. Both cells are formatted as "Date" value.
In the second sheet I configured BigQuery connector, for this example, I'm using a public dataset with dates. bigquery-public-data.utility_eu.date_greg
From the BigQuery connector wizard I added:
"STARTDATE" as "PARAMETERS!B1"
"ENDDATE" as "PARAMETERS!B2"
After this configuration, this is the resulting query:
SELECT
date,
date_str,
date_int
FROM `bigquery-public-data.utility_eu.date_greg`
WHERE date > DATE(#STARTDATE) AND date < DATE(#ENDDATE)
LIMIT 10
I'm getting an error directly from the editor with this message:
> Error BigQuery: No matching signature for function DATE for argument types: INT64. Supported signatures: DATE(TIMESTAMP, [STRING]); DATE(DATETIME); DATE(INT64, INT64, INT64) at [8:14]
As far as I can understand, the "date" cells are retrieved as a number, so the direct parse is not working. After a couple of tests, I understood the that given int value is the number I can obtain change cell format to "number".
If you convert cell value from DATE to NUMBER you get this value:
01/05/2019 -> 43.586
31/05/2019 -> 43.616
What is this number? It is not milliseconds, it increases by 1 every next day. In order to create the proper query that can parse this int, I need to understand what is this int (of course I can handle the cell as "text" and writing the timestamp value directly, but I would prefer to have the native date format so I can use the built-in calendar.
My consideration (with simple math) is that this number refers to a number of days since 30/12/1899, but it is very odd (also, every date BEFORE this days is always 0), so I'm asking you directly how to handle this value. Basing on my understanding of when the number counter starts (30/12/1899), I created this query which add the number retrieved from the cell:
SELECT *
FROM `bigquery-public-data.utility_eu.date_greg`
WHERE
date >= DATE_ADD(DATE("1899-12-30"), INTERVAL #DATAINIZIO DAY)
AND date <= DATE_ADD(DATE("1899-12-30"), INTERVAL #DATAFINE DAY)
It is working... but I think I'm doing a workaround that is not the proper way of doing this.
Also, is there any full documentation related to this BigQuery connection provided by Spreadsheet? Besides presentation in 1 I'm unable to find any specific documentation.
Spreadsheets (Google, Excel, ...) store the dates as days passed since a starting date with a fractional day representing time.
From here: "Excel stores dates and times as a number representing the number of days since 1900-Jan-0, plus a fractional portion of a 24 hour day: ddddd.tttttt . This is called a serial date, or serial date-time."
Now, you have to ways to filter by date on your Query:
In the query, you can use DATE_ADD to add your number of days (cell value) to the base date. (Carefull, DATE_ADD takes INT, and the date value is float so needs prior casting).
(preferred) on your spreadsheet you use TEXT(cell, "yyyy-mm-dd") so you can then use DATE() in the BigQuery query.
I use the second method as, though you need that extra cell (unless you directly store the date as YYYY-MM-DD; keeps the query cleaner than having a cast and date_add in there. Also would save you from the "1904 problem" explained in the link above.
What is this number? It is not milliseconds, it increases by 1 every next day.
This is so called serial number which represent number of days since "very beginning"
Google's Spreadsheet date calendar starts from 1900-01-01 - which is treated as a "very beginning"
In order to create the proper query that can parse this int, I need to understand what is this int
Armed with above info you can adjust you dates calculation to be in sync with what BigQuery expects
You mentioned that your fields are already in Date format, maybe you are doing an extra parsing in your query.
Try to do it without the DATE functions.
Also, I found this other doc, not merely related to connection, but might be helpful: Getting info from Spreadsheets with BigQuery.
I want to get data from one date only, example: 2014-06-16
in CMIS reference I know that we can use = (equal) operator that I think the time must be precised.
The alternative that i thought is to do like below :
First:
SELECT * FROM cmis:document WHERE cmis:creationDate >= TIMESTAMP '2014-06-16T00:00:00.000Z' AND cmis:creationDate< TIMESTAMP '2014-06-17T00:00:00.000Z'
Second:
SELECT P.tsi:DATENUM as date_traitement, L.tsi:type as type, P.tsi:statut as statut
FROM tsi:lot AS L JOIN tsi:pli AS P ON L.cmis:name = P.tsi:lot
WHERE
(P.tsi:DATENUM >= TIMESTAMP '2014-06-16T00:00:00.000Z' AND P.tsi:DATENUM < TIMESTAMP '2014-06-17T00:00:00.000Z')
The first one is running perfectly, I've got data from the 16 june BUT in the seconde I don't know WHY but I still got data from 2014-06-17
Note: tsi:DATENUM type is datetime
So could you say what's wrong OR how to get data from ONE date only?
The second one should work. The timestamps you are using are in GMT. If your timestamps are stored with a time zone offset it could be the reason why you are seeing times from 6/17 when you expect to only see times from 6/16.
I have a requirement to disregard records from the DB that are more than 10 minutes old. However, the DB server is present in a different time zone than the app server. I tried to leverage the time zone details from the timestamp column value but it seems that they do not store the time zone details in that column value (bad design?). However, i have found a way to get this information for the DB instance using a query:
select dbtimezone from dual.
However, most of the implementations in java support time zones via names and not offset information. I need to be able to translate this offset exactly to a timezone (EST etc) so that i may not miss any DST related time in my calculations. like so:
TimeZone dbZone = TimeZone.getTimeZone(10000); // offset is +10000
Calendar cal = Calendar.getInstance(dbZone);
cal.setTime(new Date());
DateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss");
df.setTimeZone(dbZone);
Date parsedDate = df.parse(df.format(cal.getTime()));
The plan is to convert the present client/app time to the DB specific timezone time and perform the difference between the two.
This cannot be done in a query due to some restraints. Please do not ask me to write a query to get latest records etc. Must be done in Java.
Any tips?
I am guessing it might get you in the right direction. You can try the following so you know the offset from EST and can do the calculation accordingly:
SELECT TZ_OFFSET('US/Eastern') FROM DUAL;
So a return of -3:00 would mean it is PST time. Maybe you've already tried this?