Odoo12 tree record datetime, export values - odoo-12

Odoo12 tree record shows different datetime to when the record is exported.
How do i solve this problem

Related

SSAS Tabular - No relation between fact table and 2nd Date dimension

I have 2 date fields in my fact table start_date and end_date.
Fact table is connected to the dim date with start_date.
I need to slice data by the second date field so I've created another date dimension similar to the first dim date and connected the key to the end_date in the fact table, but for some reason there's no relation between them when I browse the cube.
I made sure the keys are in the same format, data type etc.
The new dim date is 'marked as date table'.
What am I missing?
Thanks a lot.
Try with TabularEditor:
https://github.com/otykier/TabularEditor/releases/tag/2.13.0
open your Model and check relationships if you don't see the one you want, you can just add a new one.
You can also create multiple relationships to the same table using different columns (but as inactive), you can use this inactive relationship in calculation using USERELATIONSHIP.

POWER QUERY APPEND date is missing

I have two tables with similar columns
Apply append, all details were ok except for the date of the new table.
Old data dates are available, but the new one is missing and specified as "null"
I check their format, both are the same
Anyone once knows what is the issue.
Below screenshot for reference
enter image description here
It's look like new table has different column name for date.

Display date from Date Dimension in SSIS Derived Column

I created a derived column to include a Fiscal Year in an ssis package. The package includes a DateDimension with a FiscalYear column. The data in the column is displayed as “SFY2018Q1”. The Column name is displayed as “[[$DATE_DIM].[FQUARTER]]
The expression I created should display only the year “2018” from the DateDimension. However, is not resolving “is red” in the derived column. Below is the expression I created.
LEFT(RIGHT([$Date_DimFQuarter],3),2)
I also attempted the expression by excluding the “$”, and by adding the Table name DateDim. Neither of those modifications work.
Any assistance on what I am doing wrong is greatly appreciated.
just double click derived column toolbox then drag and drop your column from columns tree, your expression must be LEFT(RIGHT([YourColumn],3),2).
so try not to write the column by your self, just drag and drop it.

MS Access mporting dates

At the end of importing a .txt file through the help of the wizard i get a message that some elements were not imported correctly. I have a column in the .txt which should contain dates, but for some reason when i select the column containing dates, and i set its type to date and time, for some reason access cannot recognize them as dates. I'm thinking that it's because of the language difference. I use dates like: 1.1.2011, whereas access uses 1/1/2011.
Where can i change the format?
You can in the Advanced section of the Import Wizard.
If that doesn't work, don't import but link the file and specify the date field as text.
Then create a simple select query where you use the linked table as source. Select all the fields you need.
For the date field, use this expression:
TrueDate: CDate(Replace([YourTextDateField], ".", "/"))
Clean up other fields as well.
Now use this query for the further processing of the data.

Hive date based partitions

I have data in the following form on HDFS:-
/basedir/yyyymmdd/fixedname/files
Where the yyyymmdd is the date folder and files are the list of files added in the directory. I need a table in hive to pick up data from yyyymmdd/fixedname directory. This should also work when i added a new date. e.g. i add something on 5th March 2013 so all files added on that day would go to 20130305/fixedname folder. On 6th March 2013, all files would go to 20130306/fixedname folder.
How do i alter a hive table to pickup data from the changing date but fixed folder within it?
Do you have a partitioned table? Let's say that you already have a partitioned table by the column date and you want to add new data. In this case, you will have to add the data to the new directory and tell to hive table (specifically to the metastore) that it has a new partition using ALTER TABLE ADD PARTITION COMMAND.
Let's say that you do have not created any table yet. In this case you will have to create a partitioned table and then insert the data into this table from queries. The magic comes up when you set these two flags:
set hive.exec.dynamic.partition=yes
set hive.exec.dynamic.partition.mode = nonstrict;
These flags allow dynamic partitions (For more details read here).
Remember that you will have directories like:
/date=YYYYMMDD/fixedname/files
So you have to tell to Hive to pick up all the data into subdirectories in a recursive way. You should set the following flag (here there is a better explanation)
SET mapred.input.dir.recursive=true;
Finally you will able to make queries by date and get all the data in the subdirectories from the date you specified in the query (/date=YYYYMMDD/...).
Hope this helps you.