SSIS Data Flow overflowing on dates - date

I have a FoxPro data source, and the destination is SQL Server 2008.
On the FoxPro side, I have a column with the Date data type. That's a width of 8, min value is 0001-01-01 and max is 9999-12-31. On the SQL Server side, I have a datetime. Also a width of 8, min value is 1753-01-01 and max is 9999-12-31.
In my SSIS Data Flow task, I have an OLE DB Source component that reads in the FoxPro table. The columns are mapped as DT_DBDate in both External and Output Columns. The OLE DB Destination to the SQL Server table takes that columns and flows it to a DT_DBTIMESTAMP. I'm sure that DT_DBTIMESTAMP can handle these date ranges adequately, and reasonably sure about DT_DBDATE (but am having trouble finding it in MS documentation).
The Problem
When I execute the task, I've been having trouble with it failing and complaining of "Invalid date format" and then "Conversion failed because the data value overflowed the specific type" when it encounters values from around the early 1900s, to values over 2050. I'm not exactly sure where I've gone wrong.
Bonus Question
How can I handle overflows in my data flow task?

you might want to do a simple VFP query looking for the records that are beyond the range such as a VFP query of
select * from yourtable
where YourDate < date( 1753, 1, 1 )
or YourDate > date( 9999, 12, 31 )
look at those records for problems...
Additionally, look at other columns that may be causing your boundary issues

I think you need to decide what do you want to do with your data
If you want to load it as it is, you need to change field data type so it will be able to hold the data.
Or if you any wish to correct wrong data you would need to validate and transform it.
If you willing to spend some money please consider using Advanced ETL Processor.
It works with Text, XML, Excel, Access, DBF, Foxpro, ODBC, OLE DB, MS Sql Server, Oracle, MySql, PostgreSQL, Firebird, Interbase, SQLite, POP3, SMTP, File System, FTP, SSL and Unicode.

Related

Get value from Measure in OLAP Cube and Insert it in a SQL table

I need help with the the following:
I got an OLAP Cube, let's call it "company_prod" on "server\instance"
That Cube has (among many) a calculated member called "[Measures].[Value]"
One of the Dimensions in the Cube is for time (Year, Month, Date and so on). e.g. [TIME].[Y_M_D].[YEAR].&[2020]
Our main frontend is Excel, where we retrieve Data from the Cube with CUBEELEMENT, CUBEVALUE etc.
We got some measures which unfortunately, when I update the Excel report now and show numbers for last year, the result is different when I update that same report in a few weeks or months. This is something I won't be able to change and in some reports it's the desired behaviour because underlying data from SAP is changed and sometimes valid_from and valid_to dates are changed retroactively.
Now I want to get the value from my "[Measures].[Value]" on a certain date, let's say April 1st. I then want to insert the value I get on April 1st for 2020 in a SQL table. This should be done by an agent job that executes a stored procedure or runs a dtsx package or anything else, whichever works.
I hope it's clear what I am trying to accomplish...
If you can create linked servers from your SQL Server to SSAS Server, then you can run your MDX query against the linked SSAS Server using OPENQUERY and save the result directly to the SQL Server table.
i.e.,
INSERT INTO <your table>
EXECUTE <your mdx statement> AT <linked server>
You can add the run the above via your SQL Agent job on schedule.

How to include columns with length greater than 4000 characters in SSAS Tabular Model?

A few of the columns in the source sql view has column length greater than 4000 characters. These are columns which contain some user comments and needs to be inclided in my Tabular SSAS Model.
But whenever the character length is greater than 4000 characters, I am getting error while processing the model.
I found out that Tabular does not support column length greater than 4000 characters.
Is there any way to bypass this issue ?
In Visual Studio 2017 in compatibility level 1400 (SQL 2017 and Azure Analysis Services) I was able to import this query both in legacy data source mode (regular SQL driver) and in the new Power Query mode.
select cast(replicate(cast('A23456789' as varchar(max)),1000) as varchar(max)) as str,
len(cast(replicate(cast('A23456789' as varchar(max)),1000) as varchar(max))) as len
What are you doing differently? Are you using an older compatibility level or something?
I also would question the analytical value of importing huge text strings into an SSAS Tabular model. Can you explain the value of those columns? Can you possibly parse out the relevant pieces of info from the long strings?

PowerBi doesn't honour an SSAS OLAP cube date field

We have an SSAS OLAP cube, in production since long and perfectly tested by dozens of users, with a normal Time dimension (two hierarchies, Months and Weeks, but this is irrelevant).
The Time dimension key is a date field. On the data view it is defined as DataType: System.DataTime. On the dimension as Calendar -> Date, Usage: Key.
Using this date field on an Excel table accessing the OLAP cube is fully operational, it is a date and "Date filter" options are available as expected.
But trying to use this field on PowerBI defeated all our efforts! No way to have PowerBI interpret the field as a date, so no date filters are available. PowerBI thinks it's a text field and nothing we can think makes it change its behaviour.
We tested PowerBI with an external Excel and adding the SSAS OLAP dimension so to be able to modify the field format and oh surprise! the field is interpreted as text. Changing the format to date makes it work.
But there is no way to change the format for an SSAS OLAP cube when accessed directly as the primary (and only) source of data for the PowerBI repport.
Any idea how to define an SSAS OLAP date field so PowerBI understands it is a date?
Found it! One minute before getting mad forever I spotted the diference.
I was able to create two dimensions on the same cube, with the same field. One works, the other doesn't.
Define the date field as Order by Key and PowerBi treats it as a date.
Define the date field as Order by Name and PowerBi treats it as text.
I came across same situation. I don't know OP's calendar dimension properties, but his solution does not helped me.
In dimension design I used ValueColumn property
And in Power BI I saw a new field, that worked pretty nice with Slicer visual:
And Excel all remained the same:

SAP HANA Decimal to timestamp or seconddate SLT

I am using SLT to load tables into our Hana DB. SLT uses the ABAP dictionary and sends timestamps as decimal (15,0) to the HANA DB. Once in the HANA DB via a calculated column in a calculation view, I am trying to convert the decimals to timestamps or seconddates. Table looks like this:
I run a small SLT transformation to populate columns 27-30. The ABAP layer in SLT populates the columns based on the Database transactions.
The problem comes when I try and convert columns 28-30 to timestamps or seconddates. using syntax like this:
Select to_timestamp(DELETE_TIME)
FROM SLT_REP.AUSP
Select to_seconddate(DELETE_TIME)
FROM SLT_REP.AUSP
I get the following errors:
Problem being, It works some times as well:
The syntax in calculated column looks like this:
With the error from calculation view being:
Has anyone found a good way to convert ABAP timestamps (Decimal (15,0)) to Timestamp or Seconddate in HANA?
There are conversion functions available, that you can use here (unfortunately not very well documented).
select tstmp_to_seconddate(TO_DECIMAL(20110518082403, 15, 0)) from dummy;
TSTMP_TO_SECONDDATE(TO_DECIMAL(20110518082403,15,0))
2011-05-18 08:24:03.0
The problem was with the ABAP data type. I was declaring the target variable as DEC(15,0). The ABAP extracting the data was rounding up the timestamp in some instances to the 60th second. Once in Target Hana, the to_timestamp(target_field) would come back invalid when a time looked like "20150101121060" with the last two digits being the 60th second. This is invalid and would fail. The base Hana layer did not care as it was merely putting a length 14 into into a field. I changed the source variable to be DEC(21,0). This eliminated the ABAP rounding and fixed my problem.

Date column is not displayed in sql

I am using Oracle express for the database comparisons (Teradata and Adaptive Server IQ)
When i am trying to connect a database and try to display all the columns using select function, all the columns except date columns are getting displayed.
can i know a solution for this?