How to include columns with length greater than 4000 characters in SSAS Tabular Model? - ssas-tabular

A few of the columns in the source sql view has column length greater than 4000 characters. These are columns which contain some user comments and needs to be inclided in my Tabular SSAS Model.
But whenever the character length is greater than 4000 characters, I am getting error while processing the model.
I found out that Tabular does not support column length greater than 4000 characters.
Is there any way to bypass this issue ?

In Visual Studio 2017 in compatibility level 1400 (SQL 2017 and Azure Analysis Services) I was able to import this query both in legacy data source mode (regular SQL driver) and in the new Power Query mode.
select cast(replicate(cast('A23456789' as varchar(max)),1000) as varchar(max)) as str,
len(cast(replicate(cast('A23456789' as varchar(max)),1000) as varchar(max))) as len
What are you doing differently? Are you using an older compatibility level or something?
I also would question the analytical value of importing huge text strings into an SSAS Tabular model. Can you explain the value of those columns? Can you possibly parse out the relevant pieces of info from the long strings?

Related

Whether xlsx engine in SAS scan all the rows to get variables'length?

Whether PROC IMPORT using xlsx engine in SAS scans all rows to get variables' length? I have a whitepaper documented that xlsx engine will scan all the row to get datatype but no paper so far confirm that SAS will also scan all rows to get variables' length.
I would say that is the definitive answer on the subject, as Vince DelGobbo is the resident expert at SAS Institute on working with Excel. I don't know if he personally wrote the XLSX engine or not, but he certainly knows those things inside and out. So we can trust him here - and if it's scanning datatype, it's also scanning length (as that's part of data type).
However, this is trivially verifiable, and I did so - excel file with '1 in every row in column A except the last row (2**20) with a 240-character long string in it.
PROC IMPORT returned a 240 long column, no truncation issues.

Oracle Identifier Maximum Length

I have a database in MS Sql server.
I want to migrate the same to Oracle also.
But there are few tables and column names that are 30 character long.
Oracle does not accept any identifier which is more than 30 characters long.
Is there any option in Oracle that allows us to increase the length of the identifier?
Thanks
The maximum identifier length is 30 characters.
This is the latest discussion on stack overflow that shows there is no way to increase the identifier length
Change table/column/index names size in oracle 11g or 12c
Also there is a discussion on oracle forum where users have a suggestion to change the length of identifier.
https://community.oracle.com/ideas/3338

Crystal Reports database field length

I'm running a Crystal Report connecting to a SQL Server 2005 database.
When I use Standard Report Expert and added my SQL Server table and clicked on 'Browse data', the field length seems to be double what is in the database for strings (nvarchars and varchars). Any idea why?
IIRC a field thats nvarchar(2000) for example will hold 2000 characters but it has 4000 bytes reserved since each char requires 2bytes storage. could this be the problem that you're seeing... SQL lists size in characters and crystal lists size in bytes?

SSRS 2008 passing multiple parameters Oracle 10g backend

after several years of using Cognos, we are in process of testing conversion of Cognos Reports (8.3) to SSRS 2008 reports. we use Oracle database version 10g. in many of our reports we are converting we pass multiple values in parameters, however i cannot get this working in SSRS pointing to the Oracle datasource.
i have created parameter and set it to allow multiple values. these columns are integer types. the SQL filter is set as follows for example, where vendor_id IN (:Vendor_id). yet when I test the SQL, i get errors. i enter parameter values as comma-seperated for example, 102, 105, 107. errors as follows.
ORA-01722: invalid number
i've tried wrapping value in single, double quotes with same result. is there a different format to meet oracle syntax requirement? does multiple values only work for SQL server databases?
thanks in advance.
joe
As pointed out in this post, multi value parameters are concatenated and used as follows:
Select * from Table WHERE column1 in (:CommaSeparatedListOfValues)
http://consultingblogs.emc.com/stevewright/archive/2009/08/14/14763.aspx
So Vendor_id has to be Varchar2. I guess you have the data type of Vendor_id as integer?.

SSIS Data Flow overflowing on dates

I have a FoxPro data source, and the destination is SQL Server 2008.
On the FoxPro side, I have a column with the Date data type. That's a width of 8, min value is 0001-01-01 and max is 9999-12-31. On the SQL Server side, I have a datetime. Also a width of 8, min value is 1753-01-01 and max is 9999-12-31.
In my SSIS Data Flow task, I have an OLE DB Source component that reads in the FoxPro table. The columns are mapped as DT_DBDate in both External and Output Columns. The OLE DB Destination to the SQL Server table takes that columns and flows it to a DT_DBTIMESTAMP. I'm sure that DT_DBTIMESTAMP can handle these date ranges adequately, and reasonably sure about DT_DBDATE (but am having trouble finding it in MS documentation).
The Problem
When I execute the task, I've been having trouble with it failing and complaining of "Invalid date format" and then "Conversion failed because the data value overflowed the specific type" when it encounters values from around the early 1900s, to values over 2050. I'm not exactly sure where I've gone wrong.
Bonus Question
How can I handle overflows in my data flow task?
you might want to do a simple VFP query looking for the records that are beyond the range such as a VFP query of
select * from yourtable
where YourDate < date( 1753, 1, 1 )
or YourDate > date( 9999, 12, 31 )
look at those records for problems...
Additionally, look at other columns that may be causing your boundary issues
I think you need to decide what do you want to do with your data
If you want to load it as it is, you need to change field data type so it will be able to hold the data.
Or if you any wish to correct wrong data you would need to validate and transform it.
If you willing to spend some money please consider using Advanced ETL Processor.
It works with Text, XML, Excel, Access, DBF, Foxpro, ODBC, OLE DB, MS Sql Server, Oracle, MySql, PostgreSQL, Firebird, Interbase, SQLite, POP3, SMTP, File System, FTP, SSL and Unicode.