I'm running a Crystal Report connecting to a SQL Server 2005 database.
When I use Standard Report Expert and added my SQL Server table and clicked on 'Browse data', the field length seems to be double what is in the database for strings (nvarchars and varchars). Any idea why?
IIRC a field thats nvarchar(2000) for example will hold 2000 characters but it has 4000 bytes reserved since each char requires 2bytes storage. could this be the problem that you're seeing... SQL lists size in characters and crystal lists size in bytes?
Related
A few of the columns in the source sql view has column length greater than 4000 characters. These are columns which contain some user comments and needs to be inclided in my Tabular SSAS Model.
But whenever the character length is greater than 4000 characters, I am getting error while processing the model.
I found out that Tabular does not support column length greater than 4000 characters.
Is there any way to bypass this issue ?
In Visual Studio 2017 in compatibility level 1400 (SQL 2017 and Azure Analysis Services) I was able to import this query both in legacy data source mode (regular SQL driver) and in the new Power Query mode.
select cast(replicate(cast('A23456789' as varchar(max)),1000) as varchar(max)) as str,
len(cast(replicate(cast('A23456789' as varchar(max)),1000) as varchar(max))) as len
What are you doing differently? Are you using an older compatibility level or something?
I also would question the analytical value of importing huge text strings into an SSAS Tabular model. Can you explain the value of those columns? Can you possibly parse out the relevant pieces of info from the long strings?
I have a complicated SQL in Excel to generate a raw data table for report from SQL server OLEDB . I would like to use Crystal Report to generate a neat and formatted report. I can use the Excel raw data table as the Crystal report database data source. But, Excel need to be refreshed every time new report is required. How can I use the Excel SQL command text string to be Crystal Report data source?
My Excel connect string: Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=True;Initial Catalog=MicrosoftDynamicsAX;Data Source=SQL2;Use Procedure for Prepare=1;Auto Translate=True;Packet Size=4096;Workstation ID=RD04;Use Encryption for Data=False;Tag with column collation when possible=False
I have Crystal report version 8.5 and 11.
Its simple
While creating the connection dont select tables instead use add command and then paste the query in the box and use in reports
I`m using Crystal Reports Version 14.1.1.1036 to show an RTF Field stored in my SQL server as nvarchar(max), the problem is that my nvarchar(max) is interpreted as a Memo Field in Crystal Reports, only part of my data is being previewed (text is being truncated). Field's property is set to "Can Grow" with 0 as a Maximum number of lines. thanks for any help.
After some research, it turned out that Crystal Report have this fault by design, and there is no means to fix it. The maximum size of a text field is set to 64KB at maximum, and the only way to resolve this is by String Splitting very large fields.
Check this post for more information.
after several years of using Cognos, we are in process of testing conversion of Cognos Reports (8.3) to SSRS 2008 reports. we use Oracle database version 10g. in many of our reports we are converting we pass multiple values in parameters, however i cannot get this working in SSRS pointing to the Oracle datasource.
i have created parameter and set it to allow multiple values. these columns are integer types. the SQL filter is set as follows for example, where vendor_id IN (:Vendor_id). yet when I test the SQL, i get errors. i enter parameter values as comma-seperated for example, 102, 105, 107. errors as follows.
ORA-01722: invalid number
i've tried wrapping value in single, double quotes with same result. is there a different format to meet oracle syntax requirement? does multiple values only work for SQL server databases?
thanks in advance.
joe
As pointed out in this post, multi value parameters are concatenated and used as follows:
Select * from Table WHERE column1 in (:CommaSeparatedListOfValues)
http://consultingblogs.emc.com/stevewright/archive/2009/08/14/14763.aspx
So Vendor_id has to be Varchar2. I guess you have the data type of Vendor_id as integer?.
I have a FoxPro data source, and the destination is SQL Server 2008.
On the FoxPro side, I have a column with the Date data type. That's a width of 8, min value is 0001-01-01 and max is 9999-12-31. On the SQL Server side, I have a datetime. Also a width of 8, min value is 1753-01-01 and max is 9999-12-31.
In my SSIS Data Flow task, I have an OLE DB Source component that reads in the FoxPro table. The columns are mapped as DT_DBDate in both External and Output Columns. The OLE DB Destination to the SQL Server table takes that columns and flows it to a DT_DBTIMESTAMP. I'm sure that DT_DBTIMESTAMP can handle these date ranges adequately, and reasonably sure about DT_DBDATE (but am having trouble finding it in MS documentation).
The Problem
When I execute the task, I've been having trouble with it failing and complaining of "Invalid date format" and then "Conversion failed because the data value overflowed the specific type" when it encounters values from around the early 1900s, to values over 2050. I'm not exactly sure where I've gone wrong.
Bonus Question
How can I handle overflows in my data flow task?
you might want to do a simple VFP query looking for the records that are beyond the range such as a VFP query of
select * from yourtable
where YourDate < date( 1753, 1, 1 )
or YourDate > date( 9999, 12, 31 )
look at those records for problems...
Additionally, look at other columns that may be causing your boundary issues
I think you need to decide what do you want to do with your data
If you want to load it as it is, you need to change field data type so it will be able to hold the data.
Or if you any wish to correct wrong data you would need to validate and transform it.
If you willing to spend some money please consider using Advanced ETL Processor.
It works with Text, XML, Excel, Access, DBF, Foxpro, ODBC, OLE DB, MS Sql Server, Oracle, MySql, PostgreSQL, Firebird, Interbase, SQLite, POP3, SMTP, File System, FTP, SSL and Unicode.