XLSREAD unable to read file in MATLAB R2012 - matlab

I have an excel-file which consists of about roughly 10,000 rows and has a size of around 800KB
When I try to import the data to MATLAB both with GUI import tool, or using XLSREAD I get the following message:
Could not open the spreadsheet. MATLAB reported the following error:
XLSREAD unable to read sheet "Sheet1"
File contains unexpected record length. Try saving as Excel 98
I tried saving as excel 98, but didn't help?...funny thing is, I can import other excel-files which are bigger than 10,000 rows and 800KB in size?!...
Ideas? =) My excel-file shouldn't consist anything special, just columns of numeric data with headers consisting of text...
UPDATE !
It seems this only comes when I use MATLAB in Ubuntu 12.10...When I tried it in Windows XP it works just fine....??

I know some time has passed, but I had the same problem with Ubuntu 16.04 and MATLAB R2016a . In my case it didn't worked to delete the columns.
My solution was to change the excel file from .xls to .xlsx and try it with xlsread again (of course with changed path).
Please don't ask me why it works if it's saved in an Microsoft format.

Try removing any empty columns / rows you have in your sheet, as well as explicitly delete several columns / rows after your data. Matlab seems to have a problem with "empty" columns / rows.

Related

how to export large numbers from HeidiSQL to csv

I use HeidiSQL to manage my database.
When I export grid row to CSV format file, the large number 89610185002145111111 become 8.96102E+19
How can I keep the number without science notation conversion?
HeidiSQL does not do such a conversion. I tried to reproduce but I get the unformatted number:
id;name
89610185002145111111;hey
Using a text editor, by the way. If you use Excel, you may have to format the cell in a different format.

Import Flat File via SSMS to SQL Server fails

When importing a seemingly valid flat file (csv, text etc) into a SQL Server database using the SSMS Import Flat File option, the following error appears:
Microsoft SQL Server Management Studio
Error inserting data into table. (Microsoft.SqlServer.Import.Wizard)
Error inserting data into table. (Microsoft.SqlServer.Prose.Import)
Object reference not set to an instance of an object. (Microsoft.SqlServer.Prose.Import)
The target table may contain rows that imported just fine. The first row that is not imported appears to have no formatting errors.
What's going wrong?
Check the following:
that there are no blank lines at the end of the file (leaving the last line's line terminator intact) - this seems to be the most common issue
there are no unexpected blank columns
there are no badly escaped quotes
It looks like the import process loads lines in chunks. This means that the lines following the last successfully loaded chunk may appear to have no errors. You need to look at subsequent lines, that are part of the failing chunk, to find the offending line(s).
This cost me hours of hair pulling while dealing with large files. Hopefully this saves someone some time.
If the file you're importing is already open, SSMS will throw this error. Close the file and try again.
Make sure when you are creating your flat-file IF you have text (varchar) value in any of your columns, DO NOT select your file to be comma "," delimited. Instead, select vertical line "|" or something that you are SURE it can't be in those values. the comma is supper common to have in nvarchar filed.
I have this issue and none of the recommendations from other answers helped me!
I hope this saves someone some times and it took me hours to figure it out!!!
None of these other ones worked for me, however this did:
When you import a flat file, SSMS gives you a brief summary of the data types within each column. Whenever you see a nvarchar that's in an int or double column, change it to int or double. And change all nvarchars to nvarchar(max). This worked for me.
I've been working with csv data for a long time. I encountered the similar problems when I first started this job, however as a novice, I couldn't obtain a precise fault from the exceptions.
Here are a few things you should look at before importing anything.
Your csv file must not be opened in any software, such as Excel.
Your csv file cells should not include comma or quotation symbols.
There are no unnecessary blanks at the end of your data.
There is no usage of a reserved term as data. In Excel, open
yourfile and save it as a new file.
After considering all the suggestions, if anyone is still having issues, check the length of the DataType for your columns. It took hours for me to figure this out but increasing the nvarchar length from (50) to (100) worked for me.
One thing that worked for me : You can change the error range to 1 in "Modify colums"
Image for clarity of where it is
You get an error message with the specific line that's problematic in your file instead of "ran out of memory"
I fixed these errors by playing around with the data type. For instance, change my tinyint to smallint, smallint to int, and increased my nvarchar() to reasonable values, else I set it to nvarchar(MAX). Since most of the real-life data do have missing values, I checked allowed missing values in all columns. Everything then worked with a warning message.

Data Conversion Failed SQL

I am using the import and export wizard and imported a large csv file. I get the following error.
Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data
conversion for column "firms" returned status value 2 and status text "The
value could not be converted because of a potential loss of data.".
(SQL Server Import and Export Wizard)
Upon importing, I use the advanced tab and make all of the adjustments. As for the field in question, I set it is numeric (8,0). I have since went through this process multiple times and tried 7,8,9,10,and 11 to no avail. I import the csv into excel and look at the respective column, firms. It shows no entry with more than 5 characters. I thought about making it DT_String but will need to manipulate that column eventually by averaging it. Also, have searched for spaces or strange characters and found none.
Any other ideas?
1) Try changing the Numeric precision to numeric(30,20) both in source and destination table.
2) Change the data type to str/wstr and adjust the output column width while importing. It will run fine. It happened with me as well while loading large CSV file of approx 5 GB. After load, use Try_convert function to convert it back to numeric and check the values which went null while conversion, you will find the root cause then.

loading multiple non-CSV tables to R, and perform a function on each file.

First day on R. I may be expecting too much from it but here is what I'm looking for:
I have multiple files (140 tables), and each table has two columns (V1=values & V2=frequencies). I use the following code to get the Avg from each table:
I was wondering if it's possible to do this once instead of 140 times!
i.e: to load all files and get an exported file that shows Avg of each table in front of the original name of the file.
-I use read.table to load files as read.CSV doesn't work well for some reason.
I'll appreciate any input!
Sum(V1*V2)/Sum(V2)

excel to matlab structure using column names

I have to analyze a data given in Excel format. I will use MATLAB and I want to write a code which automatically creates structure using the column's name.
The columns are formatted as follows:
Speed_55m.max Speed_55m.min Speed_55m.stdev Speed_55m.value
And that 4 pair of names is repeating for different heights.I want to have a loop which reeds the column names and creates a structure.
I have tried the following code:
[a,b]=xlsread('PP_RR.xlsx');
for icol=1:size(a,2)
char(b{icol})=a(:,icol);
end
But I received the following error:
Subscripted assignment dimension mismatch.
A workaround is using the file explorer window on the MATLAB home page. Double click on the excel spreadsheet to open the import wizard, select "import as table". MATLAB will automatically create a 'table' data type variable with the same column names as your spreadsheet. If that does not work, convert the .xlsx to a .csv, and it will for sure.