I have to analyze a data given in Excel format. I will use MATLAB and I want to write a code which automatically creates structure using the column's name.
The columns are formatted as follows:
Speed_55m.max Speed_55m.min Speed_55m.stdev Speed_55m.value
And that 4 pair of names is repeating for different heights.I want to have a loop which reeds the column names and creates a structure.
I have tried the following code:
[a,b]=xlsread('PP_RR.xlsx');
for icol=1:size(a,2)
char(b{icol})=a(:,icol);
end
But I received the following error:
Subscripted assignment dimension mismatch.
A workaround is using the file explorer window on the MATLAB home page. Double click on the excel spreadsheet to open the import wizard, select "import as table". MATLAB will automatically create a 'table' data type variable with the same column names as your spreadsheet. If that does not work, convert the .xlsx to a .csv, and it will for sure.
Related
I am using a GETPIVOTDATA function in Excel to source data from a pivot table generated by a Power BI query (everything was originally only in excel, the file got too large, so i stored the main tables in PBI but kept the reports in excel for mgmt's sake).
=GETPIVOTDATA("[Measures].["&$A$100&"]",'PIVOT Table_test'!$A$126,"[Master].[field1]","[Master].[field1].&["&C$26&"]","[Master].[AsofDate]","[Master].[AsofDate].&[2022-04-30T00:00:00]")
However, I want to make the GETPIVOTDATA function as dynamic as possible to prevent having too many hardcoded fields/items for each table that fields the charts we look at. However, when i reference the pivot table, the '[Asof]' field populates the static item as "...&[2022-04-30T00:00:00]")...
I have been trying to change that to reference a header row that contains a Short Date value (4/30/2022) like &["&$B&1"&"]")... but i keep getting #ref errors, every other field accepts the "&&" method, and when i leave the hardcoded timestamp in the formula, it populates.
So it has to be that reference but i do not understand what I am doing wrong. I have also tried changing the format of both the header row in Excel and the field within PBI but to no success.
Found the answer on another site. The solution in the item brackets is to write the following:
["&TEXT($A22,"yyyy-mm-dd""T00:00:00""")&"]
I try to use array type column in dataprep and it is look good in dataprep display ui as the picture below.
But when I run job output with .csv file, there are invalid value in the array column.
Why does the .csv output different from dataprep display?
Array in Dataprep display
Array in csv output
It looks like these two columns each contain the complete record...? I also see some non-English characters in there. I suspect something to do with line breaks and/or encoding.
What do you see if you open the CSV file in a plaintext editor, instead of Excel?
What edition of Dataprep are you using (click Help => About Dataprep => see the Edition heading)?
What version of Excel are you using to open the CSV file?
Assuming that this is a straight-forward flow with a single dataset and recipe, could you post a few rows of data and the recipe itself (which you can download), for testing purposes?
I am trying to create a data source in Tableau (10.0) where I am joining a table from SQL with an Excel file. The join happens on a site id but when reading the id from the excel source, Tableau strips the leading zeros (and SQL keeps leading zeros). I see this example
to add the leading zeros back as a new, calculated field. But the join is still dropping rows because the id is not properly formatted when making the join.
How do I get the excel data source to read the column with the leading zeros so I can do the join?
Launch Excel and choose to open a new blank workbook.
Click the Data tab and select From Text.
Browse to the saved CSV file and select Import.
Ensure that Delimited is selected and click Next.
Leave Tab as the delimiter and click Next.
Select the column containing the data with leading zeros and click
Text.
Repeat for each column which contains leading zeros.
Click Finish.
Click OK.
Never heard of or used tableau, but it sounds as though something (jet/ace database driver being used to read excel file?) is determining the column to be numeric and parsing the data as numbers, losing leading zeroes
If your attempts at putting them back are giving you grief, I'd recommend trying the other direction instead; get sqlserver to convert its strings to numbers. Number matching should be more reliable than String matching, so long as the two systems don't handle rounding differently :)
If your Excel file was read in from a CSV and the Site ID is showing "Number Stored as Text", I think you can solve your problem by telling Tableau on the Data Source entry that the field is actually a string. On the preview data source view, change the "#" (designating number) to string so that both the SQL source and the Excel source are both strings before doing the join.
This typically has to do with the way Excel stores values as mentioned above. I would play around with the number formatting for the Site ID column in Excel itself, not Tableau, and changed that two "Text" in Excel. You can verify if Tableau will read it properly with the leading 0s by exporting your excel file to csv and looking in the csv files to see if the leading 0s are still there.
Suppose that we have this code in MATLAB:
data = [1,4,43,21,12];
I want to write this to an excel file in row 5. We can use A5:E5 but the size of my data changes (above data is just an example) and I need this data in row 5 of my excel file each time. How we can do this using xlswrite or other related functions?
If you are ok opening and closing the entire xls file each time I would load the file into a table:
T = readtable('filename');
and subsequently access/modify data in the table using curly braces (treat similar to cell array) i.e:
D = T{i,j};
Finally I would write the table back to an xlsx file:
writetable(T,filename);
Help on the functions:
http://uk.mathworks.com/help/matlab/ref/readtable.html
http://uk.mathworks.com/help/matlab/ref/writetable.html
How to export the entire query results to xls format without the value truncated (with the header intact).
e.g
the value is round up to 276408428673510000 when the actual value is suppose to be 276408428673508271
Using this method
After selecting the data by clicking 'Ctrl+ A' in gride, use 'Ctrl+Shift+C' to copy the data with Header. After pasting the data into MS Excel, try changing the 'format' to number or text. You must be knowing that 'format cells' is available as a right click option in MS Excel.
My solution was use the LPAD or RPAD functions, you give the exact length that you want and when you export the data, these don't round it.