Matlab data pull and save [closed] - matlab

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
What I'm attempting to do is simple:
Given an excel dataset and user defined inputs, open the associated excel file, pull the associated information for the user inputted information and save this information in a separate excel file.
I've already developed a list of values and the program recognizes user input with associated checks. I'm stuck on getting Matlab to use this information to open the correct dataset, I don't know how to get Matlab to pull a row/column in excel with a silent open and I don't know how to get it to save that data into a separate excel file. Any help would be appreciated, thank you.

Consider using the functions readtable, and writetable if you have a recent MATLAB (anything more recent than R2013b). The readtable function will 'silently' read data from a specific worksheet in an Excel file into a MATLAB table variable. From there you can 'query' the table to find the specific rows you want and write the result to a new excel table with writetable.
Using readtable, you can specify the range of data with the parameters sheet and range.
requested_data = readtable(excel_file, ...
'sheet',input_sheet_name, ...
'range',input_data_range);
and write the data to another Excel file with
writetable(requested_data,ouput_excel_file, ...
'sheet',output_sheet_name, ...
'range',output_data_range);
Note: Remember to set the values for excel_file, input_sheet_name, input_data_range, output_excel_file, output_sheet_name, and output_data_range before running above commands.
Querying the table to access data in your table. One way would be to use ismember as in this answer.
Finally, use writetable to store the values.
See also: sheetnames, detectImportOptions, and SpreadsheetImportOptions

Related

Powershell: Compare two CSV files and export the data that is different

I HAVE CHANGED ROLE TO CUSTOMER
i have looked at many different questions within Stack to try and figure this out, unfortunately i have had zero luck. I would like to compare two csv files and export the difference to a new sheet
both sheets have matching header fields and the only column that will ever change would be PRICE.
the result i ultimately want is for the new export sheet to look like this, any help would be appreciated.

Column to Row transpose in Talend [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I want to get a crosstable operation by using Talend Open studio.
My Source is like:
id 201601 201602 201603 ...
1 aa bb cc ...
I want to get the output like below:
id date value
1 201601 aa
1 201602 bb
1 201603 cc
. . .
. . .
. . .
Column name is depend on date. So I need a automatic way to convert columns into rows.
You may use tSplitRow.
See the capture with the job, tSplitRow configura and schema.
I think you can try with tUnpivotRow component. However you need to know this is custom component created by community member daztop.
Component is avalaible to download from this link.
Under this link you will find instruction how to use this component.
Also if your data are stored in database you can transpose columns to rows direclty in that database by running proper sql query via talend (query depends from databse engine).

Save a ordering list of data into core data [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
i have an nstableview, where the user can add rows.
each row will directly save into Core data.
at the moment i can request the records of core data and sort it by objects id to get the correct order, because the object id is nearly the same like an incremental number.
but now the user can reorder the rows.
how can i save this new order of rows into core data?
Using the id as a way to order is not guaranteed to work, so you shouldn't do that.
Instead, add a field that represents the order.
What we do is have a field called pos that is in integer and we set it sparsely. The first record can be 100, the second 200, etc. Then when we re-order, we set pos to the mid-point of the records right before and after. Every once in a while, you need to re-number the records. The more sparse you do it, the less you need to re-number.
When you add a new record set it to the max value + 100 (or whatever spacing you are using)

Join two data sets by year in Tableau [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
I have two datasets which looks like:
# 2013_data.tsv
year state age
2013 CA 22,5
2013 OH 19,3
2013 IL 45,5
2013 TX 33
# 2012_data.tsv
year state age
2012 CA 23
2012 OH 21,5
2012 CA 44,3
2012 TX 34,4
I want to use year as a pager on the Tableau map.
How can I join this separate data sources?
You could blend on year, but if the year is always different in each data source then the blend will not match on anything and you will get no results.
I am guessing that each data source (tsv file) has the same format (same number of columns and column names). In that case you can extract each data source with tableau desktop and then add the data from each source to get a master extract. (you are basically appending the data extracts):
and you will get all the data in one extract:
from here it is simple to combine the years in one visualization.
Also, since this is SO, I will point out that you can do this programatically with the extract API (see https://www.tableau.com/learn/tutorials/on-demand/extract-api-introduction).
Your best approach in this case is to put all the data into one table before using tableau. (It looks like what you really want is a union instead of a join)
Another approach is to put two tables in the same database, or two tabs on the same spreadsheet,and use custom SQL to union all them together. Or you can append multiple tables into a single Tableau extract as emh described.
If you are conceptually joining tables instead of unioning them, you could also use data blending.

Read contents from txt file using T-SQL [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
SQL Server File Operations?
Is there by any chance I could use T-SQL to read the first line of a txt file?
Actually, I have a csv file and the first line is the name of all hundreds of columns. I have already coded the part where I could use the first line to generate a table with all that columns. So, really want to figure out how to do the reading part.
You could look at the BULK INSERT statement.