Exporting and importing a MATLAB map structure - matlab

I'm using the containers.Map-function to store my data.
Is there an easy way to export the whole structure to a file and be able to import it again at a later time.
A structure could be:
keys = {'six','seven','eight','nine'};
vals = {6,7,8,9};
Map = containers.Map(keys,vals);
and then say I want to export this structure and be able to import at a later time (or in another code).
Thanks, regards
Rasmus

Use the save and load functions: http://www.mathworks.com/help/matlab/matlab_env/save-load-and-delete-workspace-variables.html:
save('MyFileName', 'Map');
load('MyFileName');

Related

Draw.io import diagrams from CSV using an API

In draw.io there is a very nice option to create a diagram using CSV import utility (Arrange->Insert->Advanced->CSV). It is very simple and straight forward.
I was trying to find a way to do it using an API (REST for example), is there a way to do it?
One more question:
Does anybody knows if there's a way to create draw.io file with multiple pages using the CSV import utility?
Thanks
Danny
Absolutely possible. Working example here: https://github.com/GanizaniSitara/drawio/
pyMX.py you want to have a look at first.
It creates the file in XML then encodes it and packs it into the drawio format.
Needs input data in CSV in format:
Level0,Level1,Level2,AppName,TC,StatusRAG,Status,HostingPercent,HostingPattern1,HostingPattern2,Arrow1,Arrow2,Link
Cool Division,Some Department,Some Department2,SomeString,Zero,25,red,green,0,Azure,Linux,up,up,http://www.gooogle.com
Rinse and repeat for anything else you need to create. It's rough code, ping me here or on GitHub if anything needs clarification.

Store tMap flow data into HashMap

I have a scenario where i want to store data to HashMap after processing in tMap component.
my flow is as follow :
tFileInputExcel------>tMap------>tJavaRow/tJavaFlex
In tJavaRow, i want to store all my data into a HashMap.
Any help on this.
Additionally to Philipps answer, I would like to add that you could use the tHashMap components. Usually those are deactivated when Talend is installed and you need to activate them first.
Then it is very easy using them, just add the components as you are used to do. You would save creating code.
It's not difficult to do this but it'll be less obvious to the "reader" of the code than the usage of the components.
For readability I'd recommend a tJavaFlex component over the tJavaRow. Assuming the Flow "toHashMap" exits your tMap with the fields "myKey" and "myValue", the code in the tJavaFlex would look like this:
Start code:
HashMap myAwesomeMap = new HashMap<>();
Main code:
myAwesomeMap.put(toHashMap.myKey, toHashMap.myValue);
End code:
/*
whatever you want to do with the aggregated data. You'll probably want to save it to the globalMap in any case.
*/
globalMap.put("myAwesomeMap", myAwesomeMap);

export als recommendation model to a file

I am new to Apache Spark. I ran the sample ALS algorithm code present in the examples folder. I gave a csv file as an input. When I use model.save(path) to save the model, it is stored in gz.parquet file.
When I tried to open this file, I get these errors
Now I want to store the recommendation model generated in a text or csv file for using it outside Spark.
I tried the following function to store the model generated in a file but it was useless:
model.saveAsTextFile("path")
Please suggest me a way to overcome this issue.
Lest say you have trained your model with something like this:
val model = ALS.train(ratings, rank, numIterations, 0.01)
All that you have to do is:
import org.apache.spark.mllib.recommendation.ALS
import org.apache.spark.mllib.recommendation.MatrixFactorizationModel
import org.apache.spark.mllib.recommendation.Rating
// Save
model.save(sc, "yourpath/yourmodel")
// Load Model
val sameModel = MatrixFactorizationModel.load(sc, "yourpath/yourmodel")
As it turns out saveAsTextFile() only works on the slaves.Use collect() to collect the data from the slaves so it can be saved locally on the master. Solution can be found here

How can I change the name of the file being saved without editing the code? MatLab

I am working on an experiment that take a lot of data samples and save it to different video files. For example, I take data and save it to a file called "slowmotion.avi" and then I take another set of data and save it to another file called "normalspeed.avi".
I am trying to find a way that I could change the name of file being saved without editing the code. The way I am using right now makes me have to open the code and change the name of the file directory within the code and then save the code. I want to make it a lot faster and easier to use.
I have tried the following code to fix this problem but it doesn't work.
graph=input('Graph of experiment: ');
vidObj = VideoWriter('%3.1f.avi\n',graph);
.....
Hope I didn't confuse you.
A possible solution:
graph=input('Graph of experiment: ','s');
vidObj = VideoWriter([graph,'.avi']);
The 's' in the input() function indicates that the expected input is a string, and [graph,'.avi'] simply concatenates both strings.

What is the best file parsing solution for converting files?

I am looking for the best solution for custom file parsing for our enterprise import routines. I want to basically change one file format into a standard file format and have one routine that imports that data into the database. I need to be able to create custom scripts for each client since its difficult to get the customer to comply with a standard or template format. I have looked at PowerShell and Iron Python to do this so far but I am not sure this is the route I want to go. I have also looked at some tools such as Talend which is a drag and drop style tool which may or may not give me what I want as far as flexibility. We are a .NET shop and have created custom code to do this in the past but I need something that is quicker to create then coding custom parsing functions each time we get a new file format in.
Depending on the complexity and variability of your work, you should consider an ETL tool like SSIS (SQL Server Integration Services).
Python is wonderful for this kind of thing. That's why we use. Each new customer transfer is a new adventure and Python gives us the flexibility to respond quickly.
Edit. All python scripts that read files are "custom file parsers". Without an actual example, it's not sensible to provide a detailed example.
with open( "some file", "r" ) as source:
for line in source:
process( line )
That's about all there is to a "custom file parser". If you're parsing .csv or .xml files, then Python has modules for that. If you're parsing fixed-format files, you'd use string slicing operations. If you're parsing other files (X12? JSON? YAML?) you'll need appropriate parsers.
Tab-Delim.
from collections import namedtuple
RecordLayout = namedtuple('RecordLayout',['field1','field2','field3',...])
def process( aLine ):
record = RecordLayout( aLine.split('\t') )
...
Fixed Layout.
from collections import namedtuple
RecordLayout = namedtuple('RecordLayout',['field1','field2','field3',...])
def process( aLine ):
fields = ( aLine[:10], aLine[10:20], aLine[20:30], ... )
record = RecordLayout( fields )
...