socket programming in MATLAB? - matlab

i have two machines running MATLAB and i need to exchange information(numbers,images) between them,is there a way in MATLAB i can do it?

If you have Parallel Computing Toolbox and MATLAB Distributed Computing Server, you can use MPI-style programming to send data between the two MATLAB processes. You can use functions like labSend and labReceive to send and receive data.

There are several possibilities without any extra toolboxes, depending on your specific needs. Check the Matlab help about external interfaces for details. For high performance, mexing a custom C communication is probably your best option. Using shared files on a network storage would be an alternative that is easier to implement but less effective, especially if you need frequent communication.

for an example of socket programming in MATLAB using Java, see this related post

Adding some specifics to other answers, here's an example of using Java for sockets. multicore and MatlabMPI use the filesystem, so I believe if you have a shared network filesystem you could use them across machines. And here is an old implementation in C++.
We have direct experience only with multicore, which is the least like real socket communication out of the above, but it gets the job done for coarsely parallel jobs.

Related

Matlab: interfacing with Windows C++ executable

In my team, we are using a high-performance C++ program to read data from the network. We want to place such data in a shared memory buffer in our C++ process for reading in a separate Matlab process that will further asynchronously process the data and provide a display interface.
All this is running on Windows.
What of the many cross-language mechanisms in Matlab is best suited to this purpose?
Thanks!
The best strategy is to use a memory-mapped file to provide data from one component and parse it from another. It appears Matlab's locking primitives are somewhat primitive, but fully worked out examples are on the Matworks website, including a simple chat application passing data between two Matlab instances.

Setting up a distributed computing grid on local network using .NET

In our firm we run complex simulations using our own software developed in .NET. These simulations are well-suited to parallel computation and we currently make much use of the various multi-threading features native to .NET. Even so, simulations often take hours or days.
We'd like to explore the potential of distributing computation over our local network of high-performance (24 core) workstations to access more CPU power. However we have no experience in this area.
Searching on Google reveals a few MPI-based options such as Pure MPI, MPI.NET, plus some commercial software such as Frontier.
Which solution should we consider for something that is ideally well-suited to a .NET environment and is relatively easy to set up?
Thanks!
Multithreading != grid computing, so you will need to rewrite some parts of your application regardless of what you will choose in the end.
I don’t know your network infrastructure but it sounded to me, like you would want to use normal desktop workstations to run distribute the code. I wouldn’t use MPI for that. MPI was rather developed for clusters and supercomputers where the network supports high bandwidth and low latency. Those aren’t the properties of a traditional office network (unless I understood something wrong).
The next thing you have to deal with is the fact that users shouldn’t turn off their machines if computations are performed on them. No grid computing platform (including MPI) deals with these kind of issues, as it is usually running on server hardware which has little failures and are running 24/7.
I don’t think there is a simple and inexpensive solution to this. You could have a service running on each machine which could execute code from DLLs with predefined parameters and send responses. Those assemblies could be downloadable from some windowsshare. But you want to have really huge peaces of work to be distributed like this. You wouldn’t get almost any improvements if the application runs only for a minute or less.
In the end you’d need also a service to find those services which are online or not, some kind of in memory DB where every service could write the IP address and that it’s online so that the clients would know to whom they can distribute the work. This could be done using RavenDB (as you said you are working with .Net), Redis or an application which was actually written for these kind of problems, Zookeeper.

Octave/Matlab solution for analyzing data coming from network

I am currently doing some research which involves analyzing data coming from different sensors. The way the data is provided is via a network interface. I want to take advantage of the already written procedures available in matlab/octave (error computing, plotting etc).
Which one is the best approach for doing such things:
doing an application in another language and call octave/matlab functions with data received from network?
doing an application in octave/matlab which handles incoming data from network interface?
...
Any other solutions and experiences are highly appreciated.
Thank you,
Iulian
LATER EDIT:
I am more interested in using octave than matlab but currently I'm looking to see a working method.
I've not used it, but there's a sockets package for Octave. If this works, writing the whole application in Octave seems easier to me than dealing with cross-language calling.
The "doing an application in octave/matlab which handles incoming data from network interface?" should be (fairly) easy, as you can easily use Java objects in MATLAB. You can use the Java network interfaces (or possibly a wrapper around them, if that makes it easier for you). I've worked on projects that take this approach to have Java threads handle all the networking and allowing MATLAB to grab results from the Java periodically and display/process them.

Linking Mulitiple Computers to Process a Task

I am unsure whether this question belongs here, so please feel free to migrate it if it doesn't.
My question is this, Is it possible to combine many different PC units to work as one?
Take for example, buying 3 different HP desktop PCs. Then link the hardware so that they act as one PC.
If so, please point me to some resources I can use.
Thanks for your time.
Note
I am not referring to linking them over a network, but rather, making the actual hardware work together.
I am not sure this is possible, so I am sure all my google search terms are not related to the issue.
You should realize that linking them over a network does not obviate their ability to work together to complete a task. Most supercomputers and clusters today are interconnected via a network (albeit a very high speed one like Infiniband). The key is to have software that can understand that it's operating in a distributed environment (e.g. MPI libraries). You might also take a look at OpenMP or Hadoop. It really depends on what you want to do with it.
You can not link some computer together to behave like one!!! Therefore you will need special hardware, which offers you the possibility to extend the numbers of CPU's working together. (Like a cray)
If you are talking about write an application that will be processed by those computers, you may be referring to MPI.
You can use the Open MPI to do that, most of languages nowdays have MPI libraries.
You can find a more elaborated information about Parallel Computing on Wikipedia Parallel Computing Article.

Communication between applications written in different languages

I am looking at linking a few applications together (all written in different languages like C#, C++, Python) and I am not sure how to go about it.
What I mean by linking? The system I am working on consists of small programs each responsible for a particular processing task. I need to be able to transfer a data set from one application to another easily (the data set in question is not huge, probably a few megabytes) and I also need some form of way to control the current state of the operation (This is where a client-server model rings a bell)
It seems like sockets or maybe SOAP would be a universal solution but just wanted to get some opinions as to what people think about this subject.
Comments/suggestions will be appreciated, thanks!
I personally take a liking towards ØMQ. It's a library that has a familiar BSD-sockets-like interface for passing messages, but you'll find it implements interesting patterns for distributing tasks.
It sounds like you want to arrange several processes in a pipeline. ØMQ allows you to do that using push and poll sockets. (And afterwards, you'll find it's even possible to scale up across multiple processes and machines with little effort.) Take a look at the guide to get started, and the zmq_socket(3) manpage specifically for how push and pull works.
Bindings are available for all the languages you mention.
As for the contents of the message, ØMQ doesn't concern itself with that, they are just blocks of raw data. You can use any format that suits you, such as JSON, or perhaps Protocol Buffers.
What I'm not sure about is the ‘controlling state’ you mention. Are you interested in, for example, cancelling a job halfway through?
For C# to C# you can use Windows Communication Foundation. You may be able to use it with Python and C++ as well.
You may also want to checkout named pipes.
I would think about moving to a model where you eliminate the issue by having centralized data that all of the applications look at. Keep "one source of the truth" so to speak.
Most outside software has trouble linking against C++ code, due to the name-mangling algorithm it uses for its symbols. For that reason, when interfacing with programs written in other languages, it is often best to declare wrappers to things as extern "C" or inside an extern "C" { block.
I need to be able to transfer a data set from one application to another easily (the data set in question is not huge, probably a few megabytes)
Use the file system.
and I also need some form of way to control the current state of the operation
Again, use the file system. A "current_state.json" file with a JSON serialized object is perfect for multiple languages to work with.
It seems like sockets or maybe SOAP would be a universal solution.
Perhaps. But it's overkill for this kind of thing. Your OS already has all the facilities you need. Just use the file system. It's very simple and very reliable.
There are many ways to do interprocess communication. As you said, sockets may be a universal solution. SOAP, i think, is somewhat an overkill. You may also use mailslots. I wrote C++ application using it a couple of years ago. Named pipes could be also a solution, but if you are coding on Windows, it may be difficult.
In my opinion:
Sockets
Mailslots
Are the best candidates.