Implementing Distributed discrete event simulator - distributed-computing

I have recently been exploring distributed simulation for a college term project. I have gone through a lot a reference materials, including:
Parallel and Distributed Discrete Event Simulation: Algorithms And Applications: Richard M. Fujimoto
Distributed Discrete-Event Simulation: Jayadev Misra
Integrated Fluid and Packet Network Simulations: George F. Riley, Talal M. Jaafar, Richard M. Fujimoto
Parallel Simulation of Telecommunication Networks: http://titania.ctie.monash.edu.au/pnetsim.html
Introduction to Discrete-Event Simulation and the SimPy Language: Norm Matloff
The OMNET++ Discrete Event Simulation System: András Varga
I wish to begin building an actual Distributed discrete event simulator system, such as for Digital logic simulators, Network simulators, etc.
I want to build such a system from scratch. I wish to know some references for building this system.

Related

Simscape, COMSOL compatibility in simulating electromagnetic range of inductive charging

I am currently trying to use Simscape to design and simulate the electrical circuit and COMSOL Multiphysics to simulate the electromagnetic interaction between coils. What I'm not certain of is whether or not we can successfully link the two software packages nicely via MATLAB or not. They both have MATLAB support though.
I am also researching whether ANSYS suite. There may be some software we can use that would take the place of one or both of the previous software.
Several years ago I had to make a connection between COMSOL and Matlab as well. This is called a Matlab Comsol LiveLink.
However, I think (not sure), that the electrical circuit should be modeled in COMSOL. Using the LiveLink you can set parameters in COMSOL using Matlab and extract the updated calculation. To my knowledge you cannot input a model, do calculations, and then extract the results.
You can look in the users guide, maybe it says something about this
Some other links which may be useful:
LiveLink™ for MATLAB®
Integrate COMSOL Multiphysics® with
MATLAB® Scripting

Theoretically, can everyday computing tasks be broken down into ones solvable by a neural network?

MIT Review recently published this article about a chip from IBM, which is more or less a Artificial neural network. Why IBM’s New Brainlike Chip May Be “Historic” | MIT Technology Review
The article suggests that the chip might have borrowed a page from the future. It might be the beginning of an era of new and evolved computation power. And also talks about programming for the chip.
One downside is that IBM’s chip requires an entirely new approach to
programming. Although the company announced a suite of tools geared
toward writing code for its forthcoming chip last year (see “IBM
Scientists Show Blueprints for Brainlike Computing”), even the best
programmers find learning to work with the chip bruising, says Modha:
“It’s almost always a frustrating experience.” His team is working to
create a library of ready-made blocks of code to make the process
easier.
Which brings me to the question, can everyday computing tasks be broken down into ones solvable by a neural network (theoretically and/or practically)?
It depends from the task.
There are plenty of tasks, for which John von Neumann computers are good enough. For example calculate precise values of functions in some range, or for example apply some filter at image, or store text into db and read it from it or store prices of some products etc. This is not area where NN are needed. Of course it is possible theoretically to train NN to choose where and how to save data. Or to train it to accountancy. But for cases where big array of data need to be analyzed, and current methods doesn't feet NN can be option to consider. Or speech recognition, or buying some pictures which will become masterpieces in the future can be done with NN.

Ok to use Java to model quantum mechanical behavior with ANNs?

I am working on a independent project. I am studying chemistry in school, along with computer science and would like to know if it is possible to model certain wave function phenomenon (schroedinger's equation, hamiltonians, eigenvalues) using Artificial Neural Networks.
My main questions are:
Would I be able to program and compute from my laptop? My laptop is a Asus Q200e
If not possible from laptop would I be able to use my desktop which contains a i5 processor and a fast GPU?
Your questions
Yes, may use your Asus Q200e to calculate your neural network.
Using a more powerful computer is always appreciative. If are willing to go the extra mile and perform the calculations on your GPU, the process will be even faster.
Applying neural networks to quatum mechanics
There is actually some litterature on how to proceed with creating such neural networks. See this link for to get a few pointers:
Artificial neural network methods in quantum mechanics

What are the available approaches to interconnecting simulation systems?

I am looking for a distributed simulation algorithm which allows me to couple multiple standalone systems. The systems I am targeting for interconnection use different formalisms, e.g. discrete time and continuous simulation paradigms. Now, the only algorithms I found were from the field of parallel discrete event simulation (PDES), such as the classical chandy/misra "null"-message protocol, which has some very undesirable problems. My question is now, what other approaches to interconnecting simulation systems besides PDES-algorithms are known i.e. can be used for interconnecting simulation systems?
Not an algorithm, but there are two IEEE standards out there that define protocols intended to address your issue: High-Level Architecture (HLA) and Distributed Interactive Simulation (DIS). HLA has a much greater presence in the analytic discrete-event simulation community where I hang out, DIS tends to get more use in training applications. If you'd like to check out some applications papers, go to the Winter Simulation Conference / INFORMS-sponsorted paper archive site and search for HLA, you'll get 448 hits.
Be forewarned, trying to make this stuff work in general requires some pretty weird plumbing, lots of kludges, and can be very fragile.

Which physical open source simulation methods worth to port to GPU

I am writing a report, and I would like to know, in your opinion, which open source physical simulation methods (like Molecular Dynamics, Brownian Dynamics, etc) and not ported yet, would be worth to port to GPU or another special hardware that can potentially speedup the calculation.
Links to the projects would be really appreciated.
Thanks in advance
Any physical simulation technique, be it finite difference, finite element, or boundary element, could benefit from a port to GPU. Same for Monte Carlo simulations of financial models. Anything that could use that smoking floating point processing, really.
I am currently working on quantum chemistry application on GPU. as far as I am aware, quantum chemistry is one of most demanding areas, in terms of total cpu time. there has been a number of papers regarding GPU and quantum chemistry, you can research those.
As far as methods, all of them are open source. are you asking about particular program? Then you can look at pyquante or mpqc. for molecular dynamics, look at hoomd. you can also Google QCD on GPU.