Mathematica and GAP: Is there an interface? - interface

Although the Abstract Algebra Add-on is a beautiful package for Mathematica, imo there is nothing that beats GAP, at least not for Group Theory. When I looked at Sage a few years ago, I found that said to have an interface with Mma but when I looked at it closer it turned out to be somewhat primitive ( but operational ). I know that work has been done at the Technical University of Eindhoven (OpenMath) on a platform for integrating / interfacing the major math tools, I even started to work on it. But that work got lost. Now to my question:
Is there, does anyone have, and are you willing to share, an interface between GAP and Mathematica?

The GAP FAQ seems to be rather pessimistic about this:
8.1: Can I call GAP functions from another programme?
The short answer is no. To explain a little more fully, essentially
all the algebraic functionality of the GAP system is written in the
GAP language, and so needs the GAP interpreter to run. The interpreter
is written in C, but does not coexist happily with other code in the
same process for a number of reasons, so there is no sensible way to
link GAP into a C, Java or other program as a subroutine library.
There is some hope, though:
What you can do is to run GAP in a child process and communicate with
it using pipes, pseudo-ttys, UNIX FIFOs or some similar device. We
have done this successfully in a number of projects, and you can
contact the support list for more detailed advice if you want to go
down this route.
Update
The FAQ now also reads:
Relatively recently, some of the SAGE developers have produced libGAP, which allows the entire GAP system to be embedded as a C library. One still can't embed individual functions by themselves though and the first call to libGAP still has to invoke the full GAP start-up sequence.
Alternatively, there are a number of ways of running GAP as a server process and calling it from C of C++ programs. See the SCSCP package for the GAP side. There are several C and C++ libraries that implement the client side.

One options is to use Sage as an intermediate. Sage can interface both with Mathematica and GAP.
I believe that Sage interfaces with both programs by:
a) running their console interfaces in the background using pexpect
b) knowing how to translate most of the GAP and Mathematica syntax into its native syntax.
Note that I've done this a couple of times before and it works ok - but it means you don't get to use the Mathematica notebook interface....
A really useful tool would be to hook Mathematica up to expect (or pexpect) so that similar interfaces to console programs can be written for Mathematica. This is basically what the second quote in Sjoerd's answer is suggesting.

Related

MATLAB: which language interface to use when calling external library functions offered with various interfaces?

Still very new to MATLAB and to programming in general, I got stuck in understanding how to best make profit from what is documented in the official MATLAB documentation in the chapter Calling External Functions. I am by now not capable to judge which of the many offered pathways might be the most effective one to go for, in terms of how easy a pathway can be learned by a beginner and which one could on the long term most easy and clear in the MATLAB code be applied over and over again.
To put it in a very practical question: as, for instance, the third party image processing function libraries ITK or OpenCV provide Java, C++, C and Python interfaces, and MATLAB has functionality to address such interfaces, which interface should a beginner in programming chose? Is one of them usable in a clearer laid out design and thus easier to get warm with and quicker to learn to apply?
I am afraid to now hear from everybody something like "well, it depends what you want to do", and my answer could only be "don´t know yet, I am learning programming and prefer to first gain some general success by going for the cleaner designed and easier to understand approach, and thus would like to get a recommendation where to start".
Please let me add this to my question and concerns: Highly respected Yair Altman states on his internet page "undocumentedmatlab.com" in the commercial for his "Undocumented Secrets of Matlab-Java Programming" book, that the Matlab programming environment would rely on Java for numerous tasks, including networking, data-processing algorithms and graphical user-interface (GUI). I derive from this statement that learning to specially connect MATLAB to JAVA will have significant advantages, THE MATHWORKS itself seem to have decided to take advantage of such connection when implementing MATLAB.
But I can also see, that THE MATHWORKS by providing for MATLAB the MEX functionality seems to lean towards a tight C/C++ incorporation, providing also MEX besides the other possibilities to call external C functions.
For me as a beginner it is now confusing to uncover which route of connectivity to external languages could be taken as the "standard" or "first to be recommended" one. Do any of you experienced programmers would have some arguments for me which route to first focus on? It is a long journey to learn programming, and I would not like to waste time on poorly recommended pathways.
This question sounds like: "I am still learning how to drive, still not a very experienced driver. Please give me your tips about how to change a flat tire, what is the best tire to get a flat tire on, passenger side rear? what are the best places to get a flat tire, is it the mall or my office parking lot or the middle of the street."
Let me give you some tips:
Changing a tire will not make you a more knowledgeable driver. You will learn very few things from doing it and it is a frustrating experience and it is not worth your time right now. Learn how to drive.
Explanation: Making MATLAB call Java/C++/C or whatever other language will not make you a better MATLAB programmer, and frankly is of secondary importance. Until the first sentence of your question isn't "I'm still new to MATLAB and programming in general" you're wasting your time. Like changing a flat tire, connecting MATLAB to other languages is not something cool, or interesting, in fact it is the opposite: it is frustrating, error prone and boring.
The day will come when you will have a flat tire. That day the location where you get it and which tire it is will become secondary. You will need to learn how to change it and you will. Trust me you will.
Explanation: You don't get to decide in what language the code that solves the exact problem you have right now is written. The same way you don't get to decide where you get the flat tire. The day will come when you already know C++ and need MATLAB to call into some C++ code (either your code or someone else's). That day you will need to learn how to write a mex file in C++ and compile it for your platform and invoke your code. Or the day will come when you need to invoke Java, and then you will learn how to call into Java.
Obsessing over this when you don't know what you need to do and you're clearly not technically equipped to do it is just a waste of time.
To Start with you can look for Interfacing MATLAB and Java (It is easiest way to learn)
Afterwards, Go for Interfacing MATLAB and C++
1. Create Classes and Gateway in C++ for MATLAB and create executable mex file
2. Create MATLAB function and Wrap in C++ (Shared C/C++ library approach)
Afterwards, Go for creating Excel Addins and Invoke those addins in Excel Sheets
Meanwhile you can look for dll referencing in C#.net application /VB.net application

Custom Programming Language ~ How to interact with the Operating System

I am trying to create my own programming language but I am already thinking ahead a little.
Of course when I can compile a little program I won't have a Standart Library at that time,
and you'd have to create one yourself. Now take for example I'd like to add some functionality to print a string to the screen, I am pretty sure I'd have to do a few System Calls to the operating system to get this displayed.
So to the point: What would be the best way to interact with the Operating System?
Possibilities I came up with myself:
- Generate Object Files and link those to the (for example) C Standart Library
- Writing the files with embedded assembly language containing System Calls
I have a feeling there are better possibilities!
I hope you can help me,
Christian
EDIT: It's a compiled language I am creating!
You have basically two options, like you say yourself. You can link your standard library with the standard C library, so that the I/O functions in your standard library can use C functions. Alternatively you can make system calls to the operating system directly.
The second approach seems like it's going to be more work: The system calls will be different on each operating system, so you'll have to put a lot of work into porting your system. The system calls may not be well documented, causing many frustrations.
You could start by linking your standard library to the C standard library and worry about other aspects of your language for the moment. Later you can look into replacing the C functions you use with syscalls.
Stdin and stdout are pretty much the minimum - if you are using a Unix environment you can then gain access to they keyboard, command line and also pipe text in and out to files.
If you are writing a new language - then StdError may also be worth considering too!

Why people don't use LabVIEW for purposes other than data acquisition and virtualization? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
This is marked as a subjective question, I hope I won't get too many down votes though.
LV seems to offer a nice graphic alternative to traditional text based programming. As I understand, it's not a just-virtualization/data acquisition programming language. Nonetheless, it seems to have that paradigm pegged to its creator's name.
My question comes up because it doesn't seem to be widely used for multi-purpose applications. I'm not a LV-expert of any kind, I'm more like a learner. I'm still getting used to LV.
Labview is fantastic if you have National Instruments hardware, and want to do something like acquire, plot and log the data.
When you start interfacing to custom devices the wiring between modules gets complicated having to do all the string manipulation work for input and output to a device.
At my place of work, we found that we got annoyed with having to make massive, complicated VI's to interface to devices and started writing them in .NET and interfacing them to Labview.
In the end we ended up scrapping Labview all together and using the NI Measurement Studio for Visual Studio to give us all the lovely looking NI controls (waveform plot, tank, gauges, switches etc) with the flexibility of C#.
In summary, even with a couple of 24" screens, sometimes the wiring for Labview code can get too complex and becomes impossible to comment, debug, and make extensible for any future changes. I suggest taking a look at Measurement Studio for Visual Studio and using your favourite .NET language with the pretty NI controls.
My two experiences with "graphic alternative[s] to traditional text based programming" have been dreadful. I find such languages to be slow to use, hard to edit, and inexpressive. Debugging them is a nightmare. And they offer no real advantages.
To be sure, it has been quite a long time since I looked at one, but the opinions of others I've asked about them have been only luke warm, so I have never taken the time to look again. Reasons to look again are welcome and will be taken on board...
Labview can be used to author large, complex software projects. Labview is unquestionably much more fun to use than a syntax based language. I have programmed mathematically dense, dynamic simulations using labview. Newer versions of Labview include alot of exciting features, especially for utilizing multiple processors. I like Labview very much. But I don't recommend it to anyone.
Unfortunately, it's an absolute nightmare for anything other than simple acquisition and display. It may one day be sufficiently developed to be considered as a viable alternative to text based languages. However, the developers at NI have consistently opted to ignore the three fundamental problems that plague labview.
1) It is unstable and riddled with bugs. There are thousands of bugs that have been posted to the labview support forums that are yet to be fixed. Some of these are quite serious, such as memory leaks, or mathematical errors in basic functions.
2) The documentation is atrocious. More often than not, when you look for help with a labview function in the local help file you'll find a sentence that merely restates the name of the item you are trying to find some detail on. e.g. A user looks up the help file on the texture filter mode setting and the only thing written in the help file is "Texture Filter Mode- selects the mode used for texture filtering." Gee, thanks. That clears things right up, doesn't it? The problem goes much deeper in that; quite often, when you ask a technical representative from national instruments to provide critical details about labview functionality or the specific behavior of mathematical functions, they simply don't know how the functions in their own library work. This may sound like an exaggeration, but trust me, it's not.
3) While it's not impossible to keep graphical code clean and well documented, Labview is designed to make these tasks both difficult and inefficient. In order to keep your code from becoming a tangled, confusing mess, you must routinely (every few operations) employ structures like clusters, and sub-vis and giant type defined controls (which can stretch over multiple screens in a large project). These structures eat memory and destroy performance by forcing labview to make multiple copies of data in memory and perform gratuitous operations- all for the sake of keeping the graphical diagram from looking like rainbow colored spaghetti with no comments or text anywhere in sight. Programming in labview is like playing pictionary with the devil. Imagine your giant software project written as a wall sized flowchart with no words on it at all. Now imagine that all the lines cross each other a thousand times so that tracing the data flow is completely impossible. You have just envisioned the most natural and most efficient way to program in labview.
Labview is cool. Labview is getting better with each new release. If National Instruments keeps improving it, it will be great one day as a general programming language. Right now, it's an extremely bad choice as a software development platform for large or logically complex projects.
I **have been writing in LabVIEW for almost 20 years now. I develop automated test systems. I have developed, RF, Vison, high speed digital and many different flavors of mixed signal test systems. I was a "C" programmer before I switched to LabVIEW.
It's true that you can build some programs quickly in LabVIEW, but just like any other language it takes a lot of training to learn to build a large application that is clean easy to maintain with reusable code. In 20 years I have never had a LabVIEW bug stop me from finishing a project.
Back in the day, NIWEEK would have a software shootout every year. LabVIEW and LabWINDOWS (NI's version of "C") programmers would both be given the same problem and have a race to see which group finished first. Each and every year all the LabVIEW programmers were done way before the 1st LabWINDOWs person finished. I have challenged many of my dedicated text based programming friends to shootouts and they all admit they don't stand a chance, even if I let them define the software problem.
So, I feel LabVIEW is a great programming tool. It's definitely the way to go if you’re interfacing with any type of NI hardware. It's not the answer for everything but I’m sure there are many people not using it just because they don’t consider LabVIEW a “real programming language”. After all, we just wire a bunch of blocks together right? I do find it funny how many text based programmers snub there noses at it as they are so proud of the mess of text code they have created that only they can understand. A good programmer in any language should write code that others can easily read. Writing overly complex code that is impossible to follow does not make the programmer a genius. It means the programmer is a “compliator”(someone who can take a simple problem and complicate it). I believe in the KISS principle (KEEP IT SIMPLE STUPID).
Anyway, there’s my two cents worth!**
I thought LabVIEW was a dream for FPGA programming. Independent executable blocks just... work. In general, I use LabVIEW for various tasks interfacing with my DAQ and FPGA hardware, but that's about it. It seems (again to me) that this is LabVIEW's strong point and the reason it was built, but outside that arena it feels "cumbersome." As far as getting things done, it's like any other language with a learning curve - once you figure it out it's not too bad for getting work done. I've seen several people give up before that thinking the learning curve was permanent or something.
Picking up a 30" monitor made a huge difference.
I know one thing that people dislike is the version control integration.
Edit: LabVIEW/hardware is hella expensive for "just for fun" use. I dropped $10K on their hardware (student prices) and got the software for free from school for making toys around the house.
Our company is using LabVIEW for the last 10 years for measuring, monitoring and reporting of our subject (trains).
Recently we have started using LabVIEW as GUI for databases with lots of data, the powers of LabVIEW with the recent new features (Classes, XControls) allows use to create these kinds of GUIs for a fraction of development costs at other platforms. While we don't need external programmers at consultancy rate.
Ton
I first started using Labview in a college physics lab. Initially, I thought it was slow and cumbersome when compared to other text-based languages. It was too difficult to create complex logic and code became sloppy real fast (wires everywhere).
Then, a few years later, I learned about using sub-vi's and bundles. What a difference! At this point, I was using labview for very high level functions. I was taking raw input from a camera, using all kinds of image filters and processing to ultimately parse out the lines in a road so that a vehicle could drive itself down this road with no driver - it was for the DARPA URBAN CHALLENGE. I was also generating maps from text waypoint data, making high-level parsing functions, and a slew of other applications that had nothing to do with processing data from input devices. It was really a lot of fun. and FAST.
After leaving college, I am now back to using text-based languages. I've been using: PHP, Javascript, VBA, C#, VBscript, VB.net, Matlab, Epson RC+, Codeigniter, various API's, and I'm sure some others. I often get very frustrated in the amount of syntax I have to memorize in order to program with any significant speed. I find it annoying to have to switch schools of thought based on the language I am using... when all programming languages essentially do the same thing! I need a second monitor just to have the help up at all times so i can find the syntax for the same functions in different languages. I miss Labview very much, it's too bad it's so expensive otherwise I would use it for everything.
Graphical based programming I think has a huge potential. By not being constrained by syntax, you can focus on logic instead of code. Labview itself may still be in its infancy in terms of support and debugging, but I believe conceptually it beats out the competition. It's simple a more intuitive way to program.
We use LabVIEW for running our end of line test equipment and it is ideal for data acquisition and control. Typically measuring 15 to 80 differential voltages and controlling environmental chambers, mass flow controllers and various serial devices LabVIEW is more than capable.
Interfacing with custom devices can be simplified greatly by using the NI instrument driver wizard to create reusable VI's, interfacing with custom dll's if needed. On a number of projects we have created such drivers for custom hardware and once created there are reusable in future projects with no modification.
Using event driven structures user interfaces are responsive and we regularly use LabVIEW applications to interface with a database.
Whatever programming environment you choose it's the process of designing the application that matters most. I agree that you can create some really horrible and unreadable block diagrams in LabVIEW but then you can also create unreadable code in Visual studio. With just a little thought and planning a LabVIEW block diagram can be made to fit on a single 24" monitor with plenty of space to add comments.
I would use LabVIEW over Visual Studio for most projects.
But people do use LabView for purposes other than data acquisition and virtualization. Of course LabVIEW is mainly used in labs and production environments because it is (or was) one of the main NI's customer target.
However you can do a lots of various things with LabVIEW, like programming a robot that would perform a lot of image analysis, and then tweet the results. Have a look at videos from NI Week 2009 on you-tube, and you'll see how powerful this tool is. For instance, there is possibility to write code and deploy it to ARM MCUs (see this Dev Monkey article from 2009.08.10).
And finally check this LabVIEW DIY group
I have been using LabVIEW for about two years for developing automation. If given due care and proper design we sure can develop maintainable and really good looking application in LabVIEW.I think this is the same for all the other languages out there. I have seen equally bad code in LabVIEW primarily from people who use it only to develop quick and dirty working automation. IMHO Graphical programming is a lot easier to code and understand if rightly done. But that said I feel text based programming 'feels' more powerful!
LabVIEW is primarily marketed for industrial automation, has inherent support for lot of NI hardware and you can get the third party hardwares working with it pretty quickly. I think that is the reason you see it only in automation field. Moreover it is pretty costly and you are locked down with NI as you do cannot even open your code if you do not buy the software from them!
I've been thinking about this question for decades (yes, since 1989...)
Like all programming languages, LabVIEW is a high-level tool used to manipulate the flow of electrons. Unless you are a purist and refuse to use anything other than a breadboard and wires; transistors, integrated circuits and programming languages are probably a good thing if you wish to build something of any consequence.
But like all high-level tools, just wielding one does not make you a professional craftsman. Back in the day of soldering irons, op-amps and UARTs it required a large amount of careful study before you could create a system that actually functioned. The modern realm of text-based languages is so overly dominated by syntax that the programmer must get it just right before it will compile and run. In order to write code that works, the programmer must increase their skill level to create systems much larger than "Hello World".
LabVIEW is not dominated by syntax, but by Data Flow. Back in the day, reaching for your flow charting template and developing the diagram of a well-balanced information system was the art and beauty part of the job. Only after you had the reviewed flowchart in hand would you even consider slogging through the drudgery of punching out the code. (yes... punch cards)
LabVIEW is a development system that allows the programmer to use flow charting tools to diagram the complete information system and press "run"..... LabVIEW "punches out the code" and compiles it for you. No need to fight through the syntax of text language A or language B.
With such a powerful tool, novices can build large, working programs rapidly -- implying some level of professional craftsmanship since it runs at all. However, if the system does not perform elegantly, or the source code diagram is a mess, it is not the fault of LabVIEW.
People often point to "LabVIEW is only good for developing large data acquisition systems." Perhaps those people should consider the professionalism of the scientists and engineers that are working in data acquisition. If they know enough to get the actual wires right for the sensors and transducers, it may be a good bet that they are expert at developing LabVIEW wiring diagrams as well.
I do use LabView at home, as it is part of Lego Mindstorms, which my son loves. And I really like the way to compose systems like this.
However, in my work (embedded systems), it is generally to restrictive. But also here, I'm trying to move up in abstraction:
- control and state behavior: Model based design (i.e. Rhapsody)
- data algorithms etc. Simulink
Sometimes a graphical model can require more clicks than a piece of code. But this also includes the work a good programmer need to do in design & documentation; not just the code typing. The graphical notation takes many hassles away and is generally much faster if the tool is powerful enough for the complexity at hand. So I expect these kinds of tools will gain more popularity in the next years as they mature and people get familiar with them.
I have used LabView for some 10 years. It's brilliant for Scientific prorgamming ie like Matlab or Simulink but 10 times better. If you are having problems then you are doing something wrong. It takes time to learn like any language. As for using .Net instead - are these people even on the same planet? Why would you go to the trouble of writing eveything from scratch when you can say pull up an FFT etc and use alread written code. .NET is fine for simple programs but not so good for Scientific processing. yes you can do it but not without oodles of add-ons for graphics etc. Prorgamming in G is far easier than text based for Scientific problems. You can of course program in c if you are interfacing and use the dll. Now there are things that I would not use LabView for - speech recognition for example may be a bit messy at present. More to the point though, why do people like programming in outdated text form when there is an easy alternative. It is as if people want to make things complicated so as to justify their job in some way. Simplify Simplify!
Somebody said that LabView is only sued in the Automation field. Simply not write at all. It has applications in Digital Signal Processing,Control Systems,Communications, Web Based,Mathematics,Image Processing and so on. It started as a data aquisition method and they invented the name Virtual Instrumentation but it has gone far beyond that now. It is a Scientific programming language with a second to none graphical interface. It is way beyond Simulink and if you like Matlab then it has a type of Matlab scripting built in for those that like such ways of programming. It is evolving all the time. The one thing I found difficult was writing code for the Compact Rio - tricky but far easier than the alternative. It's expensive but you get a quality product. I personally have not found any bugs in ordinary programming. It is an engineers language but anybody could use it to program.

Any good library or software for queue networks simulation?

I have been trying to make work EZSIM with no luck, which is a software to build discrete event simulators in a graphical DOS environment. In this software, my simulator and many others (of the other people in the course I'm taking) don't work, but teacher's simulator (and examples of the downloaded files) does work.
So, I began to distrust of the software.
Do you know any software that resolves the same kind of problems but really works? It will be good if it is free, or I can download an evaluation copy or something like that.
If you don't know any software, do you know any library which might work? Preferably in C#, Ansi C, Java or Delphi.
This may be more than what you're looking for, but check out NS2. It's the standard for open source network simulations, and will allow you to simulate all kinds of network layer behavior.
I've also used JUNG in the past. It's very flexible, although it also doesn't offer much out of the box.
I used Möbius in my computer systems analysis class. It is free for educational use (which sounds like what you're doing). It's a Java GUI which generates C++ code.
The R package queuecomputer. queuecomputer is a computationally efficient method for simulating queues with arbitrary arrival and service times. There is a submitted paper on arXiv describing the algorithm used in the package. Examples can be found within the arXiv paper and the vignette. A web app based on the package is available at https://ace-ebert.shinyapps.io/queue_simulator_mmk/ .

Modula-2 Developer?

Guess no new project is implemented in languages like Modula, Ada , Oberon .. anymore (right?). But still there are legacy systems floating around, popping out here and there looking for their creators. They cant find them because they might be retired sitting at a beach somewhere enjoying themselves.
Serious:
1) I am wondering if there are still active (experienced) Modula programmers around ?
2) Anyone experience with porting Modula code to a new hardware generation ?
3) Does anyone know about a tool that can re-engineer, means map Procedures and Mod-files in a graphical way. These tools are available for eg. C programs.
Sure, Modula Syntax is not that breathtaking in comparison to todays .net and Java API's with 1000's of methods, but if someone drop about 100.000 lines of almost undocumented sourcode at you (nicely mixed with some 8000 lines assembler), you better know if you better reject it. I have this request and I am very resistant. (Option: port and keep modula source or migrate to other language in 9 months!)
cheers
1) I am wondering if there are still active (experienced) Modula programmers around ?
There are plenty of them, but you have to do a bit of web search to find them. If you search for "Curriculum Vitae" (or "Resume") and "Modula-2" there should be plenty of hits. Also, anybody who has experience in Oberon, Pascal or Delphi will be able to handle Modula-2.
Also there are active Modula-2 projects, most notably:
GNU Modula-2 at http://www.nongnu.org/gm2
Modula-2 R10 at http://modula-2.net/m2r10
Modula2JCC at http://code.google.com/p/modula2jcc
Modulipse at http://modulipse.sourceforge.net
Schwarzer Kaffee http://sourceforge.net/projects/schwarzerkaffee
2) Anyone experience with porting Modula code to a new hardware generation ?
Ask on the GNU Modula-2 mailing list. Many GNU Modula-2 users have Modula-2 code from 16-bit DOS systems they like to port to modern platforms. The GNU Modula-2 website lists this as one important motivation for GM2. The GM2 mailing list is at:
http://lists.nongnu.org/mailman/listinfo/gm2
There is also the Modula-2 Usenet news group, you can reach it via the Google interface at
http://groups.google.com/group/comp.lang.modula2
Last but not least, there is a Modula-2 IRC channel at Freenode
irc://irc.freenode.net/#modula-2
3) How to assess 100.000 lines of Modula-2 source code
"if someone drop about 100.000 lines of almost undocumented sourcode at you (nicely mixed with some 8000 lines assembler), you better know if you better reject it. I have this request and I am very resistant. (Option: port and keep modula source or migrate to other language in 9 months!)"
You may want to contact Rick Sutcliffe, a well known Modula-2 scholar and book author who is also the maintainer of the Modula-2 FAQ in which he states that he does get hired to do consulting work for assessing Modula-2 source code in company take-over situations. It seems to me that your situation might be similar enough to justify hiring an expert to establish the value of the software that is offered to you.
The Modula-2 FAQ is at http://faq.modula2.net
1) I am wondering if there are still active (experienced) Modula programmers around ?
Yes, I'm one. But I already have a job :-)
2) Anyone experience with porting Modula code to a new hardware generation ?
Not clear if you meant porting code or porting a compiler. Porting Wirth's Modula-2 compiler (or Oberon compiler) should be easy. Ada and Modula-3 are another story.
3) Does anyone know about a tool that can re-engineer, means map Procedures and Mod-files in a graphical way. These tools are available for eg. C programs.
I don't understand the question. If you are looking to visualize the import graph of a Modula-2 program, you could easily write something to emit dot. Visualizing call graphs is another story.
Here's my bottom line on Modula-2 and Oberon:
Any C programmer worth his or her salt can quickly learn enough Modula-2 to maintain a large legacy application. Oberon's another story; its model of exported names and type extension is not like the object models found in other OO languages.
Wirth's genius as a language designer was to make things easy for the person writing the compiler. So if you need tools, any good compiler writer can produce them. Wirth's compiler should be available and easy to port.
Ada does not deserve to be mentioned in the same breath with Modula-2 and Oberon.
Ada is still an very active language. I use it in my own research since 1995 and in my lectures since last year at a university.
I myself don't know much of Modula, however I worked at a research center in Brazil that had a packet switching network project (Compac) that was entirely created in Modula-2. If I'm not mistaken they even developed the compiler/linker themselves. Since I don't feel at liberty to point you to specific persons, I would suggest you do a google search for "compac" and "cpqd" and I can pretty much guarantee you will find names of people involved in it. It should come as no surprise that references to it are quite old, from late 80's.
Modula-2 is architecturally not that dissimilar to C. A programmer familiar with C should have little trouble figuring out Modula-2. Given that your application has a significant body of assembler code then you will need someone with low-level skills anyway.
IIRC Modula-2's grammar is LL(1) or nearly so, so writing a parser to generate call graphs for a Modula-2 code base is not beyond the wit of man. Graphviz is your friend if you want a quick and easy way of visualising the call graphs. Again, this suggests that you're up for employing a 'real programmer' to do the porting work.
If you need a reasonably viable Modula-2 compiler, you could look at the Amsterdam Compiler Kit which does have a competent Modula-2 compiler that can target a wide variety of platforms, although it doesn't support Win32 IIRC.
I would suggest that documenting and porting the existing Modula-2 code base is probably easier than attempting to re-write it in C. However, if you need to move to a different processor architecture then you will have to re-write the assembly language bits anyway. This does rather change the value proposition of porting.
If it is you doing the porting then you might consider doing it in two steps.
Arrange with the customer to do a utility to generate the call graph and give them a feasibility study recommending what to do and some estimate of the scope.
Do the port, either porting the code base or re-writing it. Bear in mind that you may not need a low level language for the entire code base if you're running it on a modern computer. You may be able to do it in a mixture of (say) Python and C with less effort than would have been needed for a rewrite purely in C.
Yup.
I realise that you asked this question quite some time ago but I also know that projects that nobody like to handle get kind of delayed...
I built several large systems in Modula-2 over a span of ten years and have this insane habit of taking on impossible tasks.
I have not touched it for about ten years but am absolutely certain that I can port your system for you to almost any other platform. Why not get in touch with me if you are still interested?
Oh yeah - better still, we are both in Singapore :-)
ADW Modula-2 has now been released as freeware. http://www.modula2.org/adwm2/ Since it's free and supports 32 & 64-bit Windows applications (and I know Modula-2), I've picked it up and am using it for small utility work that I want to be a 64-bit Windows binary (most of my work is in Java and .Net, which are fine but sometimes a pure binary is best. I use MASM32 for 32-bit binary Windows apps already).
edit
There's also a project in the works (still very early on, not yet usable as of the date of this edit) now to compile Modula-2 on the JVM (with a transpile to Java option). https://github.com/m2sf/m2j