Simulation speed of SystemC vs SystemVerilog - system-verilog

Does anyone have a pointer to measurements of how fast a behavioral model written in SystemC is compared to the same model written in SystemVerilog? Or at least first hand experience of the relative simulation speed of the two, when modeling the same thing?
To clarify, this is for high level system simulation, where the model is the minimum code to mimic the behavior. It's decidedly not synthesizable. In System Verilog, it would use as many non-synthesizable behavioral constructs as possible, for maximum execution speed. The point is, we're asking about an apples to apples comparison that someone did, where they tried to do both: stay high level (non-synthesizable), and make the code as close to equivalent as possible between the two languages.
It would be great if they stated the SystemVerilog environment used, and what machine each was run on, in order to be able to translate. Hopefully can be useful to a variety of people in choosing approach.
Thanks!

Like with other language benchmarks it is not possible to answer in general, because it depends on multiple factors like modeling style, compiler vendor and version, compilation options.
Also there are tools to convert Verilog to C++ and C++ to Verilog. So you can always match simulation performance by converting from one language to other.
Verilog can be converted to C++/SystemC using Verilator https://www.veripool.org/wiki/verilator
To convert C++/SystemC into Verilog you can use HLS (High-level synthesis) toos.

Related

Which language should I prefer working with if I want to use the Fast Artificial Neural Network Library (FANN)?

I am working on reducing dimentionality of a set of (Boolean) vectors with both the number and dimentionality of vectors tending to be of the order of 10^5-10^6 using autoencoders. Hence even though speed is not of essence (it is supposed to be a pre-computation for a clustering algorithm) but obviously one would expect that the computations take a reasonable amount of time. Seeing how the library itself was written in c++ would it be a good idea to stick to it or to code in Java (Since the rest of the code is written in Java)? Or would it not matter at all?
That question is difficult to answer. It depends on:
How computationally demanding will be your code? If the hard part is done by the library and your code is only to generate the input and post-process the output, Java would be a valid choice. Compare it to Matlab: The language is very slow but the built-in algorithms are super-fast.
How skilled are you (or your team, or your future students) in Java and C++. Consider learning C++ takes a lot of time. If you have only a small scaled project, it could be easier to buy a bigger machine or wait two days instead of one, to get the results.
Have you legacy code in one of the languages you want to couple or maybe re-use?
Overall, I would advice you to set up a benchmark example in whatever language you like more. Then give it a try. If the speed is ok, stick to it. If you wait to long, think about alternatives (new hardware, parallel execution, different language).

Debug Modelica code

I wonder if there a way to "debug" a modelica code, I mean debugging the code line by line and you can see how variables change, things like that?
I know that the modelica code is translated into C, I just want to know if there's a possibility to do that somehow, if there is, I believe it's gonna be a great improvement for any of the simulation environments. Thanks.
HY
This is a good question and it comes up a lot. But first, let's step back for a second.
The idea of debugging "line by line" is something comes from imperative programming languages. By "imperative" I mean that a program is simply a sequence of instructions to be carried out in the specified order.
When someone debugs Java or Python, this "line by line" approach makes sense because the statements are the fundamental way behavior is represented. This "line by line" approach could also be extended to modeling formalisms like block diagrams (e.g. Simulink) because, while graphical, they are also imperative (i.e. they constitute steps to be carried out in a specified order).
But Modelica is not an imperative language. There is no notion of steps, statements or instructions. Instead, we have omnipresent equations. So thinking linearly about debugging doesn't work in Modelica. It is true that you could think about debugging the C code generated from Modelica, but that is typically not very useful because it bears only a partial resemblance to the equations.
So how do you debug Modelica code? Well, debugging Modelica code is really debugging Modelica equations. Normally, Modelica models are composed of components. The equations that are generated when components are connected are automatically generated so lets stipulate that the Modelica compiler generates those correctly. So what's left is the equations in the component models.
The simplest way to approach this is to test each component individually (or at least in the smallest possible models). I often say that trying to debug Modelica components by throwing them all together in a big model is like listening to an orchestra and trying to figure out the one instrument that is out of tune. The fact that these equations in Modelica tend to form simultaneous systems of equations means that errors, when they occur, can propagate immediately to a number of variables.
So your best bet is to go through and create tests for each individual component and verify the behavior of the component. My experience is that when you do this, you can track down and eliminate bugs pretty easily.
Update: You shouldn't need to add outputs to other people's component models to debug them. An output can be created at any level, e.g.
model SystemModel
SomeoneElsesComponent a;
SomeOtherGuysComponent b;
end SystemModel;
model SystemModel_Debug
extends SystemModel;
output Real someNestedSignalFromA = a.someSubsystem.someSubcomponent.someSignal;
output Real someOtherNestedSignalFromB = b.anotherSubsystem.anotherSignal;
end SystemModel_Debug;
Of course, this becomes impractical if you have multiple instantiations of a signal component. In those cases, I admit that it is easier to modify the underlying model. But if they make their models replaceable, you can use the same trick as above (extends their model, add a bunch of custom outputs and then redeclare your model in place of the original).
There is a transformation debugger in OpenModelica now. You can find here which variable is evaluated from which equation.

Matlab vs Aforge vs OpenCV

I am about to start a project in visual image-processing and have no had experience with Matlab, Aforge, OpenCV and was wondering if anyone had any experiences with these different software packages.
I was also wondering which of the three packages were most efficient I assume OpenCV but has anyone had any experience?
Thanks
Jamie.
The question you need to ask yourself is which is more important - your time or the computer's time. If your task is really simple, you may be able to code it up in MATLAB and have it work right off the bat. MATLAB is by far the easiest for development - a scripted language with built-in memory management, a huge array of provided functions, and a great interface for displaying and manipulating data while debugging.
On the other hand, MATLAB is at least an order of magnitude slower than compiled openCV code for many tasks. This is especially true if you use the intel performance primitives libraries.
If you know how to code in MATLAB, I would suggest writing and debugging your algorithms in that language, then porting them to c/c++ with openCV for speed. If there are only a couple of simple functions that you need to speed up, you can call c code from MATLAB, but it's hard to get this working right the first few times you try it, so you're probably better off just rewriting your finished code entirely in c/c++
First, please elaborate about your project's needs. It has the biggest impact on the choice, in addition to other factors - your general programming knowledge (If you haven't dealt with dot net but just with C++, AForge is not a good choice, for example).
Generally,
Both AForge and OpenCV has a built-in interface to .Net, and OpenCV also with C++, python, and more. Matlab might be more efficient, but if you don't have any experience with it - you should also learn its syntax. Take it into consideration.
Matlab probably has the largest variety of functions, but it is more complicated than the other projects. OpenCV and AForge themselves have some differences - see them described in this StackOverflow question/ answers.
I worked last year in two similar projects with cars on the highway. Afaik, Matlab allows to process only one picture frame at a time (surely you could elaborate an algorithm to compute a stream) but using Simulink you can process the stream directly.
On the other hand, i found AForge a lot friendlier and easier to use since you can easily adjust the processing parameters from a GUI (not so fast/easy) to do in Matlab/simulink.
I'd go for Aforge.Net. It's also fast enough if you're worrying about processing speed. (using 640x480)
If you are asking about using one of these in .net,easily you can get info by this:
1-matlab mostly used in simulation of projects not the End-prototype project; my numer : 30;
2-aforge (as I'v used in many project) if you do not need the circular process like capturing image, or recognition of something in images or ... you'll find it very good, cause it is easy to use but useful for single processes; my number : 50
3-opencv very good at speed and useful for circular processes, for example you can capture images from a webcam and Instantly cartoonize it without any delay, But not easy-to-use as aforge. I like it anyway cause of its speed and MANY functions it gives us mostly anything we need in programming; my number : 80
Dr.Taha - Tahasoft.net

What's the fastest vector/matrix math library in C for iPhone game?

As the title, I'm finding vector/matrix library in C optimized for iPhone/iPod processors.
Or just generally fast.
---(edit)---
I'm sorry for unclear question.
I'm looking for fast lib for commercial games for iPhone/iPod. So GPL lib cannot be used.
However, I'll stop finding fastest lib, it maybe meaningless.
Now(2010.06.26) Accelerate framework included on iOS4, so vDSP/BLAS functions are available.
This utilizes hardware feature (CPU or SIMD) to accelerate floating point operations, so superior speed (2~4.5x average, 8x maximum) and less energy(0.25x maximum) consuming can be gained by using this.
Thanks people for other answers.
Depends very much on your needs, if you're just using straight floating point math you will probably find that the compiler will use software floating point, which will be very slow. So step one is making sure that youuse the hardware floating point unit that is available in the iPhone processor.
Step two is using an already well established library, there are several, Hassan already provided you with a link to the GNU GSL which is nice.
The next step would be to take advantage of the VFP SIMD like abilities. The VFP is not actually SIMD, but does provide SIMD like instructions for which the individual operations are perform consequtively. The advantage of still using these instructions is that your program text will be shorter, allowing better use of the instruction cache and less problems when missing branch predictions and so forth. I am however not aware of any vector library taking advantage of the VFP, you'd have to do a good search and possible write your own if it's not available.
Finally, if you still need more speed, you'll want to use the true SIMD unit in the iPhone processor. However this unit is not a floating point unit, but an integer unit. So, assuming you do want real numbers, you'll be stuck with fixed point, it depends on your application whether you can get away with that. Again I am not aware of any vector library providing fixed point arithmetic using the SIMD unit provided by the iPhone processor, so again you'd need a thorough search and possibly get your hands dirty yourself.

Your experiences with Matlab/F#/R for data analysis and modeling algorithms

I've been using F# for a while now to model algorithms before coding them in C++, and also using it afterwards to check the results of the C++ code, and also against real-world recorded data.
For the modeling side of things, it's very handy, but for the 'data mashup' kind of stuff, pulling in data from CSV and other sources, generating statistics, drawing charts etc., my colleague teases me no end ("why are you coding that yourself? It's built in to MatLab").
And I have another colleague who swears by R, which also has charting stuff 'built-in'.
I know that MatLab, R and F# are not strictly comparable, so I'm not asking for a 'feature comparison shoot out'. I just wondered what other people are using for these kind of pre- and post-analysis scenarios, and how happy they are with it.
(If there's anyone out there working on wrapping Microsoft Charts into something F#-friendly, let me know, I'd be happy to participate...)
(Note: answers to this question will be subjective, but based on experience, please)
I have very little experience with F#, but regarding C++/Matlab/R: If the speed of your program's execution is the most important, use C++. If speed of implementation is the most important, use Matlab or R. This is true for a number of reasons, not the least of which is their massive libraries of math/stats packages.
Both Matlab and R can be sped up through parallelism: so generally, I think that speed and quality of implementation should be a bigger concern. That's where the real "value" of programming is taking place, in the design of the application. It's not a minor proposition if you can write 3 or 4 good R programs in the same time it takes you to write 1 good C++ program.
Regarding F#: so far as it is part of Microsoft's framework, it must have a lot to offer. If you're developing in Visual Studio or working on a big .Net project (for instance), it might make sense to use F#. On the other hand, you can call both Matlab and R from .Net applications, so I would probably argue that their libraries should be a bigger concern. For instance, see this article as an example for R and the Matlab Builder.
Long story short: comparing F# and Matlab/R isn't a good comparison. F# is a general purpose programming language, while Matlab/R can be viewed as massive mathematical/data analysis toolkits. Some people call Matlab or R from F# in order to take advantage of each language's benefits (e.g. see this discussion, this article on Matlab/F#, or this article on R/F#).
So far as charting is concerned: R is extremely strong on this front. Have a look at the graphics view on CRAN and this series of posts on the LearnR blog about Lattice and ggplot2.
I've worked a bit with matlab and python/pylab for these purposes. What these tools have 'built-in' is a programming environment, a shell, and gui tools designed for quickly looking at data from a variety of sources.
In a few commands, you can go from having a csv file to interactive plots on the screen, then to an image export in just about any format. It takes a minute or two to go from data to visualization once you have the hang of it. I would imagine this is uncommon in the C++ world (although I have seen some professors with pretty impressive work-flows).
I've tried R, but I can't say much useful about it. It seems to offer about the same set of features, but it may be troublesome to Google for support.
If you are spending more than a couple minutes getting from data to plot using your current method, it's definitely worth learning one of these environments. The best choice depends on your colleagues, your work environment, experience, and your budget.
This is a reasonable close double to the previous question on suitable functional language for scientific/statistical computing so you may want to peruse the long and detailed answers there.
Answers depends, as so often, on your experience and prior language training. I very much prefer R for data munging / modeling / visualization.
I use R because on the one hand it has everything built in and on the other hand you can still manipulate almost everything or start from scratch. Nevertheless, R is rather slow for heavy calculations (although I do all my Monte Carlo simulations in it).
I would say that Matlab is best for the availability of mathematical functionalities in general, R is best for data input/manipulation/visualisation/analysis/etc., and C++ for high-speed subroutines. You can by the way easily integrate C++ (or C, fortran, ...) code in R. Why not read and manipulate input data in R, apply the models in C++, and analyse/visualize output back in R?
I always prototype my models in MATLAB. If my prototype is fast enough, I refactor and it's done. If not, I go back and implement certain functions in C to be called by MATLAB. This requires knowledge of a low level language, which I think is always going to be the case if you are doing anything that is technically challenging.
I'm intrigued with this Lisp flavor if it ever gets off the ground.