I am doing a project on Open Modelica and i have to simulate filters on it using active elements(op amp). Modelica plots graph with respect to time but i want my graphs with respect to frequency to analyze the frequency response of the system. I searched the internet but couldn't find anything useful. Please reply as soon as possible.
If you want to plot a variable with respect to another variable you can use plotParameteric from OMShell (OpenModelica Shell). In OMEdit (OpenModelica Connection Editor) you can click on parametric plot button x(y) and then select 2 variables.
I assume that what you want is a Bode plot. If so, it is important to understand that such a plot does not arise from a transient simulation. It is necessary to transform your system into a linear, time-invariant representation in order to express the response of your system in the frequency domain.
I do not know what specific features OpenModelica has in this regard. But those are at least the kinds of things you should search the documentation for. If you have access to MATLAB, then all you really need to do is extract the linearized version of the model (normally expressed as the so-called "ABCD" matrices) and MATLAB can get you the rest of the way.
There is also the Modelica_LinearSystems2 library which might be compatible with OpenModelica (I have no idea). It includes many types of operations you would typically perform on linear systems.
Related
I have been all over Google trying to find a good function/package to perform multivariate regression (i.e. predict multiple continuous variables given another set of multiple continuous variables).
I wish to use something like fitlm(), since that also gives me p-value statistics and R squared statistics. Does anything like that exist?
Matlab has a bundle of tools for this, see this page.
I believe that mvregress is the most rounded and mainstream tool. See this page for setting up an analysis with it.
Also, a comment in this post may be useful for alternatives, if needed: it is possible to approach this via separate regression analyses, one for each response variable.
I often find myself fitting a scatter plot, and knowing that the 'true fit' should have only one inflection point. Any ideas for forcing a fit that will obey this?
I am using Matlab and Microsoft Excel
Many thanks
Option 1:
I like to use spline smoothing with Akaike information criteria, and while it is a hyper-parametric fit and has a large number of analytic candidate inflection points, the smoothed data at the sample points tends to reveal only what is within the data.
If your data doesn't actually have an inflection point, this is indicated. If it does, it is also usually captured. Statistical jargon for an important cousin to this is called a "non-informative prior".
Try slides 30-31 here: link.
Option 2:
If you have an older version of MatLab then you can specify the exact model easily in the "cftool" (not the same as sftool) then get m-file that gives how you put it into your own script. Pick a model appropriate to your data.
I have a curve which looks roughly / qualitative like the curves displayed in those 3 images.
The only thing I know is that the first part of the curve is hardware-specific supposed to be a linear curve and the second part is some sort of logarithmic part (might be a combination of two logarithmic curves), i.e. linlog camera. But I couldn't tell the mathematic structure of the equation, e.g. wether it looks like a*log(b)+c or a*(log(c+b))^2 etc. Is there a way to best fit/find out a good regression for this type of curve and is there a certain way to do this specifically in MATLAB? :-) I've got the student version, i.e. all toolboxes etc.
fminsearch is a very general way to find best-fit parameters once you have decided on a parametric equation. And the optimization toolbox has a range of more-sophisticated ways.
Comparing the merits of one parametric equation against another, however, is a deep topic. The main thing to be aware of is that you can always tweak the equation, adding another term or parameter or whatever, and get a better fit in terms of lower sum-squared-error or whatever other goodness-of-fit metric you decide is appropriate. That doesn't mean it's a good thing to keep adding parameters: your solution might be becoming overly complex. In the end the most reliable way to compare how well two different parametric models are doing is to cross-validate: optimize the parameters on a subset of the data, and evaluate only on data that the optimization procedure has not yet seen.
You can try the "function finder" on my curve fitting web site zunzun.com and see what it comes up with - it is free. If you have any trouble please email me directly and I'll do my best to help.
James Phillips
zunzun#zunzun.com
I did some reading this afternoon about SVM's. And have the hope that this looks very promising.
I am currently working on a problem, where I'm looking for a pattern in the fourier spectrum. What I'm saying is, that I have been looking at spectrums for days. I hope to find some repeating patterns. I found some criterias that match a certain pattern, but with the next sample, the whole pattern could look slightly different. So there is always slight deviation, which makes it hard to describe. Or in another way, I might be overlooking something. But I can clearly say, which is the training data.
I was hoping to make use of SVM to train it, and predict the classification. Means that if I have another set of new data, that it would tell me, that it matches the training data or it goes into the "other" group, which could be anything (no need to know).
Is that something a SVM is able to do, or am I completly off? I couldn't find any good examples of input data to see if my problem is something I could feed to SVM.
Currently using Matlab.
There actually has been tons of research done on this particular topic, but especially with Wavelet Transform. Google Wavelet Transform and SVM and you will find a number of papers. From there, you can easily go ahead with adjusting your model from Wavelet to FFT spectrum.
I don't have experience with SVM, but I do have experience with related techniques, and here's what I can say:
In all likelihood, you can't simply go from a spectrum to SVM to decision. You need to determine what it is about the spectrums that distinguish your various inputs. For example, if it's the way the data changes over time or the relationship between the high and low frequencies that makes the inputs different, you need to encode that a single parameter. Eg, you could make a parameter that's the ratio of some of your higher frequencies to some of your lower frequencies. You may also want to use parameters like frequency centroid and zero-crossing rate, which are simpler than the spectrum, but may still carry useful information (These are used in audio and speech. not sure if they apply to whatever you are looking at). Once you have these derived parameters, feed them to the SVM analysis, which will do the sorting.
Other techniques you might want to examine (which also have the same requirements) include HMM (Hidden Markov Models), K-Means, and Logistic Regression.
This may be the wrong place to ask this, but I can't find a better place on the SE network.
I've briefly worked with both Matlab and Ansys, and from what I have learnt/can gather, Matlab is a programming environment that has functions that perform common math, visualization and analysis operations. You primarily write programs in a textual fashion (.m files) or use Simulink to generate flow graphs (model-based development). Ansys on the other hand is primary a simulation environment where quite a lot can be done simply with the GUI (3D models, physics domains, configuration, display settings), and you can add equations at various points in the simulation engine in order to modify the simulation flow.
Whatever I understand is cursory and only serves as an overview. Can anyone give me a suitable real-world comparison between Matlab and Ansys (or any other simulation product such as COMSOL) that would allow us to understand when to use which, and the weaknesses of each system.
I haven't used Ansys, but Ansys is often compared with Comsol, and I've used Comsol and Matlab for years.
Matlab:
Programming language and environment that runs it. Which means it can do anything (that any other programming language can do). What are its highlights, compared to other languages?
Hundreds of built-in functions to work with Matrices. For example, in one project I needed to do simple matrix algebra (add, multiply, scale matrices), and also needed singular value decomposition. SVD is not something you could write in 50 lines of code, so I needed a ready-made library. At the time I used a library for Java, and wrote my own code for representing matrices and doing matrix algebra on them. That's a few hundreds of lines of code. Had I used Matlab, it would have been about ten lines of code, because all of it is there. I would have needed only to type help svd to find out how to use it. However, if you don't need any of that, stay away from Matlab at all costs! There are much better languages that are free.
Great to use as a calculator that is always open on the desktop, and can do back-of-the-envelope style calculations.
Plotting graphs. Many academics recommend Matlab as the tool of choice for producing publication-quality graphics. These can be exported as PDF and imported into Inkscape for further editing. The best thing is that commands for plotting a graph could be put into a script file, and then parts of it can be changed later as needed, which can save a lot of work compared to manually drawing a graph (imagine you wanted to change the axes or symbols used to present the data points).
Personally, I also use it for curve-fitting. It has many toolboxes, one of which is a neat tool that allows me to find equations that model a set of data points.
Comsol:
Specialised tool for solving partial differential equations (PDEs) on complicated domains using the finite element method (FEM). This might sound obscure, but many real-world engineering needs reduce to this. Such things as:
Finding loads, stresses and strains in civil engineering structures with complicated real-world geometry (what happens when there is gusty wind blowing onto a building or bridge?)
How do currents flow in particular conductive objects?
Chemical reactions in various industrial reactors.
What is the power efficiency of a generator (magnet spinning in coil) design?
How to place aircon outlets in a nontrivially-shaped room to achieve both good temperature distribution and good efficiency?
Comsol, as any other FEM tool that can work with arbitrary equations, can do multiphysics, which means, for example, that one could solve for chemistry of a battery, as well as the temperature and pressure, and how that feeds back into the chemical reaction (speeds up or slows down). Compared with a tool where you need to provide the equations, in Comsol, most of the things that would be needed to solve most problems are already there, and just need to be selected and applied to the geometry, which is also built inside Comsol. Also, equations of arbitrary description can be introduced.
The physical descriptions of how these physical substances behave are called PDEs.
Once Comsol has finished solving a problem, the data could be exported for post-processing into Matlab, which has much more versatile tools for manipulating data and making various plots.