I have a set of lines (image below) which should meet in a number of points. As you can see, now the angular coefficient doesn't vary noticeably, making intercepts hard to find. What transformation do you suggest to use to make things easier (like some kind of Hough transform), or where do you think I should look for inspiration? Thanks!
I'm working in Matlab but I can switch to Python if needed.
I think you're looking for houghpeaks function (http://www.mathworks.nl/help/images/ref/houghpeaks.html)
Related
I’m working with the Matlab Curve Fitting tool for the very first time and I have a question. My fit is exponential with two terms and it looks pretty good. The problem is, it won’t start from P(0,0), although my first measurement is.
Is it possible to force a start value onto my fit? Also, how does R-squared work? Is it safe to rely on?
Thank you so much
See a thorough description of this process here
In short, the most common method of fitting to a polynomial in matlab, polyfit, does not allow for forcing through zero (or anywhere else), and so a different function is required, lsqlin, for example.
I am working with image processing in MATLAB. I have two different images whose histogram plots are as shown below.
Image 1:
and
Image 2:
I have multiple images like those and the only distinguishing(separating) features is that some have single peak and others have two peaks.
In other words some can be thresholded (to generate good results) while others cannot. Is there any way I can separate the two images? Are there any functions that do so in MATLAB or any reference code that will help?
The function used is imhist()
If you mean "distinguish" by "separate", then yes: The property you describe is called bimodality, i.e. you have 2 peaks that can be seperated by one threshold. So your question is actually "how do I test for an underlying bimodal distribution?"
One option to do this programmatically is Binning. This is not the most robust method but the easiest. It might work, it might not.
Kernel Smoothing is probably the more robust solution. You basically shift and scale a certain function (e.g. Gaussian) to fit the data. This can be done with histfit in matlab.
There's more solutions for this problem which you can research for yourself since you now know the terms needed. Be aware though that your problem is not a trivial one if you want to do it properly.
I have a set of 3D points which numbers up around 1 million points. I am looking to visualise these with matlab.
I have tried the following functions:
plot3
scatter3
But they are both very sluggish. Is there a more efficient way to visualise this level of points in matlab? Maybe a way to mesh the points?
If not can anyone suggest a plug-in or even a different program for visualising 3D points?
You're going to run into efficiency issues no matter what plugin/program you use if you want all million+ points to show up in a plot. My suggestion would be to downsample. Use the plot3 or scatter3 function on every other point, or every nth point, until you get a figure that is not sluggish. As long as the variance in your data isn't astronomical, downsampling a little bit shouldn't affect the overall distribution of points (given that you have a million+ points). And any software that is able to display that much data without being sluggish is most likely downsampling/binning or using some interpolation technique to do so (so you might as well have control over it).
fscatter3 from the file exchange, does what you like.
Is there a specific reason to actually have it display that many points?
I Googled around a bit and found some people who have had similar issues (someone suggested Avizo as an alternate program but I've never used it):
http://www.mathworks.com/matlabcentral/newsreader/view_thread/308948
mathworks.com/matlabcentral/newsreader/view_thread/134022 (not clickable because I don't have enough rep to post more than two links)
An alternate solution would be to generate a histogram if you're more interested in the density of the data:
http://blogs.mathworks.com/videos/2010/01/22/advanced-making-a-2d-or-3d-histogram-to-visualize-data-density/
I you know beforehand roughly the coordinates of the feature you are looking for, try passing the cloud through a simple pass-through-filter, which essentially crops your point cloud. I.e. if you know that the feature is at a x-coordinate > 5, remove all points with x-coordinate < 5.
This filter could for the first coordinated be realized as
data = data(data(1,:) > 5,:);
Provided that your 3d data is stored in an n by 3 matrix.
This, together with downsampling, could help you out. If you still find the performance lagging, consider using something like the PCD viewer in PointCloudLibrary, check half way down the page at
http://pointclouds.org/documentation/overview/visualization.php
It is a stand alone app you could launch from matlab. I find it's performance far better than the sluggish matlab plotting tools.
For anyone who is interested I ended up finding a Point cloud visualiser called Cloud Compare. It is extremely fast and allows selection and segmentation as well as filtering on point clouds.
I need to construct an interpolating function from a 2D array of data. The reason I need something that returns an actual function is, that I need to be able to evaluate the function as part of an expression that I need to numerically integrate.
For that reason, "interp2" doesn't cut it: it does not return a function.
I could use "TriScatteredInterp", but that's heavy-weight: my grid is equally spaced (and big); so I don't need the delaunay triangularisation.
Are there any alternatives?
(Apologies for the 'late' answer, but I have some suggestions that might help others if the existing answer doesn't help them)
It's not clear from your question how accurate the resulting function needs to be (or how big, 'big' is), but one approach that you could adopt is to regress the data points that you have using a least-squares or Kalman filter-based method. You'd need to do this with a number of candidate function forms and then choose the one that is 'best', for example by using an measure such as MAE or MSE.
Of course this requires some idea of what the form underlying function could be, but your question isn't clear as to whether you have this kind of information.
Another approach that could work (and requires no knowledge of what the underlying function might be) is the use of the fuzzy transform (F-transform) to generate line segments that provide local approximations to the surface.
The method for this would be:
Define a 2D universe that includes the x and y domains of your input data
Create a 2D fuzzy partition of this universe - chosing partition sizes that give the accuracy you require
Apply the discrete F-transform using your input data to generate fuzzy data points in a 3D fuzzy space
Pass the inverse F-transform as a function handle (along with the fuzzy data points) to your integration function
If you're not familiar with the F-transform then I posted a blog a while ago about how the F-transform can be used as a universal approximator in a 1D case: http://iainism-blogism.blogspot.co.uk/2012/01/fuzzy-wuzzy-was.html
To see the mathematics behind the method and extend it to a multidimensional case then the University of Ostravia has published a PhD thesis that explains its application to various engineering problems and also provides an example of how it is constructed for the case of a 2D universe: http://irafm.osu.cz/f/PhD_theses/Stepnicka.pdf
If you want a function handle, why not define f=#(xi,yi)interp2(X,Y,Z,xi,yi) ?
It might be a little slow, but I think it should work.
If I understand you correctly, you want to perform a surface/line integral of 2-D data. There are ways to do it but maybe not the way you want it. I had the exact same problem and it's annoying! The only way I solved it was using the Surface Fitting Tool (sftool) to create a surface then integrating it.
After you create your fit using the tool (it has a GUI as well), it will generate an sftool object which you can then integrate in (2-D) using quad2d
I also tried your method of using interp2 and got the results (which were similar to the sfobject) but I had no idea how to do a numerical integration (line/surface) with the data. Creating thesfobject and then integrating it was much faster.
It was the first time I do something like this so I confirmed it using a numerically evaluated line integral. According to Stoke's theorem, the surface integral and the line integral should be the same and it did turn out to be the same.
I asked this question in the mathematics stackexchange, wanted to do a line integral of 2-d data, ended up doing a surface integral and then confirming the answer using a line integral!
I have to use mat lab to find a certain letter in a tif text image.In the spatial domain. I have no idea how to do this, and can't find any documentation other than complex code that uses loops, loops are forbidden. Yes, this is an assignment I don't want the answer just some direction on how to even start.
I want to use imfilter and use a letter as a template or filter to imfilter using correlation but from there I have no idea where to go and don't even know what questions to ask to find more info on mat labs site.
The write up makes it seem simple but I know nothing of this subject as I am a beginner so to me this is hard.
thanks
If you have the image processing toolbox I would suggest using the function normxcorr2. It calculates the normalized cross correlation between a template image and a larger image, which I think is what you want.
You don't need any for loops to use it, but the method itself probably uses for loops somewhere hidden in the code. I don't know if that counts..