A compass value goes from 0...360°. Normal averaging routines generate a huge error, when the compass value transits from 359 to 0, or back.
I'm too stupid, creating an algo, which can handle this transition.
The answer is here:
Angle average
I take no responsability for the content under this link.
Related
I have several images that i would like to correct from artifacts. They show different animals but they appear to look like they were folded (look at the image attached). The folds are straight and they go through the wings as well, they are just hard to see but they are there. I would like to remove the folds but at the same time preserve the information from the picture (structure and color of the wings).
I am using MATLAB right now and i have tried several methods but nothing seems to work.
Initially i tried to see if i can see anything by using an FFT but i do not see a structure in the spectrum that i can remove. I tried to use several edge detection methods (like Sobel, etc) but the problem is that the edge detection always finds the edges of the wings (because they are stronger)
rather than the straight lines. I was wondering if anyone has any ideas about how to proceed with this problem? I am not attaching any code because none of the methods i have tried (and described) are working.
Thank you for the help in advance.
I'll leave this bit here for anyone that knows how to erase those lines without affecting the quality of the image:
a = imread('https://i.stack.imgur.com/WpFAA.jpg');
b = abs(diff(a,1,2));
b = max(b,[],3);
c = imerode(b,strel('rectangle',[200,1]));
I think you should use a 2-dimensional Fast Fourier Transform
It might be easier to first use GIMP / Photoshop if a filter can resolve it.
I'm guessing the CC sensor got broken (it looks to good for old scanner problems). Maybe an electric distortion while it was reading the camera sensor. Such signals in theory have a repeating nature.
I dont think this was caused by a wrong colordepth/colorspace translation
If you like to code, then you might also write a custom pixel based filter in which you take x vertical pixels (say 20 or so) compare them to the next vertical row of 20 pixels. Compare against HSL (L lightnes), not RGB.
From all pixels calculate brightness changes this way.
Then per pixel check H (heu) is within range of nearby pixels take slope average of their brightness(ea take 30 pixels horizontal, calculate average brightnes of first 10 and last 10 pixels apply that brightness to center pixel 15,... //30, 15, 10 try to find what works well
Since you have whole strokes that apear brighter/darker such filter would smooth that effect out, the difficulty is to remain other patterns (the wings are less distorted), knowing what color space the sensor had might allow for a better decision as HSL, maybe HSV or so..
I'm an R Programmer but I found that Qgis is better for some geo projects. I'm having a tough time with something and I've searched. Basically, I want to create a buffer around 5 points so that I can then sum the population for the buffer area. The issue I'm having is that the Buffer dropdown is not in my Geo Processing Tools. Rather I have a Fixed Buffer (QGIS v2.18.14 mac). When I use the fixed buffer I get a gigantic circle that takes up the whole map. Any ideas where I might be going wrong?
Most Probably your SRID of points is 4326, which is degree, minutes, seconds . You should convert the points to meter based SRID, say 3857. Then apply buffer in meters unit. Afterwards you can convert back your points to 4326.
The specs for the web audio API dynamic compressor node refer to some curve being drawn over various decibel values. How can I visualize that curve?
For filter nodes, the web audio API provides a getFrequencyResponse method that produces data that can be visualized on a 2D canvas.
Is there a similar method for the dynamic processor node? Or are there well-known formulas used to compute the magnitude of the node's effect on various dB values?
I'm not sure exactly how to calculate the curve for knee, but I'm pretty sure it shouldn't be super difficult. Ignoring the knee, here's what you'd need:
First, you'd start out with a line that has a slope of 1 (45 degree angle, up and to the right). Another way of saying that is that output = input
Then, when you hit threshold, you change the slope of the line to match your compression ratio. So if your ratio is 2.3:1, your slope above the threshold would be output = input / 2.3.
Anyway, I'm sure if you do some searching, you can figure out how to factor in the knee. It's probably just a parabola that joins the two slopes (with a vertex at the point where they would normally intersect if the knee was 0). Then you just need to figure out what the value does, but if you read the Web Audio spec, the unit for knee is dB – which leads me to believe this isn't really implementation-specific. I think there probably is a Right Way™ to do it.
Unfortunately there is no way to examine easily the effect of the dynamic compressor node. And the actual implementation isn't specified in the WebAudio spec. The only way to know the effect is to examine the source code. Or perhaps feed a sine wave of different frequencies to the node and examine the output to see what is happening, experimentally. This might be hard to capture the effect of all of the parameters.
I'm trying to write a code The helps me in my biology work.
Concept of code is to analyze a video file of contracting cells in a tissue
Example 1
Example 2: youtube.com/watch?v=uG_WOdGw6Rk
And plot out the following:
Count of beats per min.
Strenght of Beat
Regularity of beating
And so i wrote a Matlab code that would loop through a video and compare each frame vs the one that follow it, and see if there was any changes in frames and plot these changes on a curve.
Example of My code Results
Core of Current code i wrote:
for i=2:totalframes
compared=read(vidObj,i);
ref=rgb2gray(compared);%% convert to gray
level=graythresh(ref);%% calculate threshold
compared=im2bw(compared,level);%% convert to binary
differ=sum(sum(imabsdiff(vid,compared))); %% get sum of difference between 2 frames
if (differ ~=0) && (any(amp==differ)==0) %%0 is = no change happened so i dont wana record that !
amp(end+1)=differ; % save difference to array amp wi
time(end+1)=i/framerate; %save to time array with sec's, used another array so i can filter both later.
vid=compared; %% save current frame as refrence to compare the next frame against.
end
end
figure,plot(amp,time);
=====================
So thats my code, but is there a way i can improve it so i can get better results ?
because i get fealing that imabsdiff is not exactly what i should use because my video contain alot of noise and that affect my results alot, and i think all my amp data is actually faked !
Also i actually can only extract beating rate out of this, by counting peaks, but how can i improve my code to be able to get all required data out of it ??
thanks also really appreciate your help, this is a small portion of code, if u need more info please let me know.
thanks
You say you are trying to write a "simple code", but this is not really a simple problem. If you want to measure the motion accuratly, you should use an optical flow algorithm or look at the deformation field from a registration algorithm.
EDIT: As Matt is saying, and as we see from your curve, your method is suitable for extracting the number of beats and the regularity. To accuratly find the strength of the beats however, you need to calculate the movement of the cells (more movement = stronger beat). Unfortuantly, this is not straight forwards, and that is why I gave you links to two algorithms that can calculate the movement for you.
A few fairly simple things to try that might help:
I would look in detail at what your thresholding is doing, and whether that's really what you want to do. I don't know what graythresh does exactly, but it's possible it's lumping different features that you would want to distinguish into the same pixel values. Have you tried plotting the differences between images without thresholding? Or you could threshold into multiple classes, rather than just black and white.
If noise is the main problem, you could try smoothing the images before taking the difference, so that differences in noise would be evened out but differences in large features, caused by motion, would still be there.
You could try edge-detecting your images before taking the difference.
As a previous answerer mentioned, you could also look into motion-tracking and registration algorithms, which would estimate the actual motion between each image, rather than just telling you whether the images are different or not. I think this is a decent summary on Wikipedia: http://en.wikipedia.org/wiki/Video_tracking. But they can be rather complicated.
I think if all you need is to find the time and period of contractions, though, then you wouldn't necessarily need to do a detailed motion tracking or deformable registration between images. All you need to know is when they change significantly. (The "strength" of a contraction is another matter, to define that rigorously you probably would need to know the actual motion going on.)
What are the structures we see in the video? For example what is the big dark object in the lower part of the image? This object would be relativly easy to track, but would data from this object be relevant to get data about cell contraction?
Is this image from a light microscop? At what magnification? What is the scale?
From the video it looks like there are several motions and regions of motion. So should you focus on a smaller or larger area to get your measurments? Per cell contraction or region contraction? From experience I know that changing what you do at the microscope might be much better then complex image processing ;)
I had sucsess with Gunn and Nixons Dual Snake for a similar problem:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.64.6831
I placed the first aproximation in the first frame by hand and used the segmentation result as starting curv for the next frame and so on. My implementation for this is from 2000 and I only have it on paper, but if you find Gunn and Nixons paper interesting I can probably find my code and scan it.
#Matt suggested smoothing and edge detection to improve your results. This is good advise. You can combine smoothing, thresholding and edge detection in one function call, the Canny edge detector.Then you can dialate the edges to get greater overlap between frames. Little overlap will probably mean a big movement between frames. You can use this the same way as before to find the beat. You can now make a second pass and add all the dialated edge images related to one beat. This should give you an idea about the area traced out by the cells as they move trough a contraction. Maybe this can be used as a useful measure for contraction of a large cluster of cells.
I don't have access to Matlab and the Image Processing Toolbox now, so I can't give you tested code. Here are some hints: http://www.mathworks.se/help/toolbox/images/ref/edge.html , http://www.mathworks.se/help/toolbox/images/ref/imdilate.html and http://www.mathworks.se/help/toolbox/images/ref/imadd.html.
I want to create an application could detect the number of spin when user rotates the iPhone device. Currently, I am using the Compass API to get the angle and try many ways to detect spin. Below is the list of solutions that I've tried:
1/ Create 2 angle traps (piece on the full round) on the full round to detect whether the angle we get from compass passed them or not.
2/ Sum all angle distance between times that the compass is updated (in updateHeading function). Let try to divide the sum angle to 360 => we could get the spin number
The problem is: when the phone is rotated too fast, the compass cannot catch up with the speed of the phone, and it returns to us the angle with latest time (not continuously as in the real rotation).
We also try to use accelerometer to detect spin. However, this way cannot work when you rotate the phone on a flat plane.
If you have any solution or experience on this issue, please help me.
Thanks so much.
The iPhone4 contains a MEMS gyrocompass, so that's the most direct route.
As you've noticed, the magnetometer has sluggish response. This can be reduced by using an anticipatory algorithm that uses the sluggishness to make an educated guess about what the current direction really is.
First, you need to determine the actual performance of the sensor. To do this, you need to rotate it at a precise rate at each of several rotational speeds, and record the compass behavior. The rotational platform should have a way to read the instantaneous position.
At slower speeds, you will see a varying degree of fixed lag. As the speed increases, the lag will grow until it approaches 180 degrees, at which point the compass will suddenly flip. At higher speeds, all you will see is flipping, though it may appear to not flip when the flips repeat at the same value. At some of these higher speeds, the compass may appear to rotate backwards, opposite to the direction of rotation.
Getting a rotational table can be a hassle, and ensuring it doesn't affect the local magnetic field (making the compass useless) is a challenge. The ideal table will be made of aluminum, and if you need to use a steel table (most common), you will need to mount the phone on a non-magnetic platform to get it as far away from the steel as possible.
A local machine shop will be a good place to start: CNC machines are easily capable of doing what is needed.
Once you get the compass performance data, you will need to build a model of the observed readings vs. the actual orientation and rotational rate. Invert the model and apply it to the readings to obtain a guess of the actual readings.
A simple algorithm implementation will be to keep a history of the readings, and keep a list of the difference between sequential readings. Since we know there is compass lag, when a difference value is non-zero, we will know the current value has some degree of inaccuracy due to lag.
The next step is to create a list of 'corrected' readings, where the know lag of the prior actual values is used to generate an updated value that is used to create an updated value that is added to the last value in the 'corrected' list, and is stored as the newest value.
When the cumulative correction (the difference between the latest values in the actual and corrected list exceed 360 degrees, that means we basically don't know where the compass is pointing. Hopefully, that point won't be reached, since most rotational motion should generally be for a fairly short duration.
However, since your goal is only to count rotations, you will be off by less than a full rotation until the accumulated error reaches a substantially higher value. I'm not sure what this value will be, since it depends on both the actual compass lag and the actual rate of rotation. But if you care only about a small number of rotations (5 or so), you should be able to obtain usable results.
You could use the velocity of the acceleration to determine how fast the phone is spinning and use that to fill in the blanks until the phone has stopped, at which point you could query the compass again.
If you're using an iPhone 4, the problem has been solved and you can use Core Motion to get rotational data.
For earlier devices, I think an interesting approach would be to try to detect wobbling as the device rotates, using UIAccelerometer on a very fine reporting interval. You might be able to get some reasonable patterns detected from the motion at right angles to the plane of rotation.