Flip face in obj file - unity3d

I'm dynamically creating a 3D model and writing an .obj file. I'm having a problem with flipping the visible side of faces.
I've made a simple example:
v 0.0 0.0 0.0
v 0.0 1.0 0.0
v 1.0 0.0 0.0
v 1.0 1.0 0.0
vn 0.0 0.0 -1.0
f 1//1 4//1 3//1
f 1//1 2//1 4//1
The above is a square divided into two triangles. The vn line is the face normal (the vector that is perpendicular to the face). I've read online that to flip the face, you can negate the normal vector. However if I multiply the normal vector by -1 and try the following...
v 0.0 0.0 0.0
v 0.0 1.0 0.0
v 1.0 0.0 0.0
v 1.0 1.0 0.0
vn 0.0 0.0 1.0
f 1//1 4//1 3//1
f 1//1 2//1 4//1
It doesn't actually flip the visible side of the face when I import it into Unity. The lighting changes a little bit, but the same side is still visible and the other side is still invisible.
When I orbit to the opposite side:

The normal only influences the lighting effect. To flip a face, you need to inverse the index order of the triangle like below.
f 3//1 4//1 1//1
f 4//1 2//1 1//1

Related

Transform midi pitch bend to 0 to 4 logarithmic scale

I have midi pitch bend message which needs to be transformed from a linear scale between 0 and 16368 to a logarithmic scale between 0.0 and 4.0.
I know that when the pitch bend is at 12432, the value needs to be at 1.0 and that at 16368 it should be at 4.0
How can I program a function in swift to convert between these two scales?
I'm not sure what do you want to achieve. But logarithm has a vertical asymptote. So, you should define the left and right bounds of abscissa. One of the possible formulas:
y = 4 * ln( (x + 1) * (e - 1) / 16368)

CGAffineTransform Scale wont scale if using variables

This is driving me crazy.
Im applying a scale transform, if I type numbers statically, it scales perfectly well. but if I use variables it always reset's to 1.0. But I even print the new transform and the scale is applied.
This works:
let scaleTransform = origTranform.scaledBy(x: 0.28125, y: 0.28125)
This doesn't
let s = finalVideoFrame.height/size.height;
let scaleTransform = origTranform.scaledBy(x: s, y: s)
if I print scaleTransform I get the right values on either cases.
▿ CGAffineTransform
- a : 0.28125
- b : 0.0
- c : 0.0
- d : 0.28125
- tx : 0.0
- ty : 0.0
Im using this transfom to set it for a AVMutableVideoCompositionLayerInstruction

Element by Element Matrix Multiplication in Scala

I have an input mllib matrix like,
matrix1: org.apache.spark.mllib.linalg.Matrix =
1.0 0.0 2.0 1.0
0.0 3.0 1.0 1.0
2.0 1.0 0.0 0.0
The dimensions of matrix1 is 3*4.
I need to do an element by element matrix multiplication with another matrix so that dimensions of two matrices will be the same in all cases. Let us assume I have another matrix named matrix2 like
matrix2: org.apache.spark.mllib.linalg.Matrix =
3.0 0.0 2.0 1.0
1.0 9.0 5.0 1.0
2.0 5.0 0.0 0.0
with dimensions 3*4
My resultant matrix should be,
result: org.apache.spark.mllib.linalg.Matrix =
3.0 0.0 4.0 1.0
0.0 27.0 5.0 1.0
4.0 5.0 0.0 0.0
How can I achieve this in Scala ? (Note: inbuilt function multiply of spark mllib works as per exact matrix multiplication.)
Below is one way of doing it. Here we iterate both the matrix column wise and find their element multiplication. This solution assumes that both the matrix are of same dimensions.
First let's create test matrix as given in question.
//creating example matrix as per the question
val m1: Matrix = new DenseMatrix(3, 4, Array(1.0, 0.0, 2.0, 0.0, 3.0, 1.0, 2.0, 1.0, 0.0, 1.0, 1.0, 0.0))
val m2: Matrix = new DenseMatrix(3, 4, Array(3.0, 1.0, 2.0, 0.0, 9.0, 5.0, 2.0, 5.0, 0.0, 1.0, 1.0, 0.0))
Now let's define a function which takes two Matrix and returns their element multiplication.
//define a function to calculate element wise multiplication
def elemWiseMultiply(m1: Matrix, m2: Matrix): Matrix = {
val arr = new ArrayBuffer[Array[Double]]()
val m1Itr = m1.colIter //operate on each columns
val m2Itr = m2.colIter
while (m1Itr.hasNext)
//zip both the columns and then multiple element by element
arr += m1Itr.next.toArray.zip(m2Itr.next.toArray).map { case (a, b) => a * b }
//return the resultant matrix
new DenseMatrix(m1.numRows, m1.numCols, arr.flatten.toArray)
}
You can then call this function for your element multiplication.
//call the function to m1 and m2
elemWiseMultiply(m1, m2)
//output
//3.0 0.0 4.0 1.0
//0.0 27.0 5.0 1.0
//4.0 5.0 0.0 0.0

How to calculate mean of function in a gaussian fit?

I'm using the curve fitting app in MATLAB. If I understand correctly the "b1" component in the left box is the mean of function i.e. the x point where y=50% and my x data is [-0.8 -0.7 -0.5 0 0.3 0.5 0.7], so why is this number in this example so big (631)?
General model Gauss1:
f(x) = a1*exp(-((x-b1)/c1)^2)
Coefficients (with 95% confidence bounds):
a1 = 3.862e+258 (-Inf, Inf)
b1 = 631.2 (-1.117e+06, 1.119e+06)
c1 = 25.83 (-2.287e+04, 2.292e+04)
Your data looks like cdf and not pdf. You can use this code for your solution
xi=[-0.8,-0.7,-0.5, 0.0, 0.3, 0.5, 0.7];
yi= [0.2, 0.0, 0.2, 0.2, 0.5, 1.0, 1.0];
fun=#(v) normcdf(xi,v(1),v(2))-yi;
[v]=lsqnonlin(fun,[1,1]); %[1,2]
mu=v(1); sigma=v(2);
x=linspace(-1.5,1.5,100);
y=normcdf(x,mu,sigma);
figure(1);clf;plot(xi,yi,'x',x,y);
annotation('textbox',[0.2,0.7,0.1,0.1], 'String',sprintf('mu=%f\nsigma=%f',mu,sigma),'FitBoxToText','on','FontSize',16);
you will get: mu=0.24537, sigma=0.213
And if you still want to fit to pdf, just change the function 'normcdf' in 'fun' (and 'y') to 'normpdf'.

Hue to wavelength mapping

Is there an algorithm to find out the wavelength of the color given the hue value (between 0 degree to 360 degree). Is there any built-in function in MATLABfor the same?
While Mark Ransom and Franco Callari are completely right that you cannot recover the spectrum of a perceptual color, nor unambiguously map hue values to wavelengths, you could definitely piece something together if you just want the corresponding monochromatic wavelength.
The part of the hue cycle between 270 and 360 is another problem. There is nothing corresponding to pink or magenta in the light spectrum, so let's assume that we only use hue values between 0 and 270 degrees.
Estimating that the usable part of the visible spectrum is 400-650nm, with wavelength L (in nm) and hue value H (in degrees), you can improvise this:
L = 650 - 250 / 270 * H
650 is the maximum wavelength, 250 is the wavelength range and 270 is the hue range.
I think this should be in the right direction but there may of course be room for improvement. You might be able to get better results comparing between input hues and corresponding colors on a visible spectrum chart, and then adjusting the values somewhat.
I cant provide simple solution, but there is something you need to consider:
The visible part of the spektrum is roughly between 380nm (UV-border) and 780nm (IR-border). But what you see (hue) depends on the cone-cells triggered. Above 660nm, the M-cone is not triggered at all, so everything between 660nm and 780nm is hue 0°.
at 580nm you have yellow with hue 60°, the purest green is at about 535nm, so that is 120°, and the purest blue (240°) is at about 457nm.
if you apply a linear function, yellow should be at 597nm - which it is not, so you'd need a more complex approach.
above blue, the red cone still gets triggered until we see violet, but we wont reach red again on higher frequencies, so you cant go above approximately 300°.
the hue range between 300° and 360° has no æquivalent in visible spektrum, it can only be simulated by mixing high frequency light (blue or violet) with red light, which results in something between magenta and red on the purple-line.
It is possible to find the dominant wavelength of a color/hue. But as said most colors arn’t monochromatic and the same color can be constructed with different “mixes” of wavelengths. I.e. metamerism.
Also, for the extra spectral magenta and violet colors only a complementary wavelength can be specified. I.e. the hue/dominant wavelength that additively mixes to white. Also white must be specified, since the is no absolute white due to adaption.
Also psychologically our perception of hues doesn’t follow dominant hue lines. Se the Munsell and NCS systems.
Here you can calulate dominant wavelength from RGB values or different CIE systems: http://www.brucelindbloom.com/index.html?Calc.html
I don’t have the formula though.
You can then transform RGB to/from HSL and similar. And to/from Munsell or NCS perceptual hues (NCS values are proprietary, so you have to pay and use their software).
Short answer: NO. A given hue can in general be produced by a triple infinity of wavelengths.
A "physical color" is a combination of pure spectral colors (in the visible range). In principle there exist infinitely many distinct spectral colors, and so the set of all physical colors may be thought of as an infinite-dimensional vector space (a Hilbert space). This space is typically notated Hcolor. More technically, the space of physical colors may be considered to be the topological cone over the simplex whose vertices are the spectral colors, with white at the centroid of the simplex, black at the apex of the cone, and the monochromatic color associated with any given vertex somewhere along the line from that vertex to the apex depending on its brightness.
. . .
This system implies that for any hue or non-spectral color not on the boundary of the chromaticity diagram, there are infinitely many distinct physical spectra that are all perceived as that hue or color. So, in general there is no such thing as the combination of spectral colors that we perceive as (say) a specific version of tan; instead there are infinitely many possibilities that produce that exact color. The boundary colors that are pure spectral colors can be perceived only in response to light that is purely at the associated wavelength, while the boundary colors on the "line of purples" can each only be generated by a specific ratio of the pure violet and the pure red at the ends of the visible spectral colors.
The CIE chromaticity diagram is horseshoe-shaped, with its curved edge corresponding to all spectral colors (the spectral locus), and the remaining straight edge corresponding to the most saturated purples, mixtures of red and violet.
(Source)
I found this site that converts a given wavelength to a hue. With a bit of work, you could actually reverse the process. It's not ideal, but I trust the guy who is a consultant in applied mathematics more than myself in solving this issue. That's that.
https://www.johndcook.com/wavelength_to_RGB.html
function convert(input) {
var w = parseFloat(input);
if (w >= 380 && w < 440) {
r = -(w - 440) / (440 - 380);
g = 0.0;
b = 1.0;
} else if (w >= 440 && w < 490) {
r = 0.0;
g = (w - 440) / (490 - 440);
b = 1.0;
} else if (w >= 490 && w < 510) {
r = 0.0;
g = 1.0;
b = -(w - 510) / (510 - 490);
} else if (w >= 510 && w < 580) {
r = (w - 510) / (580 - 510);
g = 1.0;
b = 0.0;
} else if (w >= 580 && w < 645) {
r = 1.0;
g = -(w - 645) / (645 - 580);
b = 0.0;
} else if (w >= 645 && w < 781) {
r = 1.0;
g = 0.0;
b = 0.0;
} else {
r = 0.0;
g = 0.0;
b = 0.0;
}
// Let the intensity fall off near the vision limits
if (w >= 380 && w < 420)
factor = 0.3 + 0.7 * (w - 380) / (420 - 380);
else if (w >= 420 && w < 701)
factor = 1.0;
else if (w >= 701 && w < 781)
factor = 0.3 + 0.7 * (780 - w) / (780 - 700);
else
factor = 0.0;
var gamma = 0.80;
var R = (r > 0 ? 255 * Math.pow(r * factor, gamma) : 0);
var G = (g > 0 ? 255 * Math.pow(g * factor, gamma) : 0);
var B = (b > 0 ? 255 * Math.pow(b * factor, gamma) : 0);
return [R, G, B]
}
There's no conversion because they don't overlap.
Hue moves you around an RGB colour space, usually sRGB that almost all consumer digital equipment uses. That's a subset of the colours that our visual systems recognise under normal conditions (defined by CIE 1931), and does not overlap the vibrant line of colours perceived at monochromatic wavelengths of light at all.
Though Hue from 0-120 (reddish orange to yellowish green) and near 240 (indigo) are reasonably close, sRGB is quite functional if you don't care about all the washed out greens and blues, and you can fake the violet and red ends of the full spectrum by making them darker Hue around 270 or 330 respectively, and the only place you can't really approximate is around 180, computer cyan just isn't close at all to the monochromatic vibrant blue-greens.