Smart pool swimming tracking - distance

I'm a competitive swimmer and would like more detailed feedback on my swims.
A regular smartwatch has gyro/accererometer/gps/magnetometer, but has a few issues:
Laps aren't being counted instantly, often has some delay after making a turn
There's quite some deviation in lap time measurements, I would like to be able to more accurately measure my lap times in a swim, not with a 2 seconds deviation
When not moving my arms (e.g. swimming legs only), it doesn't count my lap
When taking breaks it does not detect this fast enough.
These are just limitations from the currently used hardware.
One solution is to use a touchpad, which are often used in competition, but when swimming with multiple people in a crowded lane, this doesn't work well either.
I've thought about a solution for this, and a possible solution would be to have a "base-station" at one end of a pool, and somehow measure the distance between the base-station and the swimmer. If you know exactly how far away a swimmer is in the pool, you can more accurately count laps/measure lap times/detect pauses.
Only problem I'm facing now is, what method can I use to measure the distance from the base-station and the swimmer?
Some insight:
Sometimes the swimmer is above the water, other times they're underwater.
Sound-based localization is tricky since sound moves 4x faster in water than in air
Radio-based localization is tricky since the speed of light in water is 75% that of in air. (and radio waves get attenuated fast in water)
Camera with some IR LED or swimming-cap color detection could work but not very convenient
Accuracy needed: ~0.5m
Placement of sensor: can be around wrist, or on head, or somewhere else
Anyone got any tips?

Related

Unity bouncing ball miraculously gains height

I have a sphere with bounciness set to 1
The ball has no drag and uses gravity
It hits a platform, which has bounciness set to 1 and no friction
Yet, the ball bounces higher on every bounce, going to infinity. How is such a thing possible, when I have not given it any extra momentum?
The issue comes from the fact that physics in games happens in discrete frames, and that a moving object will be "inside" another at the frame where there is a collision. The physics engine then has to separate the objects before the next frame, and figure out how much energy to "bounce" with.
One of the steps to do this involves figuring out how much the objects overlap, and that's where this phantom extra energy is comin from. Less error in the overlap equals less error in the energy.
Don't fiddle with the bounciness; those are naive solutions, not to mention they sidestep the issue rather than solving it.
What you should do is to fix what's wrong with your collisions. That can be done a number of ways, the most appropriate/performant of which depends on each specific game:
Increase your physics frame rate (decrease fixed delta time). This reduces overlaps and makes physics frames "smoother". It doesn't really solve phantom energy though; only makes it's causes and effects smaller (maybe so small they become unnoticeable, which is all you need).
Set your sphere's collision detection method to Continuous Dynamic,
and set your ground to static. If you need the sphere to collide with
other stuff, those other stuff present similar issues, and those need
to be non-static, set their rigidbodies' collision detection method
to Continuous. (This is the method I most often find to be the best, but I've had projects where others were better for various reasons)
Increase your Default Solver Velocity Iterations
Change your solver type to Temporal Gauss Seidel
I have had a similar issue with javascript/html/css canvas animations. I have no explanation for this. Use a number like 0.99999 or 0.969399 and that should do the trick. I do get what you mean though it's weird. Just get close to 1. That's all I can say. I hope this helps anyway.

HoloLens: How to stabilize holograms at far distances

I want to place virtual objects (holograms) at far distances (20+ meters) in the HoloLens 1. However, at such distances holograms become unstable and appear to "swim" in the display. Has anyone had success with this? What worked for you?
Some potential fixes include:
Ensure 60 FPS
Adjust Stabilization Plane
Employ visual markers (vuforia)
Use static room scan (may not scale well)
For me, frame rate is not an issue. And I am using Unity 2017.4.4f1. Currently, I have a single world anchor and all objects are set relative to this anchor.
20+ meters is a lot and I am not sure if this will work good enough.
Ensuring 60 fps or at least 50/55+ is important but this wont solve the swimming at this distance. A low framerate might only cause additional swimming :)
Everything that should appear statically placed in the room should be on or very close to the stabilization plane. So what you want to avoid is having the far objects at very different distances from the user. That would otherwise cause the ones farthest off from the stabilization plane to swim.
If you only have the far away object try placing the stabilization plane at the same distance as the object, if the distances are changing a lot you can also update the stabilization plane distance at runtime to always set it to the current distance to the object.
Would be interesting to hear if it worked out :)
One more thing: If I remember correctly, objects should ideally placed directly or in close proximity to their world anchor to help stabilization.
20 metres is too far. The docs
Best practices When holograms cannot be placed at 2m and conflicts
between convergence and accommodation cannot be avoided, the optimal
zone for hologram placement is between 1.25m and 5m. In every case,
designers should structure content to encourage users to interact 1+ m
away (e.g. adjust content size and default placement parameters).

Create natural disaster model in unity3D

I am working on a application where, I would like to make a 3D terrain model of my country in unity3D in clusive of hills, mountains and rivers. So far I've been able to use mapbox to import the country map because unity wrld sdk doesn't yet support my country.
However the end goal is to create an application capable of representing natural disasters. Example, I have the country. I would like to know how would one go about causing rain to occurred that would essentially affect the "water levels" of the river and essentially show a flood. Basically, after I bring in the terrain how do I "act" on it to cause a landslide.
Any help or tutorial pointing to such would be welcomed.
You will need different models for each natural disaster. You will always only get a rough estimate of what may happen as your data will never represent the actual terrain. (For example earthquake, you may be able to reproduce damage to structures but never be able to predict if there will be a drift in the earth itself)
Rain/ Flood
A really simple simulation of rising ground water is slowly moving a "water" plane up. This crude approach will demonstrate which areas are going to be under water quite easily. For a detailed flood simulation you will need a fluid simulation of any kind (there are quite a few on the asset store)
Avalanche
Treat it as a fluid system with a strong resistance.
Vulcan
Almost the same as a flood, just with more viscosity.
Earthquake
You may be able to simulate the damage of an earthquake if all your objects have some kind of break point and the earthquake is added force to an area. A set force has an certain chance to destroy the object in the area. (Think of it in terms like any castle destroy game aka Flappy Bird, the bullet is your local earthquake and the castle your terrain + building/ trees)
Fire
You will need something like a burn value. Higher value = the longer it burns, harder to put out, faster spread. If a fire starts at any given point, it grows around. A river would have a value of 0, same as mountains. A forest would have a high value, a grass plain a low value. If you want to simulate a hot dry summer, your terrain could add a fixed value to everything, grass gets drying and thus has a higher chance to spread fire.

Chipmunk objects fall through floor at high velocities. Help?

I'm using Chipmunk cocos2d for what will ultimately be a sound-generation game where colliding particles make noise. But right now, I'm running into a problem: my particles keep falling through the floor!
In the example "bouncing ball" templates, the multiplier on the incoming accelerometer stream is fairly low (around 100.0f) but to get things to really react quickly I am cranking it up:
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)accel
{ space->gravity = cpvmult(cpv(accel.x, accel.y), 10000.0f); // originally 100.0f
}
I have found that this can be ameliorated by making dt really small, polling the accelerometer around 1/240 of a second.
Is this the best way? Is there another way to say to Chipmunk "look out, these things move fast"?
In general many physics engines have difficulty with collisions between quickly moving objects that are small relative to their velocities. I do not know the specifics of Chipmunk, but your issue implies that at certain time intervals a check is performed for intersecting objects. If the objects move quickly and are small, it is possible for the time interval where a collision is happening to be skipped.
The two easiest solutions are to use a smaller time interval, or somehow make one of the two objects larger. How are you representing the floor? If you can represent it as a thick rectangle, that should also reduce the problem.
The harder solution is to use a more complicated intersection algorithim, such as using bounding capsules to represent the area of space traversed by a sphere between two samples. If the API does not already support this, it is a pretty hefty amount of math and modification.

Falling Sand simulation

I'm trying to re-create a 'falling sand' simulation, similar to those various web toys that are out there doing the same thing - and I'm failing pretty hard. I'm not really sure where to begin. I'm trying to use cellular automata to model the behavior of the sand particles, but I'm having trouble figuring out how to make the direction in which I update the 'world' not matter...
For example, one of the particle types I'd like to have is Plant. When Plant comes in contact with Water, Plant turns that Water particle into another Plant particle. The problem here though is that if I'm updating the game world from top to bottom and left to right, then a Plant particle placed in the middle of a sea of Water particles will immediately cause all of the Water particles to the right and below that new Plant particle to turn into Plants. This is not the behavior I am expecting. =(
One straightforward solution is to not do each iteration in-place. Instead, every time you update the world, create a copy of it... then look at the original, but update the copy. That way the order of updating does not matter any more, because you are completely disregarding your updates while you're looking for particles.
Don't program it in a sequential way (looping over all particles) but use real simulation programming techniques in which every particle is treated as an individual object/agent that obeys the laws of physics and that can act (run) asynchronously and respond to "events" (interactions with other particles).
If making every sand particle a separate object is too fine-grained, then divide the world into small blocks of let's say 1000 particles and simlute the behavior of these blocks instead.