I have a MetalZone mt-2w --> Mooer Groove Loop --> Mooer Radar --> Zoom H4n
Is it better to set the individual pedal output dials to max, and reduce a
ny clipping by lowering the H4n's recording level, or lower the outputs and increase the H4n's recording level?
Is it an 'artistic' (going by how it sounds) decision, or is there an 'electrical' / acoustic (can't think of a better term) procedure ?
Mike
Related
I am working on a simulation, which contains:
a bolt, welded uprightly to the world
a nut, connected to the bolt via ScrewJoint. The mass of a nut set to 0.02 kg, the inertia is a diagonal 1.1e-9 *I. This is configured via a .sdf file.
an iiwa manipulator, which is beside a point for now.
The problem is that the nut is very hard to manipulate and I cannot find a parameter to adjust, which could've made it more lifelike. To be more specific:
I measure the ability of force, applied tangentially in a horizontal plane to the nut, to cause the screwing motion of a joint, that joins the nut with a bolt
I'd like to have greater amount of motion at lower forces, and so far I am failing to achieve that
My interest in doing this is not idle; I am interested in more complicated simulations, which are also failing when iiwa is coming in a contact with this same joint; I've asked about those here and here. (Both answered partially). To sum up those here: when manipulator grips the nut, the nut fights the screwing in such a manner, that the schunk gripper is forced to unclasp and iiwa is thrown off-track, but the nut remains stationary.
I attach below two simpler experiments to better illustrate the issue:
1. Applying tangentially in a horizontal plane 200N force using ExternallyAppliedSpatialForce.
Graph notation: (here as well as below)
The left graph contains linear quantities (m, m/s, etc) along world's Z axis, the right graph contains angular quantities (in degrees, deg, deg/s, etc) around world's Z axis. The legend entries with a trailing apostrophe use the secondary Y-axis scale; other legend entries use the primary Y-axis scale.
Experiment summary:
This works as expected, 200 N is enough to make the nut spin on a bolt, resulting in the nut traveling vertically along the bolt for just under 1 centimeter, and spinning for over 90 degrees. Note: the externally applied force does not show up on my graph.
2. Applying tangentially in a horizontal plane force using iiwa and a simple position controller.
Experiment summary:
The force here is approximately the same as before: 70N along tz, but higher (170N) in tx and ty, though it is applied now only for a brief moment. The nut travels just a few degrees or hundredth fractions of centimeter.
The video of this unsuccessful interaction is below, the contact forces are visualized using ContactVisualizer.
Please advise me on how to make this screw_joint more compliant?
I've tried varying mass and inertia of the nut (different up to the orders of magnitude) in these experiments, this
seems to scale the contact forces, but does not affect acceleration or velocity of the nut after contact.
I like your experiment using ExternallyAppliedSpatialForce to get an idea of scales, though TBH I didn't quite get the details of this setup.
Things that caught my eye though are about scales, which you can estimate with pen and paper:
Your inertia is 1e-9 kg⋅m²?! Judging from your interaction with the iiwa I estimated a radius of 1cm and with that you'd get 2e-6 kg⋅m², three orders of magnitude larger.
A force of 200 N on a 20 grams nut would cause an acceleration of 10000 m/s². As a reference, that's 1000 times the acceleration of Earth's gravity!
Are these numbers correct? Also, if you happen to have fast interactions (do you?), you might want to estimate a time step that makes sense for your application.
Hopefully this helps!
A good thing is that I've fixed it; a bad thing: I don't understand the fix.
Let's restate a problem that I was tacking: a manipulated object reacted quite predictably to the ExternallyAppliedSpatialForce, but couldn't be moved via the contact with the manipulator.
What was done:
I've update drake from 2f340192a9dc79110410faf8a6d54a8615ddca92 (circa 22 Aug, 2022) to 42448c0af1b39f0c46f760e7ae37d77097689ad3 (circa 3 Nov, 2022)
After the update, my experimental setup broke down with assertion Actuation input port for model instance ... must be connected. [Similar to the issue raised in this question.]. My fix was like that:
bolt_n_nut_ = internal::AddAndWeldModelFrom(sdf_path, "nut_and_bolt",
plant_->world_frame(),
"bolt", X_WC, plant_);
then later in ManipulationStation::Finalize:
auto zero_torque = builder.template AddSystem<systems::ConstantVectorSource<double>>(
Eigen::VectorXd::Zero(1));
builder.Connect(zero_torque->get_output_port(),
plant_->get_actuation_input_port(bolt_n_nut_));
With changes above, a manipuland began to interact with the manipulator:
Things to note in graphs:
the distance the manipuland has moved grew from fractions of millimeter up to tens of centimeters. The video presents that a nut became manipulable.
This interaction violates the constraints of the ScrewJoint, i.e. the manipuland moves along it's axis without as much rotation
I'm a competitive swimmer and would like more detailed feedback on my swims.
A regular smartwatch has gyro/accererometer/gps/magnetometer, but has a few issues:
Laps aren't being counted instantly, often has some delay after making a turn
There's quite some deviation in lap time measurements, I would like to be able to more accurately measure my lap times in a swim, not with a 2 seconds deviation
When not moving my arms (e.g. swimming legs only), it doesn't count my lap
When taking breaks it does not detect this fast enough.
These are just limitations from the currently used hardware.
One solution is to use a touchpad, which are often used in competition, but when swimming with multiple people in a crowded lane, this doesn't work well either.
I've thought about a solution for this, and a possible solution would be to have a "base-station" at one end of a pool, and somehow measure the distance between the base-station and the swimmer. If you know exactly how far away a swimmer is in the pool, you can more accurately count laps/measure lap times/detect pauses.
Only problem I'm facing now is, what method can I use to measure the distance from the base-station and the swimmer?
Some insight:
Sometimes the swimmer is above the water, other times they're underwater.
Sound-based localization is tricky since sound moves 4x faster in water than in air
Radio-based localization is tricky since the speed of light in water is 75% that of in air. (and radio waves get attenuated fast in water)
Camera with some IR LED or swimming-cap color detection could work but not very convenient
Accuracy needed: ~0.5m
Placement of sensor: can be around wrist, or on head, or somewhere else
Anyone got any tips?
As usual, the documentation lacking some information we have to gather somewhere else: Physics.defaultContactOffset.
Physics.defaultContactOffset is used by the collision detection system to predictively enforce the contact constraint.
Unity explains you should use 1 unit = 1 meter for physic simulation.
I needed a lot of small spheres and cubes: 10cm width. Thus 0,1 "unit".
What they dont say is that when you're working on a small scale (I'm using objects of 0,1m width = 10cm) you have to change Physics.defaultContactOffset to a smaller value than the default one.
Hence my question: is Physics.defaultContactOffset important for calculations, i.e. if I change this to a very small value, does it have a negative impact on performance?
I have to change it from 0.001 to 0.00001 to get an acceptable collision detection system and I'm worried about a negative impact on performance.
From Unity3D documentation on Default Contact Offset:
Use this to set the distance the collision detection system uses to
generate collision contacts. The value must be positive, and if set
too close to zero, it can cause jitter. This is set to 0.01 by
default. Colliders only generate collision contacts if their distance
is less than the sum of their contact offset values.
So we can assume the physics engine is calculating distances between colliders and checking if the distance counts as a collision or not. I don't think it matters so much for performance as the calculation is done anyway.
With all this being said, Unity3d physics engine doesn't really do well with tiny objects, so it's better if you scale the spheres up to 1 unit, and scale everything else to compensate. You will most likely run into issues with these tiny colliders.
While GPS works for long distance changes, I would like to measure a shorter distance by using the iPhone's accelerometer.
Say I want to measure a height of a box using an iPhone application. You'd start the application, press a button to start measurement at the bottom of a box, move your iPhone from to the top of the box, then press a button to stop measurement. The application would then calculate and display the height of the box.
How would I use the accelerometer to perform this kind of measurement?
It is possible to do this, as I have implemented a more complex system on a sparkfun IMU.
There are a few components to do what you require accurately.
1) You need to filter the accellerometers signal using a low pass filter. This removes any noise that is not caused by your slow moving arm.
2) Integrate the 3 seperate acceleration values twice to go from acceleration to velocity, and then from velocity to distance.
3) The above method must be performed by keeing the phone in the same plane when you move it from the bottom of the box to the top. Any pitch/roll/yaw will disrupt the measurement(hint)
4) From above, to compensate for the pitch/roll/yaw, you can then include the built in gyro =]. Use this to map the vector obtained from the accellerometer to the starting point. Using this methodology you can measure the distance "through and object" by walking around it. (remember this gyro needs filtering too).
The final result depends on many factors such as, the effectiveness of your filter, the accuracy and sampling rates of your accellerometer and gyro, and the awsomeness of your mathematics and linnear algebra skills.
Try photographing an object of a known size at the distance of interest.
Depending on the application, and how much accuracy you need, you may be able to use the new 6-axis gyro accelerometers in the iPhone 4 and iPod Touch 4th Gen. You could get the total displacement by integrating the acceleration vector.
When integrating acceleration to get displacement, any errors will be cumulative, so this may not be appropriate, but may be worth considering.
GPS is currently not accurate enough to measure a box. Take into account that there may be an error of about 10 meters to a mile. You can get back the accuracy of the measure with CLLocation.
Pythagorean Theorem: c^2 = a^2 + b^2 (http://en.wikipedia.org/wiki/Pythagorean_theorem)
In your case, if point A = (Ax, Ay) and B = (Bx, By), then you can compute the distance C by:
C = sqrt( (Bx-Ax)^2 + (By-Ay)^2 )
Users can sketch in my app using a very simple tool (move mouse while holding LMB). This results in a series of mousemove events and I record the cursor location at each event. The resulting polyline curve tends to be rather dense, with recorded points almost every other pixel. I'd like to smooth this pixelated polyline, but I don't want to smooth intended kinks. So how do I figure out where the kinks are?
The image shows the recorded trail (red pixels) and the 'implied' shape as a human would understand it. People tend to slow down near corners, so there is usually even more noise here than on the straight bits.
Polyline tracker http://www.freeimagehosting.net/uploads/c83c6b462a.png
What you're describing may be related to gesture recognition techniques, so you could search on them for ideas.
The obvious approach is to apply a curve fit, but that will have the effect of smoothing away all the interesting details and kinks. Another approach suggested is to look at speeds and accelerations, but that can get hairy (direction changes can be very fast or very slow and deliberate)
A fairly basic but effective approach is to simplify the samples directly into a polyline.
For example, work your way through the samples (e.g.) from sample 1 to sample 4, and check if all 4 samples lie within a reasonable error of the straight line between 1 & 4. If they do, then extend this to points 1..5 and repeat until such a time as the straight line from the start point to the end point no longer provides a resonable approximation to the curve defined by those samples. Create a line segment up to the previous sample point and start accumulating a new line segment.
You have to be careful about your thresholds when the samples are too close to each other, so you might want to adjust the sensitivity when regarding samples fewer than 4-5 pixels away from each other.
This will give you a set of straight lines that will follow the original path fairly accurately.
If you require additional smoothing, or want to create a scalable vector graphic, then you can then curve-fit from the polyline. First, identify the kinks (the places in your polyline where the angle between one line and the next is sharp - e.g. anything over 140 degrees is considered a smooth curve, anything less than that is considered a kink) and break the polyline at those discontinuities. Then curve-fit each of these sub-sections of the original gesture to smooth them. This will have the effect of smoothing the smooth stuff and sharpening the kinks. (You could go further and insert small smooth corner fillets instead of these sharp joints to reduce the sharpness of the joins)
Brute force, but it may just achieve what you want.
Rather than trying to do this from the resultant data, have you considered looking at the timing of the data as it comes in? If the mouse stops or slows noticably, you use the trend since the last 'kink' (the last time the mouse slowed) to establish the direction of travel. If the user goes off in a new direction, you call it a kink, otherwise, you ignore the current slowing trend and start waiting for the next one.
Well, one way would be to use a true curve-fitting algorithm. Generate a bezier curve (with exact endpoints, using Catmull-Rom or something similar), then optimize & recursively subdivide (using distance from actual line points as a cost metric). This may be too complicated for your use-case, though.
Record the order the pixels are drawn in. Then, compute the slope between pixels that are "near" but not "close". I'm guessing a graph of the slope between pixel(i) and pixel(i+7) might exhibit easily identifable "jumps" around kinks in the curve.