Micropython PID Controller for DC motor with encoder - micropython

Looking for example code or help creating a PID controller for shaft position control of a dc motor with encoder. I have seen numerous Arduino C++ examples, but nothing with Micropython.

https://github.com/rakesh-i/MicroPython-Encoder-motor this is my implementation. It is not perfect but it works. I hope this works for you.

Related

How do I track the Unity position of physical objects the player is interacting with using Hololens2 hand tracking data?

Basically I am working on a mixed reality experience using the Hololens2 and Unity, where the player has several physical objects they need to interact with, as well as virtual objects. One of the physical objects is a gun controller that has an IMU to detect acceleration and orientation. My main challenge is this : how do I get the physical object's position in Unity, in order to accurately fire virtual projectiles at a virtual enemy?
My current idea is to have the player position the physical weapon inside a virtual bounding box at the start of the game. I can then track the position of the virtual box through collision with the player's hands when they pick up the physical controller. Does OnCollisionEnter, or a similar method, work with the Players hands? (see attached image)
I am also looking into the use of spatial awareness / image recognition / pose estimation to accomplish this task, as well as researching the use of a tracking base station to determine object position (similar to HTC Vive / Oculus Rift ).
Any suggestions, resources, and assistance is greatly appreciated here. Thank you!
EDIT UPDATE 11/30/2020 :
Hernando commented below suggesting QR codes, assume for this project we are not allowed to use QR codes, and we want as as precise orientation data as possible. Thanks Hernando!
For locating the object, QR code would definitely be the recommendation to find quickly with HL2 device. I have seen the QR approach in multiple venues too for VR LBE experiences like being described here. QR code is just sitting on top the device.
Otherwise, if the controller in question supports Bluetooth, can possibly pair the device and if device has location information, can possible transmit the location of where it is at. Based on what I am seeing from all of the above, this would be a custom solution and highly dependent on the controller abilities to be seen if QR codes are out of the equation. I have witnessed some controller solutions to first start the user experience to do something like touch the floor to get an initial reference point. Or alternatively doing something like always picking up the gun from specific location in the real world like some local based experiences do before starting.
Good luck with project, just my advice from using systems with VR
Is the controller allowed to paste multiple QRcodes? If allowed, we recommend you use QRCode tracking to assist in locating your controller. If you prefer to use image recognition, object detection, or other technologies, it needs Azure service or some third-party library, more information please see:Computer Vision documentation

Custom Spawn Functions - spawn rotation

I'm trying to do some object pooling in my networked game.
I'm following this piece of documentation.
The issue is: how do I set the right rotation to the pooled object?
The delegate gives me the spawn position, but no rotation.
The objects I'm pooling don't have synced transform.
Any solutions/ideas?
EDIT:
I gave Unity feedback regarding the SpawnDelegate signature that is the root of my issue.
https://feedback.unity3d.com/suggestions/spawndelegate-signature
EDIT2:
I read a bit the decompiled code of UNET and maybe the solution could be customizing the serialization/deserialization of the objects and adding the rotation (OnSerialize/OnDeserialize). Adding a SyncVar would be the same I think but at an higher abstraction level.
From the software engineering standpoint I don't like the idea of adding a component for this basic functionality.
EDIT3:
This is the UNET decompiled repo. I cannot understand how the rotation is set correctly when spawning with default spawners. For default I mean when you register the prefabs with ClientScene.RegisterPrefab

Lego Mindstorms EV3: Making a bot go straight with Gyro

I've been trying to use Lego EV3 with a gyro sensor to make it go straight. I've tried many videos, but they don't seem to work. Here's my code
to start off, your multiplier seems a little too off, usually I do something like 2.
You might want to refer to the image in the link below.
Gyro programming
I usually use this as my base gyro programming. If you understand My Blocks, I am basically using that in my programme. All I have to do is to add in the values in terms of direction, speed and distance.
Feel free to ask me if you need further help!

Building a robotic arm manipulator in Unreal Engine

Really new to Unreal Engine. I'd like to start learning the quickest way to build a robot manipulator arm without complex inverse kinematics. Just set up the joints, arms and gripper and control them directly.
This would be nice: https://www.youtube.com/watch?v=9DqRkLQ5Sv8. This would be even nicer: https://www.youtube.com/watch?v=UWsuBdhWqL0&t=24s.
I looked into the rigging and animation toolpack, but that's just for humanoids (is it?).
I'd appreciate any pointers.
Thanks
I recommend that you look into the fabrik node or the newer 2 bone Ik node , you will be able to choose the effector bone and take its postion and move it as you like. Hope this helps .

How can I track a person and respond to their movement?

I was wondering if there was a way on a raspberry pi to track people, and then communicate with GPIO pins to turn servos so something faces their direction. I know it is possible to track using simulink, but I am not sure how to act on results. Thanks for any responses!