I have a raspberry pi 4 with a camera module on it and a pan-tilt hat.
I've made a project which when started, it uses the feed from the RPI camera, detects a face and center around it. If the person moves, the camera tracks him.
When I run the .py file through the terminal, it works.
Now I want to use it with my PC. Therefore, I need to simultaneously run my project in the background AND to steam the feed to my PC somehow.
From the methods I searched online, I saw that it's possible to use flask and get a URL to use as an IP camera.
My question is, is it possible to stream the camera feed while my projects runs and tracks the face?
Thank you.
Here's my issue:
My Raspberry Pi boots to console (Raspbian) and does the following:
- Checks a small Youtube-DL script for new updates to a playlist. If so, it pulls the video and copies it to a folder.
After the video is saved, OMXPlayer plays that video from command line and it works perfectly (on 720p, which is great for my use).
-- That's it. It doesn't do anything else.
Here's the issue. I wanted to create a small GUI where I can scroll through older videos. I don't want to boot to a desktop. I tried an Electron JS app but it creates an entire instance of a web browser - which slows down OMXPlayer for me. I tried to mess around with Python TKinter but it won't let me load the script without first typing "startx" which boots up the Raspbian desktop.
Just want a small GUI (similar to KODI) but without anything else other than videos, and a short interface to scroll through different files.
Any suggestions?
I'm using MovieTexture now, but when a video file is added to unity Project, it will automatically be imported and converted to Ogg Theora format. and the quality is really bad.
I have tried changing the quality setting and even on the highest setting the video is still pretty bad quality, I have tried it in multiple file formats like .mov, .avi. .mpeg4 etc. I have even tried converting it to .ogv to try and get around unity converting it itself, and still the quality is poor. The platform is PC, and in the build the quality is the same as in the editor.
so the question is ,how to play high quality video in unity no matter using MovieTexture or anything else like some plugins?
Unity player on Windows only supports OGG, which is why Unity is transcoding your videos.
I have use the Renderheads AVPRo Quicktime plugin on Windows to play very high quality videos in kiosk setups. (They also have one for Windows Media format, but I used Quicktime).
Link: Renderheads AVPro (Quicktime)
I am not affiliated with them in any way, just a very happy customer, and here is the review I posted on the Unity Asset store:
Great work on your plugin! I've used so many plugins that don't work well over multiple platforms, or require switching between platforms, or manual steps, or manual licensing, or DLL hell, etc. I have to say you nailed it.
I develop on a Mac (and your plugin runs in the Unity Editor), then deploying on Windows. It all worked well straight forward and as documented. Even the events to detect when a video has loaded and is ready to play just what I needed (as we are loading a large video file).
Additionally, the error messages are very precise and pin-point a problem (missing file, bad format, etc) which means less time debugging.
I would like to show the live video of a Microsoft Studio webcam with a Raspberry Pi. It should be a reading tool for my grandma.
So I tried vlc v4l2:///dev/video0 and I get always just a single picture. After that the system is frozen. You can just plug the power supply out.
I don't know what I'm doing wrong? I also tried a smaller resolution.
I'd like to write a script that once run on Ubuntu, captures any mouse movement and immediately uses the installed webcam to save a snapshot to the desktop. I'd like to know which libraries are standard for this sort of programming.
Thanks!
Found the answer. opencv for python (highgui) takes care of the webcam, and Xlib takes care of the events.
Use python DBUS to send a Lock signal to a screensaver.