I would like to extend my Rasberry Pi camera V.2.1 with a photo flash to take better pictures. With the library Picamera you could still activate a flash mode, but with the successor Picamera2there is unfortunately no longer this possibility.
As flash I want to use the Braun Hobby 280 BC.
Can anyone help me?
Related
I have camera (camera + videoprocessor TW8834) sending video data over bt656 protocol, but the camera interface have not i2c on cable, so, it isn't recognizing by linux video driver.Is there some way to capture video from my camera in linux without modifieng the video driver?
I got advice to make a fake probe in the existing video driver (sun6i), but i don't think that it is the optimal way. May be i can write some driver or something simmilar that will snap up i2c messaging with my camera? Can i use it with default video driver by this way?
May be, are there some other way?
What should i learn to solve my problem?
I solve my problem by driver and DeviceTree modification. OS Armbian, easy to build, easy to make and apply patches. Big community. Some course about kernel hacking, which was usefull for me: https://courses.linuxchix.org/kernel-hacking-2002.html
For create solution we with my collegue spend about 4 days, we both was non-experienced in kernel modification earlier. I hope, this information will help someone with similar problem.
I have a Raspberry PI model B+ and I was thinking of integrating it to Alexa Voice Service. So I was able to manage my Raspberry PI and Alexa Voice Service until the part that Alexa says hello. In order to achieve this I used also PC108 media USB external sound card. So I’m getting both input and output from my plug-in microphone or my mini jack audio output to speaker. The thing is that something is missing in order to work .What do I have to do in order to make Alexa listen ?
Thank you in advance.
At re:Invent 2016 they had a workshop on doing this. Take a look at the slides from the session and the workshop instructions. We used a simple USB microphone and sound is built into the Pi. The sample app is still being updated so it should be good to go.
This was with a Pi3 but the basics should still be the same.
You can also use PiCroft that is an image of Mycroft a open source assistant it's just burn it on a sdcard and use
https://mycroft.ai/mycroft-now-available-raspberry-pi-image/
if you want to create skills https://docs.mycroft.ai/skill.creation
As we know BeagleBone Black dont have a DSP on SoC specific for the Video processing but is there any way we can achieve that by adding some extra DSP board.
I mean like Raspberry got Video Processing, so anyone tried to integrate both to get, so we have both the things to make that work.
I know its not the optimal way and these both are different but i have only one BBB and one Raspberry and I am trying to achieve some 1080p video streaming with better quality.
There is no DSP on BeagleBoneBlack, you need to use DSP functions.
If your input is audio, you can use ALSA.
When you say "dont have a DSP on SoC specific for the Video processing" - I think you mean what is usually called a VPU (Video Processing Unit), and indeed Beaglebone Black's AM3358 processor doesn't have it (source: http://www.ti.com/lit/gpn/am3358)
x264 has ARM NEON optimizations, so it can encode video reasonably well in software, 640x480#30fps should be fine, but 1920x1080#30fps is likely out of reach (you may get 8-10fps).
On Raspberry Pi, you can use gstreamer with omxh264enc to take advantage of the onboard VPU to encode video. I think it is a bit rough (not as solid as raspivid etc) but this should get you started: https://blankstechblog.wordpress.com/2015/01/25/hardware-video-encoding-progess-with-the-raspberry-pi/
I have been searching for how to access the iphone camera using MATLAB and I have found it can be accessed by app called IP CAM with the use of a local network. Yet the solution of IP Cam program existed on apple store isn't working so well for my application because I'm trying to build a real time image capturing program using Iphone's camera and Matlab mobile with later processing (and this method keeps MATLAB busy as long as it display the scene and I still want MATLAB to run in the foreground instead of IP Cam).
So far I've downloaded MATLAB Mobile and the connector and connected the Iphone to MATLAB on my laptop, so is there any one who knows how to access the Iphone's camera on MATLAB Mobile and capture the image so that it can be stored on MATLAB workspace for later processing ? or is there any one who can suggest tutorials any piece of material helping me through this.I'd appreciate your answer very much and thank you in advance.
P.S: if there is a solution on android's devices it's also work for me.
I am not aware of iPhone based solutions but here is an opportunity for android based system. I will suggest you to used Sensor EX. I have used it for sometime to acquire Accelerometer, Gyroscope data set along with live images. This tool has bindings available for MATLAB beside other programming environments. Feel free to ask questions if you cannot figure out that how this system works.
I am working on a project of mine which requires a webcam and MATLAB. I have a Logitech Webcam, and I dont know if I could talk to it through MATLAB because Im trying to work with images and image processing, so I just want to know if there is a way to find out if the webcam is combatible with matlab or if I need to get some other type of webcam to get the job done, if I do need to get another one, it would be helpful if you suggest a certain cam that is cheap and available around.
Thank you.
I've used several Logitech webcams with the Image Acquisition Toolbox on Windows. You'll find a list of supported hardware here.