MotionEyeOS without internet - raspberry-pi

I am trying to build a small system which include a rPi and rPi Zero. rPi act as local wifi hotspot and rPi zero connect to rPi hotspot.Then I access zero camera through rPi. My diagram looks like this.
It works totally fine if I have a LAN connection. Once I remove the lan connection motionEyeOs won't stream any data. And even it disconnect from wifi and go to boot loop.
So my question, is there any way we can make motionEyeOs works with out actual internet ?.

The answer to your question: You can set link_watch="false" in /data/etc/watch.conf
But this come with a few other problems:
If your camera boots without network connection (internet), it has no time set (your videos and images could get overwritten)
If your camera boots without or looses network connection (internet), the camera does not try to reconnect and you have no other chance and have to connect a keyboard an monitor to it
A hardware clock would help for the first problem but I am still searching for a solution for the second. If you already have one, I would appreciate it, if you could help me out.

https://www.raspberrypi.org/documentation/usage/camera/raspicam/raspivid.md
This is what I use for an offline video recorder and it only needs the software and a power source..Camera also but you get the idea. Keep your image (os) small as possible as this can fill an sd card fast. If I remember right, I used 800 x 600 and it would use a gig an hour.

Related

Solution to remote power cycle embedded devices

I am looking for a solution to remotely power cycle embedded devices that are connected via a USB hub to a server. I have a software solution (usbreset.c) that was posted in several responses in stack overflow - rebooting the usb port. This works if the device is still alive, can be detected by the server, but, it just fails to communicate with the server. I have had to use it a few times. however, sometimes the device could hang or freeze on a gui page and there is no serial communication. This requires physically power cycling the device using it's on-off switch. Someone suggested raspberry-pi solution. But as per that solution, I would need one raspberry-pi per device. It's as if using a raspberry-pi to serve as a relay. This is expensive and cannot scale. Could relays work in this situation i.e. the power pins of each device connect to a relay. So, one relay per device. There are remote IP power cycle solutions. so via a web interface it is possible to check which device is offline and power cycle just that one. Such solutions are common in IT. However, is such a solution viable to remote power cycle embedded devices connected at the other end?

How do I create a Near Edge computing system? (Send sensor data with Raspberry Pi/DHT11 sensor)

I am working on edge computing for IoT applications and expected to create a system that acts as a near edge computer with the use of a raspberry pi hooked up to a dht11 sensor. How do I send this data over to a computer that is at the edge? Ideally I want to use my PC as this device but I have no clue how to send this data over in real time.
So far I have created the circuit and can view the temperature and humidity readings on the raspberry pi in python. Unsure of what the next steps are - I don't want to send this data over to the cloud just yet.
Side note: I believe i may be missing knowledge regarding this but is the raspberry pi an edge device because it is hooked up to the sensor directly?
Any help is greatly appreciated.
You need to think this through a bit more. What will you do with the temperature and humidity data that you receive?
For example, if you're just experimenting and want to just see the readings in a console on your PC, you can use netcat to send the console output of your Python program from the RPi to PC. No SW development needed, they just have to be in the same network. Not particularly useful for anything else, either.
Otherwise you need to set up some client-server solution between the RPi and your PC. There's a ton of possible solutions, all depending on what you plan to do with the data. You can use MQTT, HTTP, a straight database connection (MySQL, PostgreSQL), etc. You have to supply both sides of the connection. The Python code on client side which connects and sends data; and the server side thing that accepts the samples and stores them somewhere. Plus all the networking, authentication etc.
Or you can just download the Python client libraries for your favourite cloud solution and set that up according to a tutorial. TBH, this sounds a lot less work to me.

Use Raspberry Pi like GoPro, Live Videostream over WiFi direct connection between Pi and Android

In the last weeks I experimented with my Raspberry Pi B and with the PiCamera. I had the idea to establish a connection between the RasPi and an Android device or (if it is easier) to a windows notebook without an access point in between, just like the GoPro camera and its App. I would like to have a live stream from the PiCamera to the other device and the possibility to start/stop recording a video or simply take a picture.
The app itself is not my problem, I wrote some simple apps before. But I didn't yet find a tutorial or description how to set up the communication and the stream.
I bought a WiFi dongle (Fritz!WLAN Stick N - by AVM) that supports WiFi direct and my phone (Samsung Galaxy S5 mini) does as well.
My first question is how to set up this stick on Raspbian - yet it is not recognises as a wifi dongle, and the second is how to achieve what I descriebed above.
Could anyone please describe what I can do?
Thanks in advance!
PS: I prefer a description for bash because I use SSH

vlc hardware encoding for real-time screen multicast

We are preparing a solution to multicast the teacher's screen to 40 students' pcs.
Teacher and students pcs can be both Ubuntu and Windows.
Some solutions were tested :
iTalc ... not stable yet.
multicast a "vnc -viewonly" ... no solution found
capture the screen with VLC and multicast it.
That latest seems to work ... except that with a resolution like 1920x1200 it is just too much CPU intensive.
One idea would be to capture only the 4th of the screen. CPU is not saturated anymore but everything becomes really slow and the surface is too small anyway.
A second idea is to buy a PCI card (or something) which will be dedicated to real-time video encoding.
Anyone has an experience/knowledge on it?
Thanks!
Try the TightVNC project's TightProjector.
TightProjector is a program that can transmit the screen of a
particular Windows computer to other computers in the same local-area
network. The data is transmitted continuously, in real time.

Is using airplane mode an acceptable way to test a lack of connection?

We are in the process of developing an method of caching so that our app can continue to operate in an area with very little/no signal.
Obviously users will try to continue to use functions that require data and we need to handle the inevitable failure of these requests appropriately.
Essentially we are sat in the office, switching airplane mode on and off to simulate entering/exiting signal then adjusting our app to fix any issues this may arise.
What I'd like to know is, is using airplane mode going to give us a reasonable simulation of entering/exiting an area with no data or are there other implications?
I've seen questions raising the issue that the 3G/EDGE connection may not always wake up after airplane mode is switched on - while I appreciate this method is no way as good as actually being out in the field testing, if we can get a reasonable simulation and account for the majority of the problems that arise then I think this is an acceptable tradeoff.
I apologise if this has been asked before, I did do a search on here & on google but couldn't find any appropriate results.
You should try the Network Link Conditioner
There is a WWDC 2012 session called Networking Best Practices that mentions it (but he does not explain how to use it there).
To get it, you have to go to XCode/Open Developer Tool/More Developer Tools.. and download the latest Hardware IO Tools for XCode.
Once you install it from the IO Tools pkg, "Network Link Conditioner" will appear in System Preferences
You can then do something like 100% packet loss to simulate one of those routers that pretends you are connected but actually doesn't work.
On iOS, the network link conditioner is under Settings / Developer (you must have enabled Developer mode in XCode first to see it)
The main problem is that in the Airplane Mode the networking operations fail fast, while spotty mobile signal will lead to timeouts and a-few-bytes-an-hour speeds. This is usually a significant difference from the UI viewpoint. (It might be worth a try to use some bandwidth throttle to starve the testing machine and see how it behaves when the network starts to break?)
A few years back, when testing remote devices which used the cell network to communicate with the 'home base', we did things like move them into a shielded room (make shift), place large shields on three of four sides to force them to connect to a certain tower (and therefore, network), etc. Brute force physical methods. Since this actually cuts off the signal, it may be a more realistic approach.
You may also want to try this through your wlan-router. First, disable data roaming on your iPhone. Then, let the iPhone be connected to the internet through your wlan network. Then, disconnect the gateway on your wlan router while your iPhone is still connected to the wlan network.
This depends on what failure modes you are trying to test.
I use Airplane mode as a first pass check to make sure an app submission isn't quickly rejected.
Other network failure handling checks might include:
3G only (no wifi).
WIFI only (in Airplane mode).
Pulling the power cord on the WIFI access point.
Pulling the network cable from the back of the WIFI access point after connecting to it (Reachability may falsely say yes).
Walking in and out of a basement
elevator (or other Faraday cage) in the middle of a transfer.
Driving between 2 cell towers during a data transfer.
Walking between 2 enabled WIFI access points between connection and data transfer.
Starting the app after more than 30 minutes of device inactivity (radios may be idle).
Running the app while another app (Safari, Mail) is downloading in the background.
etc.