vlc hardware encoding for real-time screen multicast - encoding

We are preparing a solution to multicast the teacher's screen to 40 students' pcs.
Teacher and students pcs can be both Ubuntu and Windows.
Some solutions were tested :
iTalc ... not stable yet.
multicast a "vnc -viewonly" ... no solution found
capture the screen with VLC and multicast it.
That latest seems to work ... except that with a resolution like 1920x1200 it is just too much CPU intensive.
One idea would be to capture only the 4th of the screen. CPU is not saturated anymore but everything becomes really slow and the surface is too small anyway.
A second idea is to buy a PCI card (or something) which will be dedicated to real-time video encoding.
Anyone has an experience/knowledge on it?
Thanks!

Try the TightVNC project's TightProjector.
TightProjector is a program that can transmit the screen of a
particular Windows computer to other computers in the same local-area
network. The data is transmitted continuously, in real time.

Related

Hololens Wifi card seem to buffer received udp messages

Currently Im building a simple collaboration room for Hololens 2, I want people to be able to see each other.
So I built a system using udp socket to share the hand of user across network (client-server).
To try proving that my system is not at fault I tried a simple experiment ,
I have 1 server and 2 clients, 1 of the client is on Holo 2 and the other on windows standalone both are connected to the server via wifi.
When both are connected they can see each other avatar as well as their own (The computer simulate hand with the MicrosoftRealityToolkit)
Inside the hololens when the user move his hand, we can see the Pc client receiving it instantly with a really small delay
however the Holo recv his own move way later (0.5s delay approximatively)
Same experiment with the pc moving his avatar, It will recv his own move instantly , however the holo will once again recv it with a delay.
Also I noticed that if I increase the messages frequency from the server the hololens will start to lose all the packets
It's really strange because all the packets are received, all the movements are well restored they just have a big delay on holo. I suppose that the networks card is at fault and buffer the received msgs, but Im not knowledgeable enough on network stuff to really understand what's happening there.
Alright so here are the couple of things you can do.
Try with TCP communication so you can keep track of the packages.
Don't send the the pose values constantly, just send when the pose changes and when you receive the data on the Hololens, perform interpolation to get a smoother results. This will make it feel like you have less lag.
Late reply to this, but hopefully helps someone:
Hololens seems to absolutely hate receiving UDP broadcasts. Was having identical latency issue, even with all other devices disconnected from the network, simple packets, and sending at a far lower framerate than the hololens. Try addressing the hololens IP only, not the broadcast address, even if it means you need a second server running for all the other devices

MotionEyeOS without internet

I am trying to build a small system which include a rPi and rPi Zero. rPi act as local wifi hotspot and rPi zero connect to rPi hotspot.Then I access zero camera through rPi. My diagram looks like this.
It works totally fine if I have a LAN connection. Once I remove the lan connection motionEyeOs won't stream any data. And even it disconnect from wifi and go to boot loop.
So my question, is there any way we can make motionEyeOs works with out actual internet ?.
The answer to your question: You can set link_watch="false" in /data/etc/watch.conf
But this come with a few other problems:
If your camera boots without network connection (internet), it has no time set (your videos and images could get overwritten)
If your camera boots without or looses network connection (internet), the camera does not try to reconnect and you have no other chance and have to connect a keyboard an monitor to it
A hardware clock would help for the first problem but I am still searching for a solution for the second. If you already have one, I would appreciate it, if you could help me out.
https://www.raspberrypi.org/documentation/usage/camera/raspicam/raspivid.md
This is what I use for an offline video recorder and it only needs the software and a power source..Camera also but you get the idea. Keep your image (os) small as possible as this can fill an sd card fast. If I remember right, I used 800 x 600 and it would use a gig an hour.

How do I detect iPhone on network?

I am trying to detect if my iPhone is in the same network as my Raspberry Pi. I would like to execute a script when I am at home and my iPhone's presence is registered in my LAN.
It seems that when the phone is in standby not even the iphone-sync port (6207/tcp) is found. "/usr/bin/nmap -n -sT -p62078 [my phone's local IP]" shows no host. I wonder what else I could scan for. Obviously the phone is online and ready to accept facetime calls (data via 3G is deactivated). Could I accomplish something with avahi which I am using on my Raspberry Pi, or are there other ways.
I've just spent a week beating on this problem so I can refrain from sending SMS home alarms to my wife when she's at work.
Pinging won't work because the iPhone won't respond to ICMP when asleep. Reading the ARP cache won't work because a sleeping iPhone will come and go (check it every 30 seconds for a few minutes).
The only way I have found to 'reliably' determine when my two iPhones are on my local (home) network is to use the PCAP dotnet library to look for any packets originating from either of the phones' MAC addresses. For example, if you run Wireshark with the capture filter
ether src <iphone-mac-address>
you will see a surprising amount of network discovery/announcement traffic from the phone. It still has quiescent states, but so far the longest interval I have seen between captured packets is around 10 minutes. You would have to wait until you have not heard from the phone for some interval (I use 15 minutes) before declaring it not-home.
With this technique you will find a phone quickly when it rejoins the home network, assuming your phone is configured for DHCP. I also use port mirroring on my main Ethernet switch to include traffic from my wireless access points.
I don't have a Raspberry Pi solution for this, because my linux expertise is very limited, but someone else may be able to help you along those lines. I have a Windows Service using the PCAP library and so far it works reliably, with the limitation of waiting 15 minutes before deciding an iPhone has left the network.
* update 2-3-2018 *
I have this detection algorithm down to about 5 minutes, using a combination of ping/arp messages directed to each phone, about once per minute. Seems to work great.
You can find a list of devices on your network by investigating your arp cache.
arp -a
Simply write a bash script to run arp -a at a regular interval, and search for the mac address of your phone.
You could go even further with this and perform different actions depending on what brand of device is connected.
The first 3 hexadecimal digits of a mac address are the vendor id.
Take the following mac address:
00:19:E3:AB:CD:EF
00:19:E3: is one of the registered mac address for apple devices.
By comparing the devices on your network with this list, you could detect when for example a '3com' device, or a 'dell' device attaches to your network.
http://www.coffer.com/mac_find/?string=apple
You can do "arp-scan -l -r10" for that (tested this myself), but the problem is if mobile data enabled the iphone will go and suspend wifi if screen is locked to safe battery. so you need to disable mobile data .. then arp-scan will work.

WOL iOS Project

There are free projects/examples of wol (wake on LAN) for iOS?
I've found this one [openwol][1] but it's old and has no signs of recents updates.
Also I've been digging on it and it's not working as expected, the main purpose of it is wake a computer but it's not working. I've spent some time on it and still no work.
Maybe there are other examples or someone else may join on me and solve/update this code to work?
Wake-on-LAN generally doesn't work for machines on wireless networks, as the wireless hardware is typically powered down when the machine is off or asleep -- it's usually only supported for wired Ethernet. As such, wake-on-LAN is unlikely to give acceptable results on an iOS device, as it'll only work for some specific network configurations (i.e, networks where the wireless segment is bridged to a wired segment that the target system is connected to).

Best software product to simulate connectivity issues for mobile testing

I need a product to simulate network latency for testing mobile applications (in particular iphone and android). I plan to set up a wifi router connected to a linux box, and write a number of scripts to approximate different types of connectivity issues.
So far, I've taken a cursory look at Netem and ns-2 (or its offspring ns-3). Netem looks very easy to deploy and configure, but they both look like they'll require some in-depth investigation.
Does anyone have positive/negative experiences with either of those solutions that they could share? Or maybe used a different solution for this problem?
If anyone comes here looking for tips, I've found a solution that seems to work well.
Ubuntu comes with Netem installed, so I went ahead and just made use of that. Basically, I got a computer with two ethernet ports, forwarded one to the other and applied Netem latency settings to the connection. Then I attached a wireless router to one, and LAN to the other. Netem lets me play with all kinds of latency and packet loss settings.
Btw, I also tried to use a few different laptops and set the internal wireless card up as an ad-hoc wireless router. I got it working for the most part, but finding a laptop with an internal wireless card that plays nice with ad-hoc in Linux is tricky at best... can't recommend it.