I want to set up a point-to-point communication link between two Raspberry Pi using LoRa.
I know for lorawan there is (at least in Europe, where I live) a duty cycle limitation so the nodes can transmit only for an average of 30 seconds uplink time on air, per day, per device.
Is this valid also for point-to-point lora communications? Because my sender keeps on sending.
I am using the code provided here.
Yes, this is also valid for your LoRa application, since it is emitting radio waves. You can look up limits for europe for specific frequency bands in the ERC Recommendation 70-03 (page 7). In the ERC Recommendation 70-03 on page 42 you can then look up which of the frequecny bands are allowed for each country.
Example
Let's say you live in Germany and you want to use frequency 869,400 MHz to 869,650 MHz (this frequency band is called h1.6):
A quick lookup in the ERC Recommendation 70-03 page 39 shows that this band is allowed to be used in Germany:
Further this specific band allows you to use 10% time-on-air (duty-cycle) for your transmitter. This basically means you are allowed to transmit 1 second and are obligated to pause 9 seconds after that.
Related
I'm working with Movesense 2.0.0 on a HR+ sensor and I have to minimize the power consumption when device is not worn.
I can't turn it completely off since I need it to keep the correct time so, to reduce the battery usage, when I don't receive a HR notification for a certain amount of time I unsubscribe from all sensors.
What's the most power efficient way to determine when device is worn again? I was thinking about subscribing to accelerometer (as I understand it is the sensor with the lowest power consuption) and when I detect movement I resubscribe to HR and check for incoming data.
Is it a valid approach?
I also noticed that when device isn't worn but still connected to the strap I sometimes receive incorrect HR notifications, like the strap is acting as an antenna for electromagnetic noise. Is there a way to detect when the device is in that status except for looking at HR data to see if they make sense?
Your question is a bit vague in what you mean by "wear a sensor" (I'm assuming you mean HR-strap on chest). In that case if you look at the power consumption documentation (see the PowerOff measurements compared to no-wakeup) you'll notice that
HR wakeup (/System/States/2 (=Connector)) is ~0.2 uA
Movement wakeup (/System/States/0 (=Movement)) is ~4 uA
All other measurements are much higher starting from 10 uA for Acc # 13 Hz.
So the easiest and lowest power determination is to SUBSCRIBE the /System/States/2.
If you base your firmware on version >=2.1 and you measure HR or ECG you also get updates during measurement when the connection is lost (so called Leads-Off detection), so this should help to filter out the spurious HR detections. For firmware 2.0 and earlier you get Connector state 2 (=Unknown) when measuring.
Note: the leads on detection (/System/State/2 when no HR measurement is ongoing) is very sensitive and can give "connected" state when the HR-strap is sweaty.
Full disclosure: I work for the Movesense team
I have a project that consisted of transmitting data wirelessly from 15 tractors to a station, the maximum distance between tractor and station is 13 miles. I used a raspberry pi 3 to collect data from tractors. with some research I found that there is no wifi or GSM coverage so the only solution is to use RF communication using VHF. so is that possible with raspberry pi or I must add a modem? if yes, what is the criterion for choosing a modem? and please if you have any other information tell me?
and thank you for your time.
I had a similar issue but possibly a little more complex. I needed to cover a maximum distance of 22 kilometres and I wanted to monitor over 100 resources ranging from breeding stock to fences and gates etc. I too had no GSM access plus no direct line of sight access as the area is hilly and the breeders like the deep valleys. The solution I used was to make my own radio network using cheap radio repeaters. Everything was battery operated and was driven by the receivers powering up the transmitters. This means that the units consume only 40 micro amps on standby and when the transmitters transmit, in my case they consume around 100 to 200 milliamps.
In the house I have a little program that transmits a poll to the receivers every so often and waits for the units to reply. This gives me a big advantage because I can, via the repeater trail (as each repeater, the signal goes through, adds its code to the returning message) actually determine were my stock are.
Now for the big issue, how long do the batteries last? Well each unit has a 18650 battery. For the fence and gate controls this is charged by a small 5 volt solar panel and after 2 years running time I have not changed any of them. For the cattle units the length of time between charges depends solely on how often you poll the units (note each unit has its own code) with one exception (a bull who wants to roam and is a real escape artist) I only poll them once or twice a day and I swap the battery every two weeks.
The frequency I use is 433Mhz and the radio transmitters and receivers are very cheap ( less then 10 cents a pair if you by them in Australia) with a very small Attiny (I think) arduino per unit (around 30 cents each) and a length on wire (34.6cm long as an aerial) for the cattle and 69.2cm for the repeaters. Note these calculations are based on the frequency used i.e. 433Mhz.
As I had to install lots of the repeaters I contacted an organisation in China (sorry they no longer exist) and they created a tiny waterproof and rugged capsule that contained everything, while also improving on the design (range wise while reducing power) at a cost of $220 for 100 units not including batterys. I bought one lot as a test and now between myself and my neighbours we bought another 2000 units for only $2750.
In my case this was paid for in less then three months when during calving season I knew exactly were they were calving and was on site to assist. The first time I used it we saved a mother who was having a real issue.
To end this long message I am not an expert but I had an idea and hired people who were and the repeater approach certainly works over long distances and large areas (42 square kilometres).
Following on from the comments above, I'm not sure where you are located but spectrum around the 400mhz range is licensed in many countries so it would be worth checking exactly what you can use.
If this is your target then this is UHF rather than VHF so if you search for 'Raspberry PI UHF shield' or 'Raspberry PI UHF module' you will find some examples of cheap hardware you can add to your raspberry pi to support communication over these frequencies. Most of the results should include some software examples also.
There are also articles on using the pins on the PI to transmit directly by modulating the voltage them - this is almost certainly going to interfere with other communications so I doubt it would meet your needs.
I have been following a tutorial that enables you to play around with the TXPOWER parameter of your wifi card / wifi adapter:
http://null-byte.wonderhowto.com/how-to/set-your-wi-fi-cards-tx-power-higher-than-30-dbm-0149606/
You can easily boost up your wifi range when increasing the TXPOWER.
Now, most people want to improve their wifi signal strength of their home router, right. But in my case, I would like my home router (which runs on a raspberry pi) to have a relative small wifi signal radius (say, a radius of 2 meters), so that you actually need to physically look for the pi home router when trying to connect to it.
I have learned that this tutorial does not do a thing with the wifi link quality and/or the wifi signal level and thus does not influence the wifi radius of my pi home router.
link quality & signal level
Do you guys have any ideas/thoughts about how to decrease link quality and/or wifi signal level (e.g Link Quality = 12/70 and Signal level =-10dBm) ? Is this even possible ?
I am using a Tp-Link TL-WN722N IEEE 802.11n USB - Wi-Fi Adapter.
WIRELESS LITE N ADAPTER 150M USB HIGH GAIN 1DETACHABLE ANTENNA WL-AP.
150 Mbps - External
First, I recommend reviewing this section from your link:
QUICK DECIBEL UNDERSTANDING:
Every 10 decibels is a 10X increase in power starting from 1 dBm equal
to 1mW... 10 dBm equals 10 mW, 20 dBm equals 100 mW, 30 dBm equals
1000 mW, and so on. Every 3 decibels is approximately double that of
the prior power, so 30 dBm is 1000 mW, if we add 3 dBm, then we can
double the power such that 33 dBm is about equal to 2000 mW.
It appears to me that you are able to modify the transmit power of your adapter as the tutorial states. Are you saying this is not working? If you set your transmit power to something extremely low (-30dBm, for example) you would effectively be turning off the transmitter. Keep increasing that value until you get your desired coverage radius.
If the transmit power parameter is not functioning as per the tutorial, then there are other means to achieve reduced coverage. The model you specified has a detachable antenna....so detach it. This would definitely reduce your coverage. However, if it reduces coverage too much, you could simply add an inline attenuator. Fortunately, your antenna uses an SMA connector which is very common. You can find many SMA attenuators on ebay with different attenuation values. Experiment with different values until you get the desired coverage.
And if that doesn't work, just wrap a bunch of aluminum foil around the thing lol.
Hi,
In the CDMA cellular networks when MS (Mobile Station) need to change a BS(Base Station), exactly necessary for hand-off, i know that is soft hand-off (make a connection with a target BS before leaving current BS-s). But i want to know, because connection of MS remaining within a time with more than one BS, MS use the same code in CDMA to communicate with all BS-s or different code for different BS-s ?
Thanks in advance
For the benefit of everyone, i have touched upon few points before coming to the main point.
Soft Handoff is also termed as "make-before-break" handoff. This technique falls under the category of MAHO (Mobile Assisted Handover). The key theme behind this is having the MS to maintain a simultaneous communication link with two or more BS for ensuring a un-interrupted call.
In DL direction, it is achieved using different transmission codes(transmit same bit stream) on different physical channels in the same frequency by two or more BTS wherein the CDMA phone simultaneously receives the signals from these two or more BTS. In the active set, there can be more than one pilot as there could be three carriers involved in soft hand off. Also, there shall also be a rake receiver that shall do maximal combining of received signals.
In UL direction, MS shall operate on a candidate set where there could be more than 1 pilot that have sufficient signal strength for usage as reported by MS. The BTS shall tag each of the user's data with Frame reliability indicator that can provide details about the transmission quality to BSC. So, even though the signals(MS code channel) are received by both base stations, it is achieved by routing the signals to the BSC along with information of quality of received signals, which shall examine the quality based on the Frame reliability indicator and choose the best quality stream or the best candidate.
I know the iPhone Bluetooth capabilities won't be accessible through the SDK until 3.0, but how long should it take to find devices in the area? Is it dependent on the number of devices in the area? If there are around 5 devices in range, should a scan to discover all of them take <5 seconds, or >30 seconds?
I know there are a lot of unknown factors, but I'm trying to determine if I can do a Bluetooth scan on startup if the time is minimal, or if I have to tell the user it is about to do a scan and there could be a long delay. I am unable to test this in the real world as the other Bluetooth devices aren't available, but I am trying to get a sense of how it could be designed.
Not sure what the API will let you do but the Bluetooth Host Controller Interface (HCI) command underlying this is the 'Inquiry Command'
This will let you inquire about devices either for a fixed time and/or a fixed number of responses.
I'm a Bluetooth neophyte, not an expert but...
To get at least 1 response from a Bluetooth device that is in a low power mode takes 1.28 seconds, so inquiry time is in multiples of that period up to a maximum of 61.44 seconds (48 periods), so the time range is 1 (1.28 seconds) to 48 (61.44 seconds).
There might be several devices that could respond in a single 1.28 second period though.
You can also specify the number of responses you will accept (1..255) or 0 for unlimited e.g. until the time runs out.
You can also cancel an inquiry, if you found a particular device you were looking for.
Unscientific test from my desk using a CSR bluetooth chip with Bluetooth 2.1 +EDR firmware running inquiry on the chip with debug output via the chip UART. Ran each inquiry 10 times and took an average of the results:
1 period inquiry time (1.28 seconds)
yeilded an average of 10 unique
bluetooth addresses.
5 period inquiry
time (6.4 seconds) yielded an average
of 23 unique bluetooth addresses.
10
period inquiry time (12.8 seconds)
yielded an average of 29 unique
bluetooth addresses.
I say 'unique', actually the results repeated a lot of the same addresses over and over, this may be implementation dependent though and the Apple API may only return unique addresses.
However, this is not representative of the 'real world' as most of the Bluetooth Devices around here (my office) are not in a low power mode. I guess, I could filter out PCs, laptops and test kit by Class of Device. That would get mobile phones, headsets that were discoverable etc...
Inquiry can also be combined with RSSI to get the the devices with the strongest signal but they may not necessarily be the closest.
For your scenario you might want to do an inquiry bases on time and number of devices e.g 4 * 1.28 seconds or 10 devices.
To summarise:
The shortest time you can do an inquiry for is 1.28 seconds and that could get 10+/-? devices in the area IF they are awake and near by.
If you've got a saturated Bluetooth environment or (a microwave oven going in the same room) it could take longer to find all the devices within range.
I know this is an old question, but I thought I might add something for anyone who finds this question later.
As Simon Peverett mentions, device discovery is performed by an underlying "Inquiry Command" that is carried out by the Bluetooth Host Controller Interface. In the Bluetooth spec V4.0, Volume 2, Part E, Section 6.1.4, the spec says:
When general inquiry is initiated by a Bluetooth device, the INQUIRY state
shall last TGAP(100) or longer, unless the inquirer collects enough responses
and determines to abort the INQUIRY state earlier.
Elsewhere, TGAP(100) is explained to be 10.24 seconds and is described as the recommended value for the time span that a Bluetooth device performs device discovery.
In other words, a good baseline for the minimum amount of time to perform an inquiry is 10.24 seconds, or 8 of the 1.28 second periods that the Inquiry Command measures time by.