Domoticz virtual sensor: how to set initial value - domoticz

I have a device that reads the P1 port of a smart meter. I have created a virtual sensor for the smart meter and, through a simple shell script, I can set the meter readings in Domoticz. This works well.
However, for the devices (both electricity and gas), I get a large peak of consumption on the first period. That is because the virtual sensor is initialized with the values of 0 for all the counters. This gives a peak that makes the graphs unusable.
Is there a way to initialize the meter reading on a non-zero value?

Domoticz uses an sqlite3 database. With sqlite3 domoticz.db '.dump' > domo.dump, you can get a complete dump of the database. Searching for the index of the meter, I found the initial value (for the gas meter:)
INSERT INTO Meter VALUES(1093,0,0,'2022-03-24 18:00:00');
Removing that row from the Meter table did the trick.

Related

How to measure change in altitude at high frequency with Apple Watch

I am trying to measure changes in altitude with an Apple Watch in a sport activity (Kite Surfing). Currently my App is just collecting data for analysis. I am recording barometric and GPS altitude for comparison at a frequency of 10 measurements per second. Basically, it works and data is recorded, but it seems these data are just worthless. In both measurements there are sudden jumps in the dataset of up to +-10m and spikes in GPS readings of up to 75m. Does anyone have an idea how to get somehow accurate readings? I basically do not care about absolute altitude; I am just interested in the change of altitude.
Use startRelativeAltitudeUpdates(to:withHandler:) and when your done remember to stopRelativeAltitudeUpdates()
Here is a link to doc.
Also you can ignore anomalies. for example: if the max possible change in altitude in 100 milliseconds is 2 meters (72 km/h). Then if you see any changes more than 2 meters in 100 millisecond just ignore the data and wait for the next reading.
remember when you ignore one reading to account for the time difference.

CrossSim crossbar simulation software

Is anyone using CrossSim - crossbar simulator from Sandia national laboratories? In the manual, it is not explained about the input files reset.csv/set.csv for lookup table generation. I need to know about the rmp values in that file. What is rmp and how was it calculated?
Or are there any other crossbar array simulation software that can be used for vector multiplication, etc mainly for memristor devices?
I am in graduate school student.
I'm studying Resistive Memory Devices for Neuromorphic Computing.
I am also using this CrossSim simulator(ver. 0.2). Maybe I can help you.
Generally, a Memristor device has two terminals whose resistance value is modulated by an arbitrary voltage pulse. If this memristor undergoes higher than the threshold voltage(Vth), its state changes. otherwise, it holds its state.
So, we program it with a higher than Vth and read its state by applying a voltage lower than Vth.
In the manual, there is no specific explanation of what's in the reset.csv/set.csv file. it contains a current value that is acquired experimentally. not a calculated value. Actually, after the lookup table is generated its values becomes conductance value. That's why reading voltage is required in the create_lookup_table.py example. (conductance) = (current) / (voltage) you know.
The lookup table is for experimental data to verify when memristors come to hardware. if you want to simulation just algorithmically you don't need a lookup table. you can do this by adding the following codes.
params.numeric_params.update_model = "ANALYTIC"
I hope this is helpful to you. :)

Trying to retrieve road speed limit from drivers current location

I am trying to retrieve current speed limit of a given road location data is passed to the Open Street Maps api every 2-3 seconds as the vehicle is in motion.
I have tried the code below using [around] however it only outputs results when the radius is at least 2500 metres
i am seeking to have a an area that is closer to 10 meters to determine the current speed limit of the vehicles location.
If there is an alternative endpoint to achieve the above so that is more robust and responsive please advise
i am basically wanting to replicate what i see on navigation apps where the speed limit value updates within 1-2 seconds of a vehicle passing a new speed sign of value different to the previous speed zone being driven in
I would appreciate any assistance to resolve this issue.
https://lz4.overpass-api.de/api/interpreter?
data=[out:json];nodemaxspeed;out%20meta;

Android app dev: Finding the best way to synchronize the timestamps of two sensors

There's already a good answer on the technical details and constraints of timing the gyro measurement:
Movesense, timestamp source of imu data, and timing issues in general
However, I would like to ask more practical question from the Android app developer perspective working with two sensors and requirement for high accuracy with Gyro measurement timing.
What would be the most accurate way to synchronize/consolidate the timestamps from two sensors and put the measurements on the same time axis?
The sensor SW version 1.7 introduced Time/Detailed API to check the internal time stamp and the UTC time set on the sensor device. This is how I imagined it would play out with two sensors:
Before subscribing anything, set the UTC time (microseconds) on the sensor1 and sensor2 based on Android device time (PUT /Time)
Get the difference of the "Time since sensor turned on" (in milliseconds) and "UTC time set on sensor" (in microseconds) (on sensor1 and sensor2) (GET /Time/Detailed).
Calculate the difference of these two timestamps (in milliseconds)(for both sensors).
Get the gyro values from the sensor with the internal timestamp. Add the calculated value from step 3 to the internal timestamp to get the correct/global UTC time value.
Is this procedure correct?
Is there a more efficient or accurate way to do this? E.g. the GATT service to set the time was mentioned in the linked post as the fastest way. Anything else?
How about the possible drift in the sensor time for gyro? Are there any tricks to limit the impact of the drift afterwards? Would it make sense to get the /Time/Detailed info during longer measurements and check if the internal clock has drifted/changed compared to the UTC time?
Thanks!
Very good guestion!
Looking at the accuracy of the crystals (+- 20 ppm) it means that typical drift between sensors should be no more than 40 ppm. That translates to about 0.14 seconds over an hour. for longer measurements and or better accuracy, a better synchronization is needed.
Luckily the clock drift should stay relatively constant unless the temperature of the sensor is changing rapidly. Therefore it should be enough to compare the mobile phone clock and each sensor UTC at the beginning and end of the measurement. Any drift of each of sensors should be visible and the timestamps easily compensated.
If there is need to even more accurate timestamps, taking regular samples of /Time/Detailed from each sensor and comparing it to the phone clock should provide a way to estimate possible sensor clock drift.
Full Disclosure: I work for the Movesense team

Why do my temperatur sensors became suddenly instable?

I have four DS18B20 temperature sensors connected to my Raspberry Pi. I use 1wire and a pull-up resistor.
I read the values directly via cat from the 1wire devices and put the velues without calculation into a gnuplot data file.
The setup has been running fine for weeks now, measuring a coolbox at different places in a range between about 0°C and 30°C. I got quite accurate readings and plots.
But suddenly the values of all sensors start to "flutter", the became unstable. They also dropped -- all four -- about quarter of a °C. The flutter is about about 0.1°C to 0.2°C. Two of the sensors are actually inside fluid (0.5l and 25l) and thus it is virtually impossible that they suddenly drop or flutter.
The time of the event coincides with me checking the cooler box. I might have shifted or touched some sensoring parts. But can that explain the temperature shift? What happend? How can I fix it?
It seems that a cause of the problem could be resolution being lowered. This is a (volatile) setting stored in the sensor itself. It can be set to 9, 10, 11 or 12 bits. The higher the resolution the more precise the measurements will be, but at a cost of longer measurement time.
According to DS18B20 datasheet, by default the resolution is set to 12 bits after power up. In addition, the drivers handling 1-wire communication with the sensor often also by default set highest possible resolution during startup. This could explain why rebooting fixed the problem in case of OP, but doesn't explain why the change in resolution happened in the first place. That likely depends on the specific setup and will likely have to be resolved on a case-by-case basis.
Furthermore - to confirm that the measurements are indeed done at lower resolution, one can get the numeric values of the samples and check the minimum value by which the measurements change. For example, for 12bit resolution the minimum delta is 0.0625 degrees, while for 9bit resolution it may only change by 0.5 degree and nothing in between.