LwM2M: Does Leshan provide persistance? - eclipse

I am wondering if the Leshan server can provide persistance.
Let's say I want to store my device reporting information events for some time.
Can I do that using Leshan? or I have to implemented by my self?
Thanks.

Persisting the received data is not in the Leshan scope. You need to do it yourself by listening the incoming data and putting them in a store

Related

In socket.io, can the listener act as the emitter at the same time?

In my problem, I need to send some data from several devices to a centralized server. Then I need to view these received data in a UI from the centralized server.
I have sent data from a client to a server using socket.io methods and it worked. Now I want to send the data in the server to a UI. Can I use sockets for that? If so, is it possible and is it good to code in a way that server act as the listner and the emmiter at the same time? If this approach is not good, please provide your suggesions.

pub/sub pattern for a socket connection

I am developing an app with c# client and Go server. Now I would like to implement real-time update functionalities, such as when I am inside a user's profile, to obtain their data in real time, so that if said user changes them, it is updated without the need to reload manually.
According to what I have been researching for this type of app, redis is usually used with a publisher and subscriber pattern, but I have not found anything on how to implement this in an app that maintains the connection with the server through sockets...
By directly having a bidirectional connection, could this functionality be developed in real time in some other way?
If anyone knows anything about it I would appreciate any information.

Flutter How to update data in real time?

I am using rest apis for getting data but i want to update it in real time, I can't use firebase for the same. please suggest me any better way also calling api again and again is not good idea to achieve it.
For real time you need to use a real time database with a sdk for subscribe to it (like Firebase) or you can use socket.io to stream your data from a standard backend, socket.io is a library to abstract the WebSocket connections.
If you use node for backend i suggest you to read this topic on Nestjs https://docs.nestjs.com/websockets/gateways

redis- Should I use redis to store chat messages?

So I am currently working on a chat, and I wonder if I could use Redis to store the chat messages. The messages will be only at the web and I want at least a chat history of 20 messages for each private chat. The Chats subscribers will be already stored in MongoDB.
I mainly want to use Redis, because I get rid of the MongoDB stuff, for more speed.
I already use Pub/Sub, but what about storing a copy in Redis Lists? Also what about reading statuses, how could I implement that?
Redis only loses data in case of power outage, if the system is shutdown properly, it will save its data and in this case, data won't be lost.
It is good approach to dump data from redis to mongoDb/anyotherDb when a size limit is reached or on date basis (weekly or monthly) so that your realtime chat database stays light weighted.
Many modern systems now a days prepare for power outage, a ups will run and the system will shutdown properly.
see : https://hackernoon.com/how-to-shutdown-your-servers-in-case-of-power-failure-ups-nut-co-34d22a08e92
Also what about reading statuses, how could I implement that?
Depends on protocol you are implementing, if you are using xmpp, see this.
Otherwise, you can use a property in message model for e.g "DeliveryStatus" and set it to your enums (1. Sent, 2. Delivered, 3. Read). Mark message as Sent as soon as it is received at server. For Delivered and Read, your clients will send you back packets indicating the respective action has occurred.
As pointed in the comment above, the important thing to consider here is the persistency model. Redis offers some persistency (with snapshots and aof-files). The important thing is to first understand what you need:
can you afford to lose all the data? can you afford to lose some of the data? if the answer is no, then perhaps you should not bother with redis.

Advice - Real time data processing from client to server

I am looking for advice/guidance on how to achieve the following:
I have a circuit mounted and connected to an Arduino and I am able to easily retrieve data from it, using Python and the pySerial module. It allows me to determine the value of an analog input over time.
At the moment I am storing that data to a file, with a time stamp and the correspondent value and I would love to hear opinions and thoughts on how I could 'share' this data to a web server and 'play' it live.
Is it possible to 'stream' the values into the dump file and retrieve data from it at the same time through an AJAX request or should I look into event-driven web servers like 'Tornado', 'Twisted'...
I am a bit lost here. Just for the record, I am comfortable with PHP and JavaScript for the final output, I just don't have a clue on how to constantly 'stream' the data I need.
Thanks in advance.
If you don't plan to update the Ardunio device too much then it would make sense to have the Python component continue to collect the data over the serial port and publish it in a way that can easily be consumed by a service which can distribute the information in a more efficient, and probably flexible, manner.
e.g.
read the data from the serial port and publish messages onto a message queue. The message queue can then be read by any other component and the data can then be distributed to other applications/clients.
Make a web call to a server that can process each update and distribute to other applications/clients.
You could use something like Pusher (who I work for) and make a call to the REST API to deliver each message to any connected clients. Whilst this is a good way of distributing your data you will be publishing your data even if no clients are listening so I think you are best to get the data to a component like a web server first.
Assuming you go with 1 or 2, you can then use realtime web solution to distribute the data to any number of clients. You could use Pusher here or you could use a self hosted solution.
So, the data flow as I see it would be:
Ardunio -> small Python app -> Queue (or HTTP request to Web server) -> Realtime Web Technology -> Many clients