Why Queue use other data structure like link list? - queue

Even if queue is already a data structure but still its implementation is done by array and link list.Why we can't use only queue?

Related

If many Kafka streams updates domain model (a.k.a materialized view)?

I have a materialized view that is updated from many streams. Every one enrich it partially. Order doesn't matter. Updates comes in not specified time. Is following algorithm is a good approach:
Update comes and I check what is stored in materialized view via get(), that this is an initial one so enrich and save.
Second comes and get() shows that partial update exist - add next information
... and I continue with same style
If there is a query/join, object that is stored has a method that shows that the update is not complete isValid() that could be used in KafkaStreams#filter().
Could you share please is this a good plan? Is there any pattern in Kafka streams world that handle this case?
Please advice.
Your plan looks good , you have the general idea, but you'll have to use the lower Kafka Stream API : Processor API.
There is a .transform operator that allow you to access a KeyValueStatestore, inside this operation implementation you are free to decide if you current aggregated value is valid or not.
Therefore send it downstream or returning null waiting for more information.

Whats the best practice for object pool pattern in flutter/dart?

Imagine a very simple application with two pages, PostList and PostDetail. On the former page, we show a list of posts, and on the latter, the details of a single post.
Now consider the following scenario. Our user clicks on the first PostItem and navigates to the PostDetail page. We fetch the full post data from the server. The likes_count of this post gets increased by one. Now if our user navigates back to the first page, the PostItem should be updated and show the new likes_count as well.
One possible solution to handle this is to create a pool of posts. Now when we fetch some new data from the server, instead of creating a new post object, we can update our corresponding pool instance object. For example, if we navigate to post with id=3, we can do something like this:
Post oldPost = PostPool.get(3)
oldPost.updateFromJson(servers_new_json_for_post_3)
Since the same object is used on the PostDetail page, our PostItem on the PostList page will be updated as well.
Other approaches that do not use a unique "single instance" of our Post objects, across the application, would not be clean to implement and requires tricks to keep the UI sync.
But the ObjectPool approach also has its own problems and leads to memory leaks since the size of the pool gets bigger and bigger over time. For this problem to get solved we need to manually count the number of references for each pool object instance and discard them when this count is equal to zero. This manual calculation is full of bugs and I was wondering if there are any better ways to achieve this.
You can also solve this by using streams and StreamBuilders. First you create the stream and populates it with the initial data fetched from the API:
I like to use BehaviorSubject from rxdart but you can use normal streams too.
final BehaviorSubject<List<Post>> postsStream = BehaviorSubject<List<Post>>()..startWith(<Post>[]);
On the constructor body or initState function you would fetch the data and add it to the stream.
PostPage() {
_fetchPosts().then((posts) => postsStream.add(posts));
}
You can now subscribe to changes on this postsStream in both pages with StreamBuilder. Any update you need to do you would emit a new (updated) List<Post> to the stream triggering a rebuild on any StreamBuilder subscribed to the stream with the new and updated values.
You can latter dispose the StreamController.

Parsing Kafka messages

My question will be short and clean. I would like to parse json data which will be coming from the Kafka topic. Thus, My application will run as the Kafka consumer. I am only interested in some part in JSON data. Do I need to process this data using a library for example Apache-Flink? After that I will send the data to somewhere else.
In the beginning you say "filter data", so, looks like you need a RecordFilterStrategy injected into the AbstractKafkaListenerContainerFactory. See documentation for this matter: https://docs.spring.io/spring-kafka/docs/current/reference/html/#filtering-messages
Then you say "interested in some part in JSON". Well, this doesn't sound like you need records filtering, but more sounds like data projection. For this reason you can use a ProjectingMessageConverter for slicing data by some ProjectionFactory. See their JavaDocs for more info.

Which way should I create a list of objects when sending POST request

When I creating same type objects and save them into database, should I send a list of that objects in one request or should I send individually for each one?
For example, I would like to create a todo list, I can create multiple todos, then click save to send a list of todos, or when I finish editing one todo, I save it directly.
The first way can save request numbers, only one request needed to create many objects. But is the first way RESTful? All infomation about create in REST is creating a single object, but will there be poblems if increasing requests numbers?
----Edit
Thank you guys answering me.
For a more spicific usecase, I am using Django Rest Framework. I created a Todo model and a corresponding serializer. I am wondering, how could I create a list of Todos? I tried to send a list of Todos to serializer, and expecting serializer can automatically loop through it as same as getting a list of instance. But that doesn't work. I know I may be able to create a loop to call create method everytime. But is there a better way to do it?
There is nothing in REST that tells you what kind of payload you are allowed to use. You can POST/PUT whatever you want - one entity representation or many representations, in lists, dictionaries, XML, URL-encoded key/values or JSON, what ever suits your use case best.
In your case you might even want to send a delta/diff list of changes on the client: Lets for instance say your client loads some existing 3 todo items. Then the user modifies one of them, deletes another one and adds a new one. You can either do that in three requests or one single request with add/modify/delete operations encoded in it. Both ways are valid and the best solution depends on your use case and constraints like bandwidth, processing power and network round-trip time.

Remove read data for authenticated user?

In DDS what my requirement is, I have many subscribers but the publisher is single. My subscriber reads the data from the DDS and checks the message is for that particular subscriber. If the checking success then only it takes the data and remove from DDS. The message must maintain in DDS until the authenticated subscriber takes it's data. How can I achieve this using DDS (in java environment)?
First of all, you should be aware that with DDS, a Subscriber is never able to remove data from the global data space. Every Subscriber has its own cached copy of the distributed data and can only act on that copy. If one Subscriber takes data, then other Subscribers for the same Topic will not be influenced by that in any way. Only Publishers can remove data globally for every Subscriber. From your question, it is not clear whether you know this.
Independent of that, it seems like the use of a ContentFilteredTopic (CFT) is suitable here. According to the description, the Subscriber knows the file name that it is looking for. With a CFT, the Subscriber can indicate that it is only interested in samples that have a particular value for the file_name attribute. The infrastructure will take care of the filtering process and will ensure that the Subscriber will not receive any data with a different value for the attribute file_name. As a consequence, any take() action done on the DataReader will contain relevant information and there is no need to check the data first and then take it.
The API documentation should contain more detailed information about how to use a ContentFilteredTopic.