Below is a typical GCP pubsub model:
My question: is it possible for one subscriber(application or job) to subscribe to multiple subscriptions? Like this:
I mean we can filter at the subscription level that one subscription takes one event type (A or B). I know it will be easier if we have two topics (Topic A and B) and create two subscriptions, but again, it will boil down to the same question, is it possible for one subscriber to subscribe to multiple subscriptions?
Or the only alternative way I can imagine is that at the subscriber level, I can classify the event type, A or B, but that requires the publisher to pass the attribute to the topic level.
I have the control of publisher and I just wanna do one Subscriber instead of multiple subscribers.
An application can subscribe to multiple subscriptions, yes. You would need to instantiate multiple instances of the subscriber client, one for each subscription for which you want to receive messages.
If you want the subscriber to be able to receive messages without knowing the names of all of the subscriptions, then you could use push subscriptions and set the endpoint to different subscriptions to the same URL. Then, the subscriber behind that URL would receive messages from the different subscriptions.
Related
I'm making a social media app like Facebook using Flutter and Firebase.
I'm using Firebase Cloud Messaging to make notification service.
I want to make users who joined community or groups to subscribe their group's topic. Therefore, I can send notification to them by using method which is "subscribeToTopic()". However, I don't know how to make all users in community or groups to subscribe to certain topics.
If you know how to make all users to subscribe to certain topics, please let me know. Thank you!
There is no API to get the current subscribers to a topic, nor is there an API that verbatim subscribes all subscribers to one topic to another topic.
If you already track group membership yourself, you can either let each client subscriber themselves to the additional topic, or you can determine the list of tokens for the group members and then subscribe them to the topic on the server.
In Kafka I know that I can subscribe my consumer to multiple topics doing:
passing directly the list of topics I want to subscribe to.
Or passing a pattern
In my case I cannot pass directly a list of topics or a pattern because the list of topics is not known in advance.
For example a user will specify a topic in the morning by using a UI (for example), and a second topic in the evening, so the consumer will finish subscribed to these two topics.
Is it possible to do this?
You can make use of the subscribe and unsubscribe API of the KafkaConsumer to dynamically change the subscriptions of your consumer.
If you sent a message to one of the subscribed topics with key=subscribe/unsubscribe and value=topic1/topic2/... you could implement a logic in your consumer that changes its subscriptions based on those messages.
Let's suppose a simplified scenario like this:
There are two Kafka topics, users and orders and three microservices user-service, order-service and shipping-service.
When an order is placed through the order service, an OrderCreated event is added to the orders topic and listened by the shipping service. This service needs to get the user information to send the order. According to my requirements I can't make a REST call to user-service but use a stateful approach. That is to say, the shipping service is a Kafka Streams application that listens to the users topic, having a KTable backed by a local store with the full user table information. Thus, when processing the order it already has the user information available locally.
However, one concern of this approach is the consistency of the local user information in the shipping service, e.g:
A user updates its shipping address in the user-service, it updates its local SQL database and publishes an event in the user topic with this change.
The user places an order, so order-service publishes it in the order topic.
For whatever reason shipping service could process the OrderCreated event from order topic before reading the UserUpdated information from the user topic so it would use an address which is not valid anymore.
How could I guarantee that the shipping service always has an updated user information in this event-carried state transfer scenario?
If you need ordering guarantees, you would need to write both the user information update as well as the order into the same topic (and in particular into the same partition) because Kafka only guarantees order within a single partition.
You could call this topic "user_action" with a unique user-id as key (both an user information update as well as an user order is an user action). In your case, all three services would consume the "user_action" topic. While the user service only considers user updates and the order service only considers orders, the shipping service considers both.
This blog post might help, too: https://www.confluent.io/blog/put-several-event-types-kafka-topic/
Problem :
We are trying to make a chat application using AWS product AppSync and we want to achive the best performance but we're facing problem with real time subscriptions in AppSync and Graphql where a single user will need to handle hundereds of subscription in some cases which we think is not the best solution, what do you suggest ?
Problem Example:
Mutation{
addMessage(conversation_id=Int!, content:String!) : Message
}
Subscription{
subscribeForNewMessages(convesration_id: Int!):Message
#aws_subscribe(mutations: ["addMessage"])
}
the problem with this design is that the user need to invoke this subscription and keep listening for every single conversation, which we expect to be overwheelming the client in case if the conversations quantity is huge.
Questions :
Q1 :
What we are striving to achieve is one subscription for multiple (conversation_id)s, how this will be possible?
These folks (https://github.com/apollographql/apollo-client/issues/2633) are talking about something similar, we tested it and it doesn't work, is it a valid solution?
Q2:
Regarding Amplify; Will amplify perform well when listening for hundereds of subscription simulanuosly? does it make some sort of merging subscription and websockets or it will deal them separately?
Q3:
what are your comments about these designs? where there will be a service that will braodcast(invoke mutations with clients ids) the messages for chat participants , and the client will subscribe only for a single channel . like the following:
src2 : AWS AppSync for chatting application
src2 : Subscribe to a List of Group / Private Chats in AWS AppSync
Q1/Q2
You'll have to make multiple subscriptions and the aws ios/android/amplify sdks can handle subscription handshake protocols for real-time updates to data.
Take a look here
Q3
I recommend allowing clients to subscribe to specific channels (even if that means multiple subscriptions) so that the filtering logic can be done in the service rather than client side, reducing client side code which also means you don't have to worry about maintenance or scalability.
Setup:
I have setup a pubsub service wherein the publishers publish geolocation data at regular intervals.
The subscribers receive the location data of the publishers.
The subscribers are not presence subscribed, in the sense, the subscribers are not in the publishers rosters.
Problem:
The subscribers need to know the presence status of publishers.
Is there a way for the subscribers to know the presence status of publishers?
No, since there is no direct relationship between subscribers and publishers, which is typical of any pubsub design. To accomplish this the subscribers would need to know who the publishers are, which is not a good generic pubsub design.
It sounds like what you actually want is PEP (Personal Eventing Protocol), which is a subset of pubsub. In this case, the subscribers are subscribing to nodes belonging to the actual user they are interested in. If they are subscribed to the users presence, they automatically have access to the users nodes.
NOTE: I have recently found out that the newer version of the spec does in fact support an attribute that identifies the publisher. Thus making it feasible to get their presence, but you would still have to subscribe or query for it.