Recently, I was tasked to do a simple chat app for iOS, using Swift.. So, I have a parse server ready and running! All I want to know, is how to use triggers..
Let's say I have opened a conversation and I just received a new message. How can I get it, without constantly checking for new messages? I saw that cloud code is probably the way to go, but if it is so, is it practical? I mean, if I have 5000 users and they are constantly chatting, will it perform well?
Thanks in advance!
You want to use Parse LiveQuery component.
Add Live Query to your server's config:
let api = new ParseServer({
...,
liveQuery: {
classNames: ['Test', 'TestAgain']
}
});
// Initialize a LiveQuery server instance, app is the express app of your Parse Server
let httpServer = require('http').createServer(app);
httpServer.listen(port);
var parseLiveQueryServer = ParseServer.createLiveQueryServer(httpServer);
Install Parse LiveQuery library as a pod to your project (pod 'ParseLiveQuery').
Subscribe for events:
let myQuery = Message.query()!.where("user", equalTo: PFUser.currentUser()!)
let subscription: Subscription<Message> = myQuery.subscribe()
Handle events:
subscription.handleEvent { query, event in
// Handle event
// This callback gets called every time an object is created, updated, deleted etc.
}
Related
I want to delete all of a user's inserts in a collection when they stop watching a change stream from a React client. I'm using the Realm Web SDK for this.
Here's a summary of my code with what I want to do at the end of it:
import * as Realm from "realm-web";
const realmApp: Realm.App = new Realm.App({ id: realmAppId });
const credentials = Realm.Credentials.anonymous();
const user: Realm.User = await realmApp.logIn(credentials);
const mongodb = realmApp?.currentUser?.mongoClient("mongodb-atlas");
const users = mongodb?.db("users").collection("users");
const changeStream = users.watch();
for await (const change of changeStream) {
switch (change.operationType) {
case "insert": {
...
break;
}
case ...
}
}
// This pseudo-code shows what I want to do
changeStream.on("close", () => // delete all user's inserts)
changeStream.on("timeout", () => // delete all user's inserts)
changeStream.on("user closes app thus also closing stream", () => ... )
Realm Web SDK patterns seem rather different from the NodeJS ones and do not seem to include a method for closing a stream or for running a callback when it closes. In any case, they don't fit my use case.
These MongoDB Realm Web docs lead to more docs about Realm. Unless I'm missing it, both sets don't talk about how to monitor for closing and timing out of a change stream watcher instantiated from the Realm Web SDK, and how to do something when it happens.
I thought another way to do this would be in Realm's Triggers. But it doesn't seem likely from their docs.
Can this even be done from a front end client? Is there a way to do this on MongoDB itself in a "serverless" way?
If you want to delete the inserts specifically when a (client-)listener of a change-stream stops listening you have to implement some logic on client side. There is currently no way to get notified of such even within Mongodb Realm.
Sice a watcher could be closed because the app / browser is closed I would recommend against running the deletion logic on your client. Instead notify a server (or call a Mongodb Realm function / http endpoint) to make the deletions.
You can use the Beacon API to reliably send a request to trigger the delete, even when the window unloads.
Client side
const inserts = [];
for await (const change of changeStream) {
switch (change.operationType) {
case 'insert': inserts.push(change);
}
}
// This point is only reached if the generator returns / stream closes
navigator.sendBeacon('url/to/endpoint', JSON.stringify(inserts));
// Might also add a handler to catch users closing the app.
window.addEventListener('unload', sendBeacon);
Note that the unload event is not reliable MDN. But there are some alternatives which maybe be good enough for your use case.
Inside a realm function you could delete the documents.
That being said, maybe there is a better way to do what you want to achieve. Is it really the timeout of the change stream listener that has to trigger the delete or some other userevent?
I am using the following:
Apollo iOS
Neo4j Database (with the GraphQL plugin)
GraphQL
Singleton Class for Connection
My code below is to connect to the GraphQL endpoint which happens to be my Neo4j Database.
class Network {
static let shared = Network()
private(set) lazy var apollo: ApolloClient = {
let url = URL(string: "http://localhost:7474/graphql/")!
let keychain = KeychainSwift()
let configuration = URLSessionConfiguration.ephemeral
configuration.httpAdditionalHeaders = ["Authorization": "\(String(describing: keychain.get("neo.auth")))"]
return ApolloClient(
networkTransport: HTTPNetworkTransport(url: url, session: URLSession(configuration: configuration))
)
}()
}
When I create data inside my database via my app this works great however if I then query to the database to return all the data, the newly created data is not found. The data is in the database because I can see it in the browser and also if I restart my app and then run the exact same code to return all the data; it returns it.
The other thing I have tried to do is run two instances of my app side by side in the simulator. Both apps have the same features and can import/export data. When I create new data via one instance the data is created in the database successfully however upon importing the data in the other instance of the app - it returns nothing (Both apps being ran at the same time).
My Import Code
func importData(){
let apollo = Network.shared.apollo
//Import all data from the graph database
apollo.fetch(query: GetAllQuery()) { result in
guard let data = try? result.get().data else { return }
print(data.jsonObject.values)
}
}
The only thing I can think of is that the session is not updating when new data is created in the database. The reason I feel this is because if I relaunch the app and run my import function it actually returns all the new data. I need the connection to update when new data is created, is there a way I can refresh the connection upon data creation?
I found a fix that is working so far for me:
apollo.clearCache()
I am using this before I run a query to the database and so far it has pulled down all the correct data. My issue was that when I exported data to the database and then tried importing it (or reading it) directly after it never returned anything or returned data that already existed in the system.
I'm using .NET Core app with a PostgreSQL database (with Npgsql) combined with SignalR to receive real-time data and latest data entries. However, I am not receiving the latest entry, and sometimes the Clients.All.SendAsync method sends more than one entry to the client. Here is my code:
Hub method that sends new data to client:
public async Task SendForexAsync(string name)
{
var product = GetForex(name);
await Clients.All.SendAsync("CurrentData", product);
using (var conn = new NpgsqlConnection(ApplicationDbContext.GetConnectionString()))
{
conn.Open();
var cmd = new NpgsqlCommand("LISTEN new_forex", conn).ExecuteNonQuery();
conn.Notification += async (o, e) =>
{
var newProduct = GetForex(name);
await Clients.All.SendAsync("NewData", newProduct);
};
while (true)
{
await conn.WaitAsync();
}
}
}
Console app that periodically polls for new data from an API:
var addedStocksDJI = FetchNewStocks("DJI");
if (addedStocksAAPL > 0 || addedStocksDJI > 0)
{
using (var conn = new NpgsqlConnection(ApplicationDbContext.GetConnectionString()))
{
conn.Open();
var cmd = new NpgsqlCommand("NOTIFY new_stocks", conn).ExecuteNonQuery();
}
}
The other code of the app is most definitely correct because I was receiving new and correct data before I tried implementing the LISTEN/NOTIFY feature. But now, I get one (or more) of entries of newProduct on my client, but it is the "old" product, that is, the database does not query and send the latest entries, but only the old ones via SignalR. When I refresh the page manually, the new data is correctly displayed, though.
I believe it has something to do with a single connection being open so I constantly receive only the "old" set of data, but even if that is the case, I am unable to figure out why I sometimes get more than one packet of data, even though I am only trying to send one, and I am calling NOTIFY only once.
I figured it out. Hopefully this will help someone else who gets stuck with this in the future!
The issue was that I was declaring my dbContext via .NET Core's dependency injection in my Hub class, which created the context only once per that class, and also because of that per page or WebSocket transaction. Which is why I was unable to get the latest data, I assume, since the dbContext was "old" and unaware of changes.
I fixed the problem by using a dbContext via the using scheme inside of my methods, twice in my SendForexAsync method (once per every call of the GetForex function), as well as in the GetForex function itself. That way, a dbContext is created and disposed of immediately, so the next time I poll the database for new data via the GetForex function (when I get a notification from the database due to the NOTIFY from the console app), a new instance of dbContext is created which can contain that new data.
I'm new to xcode, swift, and realm. And i have to build an IOS application for my graduation project. I have a problem on how to handle multiple clients request. my application is suppose to get requests from multiple users and i have to handle these requests in a server (start counters, a timer, or add, delete, update, etc), and my server is using the realm database. my question is how to communicate between a client and a server locally ? and can i implement the server with swift not javascript ?
If you're using the Realm Mobile Platform for your client to server interactions, you should be able to use the event handling features of the Realm Object Server to detect and respond to requests triggered by users. You can download a trial of the Professional Edition (Which should be enough for your needs as a private project.)
The code for registering an event handler looks like this (Taken from the Realm docs page)
var Realm = require('realm');
// Insert the Realm admin token here
// Linux: cat /etc/realm/admin_token.base64
// macOS: cat realm-object-server/admin_token.base64
var ADMIN_TOKEN = 'ADMIN_TOKEN';
// the URL to the Realm Object Server
var SERVER_URL = 'realm://127.0.0.1:9080';
// The regular expression you provide restricts the observed Realm files to only the subset you
// are actually interested in. This is done in a separate step to avoid the cost
// of computing the fine-grained change set if it's not necessary.
var NOTIFIER_PATH = '/^\/([0-9a-f]+)\/private$/';
// The handleChange callback is called for every observed Realm file whenever it
// has changes. It is called with a change event which contains the path, the Realm,
// a version of the Realm from before the change, and indexes indication all objects
// which were added, deleted, or modified in this change
function handleChange(changeEvent) {
// Extract the user ID from the virtual path, assuming that we're using
// a filter which only subscribes us to updates of user-scoped Realms.
var matches = changeEvent.path.match(/^\/([0-9a-f]+)\/private$/);
var userId = matches[1];
var realm = changeEvent.realm;
var coupons = realm.objects('Coupon');
var couponIndexes = changeEvent.changes.Coupon.insertions;
for (var couponIndex in couponIndexes) {
var coupon = coupons[couponIndex];
if (coupon.isValid !== undefined) {
var isValid = verifyCouponForUser(coupon, userId);
// Attention: Writes here will trigger a subsequent notification.
// Take care that this doesn't cause infinite changes!
realm.write(function() {
coupon.isValid = isValid;
});
}
}
}
// create the admin user
var adminUser = Realm.Sync.User.adminUser(adminToken);
// register the event handler callback
Realm.Sync.addListener(SERVER_URL, adminUser, NOTIFIER_PATH, 'change', handleChange);
console.log('Listening for Realm changes');
Unfortunately, there's no support for Realm and Swift on the server at this point (Unless it's a Mac server) since Realm Swift needs the Objective-C runtime to work, and this isn't available on non-Mac platforms. Node.js is the way to go. :)
I need help, in my custome receiver Chromecast app, I can not fetch media metadata with which the app was initialized.
I loaded media like this, after sucesful session request:
var mediaInfo = new chrome.cast.media.MediaInfo('https://wse-wowaza01.playne.tv:443/webdrmorigin/1042a2W.smil/manifest.mpd');
mediaInfo.customData = {
"userId": "mislav",
"sessionId": "39BE906248F9F5C4A93C7",
"merchant": "playnr"
};
mediaInfo.metadata = new chrome.cast.media.MovieMediaMetadata();
mediaInfo.metadata.metadataType = chrome.cast.media.MetadataType.MOVIE;
var img = new chrome.cast.Image('https://ottservice.playnr.tv/OTTranscoderHttps/get?url=http%asd9.168%2f20664_5b8df65c-67ff-4f13-b90d-b28c37f2310c.jpg&w=224&h=126');
mediaInfo.metadata.images = [img];
mediaInfo.contentType = 'video/mp4';
var request = new chrome.cast.media.LoadRequest(mediaInfo);
//this.playerState = this.PLAYER_STATE.LOADING;
this.session.loadMedia(request,
this.onLoadMediaSuccess.bind(this, 'loadMedia'),
this.onLoadMediaFailure.bind(this)
);
How can i access that metadata in receiver app? I tried with
cast.receiver.MediaManager.getInstance()
but no luck. Are there any steps before need to code on receiver to make data available?
Thank you for help, pointed me in right direction.
Got it working, this was the problem. I am using 3rd party DMR javascript plugin for content protection. It encapsulates cast_receiver and had already instantiated MediaManager & ReceiverManager, i didnt noticed that. Then i instantiated new mediaManager, but it wasn bound to any data. Pause/play event were all handled by plugins mediamanager instance, so my instance was useless. As soon i referenced allready instantiated mediamanager, data is there and his events are working. Same with receiver manager, i started instance that was already started and problems....SO conclusion, i dont need to instantiate any, DRM plugin takes care of everything, just need to override his event handlers
Depends on where on the receiver you want t access that info. For example, in a number of callbacks, you have an "event" of type cast.receiver.MediaManager.Event, from which you can get, for example, a cast.receiver.MediaManager.LoadRequestData object via event.data. Then this data object has your customData (data.customData)