Is it possible to export pubnub chat messages to a postgresql database? - postgresql

I am prototyping a mobile app and i want to make it quickly. For that purpose, i am using pubnub chat engine.
However, i plan to migrate to a postgresql database after the end of the beta test. But i don't want to lose the existing data already stored in pubnub. Are there some possibilities to export my chat data to my own postgreSQL database ?

Export PubNub Chat Messages to your PostgeSQL Database
While many approaches exist. One is best. Using PubNub Functions. You will asynchronously save messages reliably to your Database. Using an OnAfter Publish Event. Your database needs to be accessible via a secured HTTPS endpoint.
Stackoverflow Answer: PubNub: What is the right way to log all published message to my db - You will want to Save your JSON Messages to a Private Database using the method described in the links. Example code referenced below in this post.
export default request => {
const xhr = require('xhr');
const post = { method : "POST", body : request.message };
const url = "https://my.company.com/save";
// save message asynchronously
xhr.fetch( url, post ).then( serverResponse => {
// DB Save Success!
}).catch( err => {
// DB Save Failed! handle err
});
return request.ok();
}

Related

What is mongodb equivalent for firebase onSnapshot to get realtime updates

For firebase, there is onSnapshot to get new messages in a chat application and its written in the client side code inside useEffect.
Purpose is - Let the server tell the client of new messages instead of client asking the server every second.
useEffect(() =>
onSnapshot(query(collection(db, 'matches', linkDetails.id, 'messages'), orderBy('timestamp', 'desc')),
snapshot => setMessages(snapshot.docs.map(doc => ({
id: doc.id,
...doc.data()
}))))
, [linkDetails, db])
What is the equivalent for mongodb to get new messages as soon as new messages are sent/received?
I have the messages in api - /api/messages/get which is the backend.
This is for react native and I do not use nextjs which has UseSWR.

Firebase/cloud firestore: onSnapshot() vs on()

I have been using onSnapshot successfully to alert my code to changes in underlying data, as in
// Set up to listen for changes to the "figures" collection, that is,
// someone has created a new figure that we will want to list on the screen.
setFiguresListener: function () {
// `figuresCR` is a collection reference defined elsewhere
return this.figuresCR.onSnapshot((iFigs) => {
iFigs.forEach((fSnap) => {
const aFigure = figureConverter.fromFirestore(fSnap, null);
const dbid = aFigure.guts.dbid; // ID of the "figure" in the database
nos2.theFigures[dbid] = aFigure; // update the local copy of the data
});
nos2.ui.update();
console.log(` Listener gets ${iFigs.size} figures`);
});
But I now read about on in the docs. It explains:
[The on() function] Listens for data changes at a particular location.
This is the primary way to read data from a Database. Your callback
will be triggered for the initial data and again whenever the data
changes. Use off( )to stop receiving updates. See Retrieve Data on
the Web for more details.
The syntax is a bit different, and on() seems to do much the same as onSnapshot().
So what is the real difference? Should we be using on() instead of onSnapshot()?
on() is an operation for reading from Firebase Realtime Database. That's a completely different database with different APIs than Firestore. They have essentially no overlap. There is no on() operation with Firestore.
If you're working with Firestore, ignore all the documentation about Realtime Database, and stick to using onSnapshot() for getting realtime updates.
Other tyros who fall into this tar pit: in the API doc pages, you might think that since firestore is a database under firebase, you could look for help under firebase.database. But no: look only in the next section, firebase.firestore.

Amplify datastore for android implementation with complex objects

I have an android application, which is collecting data in form of text and images.I implemented an AWS Amplify integration. Am using auth for logins, and i also added datastore for online/offline synchronization of collected data to the cloud. But i get error 400 because my item exceeds the 400kb row limit on dynamodb. After research here , i discovered that its possible to use Amplify datastore to store complex objects like images but they are stored in s3. So the sample code that demostrates this is for react, which i have failed to implement the same in native android. So anyone have a way of implementing this in android?
Currently, Amplify only supports 'complex objects' when using the API package. This does not include the DataStore package, which handles AppSync differently.
complex object support: import { API } from '#aws-amplify/api'
no complex object support: import { DataStore } from '#aws-amplify/datastore'
Sources:
https://github.com/aws-amplify/amplify-js/issues/4579#issuecomment-566304446
https://docs.amplify.aws/lib/graphqlapi/advanced-workflows/q/platform/js#complex-objects
If you want to use DataStore, currently you need to put the file into S3 separately, and then you can store reference details to the S3 file in the DynamoDB record (i.e. bucket, region, key). This could be done with Amplify Storage module.
const { key } = await Storage.put(filename, file, { contentType: file.type } )
const result = await DataStore.save({ /* an object with s3 key/info */ })

MongoDB as Real-time database

I have an experience with real-time database firestore. Currently, I'm working on Ionic 3 app with MongoDB. On that app, we have to use pull to refresh feature to update the latest content. But if we have real-time DB then we don't need to have such feature. Due to above issue, my client now wants to use firestore. But the key problem where we have is data migration. That is MongoDB to firestore. Currently, this app is in production (i.e. app stores) and having over 500+ users. Because of that converting app to firestore will be a very difficult task. So my question here is, Can't we use real-time DB features with MongoDB?
Note: I use Nodejs/Express as Restfull api.
What's your backend? How about using socket.io?
Since you're using MongoDB and Express already, here's a sample:
Server file:
var app = require('express')();
var http = require('http').Server(app);
var io = require('socket.io')(http);
app.get('/api/add', function(req, res){
db.collection('quotes').save(req.body, (err, result) => {
if (err) return console.log(err)
// send everyone the added data
io.emit('ADDED_DATA', req.body);
});
});
http.listen(3000, function(){
console.log('listening on *:3000');
});
in your client:
<script src="/socket.io/socket.io.js"></script>
const socket = io('http://localhost:3030'); //ip and port of server
socket.on('ADDED_DATA', (data) => {
// manipulate data
// push to current list
// or whatever you want
});

Meteor mocha test subscriptions

I'm currently working on writing client-side tests for my application using practicalmeteor:mocha. Therefor I need to insert a record into a mongo collection before every step.
Following the concepts of meteor, I wrote some meteor methods to insert/update/delete the records. If I want to test the expected behaviour, I assume I have to call the appropiate method (to insert a record for example) and then have to wait for the subscription to synchronize the record to the client so I can verify the insert.
I tried a few things but none of them seems to work:
describe('a topic', function() {
beforeEach(async function(done) {
let res = await new Promise((resolve, reject) => {
Meteor.call('topics.createNew', function(err, res) {
if(err) return reject(err);
resolve(res); //res is the _id from the generated record
});
});
//subscribe to single record (the one just created)
let subscription = Meteor.subscribe('topics.details', res);
//i assume this runs every time the publish function calls ready and therefor is also run when the new record is published
Tracker.autorun((computation) => {
let count = TopicsCollection.find({}).count();
//console.log('topic count: ' + count);
if(subscription.ready() && count === 1) {
computation.stop();
done();
}
});
});
});
I logged the counted documents and had luck when i wrapped the autorun-Funktion into a Promise but in neither of both cases, done() was called causing a timeout in mocha.
I already had a look on this question but I think it doesnt really help in my situation since I wait for every possible callback and I'm working with a plain Collection and the guy in the question above uses an accounts-package.
Can anyone suggest a better/working solution for this problem?
Thanks in advance! :D
Summarizing your current setup:
calling a method on the client
insert some doc on the server
subscribe to the pub from the client
verify inserts by the subscribed data
Which is a big blackbox that is on the scope here. Additionally you are testing things, that are already tested by the Meteor core team: integration tests of Methods and Publications between server and client.
To prevent waste of time, you can split up your setup into two tests:
Test the Meteor method as a unit
Test the Meteor publication as a unit
If those tests are defining the intended behavior with correct permission levels, you can assume, that if you are a logged in user with the correct permission level subscribing to your publication, that this user will get the resulting behavior as you have tested in the units. As I pointed out before, the pub/sub system should be assumed as 'working' by definition.
1. Test the method, server side only
Use hwillson:stub-collections to get a 'mock' of your collection, use practicalmeteor:sinon to mock your Meteor.user() object to test different permission levels.
2. Test the publication, server side only
Use johanbrook:publication-collector to test if your publication is running correctly and dburles:factory create mocks of your collection's data.
No need then for a complex test setup on the client side.