How can I save a Superfeedr feed to a database? - postgresql

I want to subscribe to several RSS feeds at once, and save the contents to a database. I have a Superfeedr account, so I can subscribe to the feeds that way, but I've read the Superfeedr docs and I can't figure out how I then access the aggregated feed to do anything with it.
I have an Azure account with a PostgresQL database which I can use to save the information, but I'm not sure whether there's a 'best' way to do that - I'm happy to use PHP, C, JS or something else, but I don't really know where to actually put the code to make it work. Do I set up a cron job or some kind of timeout, or can I get Superfeedr to automatically send updates to a listener?

I created Superfeedr many years ago :)
Superfeedr uses webhooks which means that once you have subscribed, you will receive notifications which include the content of the updated feeds on your HTTP server (you can ise PHP, JS or even C if you feel adventurous!)
These notifications will be POST requests and you just have to parse the body.

Related

REST api to allow multiple users edit the same form at the same time

I'm looking to make something similar to google docs where everyone can update the form (with multiple input fields) at the same time using REST api, the form data will be stored in database, is it possible?
I can have the form to send an update request whenever user make a change, but I still can't quite figure out what the logic to retrieve data/update form field content and resolve conflict when users are editing the same field.
Best way to use SignalR for realtime communication as well as pushing the updates to other users belonging to the similar group (may be call it as users of a same form). SignalR will provide all the underline infrastructure in place.

How to send DokuWiki notification mails at certain time?

In our DokuWiki Installation (Release 2018-04-22b "Greebo"), users can subscribe to daily notification mails, which is a core feature of DokuWiki.
For those daily emails, we would like to make sure that they arrive at a certain time.
In the documentation, I did not find anything about a script that could be started (i.e. from a cronjob) to send out mails.
I set up a cronjob calling a freely accesible page in the wiki using curl (no login required for this page). This did not cause emails to be send.
Any hint on how to schedule daily emails to be send at a certain time would be helpful!
Update: I am aware of .../feed.php, which would theoretically allow to get information on wiki events using the RSS feed. This data could be used to send notification mails. However, the RSS feed would need to be generated for every user to respect access right. For this to work, some sort of user credentials - or a copy of the user's access rights - would need to be copied to a place outside of dokuwiki and kept in sync.

How can I get my data from sendgrid?

I am using sendgrid api to send various email(s) my user(s). At sendgrid portal they have give us statistics in a limited way. I can't search my queries as i want. thats why I want that data in my own database. so, I can query as per my need.
My question is, Is there any way to get data/statistics my of account? any service of sendgrid which is provide my data?
thanks
If you enable our event webhook, your application can consume all the data/statistics itself and store that in your db. So for example, SendGrid will post info on the following types of events back to your site (Processed, Dropped, Delivered, Deferred, Bounced, Opened, Link Clicked, Marked as Spam, or Unsubscribed)
You can get more info on our event webhook here:
Event Webhook Docs
Those docs show examples on the data we post to your site as well as the parameters we send for each type of event.

Is it possible to save a custom variable for each user on their account?

I just got starte with programming a Facebook app. I already wrote an app for the VZ-Network, and there they have something called 'Persistant Storage'. Basically its an environment where you can save custom data on each user account. With your app you can read this data from the current user as well as from the users friends. Now I want to port my app to Facebook and my problem is that I didn't find such functionality here yet.
For now I would like to finish and launch this as soon as possible, so it would be nice if I could c&p as much of the code as possible.
Since the data is contains information about participation, at some point I would like to use the Facebook event object. But I was wondering if that could cause problems since it would require to create those events publically in order to use them in my app. Couldn't that lead to legal problems when I create such events with those who actually host the events in the real world? Would I have to ask the hosts to create those events, could I automate this process, or in case they don't have a Facebook account ask them to approve that the app creates the event for them?
I also need to know in what events the users friends participate, so I can't simply save the information on my server, since I don't have the friend info there.
In any case, it seems much easier to me to simply use a list of EventIDs on each user account to check whether or not the user participates in an event.

Real time App with Facebook

Does Facebook provide access to any real time APIs so that you can respond to events as soon as they happen? If not, what alternatives are there and what are their limitations? For example, if I use polling instead, will they limit my api calls? And if I try using RSS feeds, about how much delay can I expect? Or maybe it would be possible to receive and process email notifications (if I could convince a user to forward mail to another email address), as they seem to be dispatched pretty promptly.
I've never tried polling user data, but I think it will work without issues. As far as I know there are no restrictions on the number of API calls you can make on facebook.
As far as the Queries are concerned, what I have seen and I think this is how they implement it. If your query asks for too much data(takes too much time to process is how they measure this I think) - the query will just fail.
eg: I had this app that would pull all the status messages of all the friends of the user and display it in one place. I first queried for all the friends of the user - this worked okay. But at the same time if I ran a loop to get all the status messages for each friend - it would just fail.
I think you can call individual queries without issues, just be careful you query only data you need, cause, if the queries are too big or too many they will just fail. Best way to findout is running tests yourself.
The Facebook Graph API will allow you to subscribe to real time changes. You can currently only subscribe to users, permissions and errors, but they promise to allow subscribing to more objects in the future.