How do I create a user submission feed? - feed

Making this as simple as possible..
All I want to do is make a user-submitted form that looks like facebook's newsfeed.
"I want to" [texbox for user input]
[ (Post button)
Appears right below:
[I want to ___________________]
I feel that this can't be too complicated.. I'm trying to have simple code that will allow a user to submit text and have the text appear right below the submission area.
How should I approach this with AJAX? What else would I need? As you can tell I'm a novice.

That's a very broad question. You'll probably want a database, though you could get away with flat text files - at least for a bit. You could automatically generate the RSS xml on request with any of a number of different scripting languages from the database or flat file, or you could update the static RSS feed whenever you post. If you only needed the RSS, the RSS itself could be your storage.
Restrict us a bit more! Give us a language, a persistence mechanism, and what else you want to do with the data.

Related

How to create a "waiting" queue when there are more than 2 visitors on the website

Hello,
I use a form on my website. This form contains content that can only be selected once.
I would like to create some sort of "queue" when there is more than one visitor on the page.
So for example 2 visitors:
visitor 1 -> fill in the form.
visitor 2 -> gets popup with "please wait"
visitor 1 -> sends the form.
system processes data from form.
visitor 2 -> make popup invisible and show form.
Or another option would be to check if someone else has checked the checkbox yet and give a message if someone has checked this yet.
Well, as you didn't provide any code I will just give you my humble thoughts on this.
First of all: May I ask why? This sounds like a horrible user experience...
If you really want to do it anyways, here's what you need: AJAX or jQuery to provide the live feedback popup you mentioned. For the visitor count caching part there are obviously many ways to do it like using scripts of any kind or even some file or database based solution. From that point on you just go with your favorite web language and code the main queue logic, I would recommend PHP with a breeze of JavaScript.
I hope this was at least a little bit helpful.
Kind regards

Is there a way to embed an iframe in an email with text that updates?

I want to embed an iframe inside an email that contains the 10 most recent chat messages. Is there a way to make this iframe dynamic so that it always shows the latest 10 chats regardless of when the email is opened? If the iframe is not the correct way to do this, is there a better way?
You can implement an iframe into the email - but your mail will be recognized as spam by many providers.
You should try to render your content dynamically into an image and implement that image into the email.
There kind of is a solution. It is using Dynamic CSS with a fallback of a Dynamic image to pull the information. It is not elegant really as for some clients (e.g. Outlook) this is not available at all and will only display initial information. It also utilizes a link for a style sheet which also severely limits which clients this will work in.
The fall back dynamic image is a bit more comprehensive in client support, but much harder to maintain as you will need to build something that programatically pulls the tweets (HTML webpage potentially) and then also have something that creates and hosts an image for the email to pull. This is not a short, simple thing to set up and may not be worth the required back end work for a simple email.
See this link for a bit more in-depth info on how this can be done for adding a live twitter feed into an email: https://litmus.com/blog/how-to-code-a-live-dynamic-twitter-feed-in-html-email
Since there was no accepted answer i thought i would give my input as well.
Litmus had done something similar for their live twitter feed in emails sometime ago.
The method i can think off is to create a PHP page which takes 10 images and makes it into 1. PHP can have a loop that checks for increments from a specific number and if it exists then add to that 1 image. When there is a new chat image added, PHP will disregard the last one and add the new in the loop.
For anyone reading this in 2022, this is possible with AMP.
Instead of an iframe, you can create a dynamic email easily.
check out amp.dev
Note: AMP is not supported by many email clients

Send the user to a page along with a error message

I want to set up a login page in which from anywhere on the site I can send a user to it and it will display a custom message along with it. I could use a redirect and a msg query param but is this the best way to do it?
I'm working with node.js but I'm interested in a universal solution.
If you are going for easy, you can just have GET data in the URL. But, that doesn't look that nice, if you want a rather long message, plus, GET has size restrictions, where POST (virtually) hasn't.
For using post data you could use the solution of this: JavaScript post request like a form submit question, but that gives a rather messy source code (if you want a somewhat longer text).
You could keep them in a database, and only send the ID of the message to a PHP page, and get it from the database (that's what I would do, but that doesn't mean it's a good idea, just amateur here!)
You can use jQuery or simply plain javascript to extract your message from the url; the relevant question that listed links to detailed code: jquery get querystring from URL.
Then depending on how you want it displayed, apply the extracted string to your situation.

RSS feed per tag

Suppose stackoverflow.com wanted to have an RSS feed per each tag. They would probably have requests like stackoverflow.com/rss?tag=aspnet to return appropriate RSS feeds. This is the easy part.
Now when the user requested stackoverflow.com/rss?tag=aspnet he would see some XML. Instead it would be better to show a page where user can choose which RSS reader he wants to subscribe with (just like feedburner.com).
My question is: is there any ready-made code (html+javascript) that I can copy-paste to create such a subscription page? Basically I want to copy feedburner.com's subscription page onto my own site.
PS - I would be happy using feedburner.com, but it would require me to create a feed for each tag manually, which is impractical.

Best way to store data for Greasemonkey based crawler?

I want to crawl a site with Greasemonkey and wonder if there is a better way to temporarily store values than with GM_setValue.
What I want to do is crawl my contacts in a social network and extract the Twitter URLs from their profile pages.
My current plan is to open each profile in it's own tab, so that it looks more like a normal browsing person (ie css, scrits and images will be loaded by the browser). Then store the Twitter URL with GM_setValue. Once all profile pages have been crawled, create a page using the stored values.
I am not so happy with the storage option, though. Maybe there is a better way?
I have considered inserting the user profiles into the current page so that I could all process them with the same script instance, but I am not sure if XMLHttpRequest looks indistignuishable from normal user initiated requests.
I've had a similar project where I needed to get a whole lot of (invoice line data) from a website, and export it into an accounting database.
You could create a .aspx (or PHP etc) back end, which processes POST data and stores it in a database.
Any data you want from a single page can be stored in a form (hidden using style properties if you want), using field names or id's to identify the data. Then all you need to do is make the form action an .aspx page and submit the form using javascript.
(Alternatively you could add a submit button to the page, allowing you to check the form values before submitting to the database).
I think you should first ask yourself why you want to use Greasemonkey for your particular problem. Greasemonkey was developed as a way to modify one's browsing experience -- not as a web spider. While you might be able to get Greasemonkey to do this using GM_setValue, I think you will find your solution to be kludgy and hard to develop. That, and it will require many manual steps (like opening all of those tabs, clearing the Greasemonkey variables between runs of your script, etc).
Does anything you are doing require the JavaScript on the page to be executed? If so, you may want to consider using Perl and WWW::Mechanize::Plugin::JavaScript. Otherwise, I would recommend that you do all of this in a simple Python script. You will want to take a look at the urllib2 module. For example, take a look at the following code (note that it uses cookielib to support cookies, which you will most likely need if your script requires you to be logged into a site):
import urllib2
import cookielib
opener = urllib2.build_opener(urllib2.HTTPCookieProcessor(cookielib.CookieJar()))
response = opener.open("http://twitter.com/someguy")
responseText = response.read()
Then you can do all of the processing you want using regular expressions.
Have you considered Google Gears? That would give you access to a local SQLite database which you can store large amounts of information in.
The reason for wanting Greasemonkey
is that the page to be crawled does
not really approve of robots.
Greasemonkey seemed like the easiest
way to make the crawler look
legitimate.
Actually tainting your crawler through the browser does not make it that more legitimate. You are still breaking the terms of use of the site! WWW::Mechanize for example is equally well suited to 'spoof' your User Agent String, but that and crawling is, if the site does not allow spiders/crawlers, illegal!
The reason for wanting Greasemonkey is that the page to be crawled does not really approve of robots. Greasemonkey seemed like the easiest way to make the crawler look legitimate.
I think this is the the hardest way imaginable to make a crawler look legitimate. Spoofing a web browser is trivially easy with some basic understanding of HTTP headers.
Also, some sites have heuristics that look for clients that behave like spiders, so simply making requests look like browser doesn't mean the won't know what you are doing.