Automate clicking in a website and performing tasks - perl

I want to add my email as an allowed email address to those who use my program. In amazon i need to follow a set of procedures for making sure that my email address is allowed. I want to know if it is possible to automate this process using a perl module since it is extremely repetitive.
I have to login and then click manage kindle and then a few other buttons as a user . How would i automate this process using a perl script.

Sounds like you want WWW::Mechanize.

You have two options. You can either use greasemonkey to run a browser script to automate the clicking for you using javascript or you can use your programming language of choice with curl to get the page you need, scrape the data, then resubmit that data via get or post depending on what the site uses. If you know the url for the page to press the final button on, you can make a curl request to log in, then make a curl request to pretend like you just clicked the button on that page. Most of the time you don't need to visit all the pages in-between the two. Sometimes though, you do. Curl can be complex and a bit daunting to new users, but here's a curl module for perl http://search.cpan.org/perldoc?WWW::Curl If you need to learn how to use it, here's a good resource http://php.net/manual/en/ref.curl.php It's in php, but I think all the functions should be the same as the ones in perl. Amazon shouldn't, but if they do block curl requests, just copy the headers from your browser into curl and use those. Good luck!

Related

Prevent form data from being cached, and re-accessing with back button

I am considering making a very simple form for clients to use in a sort of web browser kiosk fashion, where they submit some of their information through the computer in the lobby at their option instead of writing something out by hand. This would be used if they come in person rather than calling or going to the web site first. I already have a form on our site for clients to use from their home computers so this would be very similar but tailored for and only used for the in-person clients.
Since the form will sort of just loop back to itself (not really "back" but just have a link to go back to a fresh form) for a fresh form after every client, how can I ensure that one can't hit back a few times to see the previous client's info? It's not really sensitive data, I just would like to provide that bit of privacy. Of course clients using our web site and the form there from their own computer are responsible for their own privacy.
Apart from having customer service walk to the computer and close and reopen the browser, or using AJAX, what should I do?
The other topics I've read related to this all have someone basically saying "you're not supposed to do that, you bad person". This seems like a valid reason to me. Any ideas?
Thanks!
Disable autocomplete by adding autocomplete="off" to the input tags or form tag.

Do two actions with a form

Hey, I'm just wondering if it's possible to have a form in html do two things on submit, have the action go to a url like normal (PayPal) but also go to a php program to send me an email.
You could use AJAX to submit the two actions requests individually.
Alternatively, just process the request at your server end code and make the appropriate requests from there.
There maybe better solutions depending on the exact context.

Autofill an HTML form

What applications exist that can take a series of fields from my db (or csv output from my db) and insert them into a web-based form and then submit that form?
Big Picture Use Case:
I maintain an in-house registration management system for webinars that we produce/present. Currently we use GoToWebinar.com to host our events but they haven't always been (and may not always continue to be) our vendor.
GoToWebinars.com does not provide me an API for creating registrations for 3rd party individuals. So when someone decides to attend one of our events they have to fill out 2 registrations forms, mine and GoToWebinars.com. I'd like to automate the task of filling in GoToWebinar's registration form.
I am looking into the same thing. I found some bits and pieces here and there and was able to decipher the URL to post to GTW:
https://www.gotowebinar.com/en_US/island/webinar/registration.flow?Template=island/webinar/registration.tmpl&Form=webinarRegistrationForm&WebinarKey=XXX_YOUR_WEBINAR_ID_XXX&Name_First=ViewersFirstName&Name_Last=ViewersLastName&Email=ViewersEmailAddress
If you are using cURL, then be sure to use CURLOPT_FOLLOWLOCATION because there are some redirections on the GTW side and cURL needs to follow them.
So far this seems to work for us.
Good luck!
I'm late to the party, but let me offer a way to call the CITRIX API via PHP to register a new GotoWebinar attendee, in case somebody else hits this page looking for the answer to your question.

screenshot-grabbing email tool

I have a web site with various graphs embedded in it that are generated externally. Occasionally those graphs will fail to generate and I would like to catch that when it happens. These graphs are embedded in multiple pages and I would rather not check each page manually. Is there any kind of tool or perhaps a browser addon that could periodically take screenshots of different URLs and email them in a single email? It would be sufficient to have scaled-down screenshots of full pages emailed maybe once a day to me, allowing me to take a quick glance and see that all the graphs are there and look okay.
I'm a big fan of automation. Rather than have emails generated that you then have to look at, take a look at 'replacing custom missing images in jquery'. This will run a piece of Javascript for each image that fails. Extending that to make a request to a URL that you control, which may also include the broken URL (or just the filename that is broken) would not be too hard. That URL would then generate an email, and store the broken URL so that it doesn't send 5000 emails if there's a flurry of hits to your page.
Another idea building on the above is to effectively change the external 404 from the source site to a local one (eg /backend/missing-images/) - the full-path need not exist - you are just generating a local 404 record in your apache logs. Logwatch will send a list of 404 pages from the apache log to you daily (or more often, if you want) by email.

Advice needed- aweber form submission using curl?

Advice needed for backend form submission to aweber and get response.
Scenario
When customer signup at my form, I will
1. insert the customer details into my own database,
2. send them a welcome email from my system,
3. at the same time I want the email to be added into aweber (this should run in the background, so that customer no need to fill in details for second time)
If I use the php curl call alone, is it a good solution?
I want to submit form value to aweber, so that aweber add the new email into their system, and then response to my backend script?
I have seen many versions outside, which may include:
http://scripts.incutio.com/httpclient/
http://freshmeat.net/projects/curl_http_client/
http://snoopy.sourceforge.net/
Are they having any special benefit over the normal php curl call to pass in data?
I have done this successfully before just using just a cURL request, but I couldn't get aweber to accept the submission without sending the person a confirmation message of their own. Basically it acts as if you have "confirmed opt-in" turned on, even if you turn it off.
Also, aweber addlead.pl script doesn't return anything, so if you had something in mind for that it won't go anywhere.
Hope this helps!