TWiki page to Google Sites [closed] - perl

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I've managed to extract the HTML source of a TWiki with its URL and Perl's LWP::Simple module.
What I want to do now is to use that HTML and put into my Google Site as a new page (via a program, NOT manually).
How do i go about doing this using Perl? Any ideas would be highly appreciated.

This question probably will be closed in his current form.
You should read https://developers.google.com/google-apps/sites/docs/1.0/developers_guide_protocol#CreatingContent and ask a specific question, what doesn't works for you.
The basics:
you should auth yourself with oAuth2 protocol to google (search metacpan for oauth2)
and send a POST request to specified URL (read the 1st link).
You probably will need too, at least:
change all URL's in your pages from the TWIKI
change all IMG SRC links

Related

My entire website is not being displayed on github [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
I'm not sure as to why my entire website is not displaying, i've tried everything that was recommended but still nothing works. My banner image and text is not showing up when I published the website on github.
What my website should look like:
file:///Users/tamannahoque/Documents/Skincare/index.html#products
(if that works)
The code:
https://github.com/TamannaHoque/the-ordinary.github.io
The website when published on github:
https://tamannahoque.github.io/the-ordinary.github.io/
The basic website is being displayed on GitHub; but, your images and other items the website refers to contain links that don't refer to the "GitHub published" locations, so all you are getting is the HTML.

Need help scraping a website in perl [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I am new into the world of perl and right now I am trying to scrape a webpage. I have done some scraping before and used WWW::Mechanize. The pages that I scraped before were somehow simple, so I took the page source and then extracted the data I needed from there. But now I have a different website that seems to contain frames. http://www.usgbc-illinois.org/membership/directory/
I am not asking for any code, but some ideas or modules I could use to extract data from the website above.
Thank1s
You may find some useful information on this website Web Scraping and also you can take look at this module Web Scraper Module

How to track Emails sent to popular Email providers? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
There are few services that let you know whether someone opened your email or not and even how many times and when! As a reference I should link to MailTrack and Mixmax
I wonder how to detect when someone opens an email?
[Update]
As #GabrielCliseru said: most Email providers scrub JavaScript from HTML Emails so this one is off the chart
Gmail strips out any content between script tags before displaying
the message. Source
The javascript code sent inside an email is not executed. This is a security limitation.
What you want to do exactly?
You want to send a jquery code by email?
You can do it simple.
Or for more sure, you can put it in a Text file and attach it

Does Google update redirect URLs? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I have shortened my URL on my e-commerce store to make them more SEO friendly however some of my original URL are in a good position on Google.
If I redirect my old URLs to my new URL will Google automatically update my old URLs to display my new URLs?
Yes, if you use permanent (301) redirects. That's pretty much the full answer.

Writing to a Wiki [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
I need to write a script that writes (appends) data to an internal wiki that isn't public (needs username and password but unencrypted, http not https). The script can be a shell script, a Perl script, or even a Java application (last resort). Any help would be appreciated. Let me know if any additional information is needed.
Right now, I'm only able to read from the wiki using LWP Perl library using the getprint($url) function.
Thanks
If it's truly MediaWiki, then just use MediaWiki::API.