Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I'm just looking through some of the webmaster stats that Google provides, and noticed that the most common links to our website are to some research articles that we've put up in PDF format. The articles are also available on the site in HTML.
I was looking at the sites (mostly forums and blogs) which link to these articles and was thinking that none of the people clicking the links would actually get to see our website, and that we're giving something away for free and not even getting some page views in return.
I thought that maybe I could change my server settings to redirect external requests to these files to the HTML version. This way, the users still get the same content (albeit in an unexpected format), and we'd get these people to see our website and hopefully explore it some more. Requests coming from my site should be let through to the PDF. Though I don't know how to set this up just yet (keep an eye out for a follow-up question here), I'm sure this is technically possible. The only question is: is that a good idea?
What would you consider the downsides of redirecting traffic from external sources such that they see our site, not just get our content? Do they outweigh the benefits?
The only other alternate option I can see is to make our branding and URL much more visible in the PDF files themselves. Any thoughts?
Hopefully your PDFs are equally branded so that visitors will feel compelled to search further in your website. That might be just as important as having visitors briefly stop-over at your website.
I'm usually opposed to all such redirects as harmful to usability. However, in this case a basic content-type negotiation takes place and this might be acceptable. However, make sure that this doesn't break downloads of the PDF documents for users who might have disabled their referers in the browser (I do this, for one).
Sure you could cut them off, but there is a bigger issue at play: Why aren't these people finding you before they are finding these moocher sites?
Possible reasons are:
a) they did find your site, but not the content they were looking for, even though its obviously there, or
b) your site never appeared in their search results.
You may want to consider a site redesign in order to address those concerns before cutting off what appears to be a reliable source of information about your target audience (for you and the people who get your PDFs from elsewhere).
In the meantime, I would suggest you allow the traffic, add a cover page to all of your PDFs that are basically a full-page ad for your site and then enlarge the font on the copyright section of each page so the authorship is very prominent. You have a built in audience now, they just don't know it yet. Show them where the source is.
Eventually, the traffic will come to you and know you as a reliable source for that information.
I would do it. It's your site and your data.
The hot-linkers are essentially 'guests' and you can make the rules for your guests.
If they don't like it, they don't have to link.
I would add a page at the beginning of each article with info about the website, the current article and links to other articles on your website.
I find it more convenient than redirecting the user to a page on your website(that's annoying). Most people right click and download PDF files, what would that do when your redirect ;)
I think the proper thing to do in this situation is to leave the redirects. Here's why:
There's nothing worse than expecting to go somewhere/get something and not getting it (the negative impact would outweigh the positive.)
Modify your content to add a footer such as: "like what you saw, we've got more, check us out at www.url.com"
If your content is good, users will check out your website. These are the visitors you want, they're more likely to stick around and provide your site with value (whatever that may be.) Those that you've coerced may provide you with an extra click or two, but you will likely not see any value given back to your site.
Look at other successful sites that give something away for free: Joel on Software, Seth Godin, Tim Ferriss, 37Signals. The long term will provide better, more consistent value than the short term.
If you go for this solution, see if redirecting to the HTML version also changes the file name displayed by the browser if somebody used 'save as' on the link, else an HTML page would be saved with a pdf extension. Apart from that, I can see no reason why you shouldn't do it.
As an alternative, see if you can add a link to your site to the top of the pdf file. This way they are reminded where it comes from even if someone else sent it to them by email.
Related
full disclaimer, I am not a programer, I am an SEO trying to learn how to not rely on my developer for every little question I have.
Currently my issue is this. I use Screaming Frog to crawl my sites to layout the page titles, meta descriptions, h1, h2, etc so I can more easily plan out my changes.
The other day I wanted to run a report for my client and my own company website and got the following back.
So I know robots.txt is a way to make pages on your site but not have google crawl them. What I don't know is why an entire site would have this message as opposed to just some pages.
Can anyone give advice on how to fix this or links to how to's? I get this issue a lot and would like to educate myself so I don't have to wait for someone else. I get these as well when I try indexing websites on Google Search Console.
Many Thanks
What I don't know is why an entire site would have this message as
apposed to just some pages.
The robots.txt for your website has not been written properly if the intention is to index its content.
Or Screaming Frog might have a but if indeed the robots.txt file is written properly.
Or some webmaster decided the content was not worth indexing on Google or that bots would eat too much bandwidth (as in not being selective to restrict access).
Checking the current robots.txt file on that website, I see this content:
User-Agent: *
Disallow:
Which means the any page of that website is allowed to be crawled by any crawler (here the explanation of that file's syntax: https://moz.com/learn/seo/robotstxt)
So the current file should not cause that error OP mentions. Seeing that this question is from June 30/2017 and the robots.txt file was last modified on Jul 11/2017, it seems since this question was opened the OP may have already fixed whatever problem they had.
I recently started learning about drupal integration and because I wanted to learn how to create sites that I give to people with no html experience who want to be able to update their site. Through my research I learned that Drupal is the best supported CMS. It really does have a lot of nice features and accomplishes the job, but it almost has too many features for what I want.
I'm assuming there is some kind of open-source software for
I am an aspiring web developer trying to build my portfolio/gain experience. What I've been trying to do is build sites for clients that I can lose complete contact with--so when their store hours change and they have no HTML experience, I get emails about updating their site.
I figure there are three approaches: (tell me if there are more)
I write a php app that allows them to edit their site
I use a CMS (Drupal) to let them edit their site
I write scripts that embed text files formatted with {white-space: pre;}
I've so far implemented each method on 3 different sites, and they all work with drawbacks. I would prefer an open-source alternative to writing my own app for stability/security. Drupal seems more oriented towards allowing multiple users to add content, whereas I only want one user update existing content. The third option works well for computer-literate clients, but anyone who can navigate onto their server to change the file could probably figure out how to update the site without any of these approaches.
To sum up my problem, can anyone tell me the term I am looking for? Content Management System refers to the site framework for sites with a growing number of content posts (correct me if I'm wrong). What is the term for the site framework for editing sites with predefined but editable pages? If you could please tell me that, then I can at least research this question on my own. Otherwise, if you have any advice or solutions, they are much appreciated!
Thanks
user1470887, you've asked a great question. The answer, unfortunately, is that too many of the existing CMS products overlook this use case. It doesn't have an exact name as far as I know.
The term "in-place editing" describes one version of this (user clicks text on web page, block of text becomes a form, user edits contents and presses submit button, new text is sent to webserver and saved, and the form becomes normal text again). But I gather you would be happy with anything that lets them edit-existing but not create-new.
I'm also guessing you don't want to build your own Drupal module or commission one.
I do not know Drupal well enough to know whether there's a Drupal module that meets your needs. I'd recommend a careful search, though, especially if you are already somewhat familiar with Drupal. (Yes, Drupal can seem like too much CMS at times.)
However ... if you can't find a Drupal solution or want an alternative to Drupal, MODX Revolution does have an answer: set it up and then install Bob Ray's NewsPublisher add-on. It will put an "edit" button on pages which a user has the right to edit, but not on pages where they don't have edit rights. (And of course users will only be able to edit the title, body content etc - not the entire page.)
Bob Ray has literally written the book on MODX (MODX: The Official Guide). I was able to successfully adapt NewsPublisher to a project last year similar to what you have described, with predefined pages that the user would only need to edit over time. The latest NewsPublisher version, untested by me, is said to be further improved and can now be styled much more easily using CSS. That should allow you to give your users a customised and consistent interface.
As andmag also notes, MODX is a very flexible system for web developers focused on the presentation layer. It has the best templating system going.
I'll recomend you to try MODX. It gives you big flexibility to run your php or html code.
As a side project I tutor grandparents and other computer novices in Computer & Internet 101, from physically using a mouse to dealing with e-mail/searching/etc. Web development isn't really my area of focus - I do have reasonable HTML/CSS/Javascript etc skills, so I can throw together a decent-looking simple, static site - but occasionally I get asked to put together extremely simple websites for these people, that they can update themselves; that is, edit text-based content without giving Grandpa a heart attack by making him come face-to-face with HTML/Javascript.
I've waded through a mile-long list of CMS software - largely culled from the many other similar questions on SO - but they've all got something ruling it out: hosted, restricts the design (can't use w/existing CSS, looks "Word-press-y", etc), not free/FOSS, etc. I wonder if "CMS" is even the right word for what I'm looking for. What I need is a simple text editor for the client: that is, something that will give the client a text box of some variety, let them edit it, and update the content with that info. They can't mess with navigation, add new pages, change anything other than text. If it was really fancy, they could upload a picture.
I was planning to do this just with a couple of password-protected php forms, but thought I'd ask if there's anything already out there that might provide this functionality? Any suggestions on building my own version of this, in PHP or something else?
What I'm really interested in is:
1) the simplicity/customize-ability of the admin interface (or lack of admin interface, if the client could somehow edit directly in the page), and
2) ease of set up for me (not getting paid much if at all for this, don't want to wade through three million plugin options to figure out how to get some unwieldy, high learning-curve framework to do what I want).
Try pulsecms.
Here is another very simple CMS that has JQuery and modernizr , HTML5 Boilerplate and TinyMCE.
I have my wife setup with Windows LiveWriter
http://explore.live.com/windows-live-writer?os=other
This means that she just builds her articles as if she is using a word processor (almost exactly the same) and then just uploads the article to her blog. I use Blogengine.net to host the blog on a Godaddy hosting solution.
Blogengine comes with built in support for LiveWriter and only required that you input the address, username and password in.
I understand this is an old post, but i hope someone find this of interest.
You could give the users the instruction to upload text files to the site, and the have the HTLM/PHP/ASP pages load the context of such .ts files.
Each web page should have a specific named .txt file associated.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
Hello I got a website with around 5-6 pages (plain html). There are areas in these pages where I need to update occassionally. Is there any free / opensource CMS to maintain these editable areas of HTML page.
Thanks
Perch is excellent for small sites.
At its very simplest, Perch allows you to replace static content in an HTML file with placeholders. A simple GUI then allows you to edit those placeholder values for individual pages. So, for example, if you have a file containing this chunk of markup:
<h1>My site</h1>
you can change that to:
<h1><?php perch_content('Main heading'); ?></h1>
and you'll then be able to edit 'Main heading' through the GUI. Most CMS apps work in a similar way, but Perch is the first I've come across that does very little else, which is a huge plus for small projects.
I haven't used Perch for a while, and I'm sure they've added some features since I last did, but I'd still recommend you give it a try. It's cheap, too.
I think couchcms is a pretty good open source alternative to the likes of cushycms and perch
I recommend cushy
http://www.cushycms.com/
http://drupal.org/ is very popular. Many people also use Wordpress - http://wordpress.org. Also try googling "simple cms".
The answer will obviously be dependent on the requirements of the software and the capabilities of your server.
You should also check out opensourcecms.com. You can try out various cms's there until you find one you like.
For a five-page website, Drupal is probably overkill; I'd say Wordpress is good enough (just define a page for each page of the website, copy and paste your content, choose a theme, and you're done). (You would want to either use the blogging features of WP to take full advantage of it, though.)
If for some reason you really want to try out Drupal but don't want to invest a lot of time into figuring it out (it does take some ... well, a lot ... of time to figure out right out of the box), and you're not in a big hurry, you can wait a bit until it's possible to try out the new Drupal Gardens hosted CMS system (currently in beta). (You need a beta key to try it. Sign up for the beta on the site and then wait for your key.)
Since your most likely a programmer I would recommend github's very own Jekyll:
Here are some sites powered by it:
https://github.com/mojombo/jekyll/wiki/sites
As a bonus you can use Github to provide you free hosting (your site will be a public repository that only you can edit).
Have you tried using mut8? They have pretty alright features.
http://mut8.me
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I see them all the time and always ignore them. Can someone explain to me why they have become so prevalent? If I'm using a site that allows me to explore it via tags (e.g., this one, del.icio.us, etc.) that's what I will do. Why would I need a "cloud" of tags upon which to click? I can just type that tag(s) into a search box. What am I missing?
It's more of a browse assist than a search assist. If you see a large or bold tag in a tag cloud that interests you it my lead to some knowledge discovery that wouldn't have otherwise been sought out with a deliberate search. When I am browsing del.ico.us or stackoverflow I appreciate the tags as they sometimes lead me to discover related topics.
Wikipedia has an interesting definition:
A tag cloud or word cloud (or weighted list in visual design) is a visual depiction of user-generated tags, or simply the word content of a site, used typically to describe the content of web sites. Tags are usually single words and are typically listed alphabetically, and the importance of a tag is shown with font size or color. 1 Thus both finding a tag by alphabet and by popularity is possible. The tags are usually hyperlinks that lead to a collection of items that are associated with a tag.
It's a easy mechanism to determine which tags are most popular or how dense that tag is populated ( amount of tags).
It's just a intuative interface, I'm fairly certain that's one of the bigger reason's why they are so popular, that and they are very Web 2.0 also.
Why would I need a "cloud" of tags upon which to click? I can just type that tag(s) into a search box. What am I missing?
How do you know what tags are available to type without a lot of trial and error? Even if you know what tags are available, how do you know which are most popular without a bunch more trial and error?
The thing that makes a tag cloud really useful (at least a well implemented tag cloud IMO) is the ability to drill into a topic deeper and deeper.
For example, I could click "Topic A" and then I can see the items in the tag cloud for all tags within the "Topic A" items. I can then drill into one of those sub topic and narrow the items even further.
The stackoverflow tag cloud doesn't do this (which is too bad), but if it did, I could click something like "visualstudio" to drill into the threads tagged visualstudio then click "asp.net" to drill into that, then "javascript". The end result would be a list of all items tagged all three "visualstudio", "asp.net" and "javascript". This is where a tag cloud becomes really useful. Unfortunately, not all tag clouds work this way (but IMO they should).
Because searching for php is not the same as viewing all posts that the owner has tagged as php. Try it.
It helps you understand the focus of the page or site that you're looking at. What topics being discussed the most? What kinds of information will I find here?
If you search for something related to Java and land on two sites, one with a tag cloud showing 'Java' is prominent, and one where Java is almost invisible but 'C#' is prominent it's pretty easy to quickly decide which site is most valuable to you.
Tags give a way of explicitly labelling something with what it is about instead of relying on computers to extract this information.
For example, you might be interested in on questions about stackoverflow. If you search for "stackoverflow" you will get all kinds of questions that are not about stackoverflow at all (e.g. they only contain the word "stackoverflow" because there is some link to another question). By selecting questions that are tagged with "stackoverflow" you get only those post that people have explicitly identified as being about stackoverflow.