About a month ago we implemented Rich Snippets on the product detail pages for our e-commerce site (example).
We used the http://schema.org/ syntax for the structured data, as it seems to be the route Google are taking moving forward.
The data appears to be correct in the Rich Snippet Testing Tool and the data has started to appear in Google Webmaster Tools.
However the data is still to be seen on the SERP.
We have followed the rich data guide on Google to the letter and still no results. Is this a case of just waiting?
Here is an additional piece of information that is making it all the more puzzling, we initially went with a Microformats implementation and within 24 hours the data started showing up on the SERP. However we moved away from this because the Schema.org approach seemed a better bet.
I suppose it is one of the reasons explained in my Wiki post at
http://wiki.goodrelations-vocabulary.org/FFAQ#Why_is_Google_not_showing_rich_snippets_for_my_pages.3F
While that one refers to GoodRelations markup, the situation should be the same for schema.org.
Martin
Quote:
If you have added GoodRelations (manually or via a shop extension module) to your shop and still do not get rich snippets in Google search results, this can have one of the following reasons:
Google has not yet re-crawled your page or pages. Google dedicates just a limited amount of crawling time to a site, depending on its global relevance. It may be that Google has simply not yet re-indexed your page. Wait 2 - 8 weeks ;-)
The markup is invalid. Try the Google Validator. If that shows a rich snippet in the preview, you may just have to wait 4 - 12 weeks until Google will notice and white-list your pages. If it does not show a rich snippet, you either do not have valid GoodRelations markup in the page, you are missing properties that Google requires (e.g. gr:validThrough for prices), the price of the item has expired, or you use markup for which Google does not show rich snippets. Currently, Google shows snippets only for products and offers.
Google cannot see that your page changed. Your XML sitemap (http://example.com/sitemap.xml or similar) does not contain a lastmod attribute or the lastmod attribute was not updated after you added GoodRelations/schema.org. This attribute is important for crawlers to notice which pages need to be reindexed.
Low ranking of your item pages. Your item pages have a low ranking and what you see in your Google results are category pages or other pages summarizing multiple items. GoodRelations shop extensions add markup only to the "deep" item pages, because those are best for rich snippets. Use the title / product name of one of your products and restrict the Google search to your site with the additional statement site:www.example.com.
Related
I have duplicate content from my home page.
In Google Webmasters they tell me that I have a problem with duplicate content:
For example:
www.example.com/page1/ www.example.com/page2/ www.example.com/page2/
How can I remove it?
What that page says is not that you have duplicate pages but that you have several pages with the same meta description:
Meta descriptions are HTML attributes that provide concise summaries of webpages. They commonly appear underneath the blue clickable links in a search engine results page.
Usually each page should have their own meta description that describes its content (that is the reason why Google warns you about duplicates), but sometimes it's OK that several pages share the same description.
For example, based in your screen shot your site appears to be about mobile phones, let's say that the duplicates with 2 pages are one for the summary of the phone and the other for the technical specifications (I'm supposing as I don't understand Arabic), the meta description of both pages could be similar but not exact because it should reflect that they cover different aspects of the phone (summary vs technical spec).
On the other hand the duplicate with 14 pages appears to be several pages from a product list, maybe phones with same tag, if this is correct then it's OK that all those pages have the same meta description, as they are just parts of the same topic split into several pages.
I am introducing rich snippets on my site and have some questions I can't find solution:
Do I need to put main company snippet only on mainpage or all pages (contacts, social networks, etc) - I mean copy the code on all links?
How do I do the beauty two columns snippets with the main site links, and how do I define what these main links are? - Example when we search Facebook we see: Facebook Login, Facebook Register, Facebook Profile, etc... all with a brief description below. Are there the separate pages that contain snippet and google identifies the most relevant? What code to put on each page?
If you are trying to add knowledge graph items you just need to markup your homepage only. For individual rich snippet items like star ratings, breadcrumbs etc you'll have to mark each page for them to show up for all of your search results.
Contact and Social profiles as you mentioned are knowledge graph items.
In the second part I am assuming you are referring to the links below the branded search result. The are not rich snippets but are rather called sitelinks and are generated if you have good on-site structure and internal linking. Sitelinks are picked by Google itself and you have little control over them.
I am looking to generate a list of URLS or FB ID's for a set of existing Facebook Pages. Ultimately the initial query I am looking to run is simple: Find all NEW facebook pages created in City XYZ.
The term NEW, is open to interpretation. It could mean "Created this month" Or newer relative others in a set (these details are not important at the moment) . Also, "Pages" refers to fan pages (not user profiles).
I have identified 3 possible approaches. Both of which I am hoping to get some input on. Regarding feasibility and process.
Option 1) Somehow leverage Facebook's Graph API and develop some time of web application to generate a list of all Pages, then filter by city, then filter by date created.
Option 2 (Best Case)) Write or generate a custom Graph Search URL with embedded search criteria and leverage FBs existing search feature to get results. A great example of this approach is used by the tool searchisback.com. Only this tool is used to do advanced searches on People, but I need advanced searches on Pages.
Option 3) Locate a tool that already does this that I can use.
I again hoping to get some input and possible some direction/recommendations.
I should also mention that I actually know very little about Facebook APIs and Facebook Development. My position right now is of some who knows what they want to do, but no idea how to do it.
Option 1: Not possible, you can only search for Pages by name, the Graph Search is not available with the API. Those are all the options: https://developers.facebook.com/docs/graph-api/using-graph-api#search
Option 2: See answer to option 1.
Option 3: There are tools that list Pages, but they all have to add them manually. So there is not really a tool that does what you want to achieve.
In short: What you want to do is not possible.
I'm trying to determine the best way to "merge" my orchard blog into my existing website. Currently the blog accessed outside the site.
I threw together a quick view in my MVC site that just loads the blog into an iframe. Any other ideas?
The blog is tuned up with a great theme and tons of mods & styling that matches my main site design to a T.
On the home page of my site, I'm using the RSS feed to output a list of the last 3 blog posts. My idea is that the user will click on a blog post link and go directly the view that hosts the blog in the inline frame.
I guess the only variable that I haven't handled yet is how to load up the correct page in the blog based on the link that the user clicked on my main site home page.
I've read other posts on this subject and it seems like the solution that is always offered is to merge all the code from the main website into Orchard which seems insane...I have a very large auction based website, taking all that logic & content and putting into Orchard is not an option.
Hope all that makes sense, thanks for the input. I can't think it would be a huge issue to "seamless" integrate my blog with my MVC site.
Orchard was never designed to be integrated into an existing application, so something like what you've done is what you have to do. The iframe however has a number of problems, such as its fixed size, and awkward navigation. It's better to integrate data than markup. It's now easy to build WebAPI controllers to expose Orchard data. You could consume that data in your application and render it there. That enables you to manipulate the data before rendering, which is of course easier than manipulating rendered HTML. For example, you can build your own link URLs so that clicking on a post's title goes to an action on your site that fetches the post contents rather than the Orchard post URL.
One final comment: It is a little weird that an auction website would need to integrate a blog in the middle of its own rendering. Shouldn't the blog be a separate section of the site?
My blog was successfully transferred to octopress and github-pages. My problem though is that website's search uses google search but the result of 'search' as you can see, are pointing to the old (wordpress) links. Now these links have change structure, following default octopress structure.
I don't understand why this is happening. Is it possible for google to have stored in it's DB the old links (my blog was 1st page for some searches, but gathered just 3.000 hits / month... not much by internet's standards) and this will change with time, or is it something I'm able to change somehow?
thanks.
1.You can wait for Google to crawl and re-index your
pages, or you can use the URL Removal Request tool
to expedite removal of old pages from the index.
http://www.google.com/support/webmasters/bin/answer.py?answer=61062
According to that page, the removal process
"usually takes 3-5 business days."
Consider submitting a Sitemap:
http://www.google.com/support/webmasters/bin/answer.py?answer=40318
click here to resubmit your sitemap.
More information about Sitemaps:
http://www.google.com/support/webmasters/bin/answer.py?answer=34575
http://www.google.com/support/webmasters/bin/topic.py?topic=8467
http://www.google.com/support/webmasters/bin/topic.py?topic=8477
https://www.google.com/webmasters/tools/docs/en/protocol.html
2.Perhaps your company might consider the
Google Mini? You could set up the Mini to
crawl the site every night or even 'continuously'.
http://www.google.com/enterprise/mini/
According to the US pricing page,
the Mini currently starts at $1995 for a
50,000-document license with a year of support.
Here is the Google Mini discussion group:
http://groups.google.com/group/Google-Mini
http://www.google.com/enterprise/hosted_vs_appliance.html
(Click: "show all descriptions")
http://www.google.com/support/mini/
(Google Mini detailed FAQ)