Is it possible to embed (chart) visualizations created by the Google Visualization API in a Blogger post?
I tried stripping out the <head> and <body> tags (and closing tags) from the pie chart example, however, the pie chart visualization fails to render, even on a published post.
NOTE: I'm asking about the Visualization API, rather than the Google Image Charts (Charts API).
Managed to add the timeline chart, it's not a very straight forward way, but it worked.
Add this to the template of your blog in the <head>:
<script type='text/javascript' src='http://www.google.com/jsapi'></script>
Then when you go to your post, write everything out and click edit HTML. Paste the rest of the javascript in, but remove any returns or they will be automatically replaced by tag inserted by the blogger editor. Save it and you should be ready to go.
Please let me know if you have a specific chart you want to show and I'll try to make it work.
Related
Is there any direct way to integrate Google Tag Manager JS snippets in AEM pages ?
The AEM Tools provide only for Google Analytics snippet using cloud services component. But GTM needs two code snippets on a page, one in <head> and one in <body> section of the page.
Hard coding the JS snippets in my HTML's is not a good idea.
GTM scripts is based on website specific tracking. So you can create sitelevel configurations for GTM containerID(think for future) and add header footer script in page component with dynamic containerID that you can configure in site level config so that it will be available across all pages.
After trying a few different packages to help with updating meta and og tags unsuccessfully, I am exploring other options to integrate Facebook sharing or commenting onto my Ember site. But I seem to be unable to dynamically generate the data-href value in my HBS templates.
<div class="fb-comments" data-href="http://www.whatever.com/reviews/{{model.id}}" data-width="510px" data-numposts="10"></div>
Whenever you access the first review, the comments load fine. But once you go to another, even though the href is updated the comment plugin no longer loads. Is there a way to incorporate this with dynamic URLs?
I have a pure GWT based website and as we are aware the search engines cannot index pure gwt based websites. Thus, I have created an alternate web page as shown below which is stored as a separate html in the war folder. The purpose of this webpage is to enlist and index details regarding my website. This page is never displayed on my website, but instead is meant only for indexing. The url leading to this web page is part of the Sitemaps.xml. Thus I am assuming that the below html will be indexed because it's a part of Sitemaps. So here are my questions:
Will the content I give in the div with id "crawler" be indexed given the fact that it is scheduled for removal onload and that the browser is redirected to another url on load?
Is there a better way to get the content indexed for a pure GWT website which does not have any html based user interface?
I can also have urls that will invoke a servlet and return a response that is meant for indexing. But then the same url will be displayed in search results, which is not useful. In other words, I am trying to figure out a way in which the content gets indexed, but when the user clicks the search result he should be redirected to the home page instead of showing the indexed content.
<head>
<script>
function load(){
element = document.getElementById("crawler");
element.parentNode.removeChild(element);
window.location.href='http://<mysite>.com';
}
</script>
</head>
<body onLoad='load()'>
<div id="crawler">
<CONTENT TO BE INDEXED>......
</div>
</body>
As you can see here the div (crawler) that contains all the content that is meant for indexing, is removed as soon as the body loads. Apart from this the page also redirects to the home page of the site on load.
The crawler will read in the entire contents of the page for indexing, so it will have no trouble picking up the portion within the div. The onload is not executed by the crawler prior to reading the page.
A method I have used in the past was to generate static html versions of the pages and reference these through the sitemap.xml. Users landing on the html page would then be directed to the equivalent dynamic page when they click on a link (ie: Buy or Specifications). This worked well for search engine placement with many pages appearing in the top ten.
The best solution to notify the search engines about an undiscoverable website's content is to create a HTML website (as you did). If you create redirects based on the crawler, search engines will not love you. I think you have to fill out your HTML with content with relevant information and add
<link rel="canonical" href="https://gwtsite.com/exact_url"/>
tag to your website's head section. This will notify the search engines that the other site has to appear in the SERP-s instead of the HTML one.
I added a Google Custom Search Element to my website. I already modified the look and feel of the CSE to my website, but I can't modify the look of the ads element since it is inside an Iframe.
Does anybody know how to set-up the style of the ads Iframe? I want do put in a background color and change the font.
Thanks!
This feature seems to be available only for Google Site Search customers. If you are one, you'll be able to recieve your search results as an XML feed, and apply a presentation layer on the raw data. All the details on how to do this are available at the official Google Custom Search site.
You can try this api for AdSence search - https://developers.google.com/custom-search-ads/docs/implementation-guide?hl=en , but you can't make ads view like search results.
So I created a nice 6 page website hutchspropertyandtree.co.nr using freedomain.co.nr via dropbox public folder. Everything was working and updating properly until i updated with iwebs SEO TOOL. I added meta and title tags as well as description etc... PROBLEM is that even though my .html files in dropbox are correct and show all new code and tags. when i open up my domain hutchspropertyandtree.co.nr it doesnt show any of my recent seo tool updates.
im thinking that the cheap domainname from .co.nr is the problem? Is it possible that the default tags and titles and keywords entered into the co.nr website creation boxs are overwriting the newer ones in the html within my dropbox?
But still doesnt explain why a stat counter code and google analytics code in the footer and header respectively still do not show up when i view source in browser.
PLEASE PLEASE HELP.
It's because the page at hutchspropertyandtree.co.nr uses a frame to show the content from another location. The meta information comes from the page with the frame, not the page in the frame. You should be able to see the content of the frame using an inspector (comes with all browsers these days) or "View frame source", if your browser does that.
Note that any search engine hits to your pages will link to the dropbox URL, not the frame page (that has essentially no content from the viewpoint of a search engine). If you want search engine results to show up under that domain, you'll have to get hosting that lets you point a domain directly to it.