Bing search api limit to certain sites - bing

Can I define Bing search api only return result from subdomains from a site? For example, www.domain.com/dir/. This is something similar to what Google custom search does. But Google CSE has a hard limit on 10k/day. Is there an alternative search api available for this purpose?

You can simply put this into the query field, like this:
site:domain.com/dir/ whatIAmLookingFor
The query field can contain anything the Bing Engine supports, despite the keyword being site, it apparently supports also URLs
You can read the Bing Search API docs here.

madmuffin is correct, but I'm expanding on his answer to show how multiple sites can be searched. You can use the advanced search operator "site:" as part of your query "q" param.
Single Site Example
"heart disease" site:www.example.com
Multiple Site Example
"heart disease" (site:www.example1.com OR site:www.example2.com)

Related

How to get EXTRA (complimentary) results from bing-search API

When we generally search for a term in www.bing.com (e.g. newyork), we get some extra information along with search results on the right most side of the page. Like for 'newyork' you would see some content from wikipedia, map location, it's twitter page etc.
Is there a way to get this information from current BING SEARCH API?
Some of this information is available in the Bing Entity Search API here: https://azure.microsoft.com/en-us/services/cognitive-services/bing-entity-search-api/.

Facebook URL Graph Search - Page Queries

I am looking to generate a list of URLS or FB ID's for a set of existing Facebook Pages. Ultimately the initial query I am looking to run is simple: Find all NEW facebook pages created in City XYZ.
The term NEW, is open to interpretation. It could mean "Created this month" Or newer relative others in a set (these details are not important at the moment) . Also, "Pages" refers to fan pages (not user profiles).
I have identified 3 possible approaches. Both of which I am hoping to get some input on. Regarding feasibility and process.
Option 1) Somehow leverage Facebook's Graph API and develop some time of web application to generate a list of all Pages, then filter by city, then filter by date created.
Option 2 (Best Case)) Write or generate a custom Graph Search URL with embedded search criteria and leverage FBs existing search feature to get results. A great example of this approach is used by the tool searchisback.com. Only this tool is used to do advanced searches on People, but I need advanced searches on Pages.
Option 3) Locate a tool that already does this that I can use.
I again hoping to get some input and possible some direction/recommendations.
I should also mention that I actually know very little about Facebook APIs and Facebook Development. My position right now is of some who knows what they want to do, but no idea how to do it.
Option 1: Not possible, you can only search for Pages by name, the Graph Search is not available with the API. Those are all the options: https://developers.facebook.com/docs/graph-api/using-graph-api#search
Option 2: See answer to option 1.
Option 3: There are tools that list Pages, but they all have to add them manually. So there is not really a tool that does what you want to achieve.
In short: What you want to do is not possible.

Grouping for website pages on google search

I have a website slateone.com, When I search slateone.com on google it gives me a search result with a group of links together as a single search. Whereas when I search for slateone it gives me the same links separately without any grouping.
I tried testing the same kind of search for popular websites like facebook and google shows same group search result for both facebook.com and facebook.
Please guide me on this issue.
Thanks in Advance!!
-Vinay
You are actually referring to sitelinks, they are completely automated and not editable.
Now your website first of all needs to have the title attribute in the tags defined, this will surely help the algorithm of Google to find them, but note that does not mean that this is the solution.
The exact algorithm is known only to Google, the fix i mentioned will make the links more readable for the bots, this will surely help.
The official information can be found here: https://support.google.com/webmasters/answer/47334?hl=en

Google Analytics API, filter by parameter

Our app uses the Google Analytics Rest API. We'd like to get the number of page views generated by different links to the site.
For example, one link to our site might be:
http://oursite.com?linknum=12345
and another might be:
http://oursite.com?linknum=23456
We'd like to track the number of page views by all visitors who click on each link, so we need a way to filter by parameter.
So far, we just get the number of page views for all visitors without any filters:
curl 'https://www.googleapis.com/analytics/v3/data/ga?ids=ga:(our id) &metrics=ga:pageviews&start-date=2014-4-26&end-date=2011-12-08&access_token=(our access token)'
The best way to learn the API is to use the query explorer at
http://ga-dev-tools.appspot.com/explorer/
For your analysis, add dimension=ga:pagePath and sort=ga:pageviews.
In addition, you could ask for pages which match a filter expression.
For example filter=ga:pagePath=#linknum to only include pages with linknum.

Facebook Graph API SEO Comments and Profanity Filter

I'm trying to integrate the Facebook comments left on our site in a way in which the content can be crawled by search engines and also for people (although I highly doubt there will be many) who don't have Javascript enabled on their browser.
Currently our Facebook comments are displayed via the use of the Facebook comment social plugin (using the <fb:comments href="MY_URL" num_posts="50" width="665"></fb:comments> tag). This ends up rendering an iFrame (which are mostly ignored by search engine crawlers) so the plan is to render this information and format it with basic HTML. To do this, the comments are pulled using the Graph API - this is then only be displayed to crawlers and people with Javascript disabled.
This all works nicely using the Graph API call (https://graph.facebook.com/comments/?ids=MY_URL), parsing the JSON result and displaying it on the page. The problem is that the <fb:comments> approach filters our results based on a blacklist we have set up on one of our Facebook Apps. The AppId with the relevant blacklist is stored on the page using metadata (<meta property="fb:app_id" content="APP_ID"/>) which the <fb:comments> control obviously must somehow use to filter the comments.
The problem is the Graph API method does not filter any results as I guess no blacklist (or App Id containing a blacklist) is specified. Does anyone know how to specify a Facebook App ID to the API call URL or of another way to not fetch commnents back that violate the terms of the blacklist?
On a side note, I know the debate about filtering content in comments rages on but it is a management decision to implement the blacklist, and one that I have no influence in changing - just incase anyone felt the need to explain the reasons why content filtering is or isn't a good idea!
Any thoughts on a solution?
Unfortunately there's no way to access a filtered list of comments using the API - it might be a reasonably request to have this in the API - you should file a wishlist item in Facebook's bug tracker
Otherwise, the only solution I can think of is to implement your own filter on your side when retrieving and displaying the comments from the API.
According to the Comments plugin documentation the filter on Facebook's side is implemented as a simple substring match, so it should be trivial to implement.
A fairly simple regular expression match should be able to check each comment against a relatively long list quickly.
(Unfortunately, the tradeoff here is that implementing a filter is easy, but you'd also need to write an interface so that whoever's updating the list of disallowed words can maintain the list for both the Facebook plugin, and your own filtering.)
Quote from docs:
The comment is checked via substring matching. This means if you blacklist the
word 'at', if the comment contains the sequence 'a' 't' anywhere it will be
marked with limited visibility; e.g. if the comment contained the words 'bat',
'hat', 'attend', etc it would be caught.
Pretty sure there is no current way of doing this from the graph API, the only thing I can suggest is taking the blacklist and build your own filter