Algolia datacenter for graph.cool - algolia

I want to use Algolia with graph.cool. Which region should I choose, when I create new app in Algolia?? If I am not wrong, graph.cool backend located in Dublin...Europe(DE) or Europe(FR) or something else?

You can easily have an idea of the latency between AWS Dublin and Algolia's regions by looking at their status website:
Here is the monitoring for a Europe (FR) cluster of machines: https://status.algolia.com/clusters/c4-fr
Here is the monitoring for a Europe (DE) cluster of machines: https://status.algolia.com/clusters/c1-de
tldr; FR is 16ms away, DE is 25ms.

You're right, the Graphcool backend is hosted in Ireland, eu-west-1. (I asked on their community slack.)
When you choose the region for your Algolia app, the most important thing to consider is where your users are going to be. The latency of the API calls from their browser/device to Algolia is the most important thing to reduce, so the search feels super fast.
If your users will be mostly in the EU, than Europe (DE) and Europe (FR) are good choices. If users will be split between the US and the EU, you might consider US-East.

Related

Is it a bad idea to host a rest api on a cdn?

I'm new to server architecture and have been reading around a lot but have not yet had a solid opinion on if the setup below is good practice or not and was hoping someone with more experienced can give me confirmation if I'm setting up my architecture correctly:
Use Angular Universal to Pre render html to CDN (e.g. Cloudflare)
Cloudinary for Image assets
One/Few strong machines with ngix handling bus load and sending off to other servers listed below (all hosted in digital ocean):
Rest API (Express Server)
Database MongoDB
I'm really concerned about the speed of my rest api as the regions offered in digital ocean seem significantly smaller in contrast to a cdn like cloudflare. How much does this matter when affecting my speed and is a service?
I know this might sound ridiculous but the region issue makes me wonder if hosting a rest api express server on a cdn would be better than a place like digital ocean. (my instincts tell me I should't do this on a cdn but am at a loss for reasons and hope someone can provide clear reasons why I can or shouldn't host an express rest api server there.)
From my knowledge I would do this a little differently.
A CDN is used to serve content hence the name CDN (Content Delivery Network). The CDN its self doesn't serve the content but it routes the user to a server which serves it. For example if you have a server in the US, France and Asia and you where from the UK and requested the website with images hosted on these servers. The CDN would direct you the the closest/best server for you. In this case that would be the server in France.
So to answer your question it isn't a bad idea to host the RESTful API on the CDN but you would need multiple servers around the world (if you are going for worldwide) and use Cloudflare CDN to direct your traffic.
This is what I would do:
If your not expecting loads of traffic (like millions) just have 1-2 servers in each location so 1-2 in North America, South America, France (EU), Asia and maybe Australia. This will give you decent coverage. Then when you setup your CDN that should handle who goes where. Using node and nginx will help you a lot this will allow you to get cheaper not as powerful servers because they are pretty light weight.
Now for your databases you can do one of two things have one dedicated solution somewhere which will be as little latency for all regions somewhere like France (EU) so its more central or you can have multiple and have them sync. Having multiple databases which sync will be more work and will require quite a bit of research. Having the one server is a lot easier to manage.
The database will be your biggest problem deciding whether to do with one and deal with latency or multiple and have to manage them and keep them in sync. Keep in mind you could go with a cloud hosting platform to host your database this would help you with the issue because a lot of platforms will offer worldwide coverage as well as providing synchronised databases. You will however run into the cost issue when using cloud platforms.
Hope this answers your questions and provides you with the knowledge you need!

MongoDB each each cluster is on a different server or that they all in one

I am starting to use MongoDB and yet I am developing the first project with this. I can not to predict how volume of clients and usage it gonna to receive but I want to develop it from the beginning to be high volume handled.
I have heard about clusters and I saw the demonstrations in MongoDB official website.
And here is my question (cutted to small semi-questions):
Are clusters are different servers or that they are just pieces of one big server?
Maybe it seems a bit not related, but how Facebook or huge database handles its data across countries? I mean, they have users from Asia and from America. Surely with different servers, how the system knows how to host, aggregate and deliver with the right server? Is it automatically or that it is a tool that a third party supply to such large databases?
If I am using clusters, shall I still just insert the data to the database and the Mongo will manipulate them in cluster by it's own, or shall I do that manually?
I have a cloud VPS. Should I continue work with this for Mongo or maybe I should really consider about AWS / Google Cloud Platform / etc..?
And another important thing is: Im from Israel, and the clouds I have mentioned above are probably from Europe at least or even more far.
It will probably cause in high latency, is not it?
Thanks.

What region should I choose when I host my database?

I am using Amazon S3 service in my project and I put the region as shown below in my code:
region: 'ap-southeast-1',
because am from Malaysia and Singapore is the nearest.
Now I want to host my Database using MongoLab but they only provide 4 options for the region, as it is shown in the picture below
Can I still host my database on MongoLab or should I look for an alternative?
Sydney would be the closest geo for you. Technically the turnaround time should not be drastically different. MLab is a pretty solid solution, as it's used by a wide range of industry leaders. Considering that your Singapore AWS instance will be transacting with Sydney, it's not really that much of latency AFAIK
Singapore and Mumbai are fairly recent locations in AWS. You can hope for MLab open up data centers in SE Asia sometime soon considering the booming emerging economies and then migrate your data at that point. As of now, Sydney seems to be your best bet, if you want to go for a hosted solution

Integration of Lucene.NET / SOLR With ASPDOTNETSTOREFRONT(ASPDNSF)

I need to build Lucene.NET or SOLR with ASPDNSF, With XMLPACKAGES. So any one of you have any idea to do so ? An idea can help me to meet my needs.
Also is there any other way - which helps to improve the searching of 200k+ products on my application ?
Thank You,
dL
FULL DISCLOSURE: We are the maker of SmartSearch, Search for AspDotNetStorefront.
Todate I have not come across a marketed Solr solution for AspDotNetStorefront. We started out heading down that path in 2007 and in the end didn't want to have to have the mixed (apache, iis) environments and turned to Lucene.net to to write our own IR for AspDotNetStorefront. We needed to be able to search hundreds of thousands of products. We have since grown our solution well beyond what lucene.net can alone provide.
As far as managing that many products there are a handful of information retrieval options out there that work with storefront. Here are a handful saving the best for last ;)
SERVICES:
Nextopia http://nextopia.com/ecommerce-site-search-aspdotnetstorefront.html
SLI Systems http://www.sli-systems.com/solutions/site-search
I know services can run around $1,000 per year for search results only, if you want it to power your navigation it goes 10 fold in price per year for lower volume sites.
HOST YOUR OWN: aspdotnetstorefront addons.
Compunix xSearch http://www.aspdotnetstorefront.com/p-961-xsearch-extreme-search-for-aspdotnetstorefront.aspx
Morrison Consulting Smart Search for AspDotNetStorefront http://store.morrisonconsulting.com/p-5-moco-smart-search-for-aspdotnetstorefront.aspx
Smart Search is much more than just search, we power the entire browsing experience with it as AspDotNetStorefront itself starts slowing down beyond 100k products.

GT.M, any experience with it?

Looking for NOSQL engines I found about GT.M here:
http://www.slideshare.net/robtweed/gtm-a-tried-and-tested-schemaless-database
At first look good, with SQL ODBC support. But I wonder if exist real experience with this? Somebody have use it?
GT.M is a very flexible engine that allows for NoSQL operations, but also is fast in cacheing disk to memory, as well as extensive enterprise level support.
I suggest you read the discussion by Rob Tweed at http://www.mgateway.com/docs/universalNoSQL.pdf
which would probably help you understand capability without jargon.
Don't have experience with it myself, but there is a list of some users of this database here: http://fisglobal.com/Products/TechnologyPlatforms/GTM/index.htm
From GT.M's Wikipedia page :
GT.M is used as the backend of their FIS Profile banking application, and it powers ING DIRECT banks in the United States, Canada, Spain, France, Italy and the UK. It is also used as an open source backend for the Electronic Health Record system WorldVistA and other open source EHRs such as Medsphere's OpenVista.
and
GT.M is predominantly used in healthcare and financial services industry. The first production use of GT.M was in 1986 at the Elvis Presley Memorial Trauma Center in Memphis, Tennessee.
From History of GT.M :
GT.M is licensed for use at over 1,000 institutions worldwide, ranging from small, community healthcare facilities and large teaching hospitals to some of the largest financial institutions in the world.
Also, from a recent presentation made by K.S. Bhaskar :
System of record for the two largest real time core banking systems in the world as we know of:
Production database sizes of a few TB
Serving around 10,000 concurrent online users + ATMs, voice response unit, web and mobile access
1000s of online banking transactions/seconds with full ACID properties.
Increasingly used in health care for for electronic health records
Operating database for at least one multi-sourced "big-data" project.