REST API Performance issue in Magento 2 - rest

Is anyone here heavily using Magento rest API for catalog data (products/categories/attributes/sets/groups) persistence? like not once in a while but more as a system integration. how is the performance in general? what throughput could you achieve and do you make use of async/bulk endpoints?
I am getting responses with a single POST/PUT in about 6s - 7s which I think is too much. As I have about 25000 products in my store
Do you have any suggestions on what can be done to improve the performance of REST APIS?
Any help is appreciated.

I found that this was because of the B2B feature of shared catalogs enabled I try to disable the shared catalog from the Admin Configurations and REST API performance just boost drastically.
As shared catalogs are like having lots of permissions for specific products for all the shared catalog groups created and that's why it's taking a lot to respond.
We customized the shared catalogs and bypass the permissions logic when the products have been created from REST APIS

Related

Choosing correct approach to build a muti-tenant architecture with Azure Cosmos DB (MongoDB)

I am little confused in choosing the suitable approach of creating database/collections for a multi-tenant system in MongoDB in CosmosDB API.
I would have 500 tenants for my application, where each tenant's data may grow up to 3-5GB and initially each tenant may need minimum RUs (400 RU/s).
For this use case i would have few options to go with:
1. PartitionKey (per tenant)
2. Container w/ shared throughput (per tenant)
3. Container w/ dedicated throughput (per tenant)
4. Database Account (per tanant)
Considering the Performance Isolation, Cost, Availability and Security, may i know which option would be suitable for the mentioned use case ?.
Please let me know your inputs as i have less exposure to NoSQL and Cosmos track.
The answer is potentially multiple options and it depends on your specific tenant use cases.
Tenant/Partition is the least expensive with a marginal cost per tenant of zero. This is a great option for providing a "free-tier" in your app but you can scale this up to a paid tier for your customers too. Max storage size is 20GB. With this scheme you will need to implement your own resource governance. You will need to ensure customers are not "running hot" and consuming throughput and storage that is drastically out of line from other users. However if you're building a multi-tenant app, resource governance is something you should already be doing.
Tenant/Container is more expensive at 400 RU/s per month ($25/month) which is the minimum throughput for a container. This is ideal when you have tenants that are very large and require isolation from others in the previous tier.
Tenant/Account is same marginal cost as Tenant/Container. This is useful if you have customers that have GDPR requirements that prevent or require replication into specific Azure regions.
Note that I DO NOT recommend Tenant/Container using shared Database throughput. The reason is because with this scheme, all containers share the same throughput which is what you get with Tenant/Partition but performance is not predictable with shared Database throughput so it is not a good choice. Additionally, you are limited to 25 containers per database further making it a poor choice.
Finally, for your app you will need to implement a mechanism to migrate customers from one tier to another. You will also of course require some sort of auth-n/auth-z mechanism. For Cosmos DB you can optionally use our native users and permissions and use resource tokens to secure access to data.
We did a presentation on this last year at BUILD with a customer of ours Citrix who built their own cloud offering on top of Azure using Cosmos DB as their user meta-data store. Definitely worth checking out and will provide you more details and insights, Mission-critical multi-tenant apps with Cosmos DB
PS: if you are building a new service on Cosmos DB I recommend using our Core (SQL) API rather than MongoDB. This is our native service and you will get the best performance and features. Our MongoDB API is the best choice for customers who are looking to migrate and want a fully managed MongoDB experience.
Hope this is helpful.

Publishing a Headless ecommerce. Which Costs i need to consider?

I'm developing an ecommerce website.
It's for a "ground-based" clothing store that is used to sells only via third party platform.
And now want a own website.
I started with Wordpress+Woocommerce.
Then i tried a ZeitNow+Next+Graphql+React version.
It connects itself to Wordpress+Woocomerce database via GraphQL Queries.
It uses ZeitNow to avoid implementing a real Node+Express server on my machine.
Which path to choose to complete the website and publish it ? My doubts are related mainly to COSTS.
If i choose the classic WP+Woocommerce way i need :
0-20 EUR /year for Domain Name.
120EUR / year approximately for a classic web server (with PHP+MySQL) hosting plan where to place the Wordpress+Woocommerce.
If i'd like to choose second option, based on what i know actually i need :
0-20 EUR /year for Domain Name.
120EUR / year approximately for a classic web server (with PHP+MySQL) hosting plan where to place the Wordpress+Woocommerce "head" part of my project, .
0EUR /year for serveless ZeitNow (free plan).
But where i need to place the "App" (ZeitNow+Next+GraphQl+React)?
An other Web server (with Node) ?
So an other 120EUR/ year plan ?
Or beacuse it's serverless i can only "deploy" to zeitNow and only link my domain to ZeitNow?
Its not clear to me.
I found on the web things like "Netifly", "Firebase", "Heroku", "AWS" ...
Are they all equivalent to Zeit Now?
I would like to publish a website with benefits of WooCommerce CMS system.
Like adding products, managing stocks, handling discounts plans, access to PayPal and Stripe payment methods integrations (i don't trust my self enough to build integration on my own due to security risks).
I wolud like also the keep benefits of using React for front End like performance (at least perceived) for Final User, or no need of Ajax request to update Cart and Wishlist.
And what about calculating if my project needs a "payed plan" of ZeitNow/Netifly/AWS to manage the request? How i can calculate them?
Sorry for the high number of question, but for me, understand the co-existence of these things is overwhelming!
Thanks.
You will always need a paid plan on any platform if you are running a for-profit endeavor.
You may need ZEIT Now to host your frontend and another server for the GraphQL API unless you really want to go DevOps-less by using serverless functions.
Here are very relevant pages for calculating costs:
ZEIT Now pricing page. Notice the "Serverless Execution" and also "BandWidth" prices.
Netlify.
AWS.
In the end, you will need to deploy a "Proof-of-Concept" and be really careful with the metrics. It is impossible to pinpoint an exact number with a custom solution because depending on your implementation, it can be more or less expensive to make API calls. Solutions like Shopify may be the best approach for your type of app. I only recommend that you develop your own stack if you want to customize, scale, and prepare the base for a team of developers later.
Disclaimer: I work for ZEIT at the moment.

Should visualization tools like tableau or looker be used for multi-tenant systems?

Visualization tools like tableau, looker, apache superset are not supposed to be used for multi tenant products.
For example. A product with 1000's of users would like analytics on their data. This needs to be secure so company A cannot see other company B visualizations. For this to work these tools need to understand if a user has privileges to view the data. This is usually achieved through cookies after the user has logged in
To ensure data is only accessed by authorized users these third party tools should not be used. Instead sticking to Ruby on Rails with d3js, highcharts etc is the best options. The data can be managed a lot easier through the same authentication methods as you login and so the data is secure.
Actually, Looker handles multi-tenant data situation just fine. It is quite a common use case for Looker.
You can bind attributes to users that will force the right SQL to be written to guarantee that the user only sees appropriate data.
https://docs.looker.com/reference/explore-params/access_filter
We've got lots of customers building extranets for their businesses this way.
Disclosure: I work at looker.
The complexity of multi-tenant deployments goes far beyond the setup of some filter:
Data privacy - you are one typo away from a data privacy breach with the filters. You should use the database security and privacy capabilities to isolate your tenants.
Performance - you need to scale the underlying database to handle the load of concurrent users.
Customization - your tenants might need to load and analyze their own custom data. They need custom reports, etc.
Take a look at gooddata.com and their workspaces.
Disclosure: I work at GoodData

Change Magento 2 REST API

Hey guys I wonder if anyone can help with this.
Now I am facing a problem at my company. We are developing a Magento 2 Community multistore for our customers.
The idea is to have several stores in the same Magento 2 installation, where each store is for each independent company. The problem is the integration with our ERP system. With the API REST we have full control in the installation, even if we are not with the admin master credential. if we run commands like this in postman: https://magentostore.com/rest/V1/orders?searchCriteria
we have all the orders in installation, all stores. So the companies with their credentials would have the same control and it is a very bad problem of security. The stores would have access to data from each other.
We have tried extensions for advanced permissions like Aitoc and Amasty but it's only works at a frontend level and does not take any effect in API REST. We know that Magento was not made for this kind of thing so my question is:
is it possible to change the API REST to filter the queries by store? and where can I find these API REST queries?
I'd thank you so much.
So you can override api calls using webapi.xml file in your module, Just point it in your service interface and change acl if you want. In your service interface inject the original one and add some your filter before calling original.
The second approach to write a plugin on OrderRepositoryInterface and add filter there (but first solution is better because this service is used not only in api so you may do not want to restrict all calls)

Best scalable model for a website serving millions of users everyday

I want to develop a website that will serve millions of pages everyday including the mobile devices. Site will have strong social features and thus would require lots of reads/writes. It will also suggest things to users based on their social behaviors (likes, dislikes etc) and their friends' behaviors. After considering many elements I have come up with
NoSQL (MongoDB or Cassandra) Database. Not sure which one is the right one.
memcached
Varnish or squid for http acceleration
php and python (Not sure if php is that scalable)
nginx or Apache web server
Any recommendations?
There are NoSQL databases that has an integrated web service that can handle much more web requests per second (including database transaction time) compared to traditional web services requesting data from an external data source. Using this kind of solution increases the performance, save a lot of time in implementation and simplify scaling your website.
The recommendation depends on how you plan on implementing the solution: a server side rendering solution or a client rendered solution? Will you have any MVVM style implementation making the communication talkative? Also what server side environment do you have in mind? Microsoft/Linux?
Take a look at Starcounter database that has a web server component integrated into the database engine and see if that could help you.