Orchard 1.9 warmup module on Azure to permanently serve a static version of a page - orchardcms-1.9

I created a page in Orchard 1.9.2 that ends up querying the db so many times that it can take several minutes to load.
Saw the warmup module and enabled it (not much documentation around exactly what it does but suggests that it can create static versions of individual pages - ideal!).
Set up my page to generate every hour with warmup and a new file for the page is appearing in the Warmup dir on Azure but there is still no improvement in performance which leads me to think the new static page file either isn't working from that location or that the static file will only be retrieved if the server isn't ready (is warming-up).
Can anyone explain how warmup should work? Many thanks

Related

Grafana 8 - slow performance loading dashboards

I am having performance related issues since updating Grafana from version 7.5.7 to version 8.2.3.I have several large dashboards, that have more than around 200 panels and about 10ish variables. These large dashboards are now taking a very long time to load, several minutes, making the user experience quite poor when they would take a few seconds to load on the previous version.
I tried to investigate the dashboard’s webpage performance when opening one of them, and noticed that for the same dashboard:
In version 7:
the activity Run Microtasks took 796ms
total blocking time = 2726ms
In version 8:
the activity Run Microtasks took 8562ms
total blocking time = 9620ms
I couldn’t find in Grafana’s documentation what Grafana is doing differently with version, that would impact performance in the way described here, and if the issue is something that I can avoid.
Any help will be highly appreciated!!

What is the difference between `next export` and `next build` in Next.js?

I have developed a web app that uses Nextjs app for the frontend and a Python (Django) API server as the backend. Most of my front-end pages have API calls to my backend server (in ComponentDidMount or in response to user actions like button clicks).
I want to deploy this app to my server. I am using Nginx as the reverse proxy. The backend deployment is sorted. I am confused about deploying the nextjs app.
After reading the docs I figured there are 2 ways to do this:
Run next build and then next start. This will start a nodejs server on port 3000. I can channel traffic to this port using Nginx.
Run next export. This will generate an out directory. I can channel incoming traffic to this directory using a reverse proxy like Nginx.
Which of the 2 options should I use? What are the differences?
Answering my own question with the help of the answer I received on the NextJS discussions forum link here
Basics of next build, start and export
next build builds the production application in the .next folder. You need to run this command irrespective of whether you want to run next start or next export.
After building, next start starts a Node.js server that supports hybrid pages, serving both statically generated and server-side rendered pages.
next export will export all your pages to static HTML files that you can serve with any host. This is similar to what create-react-app does, but you can still build dynamic pages at build-time with exportPathMap.
Note: Statically generated pages, using next export, are still reactive i.e. any dynamic JS code, which updates your pages at run time, will continue to run as usual (like any other ReactJS app). Next.js will hydrate your application client-side to give it full interactivity. Its just that the dynamic section of the page will only be rendered in the browser at run time and hence it won't be available to search engine crawlers.
When to use next export?
next export is recommended if you have some dynamic pages which need to some data to be fetched only at 'build' time. It is perfect for pages like landing pages, blogs, news articles etc, or for other kinds of pages which are updated only during code builds. You can setup a background task to build your code at regular intervals if your page data needs to be updated more frequently. This will speed up your 'first paint' speed and also provide the entire page data to search engine crawlers for indexing.
When to use next start?
If you do not need any data fetching at build time i.e. your pages are rendered dynamically at run time then you should use next build and then next start. This will also generate static HTML for sections of your page which are not dynamic. This will speed up the 'first paint' time of your pages. See this Automatic Static Optimization.
next export, also produces an optimized production ready build, but it builds fully static app, which means only html, css and javascript but no server-side code.
Since there is no server side code, you do not need a server to host your app. This makes hosting your app easier and cheaper because you dont need to maintain and scale a server. There are plenty of hosts in the market which offers very good pricing.
Since next export builds a static app, it means you cannot use api routes, server-side functions therefore you cannot revalidate. Revalidation means, your app will be checking database for a certain of time that you specify. Let's say you have a blogging app, you save your blog posts to database, with server side coe, you get those on server side, pass them to the client and client renders the posts and with revalidation option set, your app would automatically check the database if the content changed. Since you cannot use this feature, anytime content of your app changes, you have to redeploy your app.

How to do file versioning with CDN and a loadbalancer?

So I'm using a very simple CDN service. You point to your website and if you call it through their HostName they'll cache it for you after the first call.
I use this for all my static content, like JavaScript files and images.
This all works perfect - and I like that it has very little maintenance or setup cost.
Problem starts when rolling out new versions of JavaScript files. New JavaScript files automatically get a new hash if the files changes.
Because roll out over multiple instances is not simultaneously a problem occurs though. I tried to model it in this diagram:
In words:
Request hits server with new version
Requests Js file with new version hash
CDN detects correctly that the file is not cached
CDN requests the original file with the new hash from the load balancer
loadbalancer serves request of CDN to a random server - accidently serving from a server with the old version
CDN caches old version with the new hash
everyone gets served old versions from the CDN
There are some ways I know how to fix this - i.e. manually uploading files to a seperate storage with the hash baked in, etc.
But this needs extra code and has more "moving parts" that makes maintenance more complicated.
I would prefer to have something that works as seamlessly as the normal CDN behavior.
I guess this is a common problem for sites that are running on multiple instances, but I can't find a lot of information about this.
What is the common way to solve this?
Edit
I think another solution would be to somehow force the CDN to go to the same instance for the .js file as the original html file - but how?
Here are a few ideas from my solutions in the past, though the CDN you are using will rule out some of these:
Exclude .js files from the CDN Caching Service, prevents it being cached in the first place.
Poke the CDN with a request to invalidate the cache for a specific file at the time of release.
In your build/deploy script, change the name of the .js file and reference the new file in your HTML.
Use query parameters after the .js file name, which are ignored but cached under a different address reference, e.g. /mysite/myscript.js?build1234
The problem with this kind of issues is that the cache control resides on the browser side, so you cannot do too much form the server side.
The most common way I know is basically the one you mention about adding some hash to the file names or the URLs you use to get them.
The thing is that you should not do this manually. You should use some web application builder, like Webpack, to automate this process and it will depend on the technologies you are using. I saw this for the first time using GWT 13 years ago, and all the last projects I worked with, using AngularJS or React, had been integrated with builders that does what you need automatically.
Once it's implemented, your users will get the last version, and resources will be cached correctly to speed up your site.
If you can also automate the full pipeline to remove the old resources from the CDN once the expiration configured on them have been reached, you touched the sky.
I fixed this in the end by only referencing to the CDN version after a few minutes of runtime.
So if the runtime is less then 5 minutes it refers to:
/scripts/example.js?v=351
After 5 minutes it refers to the CDN version:
https://cdn.example.com/scripts/example.js?v=351
After 5 minutes we are pretty sure that all instances are running the new version, so that we don't accidently cache an old version with the new hash.
The downside is that on very busy moments you don't have the advantage of the CDN if you would redeploy, but I haven't seen a better alternative yet.

Perfomance issue with Entity Framework 6.0

get performance issue with entity framework(vs6.0).
In my website, get performance with one page (this page have long logic with database).
What is I have checked?
Case
That page is working fine with localhost. (Take time for page loading 7 to 8 seconds)
But it is working very slow with live. (Take time for page loading 1.30 to 2 minute). (Some time it is working fine with live and take time for page loading 11 to 15 second)
All other page working fine but there is not long logic with database.
If I compare all other page loading time with local and live then not got more difference.
Same data on database on live and local
Any one have idea Why it is working very slow some time?
found issue of performance in my case. Not used Dispose() any object during process with Entity framework object.

How To Deploy Web Application

We have an internal web system that handles the majority of our companies business. Hundreds of users use it throughout the day, it's very high priority and must always be running. We're looking at moving to ASP.NET MVC 2; at the moment we use web forms. The beauty of using web forms is we can instantaneously release a single web page as opposed to deploying the entire application.
I'm interested to know how others are deploying their applications whilst still making them accessible to the user. Using the deployment tool in Visual Studio would supposedly cause a halt. I'm looking for a method that's super quick.
If you had high priority bug fixes for example, would it be wise to perhaps mix web forms with MVC and instead replace the view with a code-behind web form until you make the next proper release which isn't a web form?
I've also seen other solutions on the same server of having the same web application run side-by-side and either change the root directory in IIS or change the web.config to point to a different folder, but the problem with this is that you have to do an entire build and deploy even if it were for a simple bug fix.
EDIT: To elaborate, how do you deploy the application without causing any disruption to users.
How is everyone else doing it?
I guess you can run the MVC application uncompiled also? and just replace .cs/views and such on the run.
A websetup uninstall/install is very quick, but it kills the application pool.. which might cause problem. Depending on how your site is built.
The smoothest way is to run it on two servers and store the sessions in sql server or shared state. Then you can just bring S1 down and patch it => bring s1 back up again and bring S2 down => patch S2 and then bring it up again. Al thought this might not work if you make any major changes to the session parts of the code.
Have multiple instances of your website running on multiple servers. The best way to do it is to have a production environment, a test environment, and a developement environment. You can create test cases and run the load every time you have a new build, if can get through all the tests, move the version into production ;).
You could have two physical servers each running IIS and hosting a copy of the site. OR you could run two copies of the site under different IIS endpoints on the SAME server.
Either way you cut it you are going to need at least two copies of the site in production.
I call this an A<->B switch method.
Firstly, have each production site on a different IP address. In your company's DNS, add an entry set to one of the IPs and give it a really short TTL. Then you can update site B and also pre-test/warm-up the site by hitting the IP address. When it's ready to go, get your DNS switched to the new site B. Once your TTL has expired you can take down site A and update it.
Using a shared session state will help to minimise the transition of users between sites.