How can I load test a JupyterHub instance? - jupyter

We have several JupyterHub instances we use for training purposes. Frequently, we discover problems only when 30+ students are logged in, so we'd like a solution for automated testing. We'd like to log in, then run notebooks in parallel. Thanks.
I've toyed with doing this by hand using the requests package, but the right html commands to load and submit cells of a notebook are not obvious. Can Locust or some other tool do what I'd like to do? I'd like to see examples, if possible.

Maybe try recording a session in your browser and converting it into a locustfile using https://github.com/SvenskaSpel/har2locust
That should work reasonably well, assuming the calls are regular HTTP/REST calls (I dont really know JupyterHub so I’m kind of guessing)

Related

Can I use pagespeed insights for my local host website or offline?

Can I use pagespeed insights for my localhost website or offline?
Yes.
Use the "Lighthouse" tab from your google chrome dev tools.
This is a great starter tutorial on how to do that:
https://www.youtube.com/watch?v=5fLW5Q5ODiE
Edit: user izogfif pointed out the "Audit" tab was replaced by "Lighthouse".
An alternative way to run Lighthouse
Although this is an old question there is an alternative way to run Lighthouse (the engine behind Page Speed Insights) locally that may be useful to people in some circumstances.
You can install the Lighthouse Command Line Interface (CLI) locally on your machine quite easily.
This gives you some significant advantages over using the "Lighthouse" tab in Developer tools.
Automation
Firstly you can automate it. You could have it run on every significant change / commit to check you haven't broken something.
Or if you want to check every page on your website you can automate that, very useful if you have hundreds of pages.
Storing results
Secondly you get the full JSON response (or CSV or HTML report, your choice) so you can store some (or all) of the audit results to a database for each page and see if any pages are performing poorly or whether you are improving or ruining your page performance.
Customisation
You can also set your own parameters when running tests.
For example I like to set my "cpuSlowdownMultiplier" very high (8 or 10) as I have a decent CPU and I want to catch any bottlenecks / long tasks that I may miss on slower devices. This is great for making you realise how sloppy your (my!) JavaScript is!
You can also pass headers, set cookies (slightly difficult at the moment but something they are working on) etc. before a run.
You can even use --disable-storage-reset to see how the site responds on a subsequent page visit where the user has already cached images etc. (you can do this in the Lighthouse tab in Developer tools so maybe not that good a reason).
Because you get the raw timings data you can also set your own criteria if you want.
Puppeteer
The icing on the cake is that you can use puppeteer (or similar) to automate complex tasks.
Lets say you want to check a page that is only accessible when you have logged in, use puppeteer to log in and then run lighthouse.
So which should I use?
I would advocate for the CLI if you are going to test regularly / want to automate testing, the Developer tools version for quick and dirty checks / infrequent testing.
Personally it took me about an hour to install and get used to Lighthouse, but I also had to install and learn how to use nodeJS (npm) command line to install lighthouse into my project (yes I am a slow learner!).
If I didn't have to learn that, probably 5 minutes to install and run your first test.
It is actually really simple to use the CLI once you have it installed.
Only down side is you need to update every few months, which is automatic in the browser. However even then that is a positive to me as if you are comparing over time using an older version might be useful.
Oh and you can run it on remote sites as well, so you can test the production site automatically from your own machine (useful if you are located a long way from the PSI server and want an idea of how your site performs in your local community).
This is also really useful if you have a staging server that only allows whitelisted IP addresses and want to test on there (yet again can be done with Developer tools Lighthouse but useful for bulk testing etc.)

How to profile a Yii2 based API?

I have an API application written in Yii2 framework and I don't know how to measure and track the performance as I am keen to see what is happening behind the curtain. My API is using mongoDB and I also would like to see the queries somehow I just don't know where to start. The Yii2 has its own integrated debugPanel which is great, but works only with browsers and can't have the benefits with using Postman to perform API calls for instance.
How do you guys do it on the dev and live environment?
Cheers
For basic measuring, Yii has some built in profiling. You can call it with \Yii::beginProfile() and Yii::endProfile() for that and view results via the debug toolbar.
For development you can also use Xdebug. It support profiling as well.
For production, that's something else. You want a solution that has as little impact on performance as possible and you want something to run regularly and automatic. You should keep track of routes and their profiled result so you can compare the improvements (or not) of your code over time.
I worked on a couple very high traffic sites and what we used was xhprof that activated randomly.
For example in your index.php you can do something like
if (rand(1, 100) == 50) {
xhprof_enable();
// on after_request() or register_shutdown_function(): store route and results
}
Obviously whatever you need may vary but perhaps this gives you some ideas in what direction to look.

Anyone attempted to perform automated tasks through the PCOMM or x3270 using Perl?

Anyone attempted to perform automated tasks through the PCOMM or x3270 using Perl? I am doing some operations on Mainframe through PCOMM and x3270. Since some tasks include many repeatable operations, I am trying to find an easy way to automate these tasks on Mainframe.
BTW, Perl is my favorite language, so just mentioned Perl here.
I am not mainframe guy, but check this out
http://www.perlmonks.org/?node=611038
"I automate 3270 applications from Perl by using the IBM Personal Communications 3270 terminal emulator on Win32 via Win32::OLE. It is very well documented and it works very good."
This with example code: http://www.perlmonks.org/?node_id=674214
Using IPC to drive the session:
http://www.linuxquestions.org/questions/linux-software-2/how-do-i-use-s3270-x3270-for-scripting-767067/
I hope this help.
regards,
You should do some research on QUOTE SITE FILETYPE=JES. This allows you to FTP batch jobs straight into the JES Spool. I do this dozens of times a day (maybe hundreds) to get my PC to accomplish tasks on the mainframe. If it can be done in batch, then this is a great way to do it. And of course, Perl is an excellent way to create and manipulate the JCL before it's submitted.
Another thing to look into, if you Telnet to the mainframe, it opens a TSO command dialog (just like option 6 in TSO). There are many things you can do there too. Of course, if you're doing IPLs and the like, you already know this.
My trouble is that I am not a systems programmer so I cannot control the settings of the mainframe. There are many settings that my company's systems guys are too lazy to look into, so they just shut them down out of hand. I discovered the Telnet thing about a year ago, which I was using to see if a job had finished (that's the hard part of this... knowing when the job is done). Next thing I know, the Telnet access had been disabled.
I have tons of things that allow me to do things on the mainframe via Perl. Hit me up and I'd love to share them with you.

First web server questions

Just looking for some help/suggestions with this. I require my own server for an upcoming project that will be hosting users websites. I want to build a control panel the user can log into and modify their website which will be stored elsewhere on the server. This all seems easy enough, It's just managing domains and emails that confuse me.
What should I look for to manage domain names and point them to the correct website and also what would be the best way to manage email accounts/set up new ones etc. I want to avoid cPanel/WHM if possible, I'm looking to control most things through the control panel I will be building. So any suggestions on this would be useful as well, as I will be wanting to add email accounts through php (Can be done using a shell I assume?).
I will also be wanting to measure bandwidth used on the websites contained in each users directory, any suggestions on making this possible?
I'm really looking for some suggestions on what software to use to set this up, any advice would be really helpful!
Thanks,
Graeme
It sounds like you've got a lot of creative room. May I suggest a web framework? Django. With it you can build out a nice control panel, it's template system is clean and concise. It's also based on Python and thats why I suggest it. If there is a python module for it, you can use it in Django... so things like altering, creating, etc. local data/files is a breeze. you simply us Python (you can even forget it's "django"), crunch your data and then spit it out (into django... out to templates.. to display to the user).
You'll likely want AJAXY biznazz, their is a nice Django App for that, Dajax. Django has a rich and helpful community and tons of resources. Just hop on GitHub.com and search for Django, You'll find tons of stuff.
Im building a DNS Control Panel with it. Which sounds like a minimal version of what you're doing.

How should I create an automated deployment script?

I need to create some way to get a local WAR file deployed on a Linux server. What I have been doing until now is the following process:
Upload WAR using WinSCP.
SSH into server using PuTTY.
Move/Rename/Delete certain files folders to prepare for WAR explosion.
Explode WAR.
Send email notifying users of restart.
Stop Tomcat server.
Use tail to make sure server stopped correctly.
Change symlink to point to exploded WAR.
Start Tomcat.
Use tail to make sure server started correctly.
Send email notifying users of completed restart.
This stuff is all relatively straightforward. And I'm sure there are a million and one different ways to do it. Id like to hear about some options. My first thought was a Bash script. I have very little experience with scripting in general but thought this would be a good way to learn. I would also be interested in doing this with Ruby/Python or something current like this as I have little to no experience with these languages. I think as a young developer, I should definitely get some sort of scripting language under my belt. I may also be interested in some sort of software solution that could do this stuff for me, although I think scripting would be a better way to go for the sake of ease and customizability (I might have just made that word up).
Some actual questions for those that made it this far. What language would you recommend to automate the process I've listed above? Would this be a good opportunity for me to learn Bash/Ruby/Python/something else, or should I simply take the 10 minutes to do this by hand 2-3 times a week? I would think the answer to this is obviously no. Can I automate these things from my computer, or will I need to setup the scripts to run within the Linux server? Is the email something I can automate or am I better off doing that part myself?
More questions will almost certainly come up as I do this so thanks to all in advance.
UPDATE
I should mention, I am using Maven to build the WAR. So if I can do all of this with Maven please let me know.
This might be too heavy duty for your needs, but have you looked at build automation tools such as CruiseControl or Hudson? You might also want to look at Integrity, which is more lightweight and written in Ruby (instead of Java like the other two I mentioned). These tools can do everything you said you needed in your question plus way, way more.
Edit
Since you want this to be more of a learning exercise in scripting languages than a practical solution, here's an idea for you. Instead of manually uploading your WAR each time to your server, set up a Mercurial repository on your server and create a hook (see here, here, and especially here) that executes a Ruby (or ant, or maven) script each time a changeset is pushed from a remote computer (i.e. your local workstation). You would write the script so it does all the action items in your list above. That way, you will get to learn three new things: a distributed version control paradigm, how to customize said tool, and how to write Ruby scripts to interact with your operating system (since your actions are very filesystem heavy).
The most common in my experience is ant, it's worth learning, it's all pretty simple, and very usefull.
You should definately automate it, and you should aim to have it happen in 1 step.
What are you using to build the WAR file itself? There's some advantage to using the same tool for build and deployment. On several projects I've used Ant to build a Java project and deploy it to the servers.