How to integrate Jmeter scripts with UFT or QTP - plugins

I want to run Jmeter script from UFT/QTP. Please let me know the if I need to install plugin for Jmeter in UFT and required setup for the same.

Integrate which direction? UFT is Unified Functional Testing and JMeter is a load testing tool which falls into Non-Functional Testing so there is no straightforward integration, you will have to use underlying operating system as a proxy.
You can run QTP/UFT scripts from JMeter using OS Process Sampler and CScript.exe
You can run JMeter tests from QTP/UFT script like:
Dim myCmd
Set myCmd = CreateObject ("WSCript.shell")
myCmd.run "cmd /c jmeter -n -t test.jmx -l result.jtl"
Set myCmd = Nothing

Why do you need Integration at all? If you want to keep your environment clean you should not mix 2 different tools and technologies that have nothing to do with each other.
Let your CI System (Jenkins, HPE ALM / Octane etc) trigger the execution of the different types of tests. In case you have full UFT licenses then you may think about API Tests (Based on C# / .NET) and Load Tests based on HPE toolsuite.

Related

How can I trigger a Jenkins build/job via an external Perl script?

I aiming to write a Perl script in a Unix environment which triggers a Jenkins build/job by passing the URL for the Jenkins master and other parameters.
How could this be done?
Here are some detailed questions I had:
What Perl libraries would I need?
What functions in Perl are needed?
How can I pass other build parameters to Jenkins from the script?
How do I get the results back from Jenkins?
You would normally need the LWP module and maybe its sub modules. Here's an article on how you'd use it:
I am normally the person who will say "Don't use system when you can use a Perl solution".
However, I will make an exception in this case since it is way easier just to make that one system call to the wget command:
system qq(wget -q $build_trigger_url);
How can to pass other build parameters to Jenkins from the script?
To pass parameters to the Jenkins build will involve either setting environment variables by modifying the %ENV hash, or by modifying the URL to include these parameters (via a GET request). Different plugins and configurations require different ways of doing this.
If your build machine is a Windows system, you can download the wget command.
How do I get the results back from Jenkins?
Jenkins has a built-in RESTful API. Just click the REST API links that are on the bottom of each page. You can use the REST::Client module to make REST calls, but you can also just use system calls to wget too.
The RESTful API will return data in either JSON or XML format. You should get a JSON or XML module to help read this data. Sometimes, the RESTful API will return plain text like when getting the latest build number or timestamp.

Deployment not in a domain - psexec.exe or powershell remoting

I am working on an automated deployment process for a web application. The deployment will need to:
Deploy DB changes to database using sqlpackage.exe
Deploy reporting services reports to the reports server using the web service
Deploy web app to web server(s)
Deploy fonts for reports
among other things
The first two are reasonably straightforward to run from the web server, as the web service and db are contactable, and the tools to deploy run over the network.
From reading it appears that powershell remoting should be the way to go, and internally this would not be a problem. However when deploying to production, this will be being carried out in a datacentre, where the machines (2web, 1db) are not on a domain at all. I'd like to come up with a generic process that can run both internally and externally with the appropriate configuration. Powershell remoting, with machines not in a domain appears to require a fair bit of configuration using https etc., as NT credentials can't be validated.
Should I battle out configuring powershell remoting, or would configuring this to just use psexec to execute a powershell script directly on the remote machien, copying the deployment artifacts to a drop folder on the remote machine be the best way to go?
psexec seems to "just work". It appears powershell remoting comes with a lot more pain.
Why not use psexec then? You can restrict it's role to just getting you on to the remote machine, and not let it infect your scripts. I have not attempted to get ps remoting working on a non-domain, but it general I have found it to be fairly high effort to get going. Psexec, as you say, can often be simpler.
Excuse the peddling, but the open source framework I helped build called PowerUp essentially does all this for your. It uses a model in which the powershell (well psake) scripts can move execution to another machine by calling a specific function. This can either be done with powershell remoting or psexec - you wouldn't need to change the script, it just requires a setting per environment to say which you would like to use.
Check other the sample at https://github.com/AffinityID/PowerUpSamples/tree/master/SimpleWebsite.
Hopefully that shows you enough, but if not let me know and we can go into more detail.

Jenkins-How to schedule a jenkins job when another job completes on remote machine

I have two remote machines: A and B. Both have Jenkins installed.
A: This will build from trunk.
B: To trigger automation.
How can I configure Jenkins job on Machine B, when the build is successful on Machine A?
I had the same requirement because one of the servers that I was using belonged to a different company and therefore, while it would be possible, it was clearly going to take a long time to get buy-in for me to alter their jenkins set-up, even though I was allowed access to monitor it and its outputs. However, if you don't have these restrictions, then you should definitely follow the whole master-slave configuration to address this. That said, here is a solution I came up with and just to note that I've explained why this was a genuine requirement, although I hope to go down the master-slave route myself when possible.
Install the ScriptTrigger plug-in for Jenkins and you can then watch the remote jenkins instance with a script similar to the following:
LAST_SUCCESSFUL_UPSTREAM_BUILD=`curl http://my.remote.jenkins.instance.com:8080/job/remoteJobName/lastSuccessfulBuild/buildNumber`
LAST_KNOWN_UPSTREAM_BUILD=`cat $WORKSPACE/../lastKnownUpstreamBuild || echo 0`
echo $LAST_SUCCESSFUL_UPSTREAM_BUILD> $WORKSPACE/../lastKnownUpstreamBuild
exit $(( $LAST_SUCCESSFUL_UPSTREAM_BUILD > $LAST_KNOWN_UPSTREAM_BUILD ))
Get the ScriptTrigger to schedule a build whenever the exit code is '1'. Set-up a suitable polling interval and there you have it.
This will obviously only schedule a build if the upstream job succeeds. Use "lastBuild" or "lastFailedBuild" instead of "lastSuccessfulBuild" in the URL above as your requirements dictate.
NOTE: Implemented using a BASH shell. May work in other UNIX shells, won't work in Windows.

How to run same tests on different servers using prove?

I am using the Perl prove testing utility (TAP::Harness) to test my program.
I need to run the same tests first on a local computer, then on a remote computer.
(Test programs should connect to localhost or to remote host, respectively)
How can I pass parameters (test_server) to tests using prove? I should use environment or there is better solution?
Environment variable sounds good, since you do not have easy access to higher abstraction means to pass data like command-line options or function parameters.
We already have prior art in the variables TEST_VERBOSE, AUTOMATED_TESTING, RELEASE_TESTING which influence how tests are run.
Depending on your larger goals, you may wish to approach the problem differently. We use Jenkins to control test suite runs that ultimately run "prove". It's set up to run tests on multiple servers and offers a number of features to manage "builds", which can be test suite runs.

Scheduled Tasks for Web Applications

What are the different approaches for creating scheduled tasks for web applications, with or without a separate web/desktop application?
If we're talking Microsoft platform, then I'd always develop a separate Windows Service to handle such batch tasks.
You can always reference the same assemblies that are being used by your web application to avoid any nasty code duplication.
Jeff discussed this on the Stack Overflow blog -
https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/
Basically, Jeff proposed using the CacheItemRemovedCallback as a timer for calling certain tasks.
I personally believe that automated tasks should be handled as a service, a Windows scheduled task, or a job in SQL Server.
Under Linux, checkout cron.
I think Stack Overflow itself is using an ApplicationCache expiration to run background code at intervals.
If you're on a Linux host, you'll almost certainly be using cron.
Under linux you can use cron jobs (http://www.unixgeeks.org/security/newbie/unix/cron-1.html) to schedule tasks.
Use URL fetchers like wget or curl to make HTTP GET requests.
Secure your URLs with authentication so that no one can execute the tasks without knowing the user/password.
I think Windows' built-in Task Scheduler is the suggested tool for this job. That requires an outside application.
This may or may not be what you're looking for, but read this article, "Simulate a Windows Service using ASP.NET to run scheduled jobs". I think StackOverflow may use this method or it was at least talked about using it.
A very simple method that we've used where I work is this:
Set up a webservice/web method that executes the task. This webservice can be secured with username/pass if desired.
Create a console app that calls this web service. If desired, you can have the console app send parameters and/or get back some sort of metrics for output to the console or external logging.
Schedule this executable in the task scheduler of choice.
It's not pretty, but it is simple and reliable. Since the console app is essentially just a heartbeat to tell the app to go do its work, it does not need to share any libraries with the application. Another plus of this methodology is that it's fairly trivial to kick off manually when needed.
Use URL fetchers like wget or curl to make HTTP GET requests.
Secure your URLs with authentication so that no one can execute the tasks without knowing the user/password.
You can also tell cron to run php scripts directly, for example. And you can set the permissions on the PHP file to prevent other people accessing them or better yet, don't have these utility scripts in a web accessible directory...
Java and Spring -- Use quartz. Very nice and reliable -- http://static.springframework.org/spring/docs/1.2.x/reference/scheduling.html
I think there are easier ways than using cron (Linux) or Task Scheduler (Windows). You can build this into your web-app using:
(a) quartz scheduler,
or if you don't want to integrate another 3rd party library into your application:
(b) create a thread on startup which uses the standard Java 'java.util.Timer' class to run your tasks.
I recently worked on a project that does exactly this (obviously it is an external service but I thought I would share).
https://anticipated.io/
You can receive a webhook or an SQS event at a specific scheduled time. Dealing with these schedulers can be a pain so I thought I'd share in such case someone is looking to offload their concerns.