Sauce Labs upload files with nightwatch.js - ui-automation

I have tests written with nightwatchJs that are run with remote sauce labs selenium (aka ondemand.saucelabs.com).
In the tests flow, I need to upload a local file.
When I run tests locally, I use setValue method. But this approach does not seem to be a good one with sauce labs.
Is there a proper way to upload local files with nightwatch and sauce labs?

I've came up with this solution: https://github.com/nightwatchjs/nightwatch/issues/890.
Unfortunately, there is a PR for this feature, but it has not been merged yet. So I forked nightwatch repo and added uploadFileToSeleniumServer method.

Looks like solution to this was that a custom command was created for this: https://github.com/RohanImmanuel/NightwatchJS-Remote-File-Upload.
You can also see a full repo of nightwatch examples here: https://github.com/saucelabs-training/demo-js/tree/main/nightwatch

Related

GitHub Repositories (How to Run)

I have read the following answer here about how to run a specific file.
However, let's say I want to run every single aspect of code in the entire repository here that uses MathJax without downloading it.
How would one figure that out and do that? Is it one JavaScript source code that you script?
If so, how do you figure out the URL that you run?
IF you really don't want to download a repository, you might consider using a GitHub Action.
It does access your code on GitHub side, and can execute whatever you need.
A GitHub Action has an API, and use GitHub runner (on GitHub side, so no download on your part) as opposed to self-hosted runner.
A workflow can be anything you need, like for instance github-action-build, to build your project, in a repository-specific fashion.
As an example, github-action-for-latex compile Latex documents, using a Docker image (xu-cheng/latex-docker).
You would need a similar approach, using a Docker image where you can clone that repository, and execute it (because the Docker image would have everything needed to run your project).
And that would be done entirely on GitHub (Azure-based) side.

Reusing PowerShell Scripts in Azure DevOps

I have a PowerShell script that I want to re-use across multiple build pipelines. My question is, is there a way I can "store" or "save" my PowerShell script at the project or organization scope so that I can use it in my other build pipelines? If so, how? I can't seem to find a way to do this. It would be super handy though.
It is now possible to check out multiple repositories in one YAML pipeline. You could place your script in one repository and check it out in a pipeline of any other repository. You could then reference the script directly on the pipeline workspace.
More info here.
Depending on how big theese scripts are you can create Taskgroups that contain powershell-tasks with the script as inline-powershell. But this only works on project-scope..
Another attempt i'd try would be to create a repo containing your powershell-scripts, add this repo as submodule in the repository you are trying to build and then call the scripts from the submodule-folder. But this only works when using git-repos.
Or you could create a custom build-task that contains your script.
From what I have seen, no.
A few different options I have explored are:
If using a non-hosted agent, saving the file onto the build server. Admittedly this doesn't scale well, but it is better than copy/pasting the script all over. I was able to put these scripts into version control and deploy them via their own pipeline so that might be an solution for scaling (if necessary)
Cloning another repository that has these shared scripts during the process.
I've been asking for this feature for a bit, but it seems the Azure DevOps team has higher priorities.
How about putting the powershell in a nuget package and install that in depending projects?
I just discovered YAML templates (https://learn.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azdevops#step-re-use).
I think it may help you in this case (depending how large it is your file), you can put an inline powershell script in that template yaml, and reuse it on your main yaml.
Documentation is pretty straightforward.

Automatically mirroring a Gitlab repo onto Github on push

I'm looking for a way to automatically mirror my Gitlab repos to Github, on push. I use Gitlab repos as my main repos, and would rather have to push to only one remote. But, I want my code to be browsable on Github also.
I found similar questions on StackOverflow, such as this one.
But the answers are always the same: one should add a custom post-receive git hook to the gitlab repo. This requires a shell access to the server running Gitlab. As I'm hosting a community edition Gitlab for many users, and not only me, they can't have easy access to a shell (and this isn't the most user-friendly way to do this), so it does not fit my needs.
I thought about two ways to implement it:
Either a MirrorOnPush project service, implementing such a git hook in Ruby, as the EmailOnPush project service currently do.
Or use a custom server to clone and push the repo, using a webhook.
The first one seems to be the cleaner to me, but I can't find any doc about Gitlab project service and code structure… On the other hand, the second is a bad and ugly hack, but is almost straightforward.
I'd rather implement a project service to handle it. Do you have any doc or leads on how to write a project service for Gitlab (without having to read all the Gitlab source code, as there seems to be no dev doc…) ?
Thanks !
one should add a custom post-receive git hook to the gitlab repo.
Actually, that was the best solution, up until 7.x GitLab, as I detailed in "Gitlab repository mirroring";
A true project service for repo mirroring is requested, but not voted up enough: suggestion: suggestion 4614663.
The main documentations remains:
the app models project services folder,
the spec models project services folder,
the doc/project_services,
the project services scenarios.
This isn't much, as the OP noted before.
Since it That leaves you with the hack approach.

Easy Deployment with Github?

I searched the web now for several hours but couldn't get around this:
Is there an easy way to deploy a private repository from Github to a staging/development server on each push (or at least manually)? (Best would be if only FTP-data of development server would be needed for this).
I found this: How can I automatically deploy my app after a git push ( GitHub and node.js)? but this kind of "tutorial" in the best answer stops at the point of what exactly to insert into the build.sh. And what modules are needed for this on the development server? SSH, GIT, Ruby? Maybe this sounds stupid to you, or is a wrong thinking of mine, cause nowhere on the net I found any answer to this.
The problem is, that most time, the server on which the contents of the master branch should be deployed is on a shared hosting server, where you doesn't always have SSH, GIT, Python, Ruby, etc. on which most solutions for deploying from github seem to rely on... :/
http://beanstalkapp.com/ is really great at this, you can just enter FTP-Data and deploy automatically or manually for chosen repositories and branches. So I wondered why I couldn't find a similar easy way to deploy from Github?
Thank you very much in advance!
Jonas
It isn't really clear what type of project you have, but here are a couple of ideas.
If your code is written in a compiled language, then you could:
Have a Jenkins server as mentioned in the other comment
Write a simple script in bash that does a git pull and compile and add a cron job to it.
Use an automation framework like Chef or Puppet which would automatically keep the compiled binary up to date.
If your code is an interpreted language (like HTML & JavaScript), then you could:
Use vagrant for local testing. The biggest reason is that changes are live on your local system. It only takes a git push on your machine and a git pull on the production server to make your changes live globally.
Your best bet is probably going to be #2.

Can I download artifacts built by BuildHive?

I have started using the free Jenkins build service on BuildHive for one of my GitHub projects. This is also my first try doing anything with Maven. I have succeeded in building my project using this script on BuildHive:
cd base_dir
mvn package
The build log shows that the resulting JAR has been built. Now I would like to offer the JAR to my project's users as a download artifact because GitHub has discontinued the feature of manually uploading binaries in a separate download section.
Is there any way I can download an artifact, referencing it by a URL? If so, how do I construct the URL, knowing only the artifact's local path from the build log?
Alternatively, is there a way in which I can push the artifact to another place by adding a command to my build shell script after mvn package? I was thinking of something like a curl or ftpput commmand.
The best thing I was able to come up with as a quick workaround was to upload the artifacts in question to my FTP server via curl, as suggested by my original question. It works, but the downside are the FTP credentials in the build public log. I have counterbalanced that by a shell script on my DSL router which checks for FTP storage abuse every few minutes.
As an alternative I found that after creating a free CloudBees account for my little open source project, I got my own Jenkins build configuration as well as my own artifact repository where to deploy my build artifacts. This is much more elegant and does not involve posting any FTP credentials to a public server.
I am still open for BuildHive-only solutions if anyone has a smart idea. :-)