Does DeviceFarm has local data that can be used for web testing? - aws-device-farm

I'm using devicefarm in web testing. The tests are required to upload certain attachments (png/jpg). Does device farm environment has local files that has data that can be used for uploading? If so, how can I access these data or how can I log the path/directory of these files. If not, is there's a way that can be used to upload data while running the test?

Related

Upload image error -> blob:http://localhost:3000/48c7da66-42c0-4ed3-8691-2dedd5ce4984:1 Failed to load resource: net::ERR_FILE_NOT_FOUND [duplicate]

I build a MERN app and hosted on heroku.
I saved the user's images on server by multer and it works fine for some time i.e. uploaded image is fetched successfully.
But after closing the application for long that image is not available on server.
On searching I found that each dyno on heroku boots with a clean copy of the filesystem from the most recent deploy.
But then how and where to save images?
Dyno file system is ephemeral so you need to store the file on an external storage (ie S3, Dropbox) or use an Heroku plugin (ie for FTP).
Check Files on Heroku to understand (free) options for storing/managing files (the examples are in Python but the concept is valid for other stacks too).

Is there a way to sync PWA web client IndexedDB data to OneDrive?

I am developing a PWA web client app. The data is stored on local IndexedDB.
Is there a way to sync the IndexedDB data to OneDrive with its api? Or I have to develop my own server for syncing data.
I have checked the OneDrive api, and I know it supports files sync. I don't know how draw.io does, seems like it saves the file to OneDrive.
Right now I am considering to change the app to Chrome extension with google sync or make an Electron app to use OneDrive to sync the data.

Azure - Use of dynamically generated Auth-token in Load Test requests

I'm quite new to Azure/AzureDevOps and its LoadTest app.
I need to execute a Load Test for a scenario where I login as a user and execute a few requests while logged in.
The original request is returning auth-token in its Response, that auth-token is used as one of the parameters for the Header in all other sequential Requests.
What I cannot figure out (nor find on the Internet) is to:
1. How to get the auth-token in the 1st Response;
2. How to use this token (dynamically) in all other Requests in the Load Test.
Any help is greatly appreciated.
I guess you intended to run a web test against your web application.
have you created your load test project with Visual Studio?
As for getting the auth-token and using it in other requests,I don't think it is achievable with azure devops load test app.
I believe it should be done in your web test code.
You can create your web test which introduced in this document or write your test in code with selenium or other tools. Add add the web test to a load test project. To deal with the dynamic parameters which cannot be detected by your web request in your test,you can check this document.
After you have created your load test project. You can upload your load test to azure devops load test shown as below pic.

Spring Tool Suite - log files getting exported

My company doesn't allow copying of any files from its intranet to the outside world. However, after I installed Spring Tool Suite, it's automatically trying to sent some log files outside and I keep getting email from my company's internal system that tracks this kind of activity.
How do I switch this off so that no files can exported outside automatically?
If you have control over your firewall (at least to an outgoing packet filter on your workstation), you could simply forbid it in for your software.

Auto upload remote files into Google cloud storage via FTP?

I download a lot of csv files via ftp from different sources on a daily basis. I then upload these files into Google Cloud Storage.
Are there any programs/api/tools to automate this?
Looking for a best way, if possible, to load these files directly into Google Cloud Storage without having to locally download them. Something that I can deploy on Google Compute, so I don't need to run a local programs like Filezilla/CrossFTP. The program/tool will keep checking the remote location on a regular basis and load new files into Google Cloud Storage; ensuring a checksum match.
I apologize in advance if this is too vague/generic question.
Sorry, no. Automatically importing objects from a remote FTP server is not currently a feature of GCS.