FileNotFoundExcpetion error on AWS device farm while fetching data from the .csv file - aws-device-farm

I am new at device farm. I am fetching username/password from the .csv file and the test works on my local environment. When I zipped and uploaded on the device farm it thows the error FileNotFoundException. I've placed csv file under src/test/resources/csvfile and written down the below code:
CSVReader reader=new CSVReader(new FileReader("src\test\resources\com\testdata.csv"));
I have written down this code in my local enviroment and works well;however, not the same for device farm.

Related

Firestore: Copy/Import data from local emulator to Cloud DB

I need to copy Firestore DB data from my local Firebase emulator to the cloud instance. I can move data from the Cloud DB to the local DB fine, using the EXPORT functionality in the Firebase admin console. We have been working on the local database instance for 3-4 months and now we need to move it back to the Cloud. I have tried to move the local "--export-on-exit" files back to my storage bucket and then IMPORT from there to the Cloud DB, but it fails everytime.
I have seen one comment by Doug at https://stackoverflow.com/a/71819566/20390759 that this is not possible, but the best solution is to write a program to copy from local to cloud. I've started working on that, but I can't find a way to have both projects/databases open at the same time, both local and cloud. They are all the same project ID, app-key, etc.
Attempted IMPORT: I copied the files created by "--export-on-exit" in the emulator to my cloud storage bucket. Then, I selected IMPORT and chose the file I copied up to the bucket. I get this error message:
Google Cloud Storage file does not exist: /xxxxxx.appspot.com/2022-12-05 from local/2022-12-05 from local.overall_export_metadata
So, I renamed the file metadata file to match the directory name from the local system, and the IMPORT claims to initiate successfully, but then fails with no error message.
I've been using Firestore for several years, but we just started using the emulator this year. Overall I like it, but if I can't easily move data back to the Cloud, I can't use it. Thanks for any ideas/help.

Export a CSV file from AS400 to my pc through Cl program

I want to export a database file that is created through a query, from the AS400 machine to my pc in the form of a csv file.
Is there a way to create that connection of the AS400 and my pc through a cl program?
An idea of what I want to do can be derived from the following code:
CLRPFM DTABASENAME
RUNQRY QRY(QRYTEST1)
CHGVAR VAR(&PATH) VALUE('C:\TESTS')
CHGVAR VAR(&PATH1) VALUE('C:\TESTS')
CHGVAR VAR(&CMD) VALUE(%TRIM(&PATH) *CAT '/DTABASENAME.CSV' !> &PATH !> &PATH1)
STRPCO PCTA(*YES)
STRPCCMD PCCMD(&CMD) PAUSE(*YES)
where I somehow get my database file, give the path that I want it to be saved in, in my pc , and lastly run the pc command accordingly
Take a look at
Copy From Query File (CPYFRMQRYF)
Which will allow you to create a database physical file from the query.
You may also want to look at
Copy To Import File (CPYTOIMPF)
Which will copy data from a database physical file to an Integrated File System (IFS) stream file (such as .CSV); which are the type of files you'd find on a PC.
ex:
CPYTOIMPF FROMFILE(MYLIB/MYPF) TOSTMF('/home/myuser/DTABASENAME.CSV') RCDDLM(*CRLF) DTAFMT(*DLM) STRDLM(*DBLQUOTE) STRESCCHR(*STRDLM) RMVBLANK(*TRAILING)
FLDDLM(',')
However, there's no single command to transfer data to your PC. Well technically, I suppose that's not true. If you configure a (SMB or NFS) file share on your PC and configure the IBM SMB or NFS client; you could in fact CPYTOIMPF directly to that file share or use the Copy Object (CPY) command to copy from the IFS to the network share.
If your PC has an FTP server available, you could send the data via the IBM i's FTP client. Similarly, if you have a SSH server on your PC, OpenSSL is available via PASE and SFTP or SCP could be used. You could also email the file from the i.
Instead of trying to send the file to your PC from the i. An easier solution would be to kick off a process on the PC that runs the download. My preference would be a Access Client Solution (ACS) data transfer.
You configure and save (as a .dtfx file) the transfer
Then you can kick it off with a
STRPCCMD cmd('java -jar C:\ACS\acsbundle.jar /plugin=download C:\testacs.dtfx')
More detailed information can be found in the Automating ACS Data Transfer document
The ACS download compoent is SQL based, so you could probably remove the need to use Query/400 at all
Assuming that you have your IFS QNTC mapped to your network domain. You could use the command CPYTOIMPF to copy the data directly from an IBMI DB2 file to a network directory.
This sample would result in a CSV file.
CPYTOIMPF FROMFILE(file) TOSTMF('//QNTC/servername or ip/path/filename.csv') STMFCCSID(*PCASCII) RCDDLM(*CRLF) STRDLM(*NONE)
Use the FLDDLM(';') option in addition to make semicolon separated values, omit it to use comma as value separator.

Crystal Reports Logon Failed Error With Network Path But OK With Mapped Drive

I have a VB6 program that gets installed locally on workstations. It uses an Access database, usually over a peer to peer network. I have crystal reports embedded in the program and it accesses its .rpt files from the same folder on the "server" that the database is located in. The reports files are created with "same as report" selected as the data source path. I pass the actual database path at runtime when the report is called up.
This has worked perfectly for a very long time with only one hitch I'd like to fix. I have to pass crystal reports a mapped drive. It won't work with a network path. If I use T:\Towtrack.mdb it runs happily. If I use \SERVER\Towtrack\Towtrack.mdb it returns a "logon failed" error. Since Micrsoft has broken DAO and mapped drives produce constant database corruptions I need to get away from them.
(database(s) and .rpt files are all in the same folder on the server and the entire folder has full control permissions with an "everyone" user)

How to download a blob file from Azure Storage and save it to an FTP server using Powershell?

I am trying to access a blob file in PowerShell and want to save it directly to an FTP server. How can this be done? Can this be done via the blob file URL? Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Another question related to this same download is that is there a way to just copy the file instead of getting the subdirectories copied as well? For example, I have a blob file like: dataload/files/my_blob_file. When I use the command Get-AzureStorageBlobContent -Destination $destination_path it saves the file in the same subfolder structure, but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file? I would want to accomplish this in the above FTP server.
I am trying to access a blob file in PowerShell and want to save it directly to an FTP server.
If your FTP server is Windows based, then you can just run the script on the FTP server and download the blob into the local path of this FTP server.
Can this be done via the blob file URL?
The command "Get-AzureStorageBlobContent" doesn't accept URL as parameter. That means you need to write the code or script to achieve that. Here is a simple demo written by me:
$url = "https://accountname.blob.core.windows.net/testcontainer/Test01.txt"
$separator = "://"
$option = [System.StringSplitOptions]::RemoveEmptyEntries
$temp = $url.Split($separator,4,$option)
$Protocol = $temp[0]
$HostName = $temp[1]
$Container = $temp[2]
$BlobName = $temp[3]
Or can I somehow have the file created in Powershell memory and then use $webclient.UploadFile to save it to FTP folder?
Storing a file into the RAM is not a good ideal even if we can achieve that. As I have mentioned above, if your FTP server is Windows based, please run the script on the FTP server directly.
If the script can't be run on the server for any reason, then please try to share the folder used by the FTP service and map it as a network driver on the computer which will run the script. So that you will be able to store the file into this network driver.
but can I instead have a custom path or remove the subfolders and save it as c:\myfolder\my_blob_file instead of c:\myfolder\dataload\files\my_blob_file?
Of course, just specify the path and file name as the Destination parameter:
Note: Actually, there is no concept of "folder" on Azure storage. The path is a part of the blob name. When you download the blob, you can rename the blob by specify the file name in the destination path. So that the additional folder will not be created on local folder.
=========================================================================
Update:
This script is to be run from Azure as part of Azure Automation. But when I try to call the FTP server (which is currently my local machine) I get "Unable to connect to remote server" error.
You may need Hybrid Runbook Worker to achieve your goal.
Runbooks in Azure Automation cannot access resources in your local data center since they run in the Azure cloud. The Hybrid Runbook Worker feature of Azure Automation allows you to run runbooks on machines located in your data center to manage local resources.
I'm using the default 21 port and I also tried using my public IP address
Exposing your FTP server to the Internet is not recommended. I would suggest using Hybrid Runbook Worker.
Also, how can I get the content of my blob file into a Powershell variable to work with it in the script?
To my knowledge, Get-​Azure​Storage​Blob​Content does not support return an object in RAM. You need downloading the content first, then use Get-Content to get the file content. If you use the Hybrid Runbook Worker, you'll be able to store the file locally.
==============================================================================
Update 2:
I am trying to understand as to how to call any external FTP server (which is currently on my machine for dev/test purpose, but may reside on any other external server in production), and I have to run it from a Powershell script in Azure Automation. So your reply: You may need Hybrid Runbook Worker to achieve your goal... will not work for me right?
The Hybrid Runbook Worker works for you. And it makes things easier. Because if you use Hybrid Runbook Worker, the Runbook is running on your local machine.
I'm able to download the blobs into my local machine and upload them to the public FTP server without any issue.
Are you saying that currently there is no way to upload and download files from external FTP server from Azure Powershell Automation?
I didn't successfully upload the blob to the public FTP server. Exception occurs when I try to upload the blob and only empty files with the name of the blob are uploaded to the FTP server. It might be a permission issue since the PowerShell script is running in a sandbox. That's the reason why I said that Hybrid Runbook Worker makes things easier.
In the end, please note: FTP authenticates users and transfers date in plaintext, which may cause security issue. FTPS and SFTP are more secure than FTP.

Windows Service ran by domain account cannot access file while full control

I have created a C# service that:
- Picks up and opens a local text file
- Opens an Excel-file used as template (saved locally)
- Fills in the data from the text file in the excel file
- Saves the Excel file to a network folder.
The service runs using a domain account (I cannot give the local system account rights on the network from our network admin...). When the service tries to open the template, I get an access denied error:
Microsoft Excel cannot access the file 'C:\BloxVacation\Template\BloxTemplate.xlsm'. There are several possible reasons:
• The file name or path does not exist.
• The file is being used by another program.
• The workbook you are trying to save has the same name as a currently open workbook.
The file does exist and the path is correct.
The file is not used by another user or program.
I try to OPEN the workbook (no other workbook is open), not SAVE it.
I have received the same error using the system account. The reason for this is that, when using interopservices, the system account needs a desktop folder (bug in Windows 7: http://forums.asp.net/t/1585488.aspx).
C:\Windows\System32\config\systemprofile\Desktop
C:\Windows\SysWOW64\config\systemprofile\Desktop
Create those 2 files and the error disappears for the system account.
I have given the domain user rights to those folders and the error disappears as well however, the service hangs on the code line where I open the excel file. When I execute the exact same code with the system account, the code execute well (Note: I save the file locally).
objXL.Workbooks.Open(BloxVacationService.ExcelTemplateFilePath)
Has anybody an idea how to solve this issue without having to rewrite the entire service in OpenXML? Thank you very much in advance.
If you have done all the things described in the question and it still doesn't work (as it was with me), the answer is pretty simple:
Make the domain user local admin on the machine that runs the service. It solved the problem.