Access Bucket File from Perl in Google Cloud Shell - perl

How does one open file in a Bucket from Perl program running in a Google Cloud Shell running in the same project?
One can upload a file into the shell file system and open it and also put the file in a bucket for access by others but that seems counter-productive never mind that the files will be out of sync a day later.
I've tried various forms of
open($fh, '<', "gs://bucketname/filename");
without any luck.

Mount the bucket into the file system with fuse.

Related

Execute a file in Google Cloud Platform (GCP) bucket using Scala

I'm looking to execute Scala code in text file using GCP with Spark Shell.
Using GCP (Google Cloud Platform), I've done the following:
Created a DataProc instance and named it gcp-cluster-091122.
Created a Cloud Bucket and named it gcp-bucket-091122p.
Created a simple text file called 1.txt and uploaded the file into the recently created GCP bucket, gcp-bucket-091122.
Logged onto the VM-instance SSH-in-browser and entered the command spark-shell to access the scala > prompt.
From here, how does one read/execute a particular file uploaded into a GCP bucket? I've researched this topic, but I've been unsuccessful.
I've also used "GCS Fuse" plug-in code to successfully mount the GCP bucket, gcp-bucket-091122 onto a created local file directory in SSH called lfs-directory-091122.
So an additional question would be how to execute a file located in the local file directory using Spark Shell?

Pull from and Push to S3 using Perl

everyone! I have what I assume to be a simple problem, but I could use a hand digging in. I have a server that preprocesses data before translation. This is done by a series of perl scripts developed over a decade ago (but they work!). This virtual server is being lifted into AWS. The change this makes for my scripts is that the location they pull from and the location they write to will be S3 buckets now.
The work flow is: copy all files in the source location to the local drive, preprocess the data file by file, and when complete move the preprocessed files to a final destination.
process_file ($workingDir, $dirEntry);
final_move;
move("$downloadDir/$dirEntry", "$archiveDir") or die "ERROR: Archive file $downloadDir/$dirEntry -> $archiveDir FAILED $!\n";
unlink("$workingDir/$dirEntry");
So, in this case $dir and $archiveDir are S3 buckets.
Any advice on adapting this is appreciated.
TIA,
VtR
You have a few options.
Use a system like s3fs-fuse to mount your S3 bucket as a local drive. This would presumably require the smallest changes to your existing code.
Use the AWS Command Line Interface to copy your files to your S3 bucket.
Use the Amazon API (through something like Paws) to upload your files to S3.

Running sdbinst on a .sdb in a network share location

I want to run the sdbinst command on a .sdb database file as well as open it in the compatibility administrator. I have no problem doing this locally when the .sdb is stored on the machine i'm using, but i'd like to be able to open and run sdbinst on it when the file is stored in a network store location.
Is this possible?
Yes, according to the MS Help files within the MS Compatibility Toolkit.
See: "Mitigating Issues by using Compatibility Fixes". There is an example of a network deployment workflow: "Deploying the Contoso.sdb Database to your environment".
The basic pattern is to place the sdb on a network Share. Create a one line deployment script that references a path to that share.(sdbinst "\\SomePath\Ex.sdb" -q) Either push or execute the deployment script to/on each target computer in your environment.

Windows Service ran by domain account cannot access file while full control

I have created a C# service that:
- Picks up and opens a local text file
- Opens an Excel-file used as template (saved locally)
- Fills in the data from the text file in the excel file
- Saves the Excel file to a network folder.
The service runs using a domain account (I cannot give the local system account rights on the network from our network admin...). When the service tries to open the template, I get an access denied error:
Microsoft Excel cannot access the file 'C:\BloxVacation\Template\BloxTemplate.xlsm'. There are several possible reasons:
• The file name or path does not exist.
• The file is being used by another program.
• The workbook you are trying to save has the same name as a currently open workbook.
The file does exist and the path is correct.
The file is not used by another user or program.
I try to OPEN the workbook (no other workbook is open), not SAVE it.
I have received the same error using the system account. The reason for this is that, when using interopservices, the system account needs a desktop folder (bug in Windows 7: http://forums.asp.net/t/1585488.aspx).
C:\Windows\System32\config\systemprofile\Desktop
C:\Windows\SysWOW64\config\systemprofile\Desktop
Create those 2 files and the error disappears for the system account.
I have given the domain user rights to those folders and the error disappears as well however, the service hangs on the code line where I open the excel file. When I execute the exact same code with the system account, the code execute well (Note: I save the file locally).
objXL.Workbooks.Open(BloxVacationService.ExcelTemplateFilePath)
Has anybody an idea how to solve this issue without having to rewrite the entire service in OpenXML? Thank you very much in advance.
If you have done all the things described in the question and it still doesn't work (as it was with me), the answer is pretty simple:
Make the domain user local admin on the machine that runs the service. It solved the problem.

Best way to stage file from cloud storage to windows machine

I am wanting to store a data file for Quickbooks in the cloud. I understand that the data file is more of a database-in-a-file, so I know that I don't want to simply have the data file itself in a cloud directory.
When I say 'cloud', I'm meaning something like Google Drive or box.com.
What I see working is that I want to write a script (bat file, or do they have something new and improved for Windows XP, like some .net nonsense or something?)
The script would:
1) Download the latest copy of the data file from cloud storage and put it in a directory on the local machine
2) Launch Quickbooks with that data file
3) When the user exits Quickbooks, copy the data file back up into the cloud storage.
4) Rejoice.
So, my question(s)... Is there something that already does this? Is there an easily scriptable interface to work with the cloud storage options? In my ideal world, I'd be able to say 'scp google-drive://blah/blah.dat localdir' and have it copy the file down, and do the opposite after running QB. I'm guessing I'm not going to get that.
Intuit already provides a product to do this. It is called Intuit Data Protect and it backs up your Quickbooks company file to the cloud for you.
http://appcenter.intuit.com/intuitdataprotect
regards,
Jarred