Has anyone tried to do a PS script to upload CSV files to BigQuery? - powershell

I was able to create a python script to upload files to Bigquery, but has anyone tried it with Powershell ?
I tried to find an API call for PS but I cannot find anything

Yea, there are a few ways...
Use the Google Cloud Tools for Powershell (this is in beta)
Load data using BigQuery's web API
Load data using the .Net client library
Option-1 is probably your best bet. Checkout Add-BqTableRow:
Add-BqTableRow takes CSV, JSON, and AVRO files to import into BigQuery.
Option-3: You'll find the .Net examples will mostly be in C#. Convert what you see to Powershell.

Quick and easy CSV file loader to bigquery for the community
you will need a SVC account linked to your bq project.
Authentication
gcloud auth activate-service-account SERVICE_ACCOUNT#EMAIL>COM --key-JSON FILE WITH THE SVC_ACCOUNT
BigQuery CSV files loader
bq load --source_format=CSV --skip_leading_rows=1 DATASET.TABLE_NAME CSVFILE.CSV
Thanks guys!
Have a nice day!

Related

Using PowerShell to upload to AWS S3

Hopefully this is a quick fix (most likely user error) I am using PowerShell to upload to AWS S3, I'm attempting to copy x amount of .mp4s from a folder to an S3 location, I'm able to copy individual files successfully using the below command:
aws s3 cp .\video1.mp4 s3://bucketname/root/source/
But when I try to copy all the files within that directory I get an error:
aws s3 cp F:\folder1\folder2\folder3\folder4\* s3://bucketname/root/source/
The user-provided path F:\folder1\folder2\folder3\folder4\* does not exist.
I've tried multiple variations on the above, no path just *, *.mp4, .*.mp4 (coming from a Linux background, using quotation marks etc) but I can't seem to get it working.
I was using this documentation initially https://www.tutorialspoint.com/how-to-copy-folder-contents-in-powershell-with-recurse-parameter I feel the answer is probably very simple but couldn't see what I was doing wrong.
Any help would be appreciated.
Thanks.

CSV Importing With Rails, Postgres, and Sidekiq

I'm building a customer management system using Rails that requires CSV files containing customer information to be imported into/diffed with a Postgres database. I'm hosting the application on Heroku. I moved the database to the background with Sidekiq but need advice on where to upload the file to in the first place for importing. Is hosting the file on S3 really the best solution or is there a simpler solution without using a third party storage service? The application will be used daily but up 10 employees and the larges CSV file being upload is around 100,000 rows.
Thanks.
Yes, I do think S3 is the best solution
We faced same problem at Storemapper (we use Resque instead of Sidekiq, but that's not a problem). The limiting factor here is the Heroku request timeout. You only have 30s to finish your upload to Heroku, which put hard limit on how big your csv can be. This is where S3 come. Basically what we do is:
User upload csv directly to S3 via javascript, bypassing our app server on Heroku.
Once the upload complete, the javascript makes a request to app server that will launch background worker, telling the worker where the file is at S3
The worker download the csv from s3, then process it as necessary
I found carrierwave_direct gem to be very helpful for step 1 and 2. For step 3, I use smarter_csv gem. Checkout our complete story here:
https://tylertringas.com/very-large-csv-import-in-rails-on-heroku/

Redirecting output to a text file located on Azure Storage - Using Powershell

Using PowerShell, what is the best way of writing output to a text file located on Azure storage container? Thank you.
Simply, you can't.
While this capability exists within Azure Storage with the (relatively) new Append blob it hasn't yet filtered down to Powershell.
In order to implement this you would either need to create a new c# cmdlet that encapsulates the functionality. Or you will need to redirect the output to a standard file and then us the usual Azure storage cmdlets to upload that file to Azure Storage.

Batch file uploading to cloud storage

Could anyone cut and paste a working request to upload several files to cloud storage in a batch. I am really struggling to get it working, there are no examples of file uploads and I'm really stuck. Could probably work it out if I had a working starting point. I'm starting to go crazy so any help would be much appreciated.
You can find an example at [1] and consult this other answer at [2] as reference.
I would suggest you to use gsutil to copy files even as an external call from your application (PHP exec() or system()) since this tool is optimised for parallel file transfer (-m option) and recursive folder copy (-R option) making it very simple and efficient.
For more help on gsutil copy command : gsutil cp help
Links:
[1] - https://cloud.google.com/storage/docs/json_api/v1/how-tos/batch#example
[2] - Batch upload requests to Google Cloud Storage using javascript
Regards
Paolo

Best way to stage file from cloud storage to windows machine

I am wanting to store a data file for Quickbooks in the cloud. I understand that the data file is more of a database-in-a-file, so I know that I don't want to simply have the data file itself in a cloud directory.
When I say 'cloud', I'm meaning something like Google Drive or box.com.
What I see working is that I want to write a script (bat file, or do they have something new and improved for Windows XP, like some .net nonsense or something?)
The script would:
1) Download the latest copy of the data file from cloud storage and put it in a directory on the local machine
2) Launch Quickbooks with that data file
3) When the user exits Quickbooks, copy the data file back up into the cloud storage.
4) Rejoice.
So, my question(s)... Is there something that already does this? Is there an easily scriptable interface to work with the cloud storage options? In my ideal world, I'd be able to say 'scp google-drive://blah/blah.dat localdir' and have it copy the file down, and do the opposite after running QB. I'm guessing I'm not going to get that.
Intuit already provides a product to do this. It is called Intuit Data Protect and it backs up your Quickbooks company file to the cloud for you.
http://appcenter.intuit.com/intuitdataprotect
regards,
Jarred