How to Import Google workspace data automatically to Big Query database? - import

How to daily import Google workspace data automatically to Big Query database?
I'm new to Big Query and i can do it manually but i want to automate this process. Thanks.

With BigQuery you can create external tables, that enable you to query data that is stored in your Google Drive (CSV, Avro, JSON, or Google Sheets documents).
You can find a nice how-to here.

Related

How do I load a Google Cloud Storage Firebase export into BigQuery?

I have a simple, single collection in Firestore, a museum visitor name and date with some other fields.
I successfully ran a gcloud export:
gcloud beta firestore export gs://climatemuseumexhibition2019gov.appspot.com --collection-ids=visitors
and the collection is now sitting in a bucket in Cloud Storage.
I tried downloading the data, which appears to be in several chunks, but it's in .dms format and I have no idea what that is.
Also, I've successfully linked BigQuery to my firestore project but don't see the collection, and I don't see the Cloud Storage object at all.
I gather the idea is create a Dataset from the Cloud Storage, then create a Table from the dataset. I'd appreciate specific details on how to do that.
I've read the manuals and they are opaque on this topic, so I'd appreciate some first hand experience. Thank you.

is there any way to integrate mainframe files with Mongodb or any Hadoop component

I am using Exstream Dialogue Pub to generate PDF documents. Next I would like to store these PDFs in HIVE or Mongodb and retrieve these PDFs when ever i want.
Is there a way integrate Mainframe with HIVE or Mongodb?
How to bring data from VSAM files to HIVE or Mongodb?
If not, please suggest any another component in Hadoop.
There is an mongo interface to VSAM. It's called Data Virtualization Service (DVS):
https://www.ibm.com/support/knowledgecenter/en/SS4NKG_1.1.0/kc_welcome_all.html

Import Firebase analytics data to MongoDB

I will try to explain the context of the project on which I'm working and the problem which I currently try to overcome. I would like to collect mobile data for analytics purpose, e.g. using Firebase from Google. After collecting thoses data, I'd like to store them in a local database, such as PostgreSQL or MongoDB. But the thing is mobile data collecting platform such as Firebase mostly doesn't support connecting to a local database. I've found out that there is a possibility to export raw data from Firebase to import into a local database, but I have no detail information about this. I've searched through many documentations and I couldn't find anything.Has everyone ever had this kind of problem and can give a clear guidance about exporting data from Firebase and import into a local database? Thanks in advance.
Google Analytics for Firebase collects enormous amount of data and send it to the servers in form of batches or bundles. Three things here with your design:
SDK collects that data and does not expose it for you to parse in the app and store it in local DB.
Storing this much data locally, you are going to re-invent the wheel. In addition, this could cause performance issues.
If you need all the data collected by Firebase in raw form, just link Firebase to Big Query and all data would be available to you.

Loading Data into marklogic from MySql

How to load data into Marklogic from MySql database.
As well as how to create a document database & build search application on top of it ? (PDF's would be the source for documents database)
-Thanks & Regards
Swapneel
there are two main options for loading data from MySQL - you can either export flat csv files and import the rows as documents using MLCP or by connecting directly with MLSAM. You'll have to look and see which best suits your situation.
In terms of design, I'd recommend looking at the MarkLogic developer website and going through the tutorials; there's a good overview of data modeling.
You will probably want to look at the CPF documentation for information about PDF conversion as well.
Hope that's been useful, Ed

Export data from Google SpreadSheet to Google Cloud SQL

I have big Google Spreadseet (about 50 colums and about 50 sheets). Can I export data from tables to SQL database? May be there are script opportunities or anothers...
Thanks in advance, Dmitry.
I think the best way to do this would be to use Google Apps Script.
The SpreadsheetService will easily open and read the spreadsheet, then you can use the JDBC service to connect to Cloud SQL.