Google cloud SQL export - google-cloud-sql

I am trying to export CSV from Google cloud SQL to bucket. I want help to understand any bulk export mechanism, currently looks per instance we can export one CSV at one time.

Related

Delete a file in sharepoint using Azure Data Factory Delete Activity

I am trying to delete a file that is located in a sharepoint directory after successful copy activity. The Delete Activity is having the following properties:
Linked Service : HTTP
DataSet : Excel
Additional Header: #{concat('Authorization: Bearer ',activity('GetToken').output.access_token)}
Here, GetToken is the Web Activity in ADF that generates a token number for accessing SharePoint.
When I am running the pipeline, I am getting the below error:
Invalid delete activity payload with 'folderPath' that is required and cannot be empty.
I have no clue on how to tackle this.
As per my understanding you are trying to delete a file in Sharepoint online using Azure Data Factory.
Currently delete activity in ADF only supports the below data stores and not sharepoint online. which is why you are receiving the above error.
Azure Blob storage
Azure Data Lake Storage Gen1
Azure Data Lake Storage Gen2
Azure Files
File System
FTP
SFTP
Amazon S3
Amazon S3 Compatible Storage
Google Cloud Storage
Oracle Cloud Storage
HDFS
Image: Delete activity Supported Data stores
Ref: Delete activity supported data sources
As a workaround you may try exploring HTTP connector. OR you can use custom activity and write your own code to delete files from SharePoint.
Hope this info helps.

Reading data from QVD using python and databricks

I am new to python,can you help me with the details of how QVD can be read into azure databricks dataframe using python.
I need the detail syntax (authentication with access key)of accessing the QVD from datalake and the read the same using qvd_reader

Is it possible to copy GeoJson data into PostGIS with Azure Data Factory?

I am looking if it is possible to transition from Airflow to Azure Data Factory.
I have a REST API from which I extract GeoJSON and would like to export this to a Postgres Database with PostGIS. I tried to do this with the Copy Data activity, but this only provides a simple mapping between the GeoJSON fields and similar fields in my table.
Normally I would use ogr2ogr to do this, but am not sure how to approach this with Azure Data Factory.
Does anyone know if my use case would be possible? If yes, how would you suggest to do it?
I fixed my own question. I created an Azure Function which runs Python in a self assigned docker container (one of the options in Azure Functions). I installed gdal in the standard Azure Functions Python Docker container and run subprocess.run() to execute ogr2ogr with the parameters I pass to it via the body of the Azure Functions POST request. I can run this Azure Function via Azure Data Factory.
Hope this can help anyone else searching for a similar approach.

Google Data Studio JDBC connection to PostgreSQL integration with Sheets?

Google Apps Script JDBC doesn't support a connection to PostgreSQL directly but Google Data Studio supports a connection to PostgreSQL to pull data and build reports. I've also heard they support a low-key export to .csv option. Is it then possible to exploit the Data Studio Service in Google Apps Script to populate Google Sheets with that data, effectively creating a workaround?
All I need is a one-way access from PostgreSQL into Google Sheets by means of Google Apps Script, I do NOT expect to import anything back into my database.
Looking at the reference documentation, the built-in Apps Script service for DataStudio does not allow you to pull data from a connected data source. It can be used to create connectors but its does not allow direct access to connected data sources.
However, you can try creating a custom API or server-less mirco-service in a language that supports PostgreSQL, and then expose that service as HTTP endpoints that you can call via URLFetchApp. You can leverage Google Cloud Functions to do this and write the mirco-service in either back-end Javascript(Node.js), Python or Go. This approach will take you well-outside the bounds of your typical GAS script, but it is a viable option.

Connect to Azure SQL Database from Databricks Notebook

I wanted to load the data from Azure Blob storage to Azure SQL Database using Databricks notebook . Could anyone help me in doing this
I'm new to this, so I cannot comment, but why use Databricks for this? It would be much easier and cheaper to use Azure Data Factory.
https://learn.microsoft.com/en-us/azure/data-factory/tutorial-copy-data-dot-net
If you really need to use Databricks, you would need to either mount your Blob Storage account, or access it directly from your Databricks notebook or JAR, as described in the documentation (https://docs.azuredatabricks.net/spark/latest/data-sources/azure/azure-storage.html).
You can then read the files into DataFrames for whatever format they are in, and use the SQL JDBC connector to create a connection for writing the data to SQL (https://docs.azuredatabricks.net/spark/latest/data-sources/sql-databases.html).