What is the maximum size of azure easy tables? - azure-mobile-services

I searched on the internet and not found any answer related to it.
Could someone tell what is the maximum size of data we can accommodate with azure easy tables?
What is the maximum size we can have a backup of?
Thanks

Before using the Azure Easy Table, we need to configure the connections of Azure SQL and Azure Storage. The data of Azure Easy Table will be stored in Azure SQL database and Azure Storage.
Could someone tell what is the maximum size of data we can accommodate with azure easy tables?
So the maximum size of data we can accommodate with azure easy tables depends on the Azure SQL database which you choose.

Related

What is the difference between polybase and bulk insert, copy methods in azure data factory and when to use them?

Can some one please elaborate on when to use Polybase versus bulk insert in azure datafactory, what are the differences between these two copy methods?
The two options labeled “Polybase” and the “COPY command” are only applicable to Azure Synapse Analytics (formerly Azure SQL Data Warehouse). They are both fast methods of loading which involve staging data in Azure storage (if it’s not already in Azure Storage) and using a fast, highly parallel method of loading to each compute node from storage. Especially on large tables these options are preferred due to their scalability but they do come with some restrictions documented at the link above.
In contrast, on Azure Synapse Analytics a bulk insert is a slower load method which loads data through the control node and is not as highly parallel or performant. It is an order of magnitude slower on large files. But it can be more forgiving in terms of data types and file formatting.
On other Azure SQL databases, bulk insert is the preferred and fast method.

Ways to import data into AzureSQL PaaS from Azure Blob Storage

All,
I have to BULK Insert data into AzureSQL from a Azure Blob Storage Account. I know one way is to use SAS keys but are there more secure ways to load data from T-SQL?
For example, is there a way to use the users AAD account to connect to the Storage? Would Managed Identity work? I have not come across an example in the Internet that uses anything other than SAS Keys.
Gopi
azure data factory generally serves this purpose. You can build a pipeline that grabs data from blob and massages it / loads it into sql, kind of what it's designed for. However if you do not wish to use that,
the recommended way is SAS because it can be temporary and revoked at any time. Why do you think SAS is less secure?
as per the documentation: https://learn.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view=sql-server-ver15#credential--credential_name if you were to create an external data source with blob_storage type the identity/credentials MUST be SAS, as it doesn't support any other authentication type. as such, that means you cannot use any other auth method to a blob storage using tsql.

How do I efficiently migrate the BigQuery Tables to On-Prem Postgres?

I need to migrate the tables from the BigQuery to the on-prem Postgres database.
How can I efficiently achieve that?
Some thoughts that are coming
I will use Google APIs to export the data from the tables
Store it locally
And finally, import to Postgres
But I am not sure if that can be done for a huge amount of data in TBs. Also, how can I automate this process? Can I use Jenkins for that?
Exporting the data from BigQuery, store it and importing it to PostgreSQL is a good approach. Here are other two alternatives that you can consider:
1) There's a PostgreSQL wrapper for BigQuery that allows to query directly from BigQuery. Depending on your case scenario this might be the easiest way to transfer the data; although, for TBs it might not be the best approach. This suggestion was made by #David in this SO question.
2) Using Dataflow. You can create a ETL process using Apache Beam to made the transfer. Take a look at this how-to for transferring data from BigQuery to CloudSQL. You would need to adapt it for local PostgreSQL, but the idea maintains.
Here's another SO answer that gives more context on this approach.

Parallelisms in Azure Data factory v2 copy activity

We are implementing solution to achieve similar functionality as of ssis packages to copy data from one database to another (on-premise to azure SQL). In SSIS we have option to setup parallel processing in different ways. We can also transfer data in chunks.
Similarly, which is the best way to achieve parallelisms in Azure Data Factory version 2? Please consider scenario of transferring data for only 1 table.
Have a look at the Copy Activity Performance and Tuning Guide for ways to optimize transferring data into the Cloud with ADF: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-performance

Azure data Factory and power shell

enter image description here
Hi Stack team,
actually i plan to migrate my SQL server2012 databases to Azure data warehouse with Azure data factory approach...
But problems are,
1) my database size is 4.5 TB
2)in this approach there are 3 methods. those method details i mentioned in the attached image.. my problem is i planed to 3rd method for migrate(3.Using Azure Data Factory and PowerShell (entire database - ADF))
so please tell me links related above method and its possible for migration or not. if its possiable send me how to do...
Please refers to the link below, there's an UX solution in Azure Data Factory which are just target for migration from SQL Server to Azure SQL Datawarehouse. It's a wizard based UX which are very easy to follow. It will automatically create tables in Azure SQL Datawarehouse and migrate the data using polybase, which is the most efficient way to load data into SQLDW.
https://learn.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-with-data-factory