My model has an entity with a property of type 'blob'. At the moment a database column of type "varbinary(max)" is generated. However, I want to load/save my blob from/to Azure blob storage. What property type do I have to use in my model? What configuration do I need in my web.config file? Do you have an example project?
In the CodeFluent.Runtime.Azure dll you will find the AzureBinaryLargeObject call and its related configuration AzureBinaryServicesEntityConfiguration.
See Azure BLOB
This should do the treak.
However it relies on the Microsoft.WindowsAzure.StorageClient.dll not included in newest version of Azure SDK. You can get it in this specific NuGet package
Related
My Azure Databricks workspace was decommissioned. I forgot to copy files stored in the DatabricksRoot storage (dbfs:/FileStore/...).
Can the workspace be recommissioned/restored? Is there any way to get my data back?
Unfortunately, the end-user cannot restore Databricks Workspace.
It can be done by raising support ticket here
It is best practice not to store any data elements in the root Azure Blob storage that is used for root DBFS access for the workspace. The root DBFS storage is not supported for production customer data. However, you might store other objects such as libraries, configuration files, init scripts, and similar data. Either develop an automated process to replicate these objects or remember to have processes in place to update the secondary deployment for manual deployment.
Refer - https://learn.microsoft.com/en-us/azure/databricks/administration-guide/disaster-recovery#general-best-practices
I am doing a kind of comparison between a .bim files of UAT and PROD environment. I want to download the .bim file from Prod and UAT AAS (Azure Analysis Services) and perform the comparison. I am unable to do so. I tried Backup-ASBackup, it is not downloaded the actual .bim file in fact it creates some kind of compressed file in that I can't see the actual .bim file code.
I have seen few links but they were just performing a backup of data models for disaster management.
Please help me.
Unfortunately, you cannot take backup and download Azure Analysis Services Data models (.bim) file using powershell.
Reason: By default, Backup-ASDatabasecmdlet creates a <filename.abf> which does not allow you to extract the files because the file is compressed with Unicode encoding and this file can be only used to Restore-ASDatabase in case of disaster recovery.
You can download Azure Analysis Services Data models (.bim) file using Azure portal.
Select your Analysis services => Under Models => Select Manage => Select the models which you want to download the (.bim) file using Open in Visual studio.
The downloaded zip folder contains "SMPROJ" & "BIM" file.
So I am facing the following problem: I have a bunch of Azure Data Factory V1
Pipelines in one specific data factory, these pipelines, each have, around 400 data sets.
I need to move all of them to a new resource group / environment and put their json definition in a git repo.
So my questions is, how can I download all the pipelines definitions for a data factory and all the data sets definitions in their json format from Azure?
I don't want to click each one and copy-paste from the Azure UI, as it will take ages.
Call Rest API is good way for both V1 and V2. See this doc.
For ADF V1, you can try using Visual Studio.
Connect via Cloud Explorer to your data factory.
Select the data factory and choose Export to New Data Factory Project
This is documented on SQL Server Central.
Another thing to try is to have Azure script out an ARM template of the Data Factory.
Is there any possibility to upload data to SharePoint Online using Migration API, without uploading package to Azure Storage. I have referred some Blogs and they are suggesting to upload package to Azure Storage via Powershell
The closest thing you can use is the direct migration using PowerShell, it will not use the Migration API and of course, no Azure will be required.
I've created a custom solution for migration, it supports metadata and allows you to migrate file shares and local folders, not sure if it will be helpful for you but take a look:
https://github.com/MrDrSushi/SPOPSM
When I try to deploy ADF project from visual studio to azure, I get the error:
21.02.2017 13:03:32- Publishing Project 'MyProj.DataFactory'....
21.02.2017 13:03:32- Validating 10 json files
21.02.2017 13:03:37- Publishing Project 'MyProj.DataFactory' to Data Factory 'MyProjDataFactory'
21.02.2017 13:03:37- Starting upload of Dependency D:\Sources\MyProjDataFactory\Dependencies\ParseMyData.usql
The dependency is Azure Data Lake Analytics U-SQL script.
Where are the dependencies stored in azure?
UPDATE:
When i try to orchestrate a U-SQL stored proc instead of script the visual studio validator trows me the error on build:
You have a couple of options here.
1) Store the USQL file in Azure Blob Storage. In which case you'll need a linked service in your Azure Data Factory to blobs. Then upload the file manually or add the file to your Visual Studio project dependencies for data factory. Unfortunately this will mean the USQL becomes static in the ADF project and not linked in any way to your ADL project so be careful.
2) The simplest way is just to in line the USQL code directly in the ADF JSON. Again is means you'll need to manually refresh code from the ADL project.
3) My preferred approach... Create the USQL as a stored procedure in the Azure Data Lake Analytics service. Then reference the proc in the JSON using [database].[schema].[procname]. You can also pass parameters to the proc from ADF. For example the time slice. This also assumes you already have ADL setup as a linked service in ADF.
Hope this helps.
I have a blog post about the 3rd options and passing params here if your interested: http://www.purplefrogsystems.com/paul/2017/02/passing-parameters-to-u-sql-from-azure-data-factory/
Thanks