How to take backup / Download Azure Analysis Services Data models (.bim) file using PowerShell - powershell

I am doing a kind of comparison between a .bim files of UAT and PROD environment. I want to download the .bim file from Prod and UAT AAS (Azure Analysis Services) and perform the comparison. I am unable to do so. I tried Backup-ASBackup, it is not downloaded the actual .bim file in fact it creates some kind of compressed file in that I can't see the actual .bim file code.
I have seen few links but they were just performing a backup of data models for disaster management.
Please help me.

Unfortunately, you cannot take backup and download Azure Analysis Services Data models (.bim) file using powershell.
Reason: By default, Backup-ASDatabasecmdlet creates a <filename.abf> which does not allow you to extract the files because the file is compressed with Unicode encoding and this file can be only used to Restore-ASDatabase in case of disaster recovery.
You can download Azure Analysis Services Data models (.bim) file using Azure portal.
Select your Analysis services => Under Models => Select Manage => Select the models which you want to download the (.bim) file using Open in Visual studio.
The downloaded zip folder contains "SMPROJ" & "BIM" file.

Related

Move files in FTP with Logic apps

In an FTP I need to move files from folder to the archives file once they are deposited, I've build previous pipelines in Azure data factory, but since FTP is not supported in copy data I resorted to logic apps but I dont know which tasks to use. I also need to trigger the logic app from ADF.
Thank you,
There are several ways to implement the workflow you are trying to achieve using the SFTP/FTP connector depending on how frequently the files are added and how big the file sizes are. And after that you can create the Azure Blob Storage to archive the files from FTP Folder.
Following steps would give you an overall steps which you should follow.
In azure portal search Logic app and create. Open the Logic App and under DEVELOPMENT TOOLS select Logic App Designer and from the list of Templates click on Blank Logic App and search for FTP – When a file is added or modified as trigger.
Then provide the connection details for the remote FTP server you wish to connect to, as shown below for SFTP server.
Once you have the connection created we need to specify the folder in which the files will reside.
Then Click New step and Add an action. Now you would need to configure the target Blob storage account to transfer the FTP file to. Search for Blob and select AzureBlobStorage – Create blob.
Like this you would be able to archive the FTP files. You should also refer to this article to get more information how to copy files from FTP to Blob Storage in Logic App.
There is also an Quick Start template available for Copy FTP files to Azure Blob logic app by Microsoft. This template allows you to create a Logic app triggers on files in an FTP server and copies them to an Azure Blob container.
And for you second problem -
I also need to trigger the logic app from ADF
Check this Execute Logic Apps in Azure Data Factory (V2) Microsoft document.

copy files from azure file storage to azure website after release

I have files that need to be copied over to my website (azure website) after a deployment has been made. usually these files are server specific (I have multiple different servers for different releases), and usually in the past, before i used azure, i just had a backup folder with these files and a powershell script that i ran after deployment that just copied those files right over.
Now that i'm moving to azure, i'd like to keep this functionality. I'm interested in copying over these files into azure file storage, and then in my release task after azure website deployment, just copying from that file storage over into the site\wwwroot folder. I'm not really seeing an easy way to do this. Is there a release task i can use with this in mind?
Is there a release task i can use with this in mind?
Yes, we could use the Azure File Copy task. I also do a demo to copy the zip file to the azure storage. It works correctly on my side. Fore more information, please refer to the screenshot.
Note: If you don't want to zip the files, you could remove the Archive File task.
Test result:

Deploy Azure Data Factory by just providing project file (Not coding Linked services, Datasets separately)

I want to know how to deploy an ADF from a Visual studio project file in which only the Resources (eg: Azure SQL ) are defined?
I want the linked services to be generated automatically for the specifies resources.

an error when i try deploy ADF project with dependencies

When I try to deploy ADF project from visual studio to azure, I get the error:
21.02.2017 13:03:32- Publishing Project 'MyProj.DataFactory'....
21.02.2017 13:03:32- Validating 10 json files
21.02.2017 13:03:37- Publishing Project 'MyProj.DataFactory' to Data Factory 'MyProjDataFactory'
21.02.2017 13:03:37- Starting upload of Dependency D:\Sources\MyProjDataFactory\Dependencies\ParseMyData.usql
The dependency is Azure Data Lake Analytics U-SQL script.
Where are the dependencies stored in azure?
UPDATE:
When i try to orchestrate a U-SQL stored proc instead of script the visual studio validator trows me the error on build:
You have a couple of options here.
1) Store the USQL file in Azure Blob Storage. In which case you'll need a linked service in your Azure Data Factory to blobs. Then upload the file manually or add the file to your Visual Studio project dependencies for data factory. Unfortunately this will mean the USQL becomes static in the ADF project and not linked in any way to your ADL project so be careful.
2) The simplest way is just to in line the USQL code directly in the ADF JSON. Again is means you'll need to manually refresh code from the ADL project.
3) My preferred approach... Create the USQL as a stored procedure in the Azure Data Lake Analytics service. Then reference the proc in the JSON using [database].[schema].[procname]. You can also pass parameters to the proc from ADF. For example the time slice. This also assumes you already have ADL setup as a linked service in ADF.
Hope this helps.
I have a blog post about the 3rd options and passing params here if your interested: http://www.purplefrogsystems.com/paul/2017/02/passing-parameters-to-u-sql-from-azure-data-factory/
Thanks

Programatically run a load test without having the app in VSTS source control

We're using an on-prem VCS and CI pipeline, and don't have plans to switch to VSTS right now. However, I'd be very interested in running cloud-based load-tests against our app as part of our CI pipeline. In order to do this, I'd have to be able to programmatically upload the loadtest script and invoke it from VSTS.
Is this possible?
Yes, it is possible, the workflow like this:
1.Create a valid loadtest file. You can use the load test file from an earlier run through Visual Studio for this.
2.Create a location to upload the file(s). This location is a drop folder on Azure Blob and is below referred to as "TestDrop".
3.Upload the loadtest file and any other files required for the run, this includes the webtest files, settings file, etc. to this location or "TestDrop".
4.Create a Test Run using the Testdrop from the previous step as all the files required for a run are now available at the drop location.
5.Start the run.
6.Once finished, download the results to your local machine. This will be a gzip file. Uncompress it to get the results file.
7.Use Visual Studio to view the downloaded results.
More information, you can refer to this article, which contains samples.