Move files in FTP with Logic apps - azure-data-factory

In an FTP I need to move files from folder to the archives file once they are deposited, I've build previous pipelines in Azure data factory, but since FTP is not supported in copy data I resorted to logic apps but I dont know which tasks to use. I also need to trigger the logic app from ADF.
Thank you,

There are several ways to implement the workflow you are trying to achieve using the SFTP/FTP connector depending on how frequently the files are added and how big the file sizes are. And after that you can create the Azure Blob Storage to archive the files from FTP Folder.
Following steps would give you an overall steps which you should follow.
In azure portal search Logic app and create. Open the Logic App and under DEVELOPMENT TOOLS select Logic App Designer and from the list of Templates click on Blank Logic App and search for FTP – When a file is added or modified as trigger.
Then provide the connection details for the remote FTP server you wish to connect to, as shown below for SFTP server.
Once you have the connection created we need to specify the folder in which the files will reside.
Then Click New step and Add an action. Now you would need to configure the target Blob storage account to transfer the FTP file to. Search for Blob and select AzureBlobStorage – Create blob.
Like this you would be able to archive the FTP files. You should also refer to this article to get more information how to copy files from FTP to Blob Storage in Logic App.
There is also an Quick Start template available for Copy FTP files to Azure Blob logic app by Microsoft. This template allows you to create a Logic app triggers on files in an FTP server and copies them to an Azure Blob container.
And for you second problem -
I also need to trigger the logic app from ADF
Check this Execute Logic Apps in Azure Data Factory (V2) Microsoft document.

Related

How to take backup / Download Azure Analysis Services Data models (.bim) file using PowerShell

I am doing a kind of comparison between a .bim files of UAT and PROD environment. I want to download the .bim file from Prod and UAT AAS (Azure Analysis Services) and perform the comparison. I am unable to do so. I tried Backup-ASBackup, it is not downloaded the actual .bim file in fact it creates some kind of compressed file in that I can't see the actual .bim file code.
I have seen few links but they were just performing a backup of data models for disaster management.
Please help me.
Unfortunately, you cannot take backup and download Azure Analysis Services Data models (.bim) file using powershell.
Reason: By default, Backup-ASDatabasecmdlet creates a <filename.abf> which does not allow you to extract the files because the file is compressed with Unicode encoding and this file can be only used to Restore-ASDatabase in case of disaster recovery.
You can download Azure Analysis Services Data models (.bim) file using Azure portal.
Select your Analysis services => Under Models => Select Manage => Select the models which you want to download the (.bim) file using Open in Visual studio.
The downloaded zip folder contains "SMPROJ" & "BIM" file.

copy files from azure file storage to azure website after release

I have files that need to be copied over to my website (azure website) after a deployment has been made. usually these files are server specific (I have multiple different servers for different releases), and usually in the past, before i used azure, i just had a backup folder with these files and a powershell script that i ran after deployment that just copied those files right over.
Now that i'm moving to azure, i'd like to keep this functionality. I'm interested in copying over these files into azure file storage, and then in my release task after azure website deployment, just copying from that file storage over into the site\wwwroot folder. I'm not really seeing an easy way to do this. Is there a release task i can use with this in mind?
Is there a release task i can use with this in mind?
Yes, we could use the Azure File Copy task. I also do a demo to copy the zip file to the azure storage. It works correctly on my side. Fore more information, please refer to the screenshot.
Note: If you don't want to zip the files, you could remove the Archive File task.
Test result:

how to move files located in on-premises windows based file share using scala?

Background : I am using Azure ADFV2 to move data from fileshare to ADLS, after the file is moved successfully I want to archive the file within fileshare location.
How do I connect to on-premises windows based file share and move the files from one folder to another within the fileshare using scala. I am not sure how to establish the connectivity to a file share.
You can use file system linked service to establish the connectivity to a file share.
create a self-hosted integration runtime on ADF and install it on your machine.
create a file system linked service, and for "Connect via integration runtime" field, choose the self-hosted ir you created in 1.
configure you linked service and dataset as the doc instructs: https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system

Programatically run a load test without having the app in VSTS source control

We're using an on-prem VCS and CI pipeline, and don't have plans to switch to VSTS right now. However, I'd be very interested in running cloud-based load-tests against our app as part of our CI pipeline. In order to do this, I'd have to be able to programmatically upload the loadtest script and invoke it from VSTS.
Is this possible?
Yes, it is possible, the workflow like this:
1.Create a valid loadtest file. You can use the load test file from an earlier run through Visual Studio for this.
2.Create a location to upload the file(s). This location is a drop folder on Azure Blob and is below referred to as "TestDrop".
3.Upload the loadtest file and any other files required for the run, this includes the webtest files, settings file, etc. to this location or "TestDrop".
4.Create a Test Run using the Testdrop from the previous step as all the files required for a run are now available at the drop location.
5.Start the run.
6.Once finished, download the results to your local machine. This will be a gzip file. Uncompress it to get the results file.
7.Use Visual Studio to view the downloaded results.
More information, you can refer to this article, which contains samples.

Azure Continuous Integration Overwriting App_Data even with WebDeploy file specified to "exclude app data"

I have a Windows Azure Website and I've setup Azure Continuous Integration with hosted Team Foundation Server. I make a change on my local copy, commit to TFS, and it gets published to Azure. This is great, the problem is that I have an Access database in the ~\App_Data\ folder and when I check-in the copy on Azure gets overwritten.
I setup a web-deploy publish profile to "Exclude App_Data" and configured the build task to use the web-deploy profile, and now it DELETES my ~\App_Data\ folder.
Is there a way to configure Azure Continuous Integration to deploy everything and leave the App_Data alone?
I use the 'Publish Web' tool within Visual Studio, but I think the principles are the same:
if you modify a file locally and publish, it will overwrite whatever's on the web
if you have no file locally - but the file exists on the web - it will still exist on the web after publishing
The App_Data folder gets no special treatment in this behaviour by default. Which makes sense - if you modified an .aspx or .jpg file locally, you would want the latest version to go on the web, right?
I also use App_Data to store some files which I want the web server (ASP.NET code) to modify and have it stay current on the web.
The solution is to:
Allow the web publishing to upload App_Data, no exclusions.
Don't store files in App_Data (locally) that you want to modify on the web.
Let the web server be in charge of creating and modifying the files exclusively.
Ideally you would not have to change any code and the server can create a blank file if necessary to get started.
However if you must start off with some content, say, a new blank .mdf file, you could do the following:
Locally/in source repository, create App_Data/blank.mdf (this is going to be a starting point, not the working file).
In Global.asax, modify "Application_Start" to create the real working .mdf file from the blank starting file:
// If the real file doesn't exist yet (first run),
// then create it using a copy of the placeholder.
// If it exists then we re-use the existing file.
string real_file = HttpContext.Current.Server.MapPath("~/App_Data/working.mdf");
if (!File.Exists(real_file))
File.Copy(HttpContext.Current.Server.MapPath("~/App_Data/blank.mdf"), real_file);