Is there a way to override the file in the Downloads section on GitHub when uploading a file with the same filename? (e.g. via developer API or the ruby script, etc) The reason is that I want to keep track of the number of downloads. Thanks!
I havn't tried this but it's possible that you could replace the file on Amazon S3. I don't know if it will work or if it's a one-time upload token you get without the posibility to delete the file.
See the API documentation for uploading a file on Github (which includes using the amazon s3 rest api to upload the file):
http://developer.github.com/v3/repos/downloads/
And API documentation for deleing a file on amazon s3:
http://docs.amazonwebservices.com/AmazonS3/latest/API/RESTObjectDELETE.html
And API documentation for putting a file on amazon s3:
http://docs.amazonwebservices.com/AmazonS3/latest/API/RESTObjectPUT.html
Related
In an FTP I need to move files from folder to the archives file once they are deposited, I've build previous pipelines in Azure data factory, but since FTP is not supported in copy data I resorted to logic apps but I dont know which tasks to use. I also need to trigger the logic app from ADF.
Thank you,
There are several ways to implement the workflow you are trying to achieve using the SFTP/FTP connector depending on how frequently the files are added and how big the file sizes are. And after that you can create the Azure Blob Storage to archive the files from FTP Folder.
Following steps would give you an overall steps which you should follow.
In azure portal search Logic app and create. Open the Logic App and under DEVELOPMENT TOOLS select Logic App Designer and from the list of Templates click on Blank Logic App and search for FTP – When a file is added or modified as trigger.
Then provide the connection details for the remote FTP server you wish to connect to, as shown below for SFTP server.
Once you have the connection created we need to specify the folder in which the files will reside.
Then Click New step and Add an action. Now you would need to configure the target Blob storage account to transfer the FTP file to. Search for Blob and select AzureBlobStorage – Create blob.
Like this you would be able to archive the FTP files. You should also refer to this article to get more information how to copy files from FTP to Blob Storage in Logic App.
There is also an Quick Start template available for Copy FTP files to Azure Blob logic app by Microsoft. This template allows you to create a Logic app triggers on files in an FTP server and copies them to an Azure Blob container.
And for you second problem -
I also need to trigger the logic app from ADF
Check this Execute Logic Apps in Azure Data Factory (V2) Microsoft document.
We recently migrated from Bitbucket to ADO and one of our platforms requires a link that will return the raw file for its deployment process. Bitbucket has a "raw" link available when viewing source files in the web UI, but I haven't found anything like that in ADO, the closest thing is a download link, but I need a link that just simply returns/displays the raw source file contents, not with a download dialog box. Is this possible?
I found that using the following in the api URL got me what I need:
_apis/sourceProviders/TfsGit
https://learn.microsoft.com/en-us/rest/api/azure/devops/build/source-providers/get-file-contents?view=azure-devops-rest-5.0
I want to upload zip file to my repository GitHub.
How can i make this methods API and PHP+CURL?
I tried to find the description in the documentation, but I did not find how to send the file
Is there any possibility to upload data to SharePoint Online using Migration API, without uploading package to Azure Storage. I have referred some Blogs and they are suggesting to upload package to Azure Storage via Powershell
The closest thing you can use is the direct migration using PowerShell, it will not use the Migration API and of course, no Azure will be required.
I've created a custom solution for migration, it supports metadata and allows you to migrate file shares and local folders, not sure if it will be helpful for you but take a look:
https://github.com/MrDrSushi/SPOPSM
After I added S3 credentials to filepicker all the files 404. What's up with that? I assume this is because filepicker is trying to get the files from s3 instead its original location without trying to move it?
How do I ensure that filepicker migrates the files.
We have a fix coming out for this in the next two days, in the interim we have a script that can migrate the files into your S3 bucket if you need them moved.