I have set up an automation account and a graphical workflow with PowerShell to backup to blob storage.
I Realised that Azure automation was missing the required APIM commandlets so I imported them. I can now see these within my Automation assets:
Here is my graphical PowerShell workflow:
However, Backup-AzureRmApiManagement is not available:
Can anyone tell me why?
thanks
Russ
Ok,
I didn't think there was a valid reason for this. I deleted my imported modules (even though azure said they were available (imported correctly)) and re-imported.
Now the Backup-AzureRmApiManagement is available:
Just be careful when importing PowerShell module into Azure. It may look like they imported correctly but some things might be missing or broken.
Related
Asking for an opinion or direction on the current problem.
We are using bitbucket pipeline to deploy ci/cd web applications to Azure. Now what is remaining - the database, also being hosted on Azure.
From my research - everything on SQL Database Projects deployments usually utilizes Azure DevOps pipelines (connects to github repo, allows plural environments, has a built-in SqlAgent allows deploy SQL db to the target server via dacpac file. It allows CI with every check-in, every time you push changes. Nice!
But what if can not (for some reason) use Azure DevOps and have to utilize Bitbucket pipelines instead. is that possible? how? via scripting? a tool? to call in the command line? Any help - highly appreciated.
It's true that in Azure DevOps it is easier to deploy (Azure) SQL Database, as Azure DevOps offers many tasks (including 3rd party custom tasks you can find in Microsoft MarketPlace).
However, no matter what tool will you use, you should be able to do the same, knowing the concept of deployment of a specific service.
I don't know BitBucket very well, but I bet the product has the capability to execute some commands, including PowerShell commands as well. If so, you must do 2 steps in your pipeline to publish Azure SQL database:
1) Create server and (empty) database - perhaps BitBucket offers some task for creating services in Azure (from ARM template or other way). If not - you can always use CLI or PowerShell to do so. More info: az cli server
2) Deploy the database or changes to it. This step is always to compare DACPAC file (which is compiled version of SQL Server database project) to target database on the server. The result is T-SQL (differential) script which must be executed against the target database. There is only one way to do so - sqlpackage.exe - tool provided by Microsoft. You can find the whole documentation here and plenty of examples on how to use it on the Internet.
Let me know if that helps.
I am new to Azure Devops and hoping this is a simple fix. I have a powershell script that uses Tabular Editor to deploy .bim file to Azure Analyses Services. This works great on my local machine. I have tried to get this working in the devops pipelines with no luck. I haven't found away to install the software on the hosted agent - Question 1) can I install software on a Hosted Agent e.g. on Hosted VS2017.
Failing being able to install software on Microsoft's hosted agent. I checked in the TabularEditor.exe file into the source code (I know this ins't best practise). The executable file gets put into the build artifact and publishes. Then in the release when my powershell script is called it just hangs, the script gets stuck here. The powershell script reads from a config file and also uses the path to the tabulareditor executable.
The script I am using works fine if you use a self hosted machine assuming the agent has the correct permissions.
I have another Analyses Services script that is ready and works provided Someone creates an XMLA of the model first, then we provide that as an input instead of a .bim file. But this is not quite the automated route I am looking for.
Also I am aware that there is a third party task that does azure analyses services deployment but I want to avoid using that.
In summary I am looking to find out
1) if I can indeed install software on Microsofts Hosted Agent
2) Should I be able to use the executable in my build artifact instead
3) Is there a better way to deploy Analyses services with a .bim file
I appreciate this is long winded and slightly unique but any insight or information would be appreciated.
Thanks
I am doing some research for Powerapps integration with Azure DevOps.
However there is limitated information for it.
It is possible to integrate powerapps inside a Task for AzureDevops?
Based on, that we have a .zip file with the Powerapp, and we want to create a Build and Release/Deploy for several environments.
Thank You.
It is possible to integrate powerapps inside a Task for AzureDevops?
Yes it is.
You can leverage the Solution concept of the Microsoft Power Platform and the Power Apps BuildTools (preview) extension for Azure DevOps.
Update 11/2020: This is now GA and called Power Platform Build Tools
I've written a complete step-by-step guide on this topic:
A Continuous Delivery Approach for No-Code Solutions in Microsoft’s Power Platform
Bottom line:
With this build tool, you can automatically check-in a Solution into source control and deploy it using a continuous delivery approach with the help of Azure DevOps. See the screenshot for a sample configuration of the Export and Import Solution Task.
It works for everything you can organize inside a Solution, e.g.:
Power Apps
Power Automate Flows
AI Builder Models
Common Data Service Entities
It is possible to integrate powerapps inside a Task for AzureDevops?
I am afraid there is no such Task integrate powerapps for AzureDevops at this moment.
If you want to integrate powerapps with azure devops, you can follow the guide step by step:
Microsoft Teams – Integration with Visual Studio Team Services using PowerApps.
Besides, AFAIK, PowerApps should not be "Build/Deployed" through Azure Devops.
When you are developing with PowerApps, there is no way to do Source
Control. There are no source files. The only artifact you can version
control is the .zip file that you can export.
And
In PowerApps, you don’t have to build your code. Any change you make
to the application is live for you to test it. In that way it is very
productive. To publish the application you just click on the publish
button and it is live.
Check this great blog: PowerApps From A DevOps Perspective for some more details.
Hope this helps.
Solutions are a way to package your components in a single zip file and use Powerapps build tools to import your solution on to a different environment or tenant.
It is still a an improvement from manually importing each app or environment variable and then import it on to target system, but it lacks what we call as automation of deployment.
To provide an eg, I will explain what I have done, and what still constitutes of a manual task:
I created an enterprise level app using powerapp canvas model. My app consumes data from around 20 APIs. These API calls are implemented in power automate.
We have 4 environments, dev, sit, uat and prod. Now I cant keep on importing flows in each environment and change their api URLs to point to the deployed environment. So I used environment variables for each environment which stores api URLs for each environment. This can be done under solution.
Under the same solution, I added my app. So now my solution has 2 things, my app and the environment variable which consists of api URLs.
I then use powerapps build tools to move this solution from dev to sit.
Steps: use build tools tasks to perform the following
Export solution
Unpack it in git
Pack it
Import the solution.
This successfully moves my solution to sit.
But the solution environment variable still points to the dev url.
So I have to override environment variables to store sit URLs.
This manual intervention to edit environment variable is as good as doing all the tasks manually.
This was the case when PowerApps was first announced; however, this is no longer the case.
While it is technically true that there is no actual code that would be managed and deployed with a PowerApp or Flow but that doesn't mean that you can not use the power of Azure DevOps. Additionally, when creating a PowerApp / flow you would also be creating entities and even Model Driven apps - and these uses solutions - which naturally work well to deplooy within Azure DevOps.
Microsoft is building out this whole construct to enable all these to deploy...
While the whole incorporation of PowerApps and flows into Solutions is not fully baked yet - they are targeting to have this ready around the October time frame this year.
We have been talking to Microsoft about also enabling PowerApps and flows to follow the same expansion that solutions do so that they can take advantage of the full branching strategy.
So even though you would be simply exporting out zip files into your repo - you can still take advantage of the full devops pipeline which is highly recommended.
Use this component, it still on preview mode but is working fine on my side
https://marketplace.visualstudio.com/items?itemName=microsoft-IsvExpTools.PowerApps-BuildTools
Is it possible to browse the file system in Azure Devops. Like when using SSH to connect to a server? Or if it's possible to browse using Explorer.
It would really simplify things if I could see what files were created and where they end up after builds.
Now I don't feel I have any way to know which files ended up where after the builds are done.
Thanks!
I don`t think so. You may add build steps (Build and release tasks - Utility) and create cmd or bath file to browse the file system of the build servers.
As alternative way, you may use your own build server (Self-hosted agents) on Azure VMs and you will have the full control.
Is there a way to create cosmosDb collections via Azure Templates or via Powershell?
All i have got so far are examples with the Azure CLI, but those do not fit my requirements.
I would also like to avoid Rest Calls, since this seams like a lot of overhead compared to the a possible Powershell solution.
Based on the information provided here, you can only perform account related operations with PowerShell. It is not possible to manage data inside an account using PowerShell as of today.
The following table includes links to sample Azure PowerShell scripts
for Azure Cosmos DB. At this time you can only manage the Azure Cosmos
DB accountlayer via PowerShell; other resources such as databases and
collections cannot be managed via PowerShell.
Also looking at the Azure Feedback site here, it is still unplanned but someone has started a project on Github for this. Do take a look at that project here: https://github.com/secabstraction/PoshDocs.
You can use the Cosmos DB PowerShell module available on the PowerShell Gallery. You can find the documentation and examples of how to use it on GitHub in the project repository.
You can try calling the REST API from powershell to achieve your goal SEE REST API Dcumentation here