How to suppress default outputs on serverless cloudformation yml? - aws-cloudformation

I'm using serverless-stack-output to save my serverless output to a file with some custom values that I setup. Works well, but serverless has some other default outputs such as these:
FunctionQualifiedArn (one for each function)
ServiceEndpoint
ServerlessDeploymentBucketName
I don't want these to show on my file, how to disable serverless/cloudformation from outputting them?

This is not possible at this stage.
I've dug through the code and there's no switch that to suppress outputs.
Unfortunate, as I have the exact same requirement.

Related

Question regarding parameter store and Cloud formation integration

I was trying this scenario but I am not able to figure out
I have a name and value in parameter store in SSM, now I am running the CF template from CLI using code pipeline, and I want the CF template take values directly from parameter store and should not prompt
on screen asking me to give the value.
I tried this but it prompt me .
AWS::SSM::Parameter::value
this is prompting when I used to upload a template in screen. how to avoid it and make the script take the value from parameter store directly
You have two choices for that:
Provide a default value for the parameter.
Use ParameterOverrides in your CodePiepline to provide the required value for the parameter.

Azure DevOps - Can we reuse the value of a key in the same variable group?

I have lots of URL values and their keys. But there is no way to batch import the variables and the "value" controls are also not text boxes in the Variables Group page to perform chrome browser extensions assisted find and replace.
If this is possible, what is the syntax to refer to the key?
As in, I have a variable App.URL : www.contoso.com.
I am using the key to substitute value in my next variable like this Login.URL : $(App.URL)\Login and this doesn't work.
GitHub link : https://github.com/MicrosoftDocs/vsts-docs/issues/3902#issuecomment-489694654
This isn't currently available, not sure if it will be. Can you create a task early in your pipeline that sets the variables you need in subsequent tasks/steps? This gives you more control as you can store the script along with your source. You could then use a pipeline variable for the environment you're in and let your script use that to set values appropriately.
See Set variables in scripts in the MS docs.
If it's not possible to re-architect your app to concatenate the url strings in the application, what the previous commenter said about creating a simple script to do that for you would be the way to go. Ie:
#!/bin/bash
#full login url
fullLoginUrl=$APP.URL\$LOGINSUFFIX
echo "##vso[task.setvariable variable=Login.URL]$fullLoginUrl
Otherwise, perhaps playing around with the run time vs compile time variables in YAML pipelines might be worth trying.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#understand-variable-syntax

Dataflow: set DataflowPipelineDebugOptions

My pipeline gives OOM errors constantly so I read a fowllowing answer and try to set --dumpHeapOnOOM and --saveHeapDumpsToGcsPath. But it seems that these options do not work. Do I need to change my code or modify something else?
Memory profiling on Google Cloud Dataflow
You will want to check configuring-pipeline-options.
The current way in Apache Beam (2.9.0) to configure pipeline option in command line is --<option>=<value>.
In your case, you can set --dumpHeapOnOOM=true --saveHeapDumpsToGcsPath="gs://foo"

Unable to run experiment on Azure ML Studio after copying from different workspace

My simple experiment reads from an Azure Storage Table, Selects a few columns and writes to another Azure Storage Table. This experiment runs fine on the Workspace (Let's call it workspace1).
Now I need to move this experiment as is to another workspace(Call it WorkSpace2) using Powershell and need to be able to run the experiment.
I am currently using this Library - https://github.com/hning86/azuremlps
Problem :
When I Copy the experiment using 'Copy-AmlExperiment' from WorkSpace 1 to WorkSpace 2, the experiment and all it's properties get copied except the Azure Table Account Key.
Now, this experiment runs fine if I manually enter the account Key for the Import/Export Modules on studio.azureml.net
But I am unable to perform this via powershell. If I Export(Export-AmlExperimentGraph) the copied experiment from WorkSpace2 as a JSON and insert the AccountKey into the JSON file and Import(Import-AmlExperiment) it into WorkSpace 2. The experiment fails to run.
On PowerShell I get an "Internal Server Error : 500".
While running on studio.azureml.net, I get the notification as "Your experiment cannot be run because it has been updated in another session. Please re-open this experiment to see the latest version."
Is there anyway to move an experiment with external dependencies to another workspace and run it?
Edit : I think the problem is something to do with how the experiment handles the AccountKey. When I enter it manually, it's converted into a JSON array comprising of RecordKey and IndexInRecord. But when I upload the JSON experiment with the accountKey, it continues to remain the same and does not get resolved into RecordKey and IndexInRecord.
For me publishing the experiment as a private experiment for the cortana gallery is one of the most useful options. Only the people with the link can see and add the experiment for the gallery. On the below link I've explained the steps I followed.
https://naadispeaks.wordpress.com/2017/08/14/copying-migrating-azureml-experiments/
When the experiment is copied, the pwd is wiped for security reasons. If you want to programmatically inject it back, you have to set another metadata field to signal that this is a plain-text password, not an encrypted password that you are setting. If you export the experiment in JSON format, you can easily figure this out.
I think I found the issue why you are unable to export the credentials back.
Export the JSON graph into your local disk, then update whatever parameter has to be updated.
Also, you will notice that the credentials are stored as 'Placeholders' instead of 'Literals'. Hence it makes sense to change them to Literals instead of placeholders.
This you can do by traversing through the JSON to find the relevant parameters you need to update.
Here is a brief illustration.
Changing the Placeholder to a Literal:

Can I enable / Disable an Azure Service Bus Topic using Powershell

I have spent a couple of hours search for a solution to disable my Azure Service Bus Topics using Powershell.
The background for this is we want to force a manual failover to our other region.
Obviously I could click in the Portal:
but I want to have a script to do this.
Here is my current attempt:
Any help would be great.
Assuming you're sure your $topic contains the full description, modify the status parameter in the array and then splat it back using the UpdateTopic method. I'm afraid I can't test this at present.
$topic.Status = "Disabled"
$topicdesc = $NamespaceManager.UpdateTopic($topic)
I don't think you'll need to set the entity type for the Status, nor do you require semi-colons after each line of code in your loop.
References
PowerShell Service Bus creation sample script (which this appears to be based off): https://blogs.msdn.microsoft.com/paolos/2014/12/02/how-to-create-service-bus-queues-topics-and-subscriptions-using-a-powershell-script/
UpdateTopic method: https://msdn.microsoft.com/en-us/library/azure/microsoft.servicebus.namespacemanager.updatetopic.aspx
Additional note: please don't screenshot the code - paste it in. I'd rather copy-and-paste than type things out.