In my node-red app in Bluemix, I added a User-Defined environment variable.
Not sure how to get that variable from a function in my node-red application.
Thanks.
You will need to edit the bluemix-settings.js file to include the "process" built-n or other variables in functionGlobalContext.
functionGlobalContext: {
process: process
}
Once redeployed you can access the process in a function node as...
context.global.process
https://developer.ibm.com/answers/questions/170246/how-do-i-get-at-my-vcap-variables-from-node-red.html
To get User Defined Variable from Bluemix you must use something like this:
var services = context.global.process.env['USR_DEFINED_VAR'];
after configuring functionGlobalContext: { process: process } in bluemix-settings.js
To change the value is just like that:
context.global.process.env['USR_DEFINED_VAR'] = value;
Related
I've been using Terraform for some time but I'm new to Terraform Cloud. I have a piece of code that if you run it locally it will create a .tf file under a folder that I tell him but if I run it with Terraform CLI on Terraform cloud this won't happen. I'll show it to you so it will be more clear for everyone.
resource "genesyscloud_tf_export" "export" {
directory = "../Folder/"
resource_types = []
include_state_file = false
export_as_hcl = true
log_permission_errors = true
}
So basically when I launch this code with terraform apply in local, it creates a .tf file with everything I need. Where? It goes up one folder and under the folder "Folder" it will store this file.
But when I execute the same code on Terraform Cloud obviously this won't happen. Does any of you have any workaround with this kind of troubles? How can I manage to store this file for example in a github repo when executing github actions? Thanks beforehand
The Terraform Cloud remote execution environment has an ephemeral filesystem that is discarded after a run is complete. Any files you instruct Terraform to create there during the run will therefore be lost after the run is complete.
If you want to make use of this information after the run is complete then you will need to arrange to either store it somewhere else (using additional resources that will write the data to somewhere like Amazon S3) or export the relevant information as root module output values so you can access it via Terraform Cloud's API or UI.
I'm not familiar with genesyscloud_tf_export, but from its documentation it sounds like it will create either one or two files in the given directory:
genesyscloud.tf or genesyscloud.tf.json, depending on whether you set export_as_hcl. (You did, so I assume it'll generate genesyscloud.tf.
terraform.tfstate if you set include_state_file. (You didn't, so I assume that file isn't important in your case.
Based on that, I think you could use the hashicorp/local provider's local_file data source to read the generated file into memory once the MyPureCloud/genesyscloud provider has created it, like this:
resource "genesyscloud_tf_export" "export" {
directory = "../Folder"
resource_types = []
include_state_file = false
export_as_hcl = true
log_permission_errors = true
}
data "local_file" "export_config" {
filename = "${genesyscloud_tf_export.export.directory}/genesyscloud.tf"
}
You can then refer to data.local_file.export_config.content to obtain the content of the file elsewhere in your module and declare that it should be written into some other location that will persist after your run is complete.
This genesyscloud_tf_export resource type seems unusual in that it modifies data on local disk and so its result presumably can't survive from one run to the next in Terraform Cloud. There might therefore be some problems on the next run if Terraform thinks that genesyscloud_tf_export.export.directory still exists but the files on disk don't, but hopefully the developers of this provider have accounted for that somehow in the provider logic.
I am using the googleapiclient.discovery as client to connect to GCP. Ideally, I would like to retrieve a virtual machine by it's
zone
project
name
I am having a difficult time finding code samples that do this. I am initializing the client like so
client = googleapiclient.discovery.build('compute', 'v1')
I've exported the environment variable GOOGLE_APPLICATION_CREDENTIALS and I am able to successfully connect to GCP. However, I am unable to fetch an instance by it's name. I am looking for a method like
instance = client.compute.instances().get("project","zone","instance_name")
Any help with this would be greatly appreciated.
Just need to set up a client with discovery like so
compute = discovery.build('compute', 'v1', credentials=credential)
getinstance = compute.instances().get(project=project_id, zone=region, instance=instance_id).execute()
After I made this demo app to work inside my account on Bluemix cloud
https://github.com/eGlobeBizCom/food-coach,
I create another workspace inside the above Watson Service Instance, and want to connect the above js code app with the second WORKSPACE_ID. Inside Bluemix cloud, in Runtime section, is there anyway to change the above ?WORKSPACE_ID fast? Or we have to change the WORKSPACE_ID manually in manifest.yml mentioned below
Update conversation service workspace without changing workspace ID
After many searches on the web, no info can answer the above Q. Any suggestions are warmly welcome.
in this case, you can:
1. Edit the workspace ID inside the app.js file in line #61. If this link is really the reposity you are using.
But, dont miss replace the username and password from your Conversation Service, replace with Service Credentials.
Make sure if the workspace have this format if your replace <workspace_id> inside the workspace variable:
var workspace = "4235254-546563g-sfgsg-sgs-ggsfsegs" //test
var workspace = '4235254-546563g-sfgsg-sgs-ggsfsegs' //test
2. You can see the line have the code: process.env.WORKSPACE_ID. This is because the repository use dotenv package, you can simple edit the .env file and replace the value of the WORKSPACE_ID;
3. You can simple add the Enviroment Variables in the IBM Bluemix too! Try:
In this case, you will click in the Runtime, and you will see the "Enviroment Variables". Replace the name with WORKSPACE_ID and the value with your workspace_id. After, restart your application.
When I try to upload the zip file to an azure function app using kudu REST API, it throws an error while I try to view the c# code in Function App editor in the browser. The error is:
"Error:
Function ($Source) Error: Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.Source'. Microsoft.Azure.WebJobs.Host: Value cannot be null.
Parameter name: hostAccount.
Session Id: xxxxxxxxxxx
Timestamp: 2016-12-02T18:35:00.083Z"
Please note that I have automated end to end of Application Insights starting from creation of a resource group till exporting the multi-setep web test results to our Splunk - All using Powershell.
In this process of automation, I am forming a storage connection string and setting it to the app settings of the function app and then providing that key in my function.json binding.
But still I get this error.
Here is the issue I created in the Azure Function App - Git: https://github.com/Azure/azure-webjobs-sdk-script/issues/1020
The error points to missing host configuration (e.g. the host storage account).
If you're automating the creation process, you need to make sure the required environment variables are properly set. The recommended way would be to use an ARM template manage that for you.
I have some details on how you can get the ARM template for a resource (which you could use to look at the required settings for your Function App) here.
We also have some sample templates you can use linked here
I hope this helps!
I am using Node-RED and want to parse in Bluemix VCAP_SERVICES but I am getting an error. My code is:
var services = context.global.VCAP_SERVICES;
var env_cloudint = services['CloudIntegration'][0].credentials;
but I get this error :
TypeError: Cannot read property 'CloudIntegration' of undefined
I do have CloudIntegration in my VCAP_SERVICES. Do I need anything extra in my code to exploit VCAP_SERVICES?
By default, environment variables are not added to the Function global context object. To access the Bluemix VCAP_SERVICES environment variable from a Node-RED flow, you will need to add it to the Function node's global context.
Edit bluemix-settings.js and add an entry to the functionGlobalContext property:
functionGlobalContext: { VCAP_SERVICES: JSON.parse(process.env.VCAP_SERVICES)}
Then redeploy your app. When redeployed, you can then access VCAP_SERVICES in a Function node as the context.global.VCAP_SERVICES variable.