I've updated the Google Cloud SDK (gcloud), both in general with:
gcloud components update
And just for PowerShell with:
gcloud components update powershell
Running gcloud --version shows:
Google Cloud SDK 122.0.0
beta 2016.01.12
bq 2.0.24
bq-win 2.0.24
bundled-python 2.7.10
core 2016.08.16
core-win 2016.08.05
gcloud
gsutil 4.20
gsutil-win 4.20
powershell 0.1.3
windows-ssh-tools 2016.05.13
When I open PowerShell and try Get-GcsBucket though, I see this error:
Get-GcsBucket : The term 'Get-GcsBucket' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
How can I get PowerShell to recognize the gcloud cmdlets?
The simplest fix for most people will be to uninstall and reinstall the Google Cloud SDK. This doesn't lose any configuration information, so you don't need to run gcloud init or gcloud auth afterwards.
As an alternative, you can run the command that the installer would normally run. Find your Google Cloud SDK installation directory (e.g. %AppData%\..\Local\Google\Cloud SDK or %ProgramFiles(x86)%\Google\Cloud SDK). Within that directory, find google-cloud-sdk\platform\GoogleCloudPowerShell. In there, there's a script called AppendPsModulePath.ps1. Run that, and it will modify the PSModulePath environment variable for your current user. From then on, new PowerShell windows will have the cmdlets available.
Related
I am able to configure Fastlane locally and working well with terminal, but when I am trying to run it with Jenkins(I have configured Jenkins locally on my macbook) it is failing every-time(i have installed ruby 2.5.0 again).
Any help on the same would be highly appreciated.
I am attaching SS for your reference.
Jenkins run its build scripts using specified user 'jenkins'. You might want to check if 'jenkins' user had installed requires dependencies to run fastlane, for e.g ruby ...
Have you set up your PATH in Jenkins? In the configuration of your node, in the environment variables section, you'll want to include /usr/local/bin/ with Jenkins's PATH by entering /usr/local/bin/:$PATH.
Using node v 8.9.0 and this tutorial
When I try and debug my http google cloud function in dev tools:
C:_Users_Matt_AppData_Roaming_nvm_v8.9.0_node_modules_#google-cloud_functions-emulator_src_supervis
I get filesystem permission denied error, how can I debug my cloud functions:
I also got the filesystem permission denied error, and the issue was that you need to accept the premissions from chrome to be able to access that filesystem. initially I didn't see the premissions prompt, but then I found it on a different tab (which was kinda weird behavior). I guess just look for that permissions prompt, it should be right below your address bar.
I see that you are referring to a C directory, which means that you are trying this on Windows OS. I will put the steps below with documentation links on how to properly setup the configuration. Those steps worked for me well without giving me any issues, so I suggest you to follow them one by one and see if that helps you.
Run Google Cloud Functions Emulator on Windows OS:
Install and set up Google Cloud SDK for Windows. Link and documentation here
Install Node.js and npm for Windows. Tutorial here
Right click on Google Cloud SDK Shell and select Run as administrator.
Execute $ node --version you should get the version of Node.js without any additional errors
Execute $ npm --version you should get the version of npm without any additional errors
The tutorial that you are referring to is part of Google Cloud Functions Tutorial Series
First install and set up npm functions emulator by running $ npm install -g #google-cloud/functions-emulator as mentioned in Google Cloud Functions Tutorial : Setting up a Local Development Environment
Setup the project for the functions $ functions config set projectId PROJECT_ID as mentioned in Start and Stop the Emulator documentation.
Start the emulator by executing $ functions start. Same documentation as above.
Download the source code as mentioned in the documentation you are referring to. The GitHub repository is here.
Clone the project locally. $ git clone https://github.com/rominirani/googlecloudfunctions-training.git
Navigate to the folder $ cd googlecloudfunctions-training/helloworld-http
Follow the rest of the Google Cloud Functions Tutorial : Debugging Local Functions documentation.
NOTE: Whenever you run / execute / call the Cloud Function the Node.js
blank window will pop up. Keep it open as it is the executable that
executes your code.
I have tested the tutorial with the above set up that I described and it worked for me. You have to be the administrator of your account, since the Functions Emulator and the code is running locally, so you have to have all the permissions of the directories that are going to be used and execute all the software as administrator.
I am trying to automate deployment to Azure Service Fabric with Jenkins and ServiceFabric PowerShell extension. Jenkins ServiceFabric plugin is not a good option in my case due to lack of control and flexibility over deployment process.
I've faced following issue - Jenkins can't recognize SF PowerShell cmdlets
Connect-ServiceFabricCluster : The term 'Connect-ServiceFabricCluster'
is not recognized as the name of a cmdlet, function, script file, or
operable program. Check the spelling of the name, or if a path was
included, verify that the path is correct and try again
ServiceFabric setup is correct because tt works like a charm when I run the script locally from PowerShell.
So, I've tried to run Jenkins locally instead of service mode as suggested in different posts over the internet, but this haven't resolved the issue.
The other things i've tried:
run the script with self-elevation to admin
run x86/x64 powershell modes
run the script via calling PowerShell exe from cmd runner instead
powershell plugin
forcing "unrestricted" mode
double-dot before script name
I'm still receiving the same result.
So, I tried ServiceFabric Python Cli as an alternative, but faced the other issue - it returns "Bad SSL handshake" on "sfctl cluster select" with certificate, which worked with PS ServiceFabric cmdlets locally
Any ideas?
This is similar to Azure/service-fabric-issues issue 491 which was about a mismatch between the Azure Service Fabric SDK and the Service Fabric runtime.
For instance:
The 2.7 SDK will work against a version 6.0 cluster, but the task will not work on with the 2.8 SDK installed on the agent.
Plus:
Service Fabric PowerShell cmdlets requires PowerShell 3.0 or higher.
Service Fabric uses Windows PowerShell scripts for creating a local development cluster and for deploying applications from Visual Studio. By default, Windows blocks these scripts from running.
To enable them, you must modify your PowerShell execution policy. Open PowerShell as an administrator and enter the following command:
Set-ExecutionPolicy -ExecutionPolicy Unrestricted -Force -Scope CurrentUser
So: if that script is working locally, but not through a Jenkins job on a Jenkins agent, look for differences between the local execution environment (where it is working) and the Jenkins one (where it fails).
The user might not be the same and/or the runtime version might not be compatible with the SDK version.
Do you have Jeknis PowerShell Plugin installed in your system ??
if so can you add your commands into the Power Shell dialog box and see if it works :)
To use any GCloud componet, I have installed on Cloud Shell just once, and i could use it each time i open cloud shell. But for CBT component for BigTable, I don't know what is happening that each time I close the browser the CBT tool is not installed any more and I should re-install it. The problem does not appear immediately, generally each day I should install it and it exist between installed components for whole day, and the day after I see it is not any more installed!
Any idea ?
This problem is caused by Google terminating idle Cloud Shell instances when they are not being used. Termination happens after about 60 minutes of non-use.
Only data stored in the $HOME directory persists after a new Cloud Shell is launched.
To install cbt the following steps are recommended:
gcloud components update
gcloud components install cbt
Since these components are not being installed in $HOME, they do not persist after Cloud Shell is terminated.
There are two methods that I recommend to solve this problem:
Google Cloud Shell is a Docker container. You can modify the docker image to customize to fit your needs. This method will allow you to install packages, tools, etc that are not located in your $HOME directory.
Modify .bashrc to run a script located in the $HOME directory to install cbt each time a new instance is created.
Note: It appears as of now that cbt is included in the default Cloud Shell instance. This answer should help others understand what is happening and be able to install other programs, tools, etc. persistently.
Does anyone know from where I can download a Windows version of the Cloud SQL Proxy?
I see on the support page an example command line, but there's no indication of where you could get a binary from. It's not on the Github.
Thanks
There is now a pre-compiled proxy version released, see the doc page for the download link: https://cloud.google.com/sql/docs/sql-proxy .
Note that you must run the program in a command prompt; there's a feature request to allow a web-UI configuration rather than doing a command prompt.
If you want to compile to code yourself from source it is relatively straightforward:
Install Go (use the .msi installer)
Execute the following in a command prompt (requires installation of git):
go get github.com/GoogleCloudPlatform/cloudsql-proxy/cmd/cloud_sql_proxy
The proxy binary should be located in %GOPATH%\bin (you should be able to do cd %GOPATH%\bin in a command prompt and then use dir to see the cloud_sql_proxy.exe file).
It's been a while since I've used windows for development, so let me know if there are any troubles.