I was trying to publish a tableau work sheet using tabcmd.exe utility but getting an error.
*** 411 "Length Required"
The tabcmd log file has the follwoing entries.
[11888] DEBUG Nov-05 05:11:42,437: ====>> Starting Tabcmd 8.0 at Tue Nov 05 05:11:42 UTC 2013 <<====
[11888] DEBUG Nov-05 05:11:42,437: Build 8000.13.1001.1930
[11888] DEBUG Nov-05 05:11:45,195: run as: tabcmd publish "tabcmd_publish.twbx" -n "tabcmd_publish" -r "Default" --db-user "username" --db-password "password" -o --no-certcheck
[11888] INFO Nov-05 05:11:45,210: Continuing previous session
[11888] INFO Nov-05 05:11:45,242: Server: https://someurl.tableauserver.com
[11888] INFO Nov-05 05:11:45,242: Username: <username>
[11888] INFO Nov-05 05:11:45,260: Connecting to server...
[11888] DEBUG Nov-05 05:11:49,690: Authenticity token: v2x4ltoI1rlcNYP875pO6ciYhWH8P3Vs7W15PMdWrWo=
[11888] DEBUG Nov-05 05:11:49,113: upload options: {}
[11888] DEBUG Nov-05 05:11:50,661: /manual/file_uploads/create.xml
[11888] DEBUG Nov-05 05:11:50,680: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01//EN""http://www.w3.org/TR/html4/strict.dtd">
<HTML><HEAD><TITLE>Length Required</TITLE>
<META HTTP-EQUIV="Content-Type" Content="text/html; charset=us-ascii"></HEAD>
<BODY><h2>Length Required</h2>
<hr><p>HTTP Error 411. The request must be chunked or have a content length.</p>
</BODY></HTML>
[11888] FATAL Nov-05 05:11:50,682: 411 "Length Required"
I think the reason for this issue is the data stream sent to the server should contain Content-Length value. But i am not sure how can i set it?
Any help on this is highly appreciated
Related
One of our collaborators has made some data available on AWS and I was trying to get it into our google cloud bucket using gsutil (only some of the files are of use to us, so I don't want to use the GUI provided on GCS). The collaborators have provided us with the AWS bucket ID, the aws access key id, and aws secret access key id.
I looked through the documentation on GCE and editied the ~/.botu file such that the access keys are incorporated. I restarted my terminal and tried to do an 'ls' but got the following error:
gsutil ls s3://cccc-ffff-03210/
AccessDeniedException: 403 AccessDenied
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied
Do I need to configure/run something else too?
thanks!
EDITS:
Thanks for the replies!
I installed the Cloud SDK and I can access and run all gsutil commands on my google cloud storage project. My problem is in trying to access (e.g. 'ls' command) the amazon S3 that is being shared with me.
I uncommented two lines in the ~/.boto file and put the access keys:
# To add HMAC aws credentials for "s3://" URIs, edit and uncomment the
# following two lines:
aws_access_key_id = my_access_key
aws_secret_access_key = my_secret_access_key
Output of 'gsutil version -l':
| => gsutil version -l
my_gc_id
gsutil version: 4.27
checksum: 5224e55e2df3a2d37eefde57 (OK)
boto version: 2.47.0
python version: 2.7.10 (default, Oct 23 2015, 19:19:21) [GCC 4.2.1 Compatible Apple LLVM 7.0.0 (clang-700.0.59.5)]
OS: Darwin 15.4.0
multiprocessing available: True
using cloud sdk: True
pass cloud sdk credentials to gsutil: True
config path(s): /Users/pc/.boto, /Users/pc/.config/gcloud/legacy_credentials/pc#gmail.com/.boto
gsutil path: /Users/pc/Documents/programs/google-cloud- sdk/platform/gsutil/gsutil
compiled crcmod: True
installed via package manager: False
editable install: False
The output with the -DD option is:
=> gsutil -DD ls s3://my_amazon_bucket_id
multiprocessing available: True
using cloud sdk: True
pass cloud sdk credentials to gsutil: True
config path(s): /Users/pc/.boto, /Users/pc/.config/gcloud/legacy_credentials/pc#gmail.com/.boto
gsutil path: /Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gsutil
compiled crcmod: True
installed via package manager: False
editable install: False
Command being run: /Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gsutil -o GSUtil:default_project_id=my_gc_id -DD ls s3://my_amazon_bucket_id
config_file_list: ['/Users/pc/.boto', '/Users/pc/.config/gcloud/legacy_credentials/pc#gmail.com/.boto']
config: [('debug', '0'), ('working_dir', '/mnt/pyami'), ('https_validate_certificates', 'True'), ('debug', '0'), ('working_dir', '/mnt/pyami'), ('content_language', 'en'), ('default_api_version', '2'), ('default_project_id', 'my_gc_id')]
DEBUG 1103 08:42:34.664643 provider.py] Using access key found in shared credential file.
DEBUG 1103 08:42:34.664919 provider.py] Using secret key found in shared credential file.
DEBUG 1103 08:42:34.665841 connection.py] path=/
DEBUG 1103 08:42:34.665967 connection.py] auth_path=/my_amazon_bucket_id/
DEBUG 1103 08:42:34.666115 connection.py] path=/?delimiter=/
DEBUG 1103 08:42:34.666200 connection.py] auth_path=/my_amazon_bucket_id/?delimiter=/
DEBUG 1103 08:42:34.666504 connection.py] Method: GET
DEBUG 1103 08:42:34.666589 connection.py] Path: /?delimiter=/
DEBUG 1103 08:42:34.666668 connection.py] Data:
DEBUG 1103 08:42:34.666724 connection.py] Headers: {}
DEBUG 1103 08:42:34.666776 connection.py] Host: my_amazon_bucket_id.s3.amazonaws.com
DEBUG 1103 08:42:34.666831 connection.py] Port: 443
DEBUG 1103 08:42:34.666882 connection.py] Params: {}
DEBUG 1103 08:42:34.666975 connection.py] establishing HTTPS connection: host=my_amazon_bucket_id.s3.amazonaws.com, kwargs={'port': 443, 'timeout': 70}
DEBUG 1103 08:42:34.667128 connection.py] Token: None
DEBUG 1103 08:42:34.667476 auth.py] StringToSign:
GET
Fri, 03 Nov 2017 12:42:34 GMT
/my_amazon_bucket_id/
DEBUG 1103 08:42:34.667600 auth.py] Signature:
AWS RN8=
DEBUG 1103 08:42:34.667705 connection.py] Final headers: {'Date': 'Fri, 03 Nov 2017 12:42:34 GMT', 'Content-Length': '0', 'Authorization': u'AWS AK6GJQ:EFVB8F7rtGN8=', 'User-Agent': 'Boto/2.47.0 Python/2.7.10 Darwin/15.4.0 gsutil/4.27 (darwin) google-cloud-sdk/164.0.0'}
DEBUG 1103 08:42:35.179369 https_connection.py] wrapping ssl socket; CA certificate file=/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/third_party/boto/boto/cacerts/cacerts.txt
DEBUG 1103 08:42:35.247599 https_connection.py] validating server certificate: hostname=my_amazon_bucket_id.s3.amazonaws.com, certificate hosts=['*.s3.amazonaws.com', 's3.amazonaws.com']
send: u'GET /?delimiter=/ HTTP/1.1\r\nHost: my_amazon_bucket_id.s3.amazonaws.com\r\nAccept-Encoding: identity\r\nDate: Fri, 03 Nov 2017 12:42:34 GMT\r\nContent-Length: 0\r\nAuthorization: AWS AN8=\r\nUser-Agent: Boto/2.47.0 Python/2.7.10 Darwin/15.4.0 gsutil/4.27 (darwin) google-cloud-sdk/164.0.0\r\n\r\n'
reply: 'HTTP/1.1 403 Forbidden\r\n'
header: x-amz-bucket-region: us-east-1
header: x-amz-request-id: 60A164AAB3971508
header: x-amz-id-2: +iPxKzrW8MiqDkWZ0E=
header: Content-Type: application/xml
header: Transfer-Encoding: chunked
header: Date: Fri, 03 Nov 2017 12:42:34 GMT
header: Server: AmazonS3
DEBUG 1103 08:42:35.326652 connection.py] Response headers: [('date', 'Fri, 03 Nov 2017 12:42:34 GMT'), ('x-amz-id-2', '+iPxKz1dPdgDxpnWZ0E='), ('server', 'AmazonS3'), ('transfer-encoding', 'chunked'), ('x-amz-request-id', '60A164AAB3971508'), ('x-amz-bucket-region', 'us-east-1'), ('content-type', 'application/xml')]
DEBUG 1103 08:42:35.327029 bucket.py] <?xml version="1.0" encoding="UTF-8"?>
<Error><Code>AccessDenied</Code><Message>Access Denied</Message><RequestId>6097164508</RequestId><HostId>+iPxKzrWWZ0E=</HostId></Error>
DEBUG: Exception stack trace:
Traceback (most recent call last):
File "/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gslib/__main__.py", line 577, in _RunNamedCommandAndHandleExceptions
collect_analytics=True)
File "/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gslib/command_runner.py", line 317, in RunNamedCommand
return_code = command_inst.RunCommand()
File "/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gslib/commands/ls.py", line 548, in RunCommand
exp_dirs, exp_objs, exp_bytes = ls_helper.ExpandUrlAndPrint(storage_url)
File "/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gslib/ls_helper.py", line 180, in ExpandUrlAndPrint
print_initial_newline=False)
File "/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gslib/ls_helper.py", line 252, in _RecurseExpandUrlAndPrint
bucket_listing_fields=self.bucket_listing_fields):
File "/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gslib/wildcard_iterator.py", line 476, in IterAll
expand_top_level_buckets=expand_top_level_buckets):
File "/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gslib/wildcard_iterator.py", line 157, in __iter__
fields=bucket_listing_fields):
File "/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gslib/boto_translation.py", line 413, in ListObjects
self._TranslateExceptionAndRaise(e, bucket_name=bucket_name)
File "/Users/pc/Documents/programs/google-cloud-sdk/platform/gsutil/gslib/boto_translation.py", line 1471, in _TranslateExceptionAndRaise
raise translated_exception
AccessDeniedException: AccessDeniedException: 403 AccessDenied
AccessDeniedException: 403 AccessDenied
I'll assume that you are able to set up gcloud credentials using gcloud init and gcloud auth login or gcloud auth activate-service-account, and can list/write objects to GCS successfully.
From there, you need two things. A properly configured AWS IAM role applied to the AWS user you're using, and a properly configured ~/.boto file.
AWS S3 IAM policy for bucket access
A policy like this must be applied, either by a role granted to your user or an inline policy attached to the user.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::some-s3-bucket/*",
"arn:aws:s3:::some-s3-bucket"
]
}
]
}
The important part is that you have ListBucket and GetObject actions, and the resource scope for these includes at least the bucket (or prefix thereof) that you wish to read from.
.boto file configuration
Interoperation between service providers is always a bit tricky. At the time of this writing, in order to support AWS Signature V4 (the only one supported universally by all AWS regions), you have to add a couple extra properties to your ~/.boto file beyond just credential, in an [s3] group.
[Credentials]
aws_access_key_id = [YOUR AKID]
aws_secret_access_key = [YOUR SECRET AK]
[s3]
use-sigv4=True
host=s3.us-east-2.amazonaws.com
The use-sigv4 property cues Boto, via gsutil, to use AWS Signature V4 for requests. Currently, this requires the host be specified in the configuration, unfortunately. It is pretty easy to figure the host name out, as it follows the pattern of s3.[BUCKET REGION].amazonaws.com.
If you have rsync/cp work from multiple S3 regions, you could handle it a few ways. You can set an environment variable like BOTO_CONFIG before running the command to change between multiple files. Or, you can override the setting on each run using a top-level argument, like:
gsutil -o s3:host=s3.us-east-2.amazonaws.com ls s3://some-s3-bucket
Edit:
Just want to add... another cool way to do this job is rclone.
1. Generate your GCS credentials
If you download the Cloud SDK, then run gcloud init and gcloud auth login, gcloud should configure the OAuth2 credentials for the account you logged in with, allowing you to access your GCS bucket (it does this by creating a boto file that gets loaded in addition to your ~/.boto file, if it exists).
If you're using standalone gsutil, run gsutil config to generate a config file at ~/.boto.
2. Add your AWS credentials to the file ~/.boto
The [Credentials] section of your ~/.boto file should have these two lines populated and uncommented:
aws_access_key_id = IDHERE
aws_secret_access_key = KEYHERE
If you've done that:
Make sure that you didn't accidentally swap the values for key and id.
Verify you're loading the correct boto file(s) - you can do this by
running gsutil version -l and looking for the "config path(s):" line.
If you still receive a 403, it's possible that they've given you either
the wrong bucket name, or a key and id corresponding to an account
that doesn't have permission to list the contents of that bucket.
I created a project marvin-xyz, and a json action file marvin.json
When I try to upload it with the CIA tools from google: gactions,
I get the following error:
$gactions test --action_package marvin.json --project marvin-xyz
Pushing the app for the Assistant for testing...
ERROR: Failed to test the app for the Assistant
ERROR: Internal error encountered.
2017/09/28 10:35:28 Server did not return HTTP 200
xyz is just a placeholder here - the project exists - in both cases, the real one has been used
Any advice?
EDIT: here it the update with the verbose option
$gactions --verbose test --action_package marvin.json --project marvin-xyz
Checking for updates...
Successfully fetched update metadata
Finished checking for updates -- no updates available
Pushing the app for the Assistant for testing...
POST /v2/users/me/previews/marvin-xyz:updateFromAgentDraft?updateMask=previewActionPackage.actionPackage.actions%2CpreviewActionPackage.actionPackage.conversations%2CpreviewActionPackage.actionPackage.types%2CpreviewActionPackage.startTimestamp%2CpreviewActionPackage.endTimestamp HTTP/1.1
Host: actions.googleapis.com
User-Agent: Gactions-CLI/2.0.7 (darwin; amd64; stable/XYZ)
Content-Length: 697
Content-Type: application/json
Accept-Encoding: gzip
{"name":"users/me/previews/marvin-xyz","previewActionPackage":{"actionPackage":{"actions":[{"description":"Default Welcome Intent","fulfillment":{"conversationName":"marvin"},"intent":{"name":"actions.intent.MAIN","trigger":{"queryPatterns":[{"queryPatterns":"I want to talk to marvin"},{"queryPatterns":"talk to marvin"},{"queryPatterns":"wake up marvin"},{"queryPatterns":"is marvin there?"},{"queryPatterns":"where is marvin?"},{"queryPatterns":"hello marvin"},{"queryPatterns":"pass me to marvin"},{"queryPatterns":"marvin please"}]}},"name":"MAIN"}],"conversations":{"marvin":{"name":"marvin","url":"https://example.com/sidekick/api/v1/"}}},"name":"users/me/previews/marvin-xyz"}}
Reading credentials from: creds.data
ERROR: Failed to test the app for the Assistant
ERROR: Internal error encountered.
2017/09/29 23:37:09 Server did not return HTTP 200
I am trying to register a build agent with VSTS by following the instructions here. I created a PAT token in my Security settings and used it to register the agent. I used the following PS command to register the agent:
PS>.\config.cmd --url 'https://[account].visualstudio.com/' --auth 'PAT' --token '[Token]' --pool '[Pool Name]' --agent '[Agent Name]'
I get the following information on the output:
>> Connect:
Connecting to server ...
>> Register Agent:
Scanning for tool capabilities.
Connecting to the server.
Successfully added the agent
Testing agent connection.
But it then gets stuck on this step and never completes. When I look at the agent pool in VSTS I see the new agent, but it's state is 'Offline'.
I looked in the _diag folder and checked the latest log. There is an error there:
[2017-07-11 12:32:10Z WARN VisualStudioServices] Authentication failed with status code 401.
Date: Tue, 11 Jul 2017 12:32:10 GMT
P3P: CP="CAO DSP COR ADMa DEV CONo TELo CUR PSA PSD TAI IVDo OUR SAMi BUS DEM NAV STA UNI COM INT PHY ONL FIN PUR LOC CNT"
Server: Microsoft-IIS/10.0
WWW-Authenticate: Bearer authorization_uri=https://login.microsoftonline.com/b8f712c7-d223-4cfb-b165-6267fc789086, Basic realm="https://tfsprodweu2.app.visualstudio.com/", TFS-Federated
X-TFS-ProcessId: f4c5d148-0e01-488f-ab27-69c753e38911
Strict-Transport-Security: max-age=31536000; includeSubDomains
ActivityId: b85558aa-d41a-4763-a311-f2498ddb2dc0
X-TFS-Session: 8242b35d-d5e9-4242-8f52-b7c1f14c451c
X-VSS-E2EID: 6494b165-d486-4064-a5fd-92ae7f201867
X-FRAME-OPTIONS: SAMEORIGIN
X-TFS-FedAuthRealm: https://tfsprodweu2.app.visualstudio.com/
X-TFS-FedAuthIssuer: https://rr-ffes.visualstudio.com/
X-VSS-ResourceTenant: b8f712c7-d223-4cfb-b165-6267fc789086
X-TFS-SoapException: %3C%3Fxml%20version%3D%221.0%22%20encoding%3D%22utf-8%22%3F%3E%3Csoap%3AEnvelope%20xmlns%3Asoap%3D%22http%3A%2F%2Fwww.w3.org%2F2003%2F05%2Fsoap-envelope%22%3E%3Csoap%3ABody%3E%3Csoap%3AFault%3E%3Csoap%3ACode%3E%3Csoap%3AValue%3Esoap%3AReceiver%3C%2Fsoap%3AValue%3E%3Csoap%3ASubcode%3E%3Csoap%3AValue%3EUnauthorizedRequestException%3C%2Fsoap%3AValue%3E%3C%2Fsoap%3ASubcode%3E%3C%2Fsoap%3ACode%3E%3Csoap%3AReason%3E%3Csoap%3AText%20xml%3Alang%3D%22en%22%3ETF400813%3A%20Resource%20not%20available%20for%20anonymous%20access.%20Client%20authentication%20required.%3C%2Fsoap%3AText%3E%3C%2Fsoap%3AReason%3E%3C%2Fsoap%3AFault%3E%3C%2Fsoap%3ABody%3E%3C%2Fsoap%3AEnvelope%3E
X-TFS-ServiceError: TF400813%3A%20Resource%20not%20available%20for%20anonymous%20access.%20Client%20authentication%20required.
X-VSS-S2STargetService: 00000002-0000-8888-8000-000000000000/visualstudio.com
X-TFS-FedAuthRedirect: https://app.vssps.visualstudio.com/_signin?realm=rr-ffes.visualstudio.com&reply_to=https%3A%2F%2Frr-ffes.visualstudio.com%2F_apis%2FconnectionData%3FconnectOptions%3D1%26lastChangeId%3D-1%26lastChangeId64%3D-1&redirect=1&context=eyJodCI6MiwiaGlkIjoiNDEzNzE0YzItZWQ2OS00MWRkLWJmMTItNzc0ZTI1ZGEzOTdmIiwicXMiOnt9LCJyciI6IiIsInZoIjoiIiwiY3YiOiIiLCJjcyI6IiJ90#ctx=eyJTaWduSW5Db29raWVEb21haW5zIjpbImh0dHBzOi8vbG9naW4ubWljcm9zb2Z0b25saW5lLmNvbSIsImh0dHBzOi8vbG9naW4ubWljcm9zb2Z0b25saW5lLmNvbSJdfQ2
X-Powered-By: ASP.NET
X-Content-Type-Options: nosniff
I am sure I correctly entered the PAT access token, so what is the problem?
I found the answer. The authentication failure was a red herring. The actual problem is that it was waiting for user input, but powershell was not writing out the question and waiting for the answer. Here is the line in the log file:
[2017-07-11 14:03:20Z INFO Terminal] WRITE: Enter work folder (press enter for _work) >
[2017-07-11 14:03:20Z INFO Terminal] READ LINE
In order to get round this problem, I made sure I specified all the required arguments and used --unattended to ensure it wasn't going to ask me anything else. In the end it was like this:
PS>.\config.cmd --url $url --auth 'PAT' --token $patKey --pool $poolId --agent $serverId --work '_work' --runasservice --unattended
I just want to share our story we had the same Exception but my log was different.
[2017-12-04 11:20:37Z WARN VisualStudioServices] Authentication failed
with status code 401.
And after struggling around we figured out that the ApplicationPools that running the TFS should have administration level to be able to access Certificate Management
To fix the issue, I had to go to Administrative Tools > Services, and change the VSTS Agent service account from Network service to Local system.
The service was then able to start and work as expected.
so I've successfully installed the newest google-cloud-sdk on my mac so all my gcloud and gsutil command line tools are up to date.
However, whenever I try a gsutil command, it times out. For example, when I run:
gsutil mb gs://cloud-storage-analysis
it starts to run, printing:
Creating gs://cloud-storage-analysis/...
But then it never stops. I stop it using Control-C and it prints out this.
Caught signal 2 - exiting
Traceback (most recent call last):
File "/Users/jazz/google-cloud-sdk/bin/bootstrapping/gsutil.py", line 71, in <module>
main()
File "/Users/jazz/google-cloud-sdk/bin/bootstrapping/gsutil.py", line 54, in main
'platform/gsutil', 'gsutil', *args)
File "/Users/jazz/google-cloud-sdk/bin/bootstrapping/bootstrapping.py", line 45, in ExecutePythonTool
execution_utils.ArgsForPythonTool(_FullPath(tool_dir, exec_name), *args))
File "/Users/jazz/google-cloud-sdk/bin/bootstrapping/bootstrapping.py", line 86, in _ExecuteTool
execution_utils.Exec(args + sys.argv[1:], env=_GetToolEnv())
File "/Users/jazz/google-cloud-sdk/bin/bootstrapping/../../lib/googlecloudsdk/core/util/execution_utils.py", line 146, in Exec
ret_val = p.wait()
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 1357, in wait
pid, sts = _eintr_retry_call(os.waitpid, self.pid, 0)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/subprocess.py", line 478, in _eintr_retry_call
return func(*args)
KeyboardInterrupt
The one time I let it run for a long time, it actually timed out and stopped. This happens for every gsutil command I try.
My bq commands work (but they're really slow).
I don't know what's wrong.
Thanks for any help.
Edit:
So my problem hasn't entirely gone away, but gsutil does sometimes work. It is very intermittent and seemingly random when it works and when it doesn't. It seems like refreshing the shell and/or quitting and reopening Terminal helps, but not every time. I'd still like to get to the bottom of it.
So as Misha suggested, I ran gsutil -D ls as a test.
It got to here and then stopped for a while (maybe 2-3 minutes): (some info has been [removed])
gsutil version: 4.13
checksum: [key] (OK)
boto version: 2.38.0
python version: 2.7.5 (default, Mar 9 2014, 22:15:05) [GCC 4.2.1 Compatible Apple LLVM 5.0 (clang-500.0.68)]
OS: Darwin 13.4.0
multiprocessing available: True
using cloud sdk: True
config path: [path-to-home/.boto
gsutil path: [path-to-home]/google-cloud-sdk/platform/gsutil/gsutil
compiled crcmod: True
installed via package manager: False
editable install: False
Command being run: [path-to-home]/google-cloud-sdk/platform/gsutil/gsutil -o GSUtil:default_project_id=storagelogstest -D ls
config_file_list: ['[path-to-home]/.config/gcloud/legacy_credentials/[email]#gmail.com/.boto', '/.boto']
config: [('debug', '0'), ('working_dir', '/mnt/pyami'), ('https_validate_certificates', 'True'), ('debug', '0'), ('working_dir', '/mnt/pyami'), ('content_language', 'en'), ('default_api_version', '2'), ('default_project_id', 'storagelogstest')]
DEBUG 0626 10:43:29.972406 oauth2_client.py] GetAccessToken: checking cache for key 9886bbc6e7e67cf6d2b8b3a707046f2a326dcceb
DEBUG 0626 10:43:29.972712 oauth2_client.py] FileSystemTokenCache.GetToken: key=9886bbc6e7e67cf6d2b8b3a707046f2a326dcceb not present (cache_file=/var/folders/25/zs7lm5jd7dg5jljd4qdxjpnc0000gq/T/oauth2_client-tokencache.503.9886bbc6e7e67cf6d2b8b3a707046f2a326dcceb)
DEBUG 0626 10:43:29.972882 oauth2_client.py] GetAccessToken: token from cache: None
DEBUG 0626 10:43:29.973030 oauth2_client.py] GetAccessToken: fetching fresh access token...
INFO 0626 10:43:29.973551 client.py] Refreshing access_token
Then, it outputted this:
connect fail: (accounts.google.com, 443)
connect: (accounts.google.com, 443)
send: 'POST /o/oauth2/token HTTP/1.1\r\nHost: accounts.google.com\r\nContent-Length: 195\r\ncontent-type: application/x-www-form-urlencoded\r\naccept-encoding: gzip, deflate\r\nuser-agent: Python-httplib2/0.7.7 (gzip)\r\n\r\nclient_secret=ZmssLNjJy2998hD4CTg2ejr2&grant_type=refresh_token&refresh_token=1%2FUl4EXn6N5jPCjFVy6-U5HwIKNApkGmYEEQPZO654NxHBactUREZofsF9C7PrpE-j&client_id=32555940559.apps.googleusercontent.com'
reply: 'HTTP/1.1 200 OK\r\n'
header: Content-Type: application/json; charset=utf-8
header: Cache-Control: no-cache, no-store, max-age=0, must-revalidate
header: Pragma: no-cache
header: Expires: Fri, 01 Jan 1990 00:00:00 GMT
header: Date: Fri, 26 Jun 2015 17:44:45 GMT
header: Content-Disposition: attachment; filename="json.txt"; filename*=UTF-8''json.txt
header: Content-Encoding: gzip
header: P3P: CP="This is not a P3P policy! See http://www.google.com/support/accounts/bin/answer.py?hl=en&answer=151657 for more info."
header: X-Content-Type-Options: nosniff
header: X-Frame-Options: SAMEORIGIN
header: X-XSS-Protection: 1; mode=block
header: Server: GSE
header: Set-Cookie: NID=68=PfDga1cpnMr8ho-0tlBrWNhgLzQsThRzV31vn8QD1cV45H8C4-ydGoMI0ITI0lPJHvKhN_uPSisTQwIzM2LEFKqjXZlgsJ-9l0HiflLdl1UGMevAQ2GFxFqa369vQxZG;Domain=.google.com;Path=/;Expires=Sat, 26-Dec-2015 17:44:45 GMT;HttpOnly
header: Alternate-Protocol: 443:quic,p=1
header: Transfer-Encoding: chunked
DEBUG 0626 10:44:45.808030 oauth2_client.py] GetAccessToken: fresh access token: AccessToken(token=ya29.ngEY7SR5AbdDZ2pWMtBJzAnJGYGUgXB6hKcAwE8I274ieyLmEpuD1WypFJ8jAZN9LS5zCZ3ldGL4MA, expiry=2015-06-26 18:44:45.806928Z)
DEBUG 0626 10:44:45.808321 oauth2_client.py] FileSystemTokenCache.PutToken: key=9886bbc6e7e67cf6d2b8b3a707046f2a326dcceb, cache_file=/var/folders/25/zs7lm5jd7dg5jljd4qdxjpnc0000gq/T/oauth2_client-tokencache.503.9886bbc6e7e67cf6d2b8b3a707046f2a326dcceb
INFO 0626 10:44:45.813867 base_api.py] Calling method storage.buckets.list with StorageBucketsListRequest: <StorageBucketsListRequest
maxResults: 1000
project: 'storagelogstest'
projection: ProjectionValueValuesEnum(full, 0)>
INFO 0626 10:44:45.814872 base_api.py] Making http GET to https://www.googleapis.com/storage/v1/b?project=storagelogstest&fields=nextPageToken%2Citems%2Fid&alt=json&projection=full&maxResults=1000
INFO 0626 10:44:45.815298 base_api.py] Headers: {'accept': 'application/json',
'accept-encoding': 'gzip, deflate',
'content-length': '0',
'user-agent': 'apitools gsutil/4.13 (darwin) Cloud SDK Command Line Tool 0.9.66'}
INFO 0626 10:44:45.815390 base_api.py] Body: (none)
DEBUG 0626 10:45:45.846443 http_wrapper.py] Caught socket error, retrying: timed out
It tried to reconnect a few times but it timed out each time and then I stopped it.
I have not tried this on another machine, but I have talked to a coworker who has the same problem (frequently it won't work but some times it does).
So, if you have this problem, most likely you'll be able to get gsutil to work if you basically just try it a lot. Try restarting the shell (exec -l $SHELL) and quitting/reopening the command line and keep trying, it eventually worked for me. This is not a permanent fix, it still times out about 2/3 the time for me. But you'll at least be able to run your commands.
Hopefully Google can address this problem
I have checked similar questions asked, but none seem to match the circumstances of this one.
This page is returning a 404 error in Facebook's Object Debugger tool. Other pages on the site work okay, so it shouldn't be any missing meta tags.
Now some of the page content is hidden, but only some, the majority of the page content is available, so surely this shouldn't be causing the issue. If it does then that would have to be regarded as a bug, no?
Anyone have any idea what the issue might be and/or how to fix?
The error message is accurate - your URL is returning a 404 when the Facebook crawler attempts to get the metadata
You'll need to check your server settings or the code which renders that URL to see why it's doing so, here's the output when i made the same request Facebook makes from my own laptop:
$ curl -A "facebookexternalhit/1.1" -i 'http://austparents.edu.au/webinars/parent-webinar-on-the-australian-curriculum-with-rob-randall-ceo-acara/'
HTTP/1.1 403 Forbidden
Date: Tue, 23 Sep 2014 00:03:36 GMT
Server: Apache/2.2.14 (Ubuntu)
Vary: Accept-Encoding
Content-Length: 366
Content-Type: text/html; charset=iso-8859-1
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>403 Forbidden</title>
</head><body>
<h1>Forbidden</h1>
<p>You don't have permission to access /webinars/parent-webinar-on-the-australian-curriculum-with-rob-randall-ceo-acara/
on this server.</p>
<hr>
<address>Apache/2.2.14 (Ubuntu) Server at austparents.edu.au Port 80</address>
</body></html>