403 access denied error occure when i click to an url. I want help regarding this issue - access-denied

How do i get images from 403 access denied page.
Foe example : http://hhsrv.n.dk/chat/gfx/items/%7BACF45E42-9B9E-426F-89A6-EC5AA54C8802%7D.gif
I need all images from: http://hhsrv.n.dk/chat/gfx/items/

403 generally occurs in the case of permission problem. If you want images to be publically accessible, just give them 755 permissions.
chmod -R 755 gfx/items/
Alternatively, if problem persists, check your apache conf for the proper permissions and configuration to this endpoint.

Related

GCloud gcsfuse permission denied

Anyone successfully using gcsfuse?
I've tried to remove all default permission to the bucket,
and setup a service account:
gcloud auth activate-service-account to activate serviceaccname
And then running:
gcsfuse --debug_gcs --foreground cloudbuckethere /backup
gcs: Req 0x0: -> ListObjects() (307.880239ms): googleapi: Error 403: xxxxxx-compute#developer.gserviceaccount.com does not have storage.objects.list access
It's weird that it's complaining that the user xxxxx-compute which is not my activated service account:
gcloud auth list
Does show my current service account is active...
I've also granted admin owner, admin object owner, write object, read object to the bucket to my serviceaccname.
If I grant xxxxx-compute to my bucket with all the permission, including legacy permissions, listing seems to work. but writing any file to the directory failed with:
googleapi: Error 403: Insufficient Permission, insufficientPermissions
Anyone have any luck?
I found a solution, not sure if this is a good solution, but it works.
Setup a service account and download the JSON file.
Grant access to the bucket as bucket admin with the above service account name.
Then run add into environment variable, pointing to the path to the service JSON file.
GOOGLE_APPLICATION_CREDENTIALS=/path-to-json/gcloud.json gcsfuse --debug_gcs --foreground bucketname /path-to-mount
Also take note that it may uses large amount of space in the tmp directory by default. Adding flag:
... --temp-dir=/someotherpath
Will really helps if you have limited space in /tmp.

Access Denied Exception message when pulling down files from google bucket

I have been running a batch file to pull files from google bucket which was created by someone and had been working in the past, however, now I'm getting an error message stating
"ACCESS DENIED EXCEPTION:403 tim#gmail.com does not have storage.objects.list access to dcct_-dcm_account870"
What can I do to resolve it?
I just found out the solution to this issue.
I notice that ****#gmail.com have left the company and i have to reconfigure the gsutil to give access to myself using the link below for previous answer
gsutil cors set command returns 403 AccessDeniedException

AccessDeniedException: 403 Forbidden

gsutil -m acl -r set public-read gs://my_bucket/
command gives AccessDeniedException: 403 Forbidden error even I provide full access to my email id as owner to my_bucket.I am using blobstore api to upload the file in my project. How to solve this problem.
You probably need to set up Cloud API access for your virtual machine. Currently it needs to be set during VM creation process by enabling:
Allow full access to all Cloud APIs
To provide access for VM when you haven't chosen the above setting you need to recreate instance with full access, but there is pending improvement:
Google Cloud Platform Ability to change API access scopes
When it's done we will be able to change settings after shutting down instance.

Total Import permission denied

When I try to access my import tool for products Tools->Total Import Pro a message appears:
You do not have permission to access this page, please refer to your system administrator
I tried to go to system->users->users groups->administrator->Modify Permission
but I can't find the permission for TotalI mport.
Make sure you search the module inside Access Permission: and also Modify Permission: . Normally the module name will start with module/your_module_name.
If still cannot you can try to upload again(maybe some files is missing) or contact the developer.
I had the same issue Daniel.
Double check that all the files associated with the Export / Import tool are installed correctly. If the controller file is missing the system won't show it in the permissions list.
If the files were not upload, you can make it manually via FTP. I've done that and it worked for me.
After you've done that, make sure you give permission to admin to use the tool. Go to System > User Group > Admin and give access and modify permission to tool/export_import
Go to System > Users > Users groups
Edit the Group Administrator and make sure you are logging in as administrator
There are two boxes
- Access Permission
- Modify Permission
You need to check to the checkbox 'tool/export_import' in both "Access permission" and "Modify permission". Is your case is Modify permission

how to grant read permission on google cloud storage to another service account

our team create some data on google cloud storage so other team can copy/download/read it from there, but when they tried, they always got 403 forbidden message. I tried to edit the permission on that bucket and added new permission as 'Project', 'viewers-(other team's project id)', and 'Reader', but still they got the same error when they ran this command:
gsutil cp -R gs://our-bucket gs://their-bucket
i also tried with their client id and email account, still the same.
I'm not sure one can define another group's collection of users with a give access right (readers, in this case), and apply it to an object in a different project.
An alternative to this would be to control bucket access via Google Groups: simply set up a group for readers, adding the users you wish to grant this right to. Then you can use said Group to control access to the bucket and/or contents. Further information, and use case scenario, here https://cloud.google.com/storage/docs/collaboration#group
try:
gsutil acl ch -u serviceaccount#google.com:R gs://your-bucket
This ch:changes the permission on 'your-bucket' for u:user serviceaccount#google.com to R:Reader.