Problem:
I need to be able to remove all link decoration from the download URL that is generated for images in Firebase Storage.
However, when all link decoration is stripped away, the resulting link currently would return a JSON document of the image's metadata.
Context:
The flow goes as follows:
An image is uploaded to Firebase from an iOS app. Once that is done the download URL is then sent in a POST request to an external server.
The server that the URL is being sent to doesn't accept link decoration when submitting image URLs.
Goal:
Alter the Firebase Storage download URL such as it is stripped of all link decoration like so:
https://firebasestorage.googleapis.com/v0/b/example.appspot.com/o/[FOLDER_NAME]%[IMAGE_NAME].jpg
Notes:
The problem is twofold really, first the link needs to be manipulated to remove all the link decoration. Then the behavior of the link needs to changed, since in order to return an image, you need ?alt=media following the file extension, in this case .jpg. Currently, without link decoration, using the link with my desired structure would return a JSON document of the metadata.
The current link structure is as follows:
https://firebasestorage.googleapis.com/v0/b/example.appspot.com/o/[FOLDER_NAME]%[IMAGE_NAME].jpg?alt=media&token=[TOKEN]
Desired link structure:
https://firebasestorage.googleapis.com/v0/b/example.appspot.com/o/[FOLDER_NAME]%[IMAGE_NAME].jpg
The token is necessary for accessing the image depending security rules in place, but can be ignored with the proper read permissions. I can adjust the rules as needed, but I still need to be able to remove the ?alt=media and still return an image.
Building up on Frank's answer, if you access to your associated Google Cloud Platform project, find the bucket in the Storage tab and make this bucket public, you will be able to get the image from here with the format you wish. That is, you will not be accessing through Firebase
https://firebasestorage.googleapis.com/v0/b/example.appspot.com/o/[FOLDER_NAME]%[IMAGE_NAME].jpg
but through Google Cloud Storage, with a link like
https://storage.googleapis.com/[bucket_name]/[path_to_image]
Once in your GCP project Console, access the Storage bucket with the same name as the one you have in your Firebase project. They are the same bucket. Then make the bucket public by following these steps. After that, you will be able to construct your links as mentioned above and they will be accessible with no token and no alt=media param. If you do not want to make the public to everyone, you will be able to play around with the permissions there as you wish.
You could split the url string into two halves by using String.componentsSeparatedByString(_ separator:)
Storage.storage().reference().child(filePath).downloadURL(completion: { (url, error) in
let urlString = url.absoluteString
let urlStringWithoutQueryString = urlString.componentsSeparatedByString("?").first!
})
Calling .downloadURL on a StorageReference will return you that URL, but this method can be used to remove the query string from any URL. String.componentsSeparatedByString(_ separator:) breaks a String into an array of Strings, splitting the string by any occurrence of a given separator, in this case ?.
NOTE this method assumes that ? occurs only once within the url string, which I believe is the case for all Firebase Storage urls.
You should treat the download URL that you get back from Firebase as an opaque string. There's no way to strip the parameters from a download URL without breaking that download URL.
If you want to allow public access to the files in your bucket with simpler URLs, consider making the object in your (or even your entire) bucket public.
Related
Dynamic links work great for 98% of our users. However, there are still a group of users which have difficulty with them or do not know how to use them.
I want to add a feature which would let users paste their link into the app, and then we extract the data from the link and handle it normally. This will also serve as a backup for when the links are down or misbehaving. It will also allow our customer service team to get data from a link when customers share them with us.
The problem is, there doesn't seem to be a way to manually pass in a dynamic link to retrieve the dynamic data.
Does anyone know how this can be achieved?
Here is my attempt at your question.
I am assuming what you mean by the dynamic data is the underlying deeplink along with the parameters associated with the deeplink.
void dynamicLinkToDeepLink(String dynamicLinkString) async {
final details = await FirebaseDynamicLinks.instance.getDynamicLink(Uri.parse(dynamicLinkString));
// your deep link can be accessed as follows
details!.link;
}
You have to safeguard the above code as you see fits when you use it. You will have to wrap FirebaseDynamicLinks.instance..... with a try catch block and you will also have to check if the value of the returned link is not null before acccessing details!.link
I need to create a SAS so I can create an Azure SQL Extended Event session. The event session needs a file data storage target via SAS and I can't create one that works. Here's what I've tried:
Identified a storage account that's not blob; just general. I'm pretty sure I need general so I can create files directly.
Created a file share therein.
Using azure storage explorer, right clicked on that file share and selected, "Get Shared Access Signature."
Checked Read, Write, List and created.
This gives me the URL https://mystorageacct.file.core.windows.net/xevents?st=2018-12-25T16%3A29%3A51Z&se=2018-12-29T16%3A29%3A00Z&sp=rwl&sv=2018-03-28&sr=s&sig=mysig
If I just try to follow this URL or create a CloudFile object with it in code, I get the oft-seen error, Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. Signature did not match. String to sign used was rwl 2018-12-25T16:29:51Z 2018-12-29T16:29:00Z /file/cs7f0fbc5104d4ax435dx883/$root 2018-03-28
Tried adding in comp=list&restype=container as suggested here. No joy.
Ensured I have no access policy in use.
Went to the azure portal and created a different SAS at the storage account level (couldn't see a way to create it on the file share). That gave me this "File service SAS URL": https://mystorageacct.file.core.windows.net/?sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-12-30T01:25:16Z&st=2018-12-26T17:25:16Z&spr=https&sig=mysig
If I try that URL I get Value for one of the query parameters specified in the request URI is invalid. I don't know which parameter is in question, they look fine to me, but I don't know what the value srt=sco indicates. Based on this doc srt is resource type, but I don't know what the value sco indicates.
Very confused, looking for suggestions.
For any future readers, extended event sessions confusingly (because they write a file) require blob containers, not general/file/queue containers. At least I could only get them to work that way.
You are probably confused by how the SAS URLs are presented. In fact, the SAS URLs you got just provide examples of how to use the SAS token, they can't be used directly. Hence you saw those errors occur.
Service-level SAS URL, i.e. the one you got from Storage Explorer.
It's in the format of fileEndPoint/fileShareName?SASToken. The SASToken gives us permission to operate on all files inside the specified file share. To leverage the token, we need to add fileName in the URL, i.e. fileEndPoint/fileShareName/fileName?SASToken.
comp=list&restype=container is to list blobs in Blob Container, not for File Share.
Account-Level SAS URL, the one you got form Azure portal.
It's in the format of fileEndPoint/?SASToken. Likewise, we need to complement the URL to make it valid, i.e. fileEndPoint/fileShareName/fileName?SASToken. Note that this SASToken has all permission on all Storage resources because all choices are checked.
sco means we have permission to operate on service, container, and object, which indicates the scope of permission, check doc for details.
I am not familiar with Azure SQL Extended Event session, but if you only need to work with files inside one file share, 1st is enough.
I'm working in an Asp.Net Core 2 web api for files hosted at Google Cloud Storage. The files hosted there are not public, so I can't use the MediaLink property of the object. I tried to make a download endpoint using MemoryStream but when there are many users downloading large files at once I run into memory issues.
My question is: is there a way to create something link a one-time download link for a file or something similar?
I'm also trying to implement what's described in this link but I'd need to give the bearer token to the user. I can't do that.
Any tips?
Yes. Google Cloud Storage offers a feature called "signed URLs" that is what you described: a URL that is only good for a short while to download a single file. The idea is that you craft a download URL, then use the private key of a service account to "sign" the URL. Anyone holding that final URL can use it to act as that service account for the purpose of downloading that one object.
Take a look: https://cloud.google.com/storage/docs/access-control/#Signed-URLs
Writing code to generate the signed URL is a bit tricky, but the client libraries provide helper methods in several languages to do it for you. You can also generate one with the gsutil command: gsutil signurl -d 10m privatekey.p12 gs://bucket/foo
There is a code sample for generating he signed URLs programatically on their GitHub project: Signed URLs
I managed to Create it using C#. I'm posting here because this will be useful to someone else:
1 - Create your private key
2 - Create and UrlSigner:
private readonly UrlSigner _urlSigner;
2 - In your class constructor:
using (var stream = File.OpenRead(_googleSettings.StorageAuthJson))
{
_urlSigner = UrlSigner.FromServiceAccountData(stream);
}
_googleSettings.StorageAuthJson has the physical path of the json file you downloaded when creating your key.
3 - Method to get the URL:
public string GetSignedUrl(string bucketName, string objectName, TimeSpan duration) {
var url = _urlSigner.Sign(bucketName, objectName, duration, null);
return url;
}
I've been trying to get file uploads to work, following the instructions for both Dropbox and S3 but each time I just get this message:
File Upload URL not provided
It doesn't seem to be making any calls to the server. I've found this mention of a bug around file uploads:
https://github.com/formio/ngFormio/issues/322
But I suspect that applies if you're hosting it yourself. I'm using the cloud version.
I've configured it with e.g. the S3 bucket's URL, authentication etc.
What does this error actually mean?
Update: here's the syntax I'm using:
<formio form="https://formview.io/#/xxxxxxxxxxxxxxxxxxx/applicationform" url="'https://formview.io/#/xxxxxxxxxxxxxxxxxxx/applicationform'"></formio>
Thanks
In order to make the uploads work, you need to provide the URL of your form, which is used to generate the upload token to upload the files to the 3rd party providers. This can be done in one of two ways.
<formio src="'https://examples.form.io/example'"></formio>
You would use above if you wish to render the form from the JSON REST API of the form. In many cases, you may wish to provide the actual form object (which I suspect is what you are doing) like so.
<formio form="{...}"></formio>
This works fine for rendering the form, but it does not provide the URL context for file uploads. For this reason, we have the url parameter which you can include along with your form object for file uploads to work.
<formio form="{...}" url="'https://examples.form.io/example'"></formio>
Providing the url this way is passive. The form will not try to submit to that url, but rather just use it as the url configuration for file uploads.
I'm trying to integrate my rails app with SugarCRM. Is it possible to fetch the Contact picture from SugarCRM using REST API? Please let me know.
To get the profile image for a user do the following:
Call the login method through REST
Call the get_entry_list method through REST, with the following parameters:
Module: Users
Query: users.user_name = 'xxxx'
Select_fields: picture
The response contains the filename for the profile image, which is stored in /uploads.
However, it is not possible to view the image in that folder due to .htaccess restrictions for security reasons, but other options exist:
Extend the REST API with a method to serve profile images (similar to get_document_revision)
Login on the server from your rails app and get the image
Create a simple entrypoint+module in SugarCRM, which can show the picture
Remove the .htaccess restiction for images (if it doesn't create a security risk in your setup)
In such scenario, I faced a problem where the upload folder stores file with name of the record id, i.e the GUID without file extension.
So to cop up with this I did write a function to copy the file at same hierarchy but with its extension.
Example:
A png extension file at upload folder with name say, '32sdft-tg35f-Tuhis-675rtyf-77666-46dgc' will end up as, '32sdft-tg35f-Tuhis-675rtyf-77666-46dgc.png'
Now only the path will be require to render the image.
Rest all things as applicable as suggested by our friend, Kare !!