Is there a way to change the google storage signed url to not include the name of the file? - google-cloud-storage

I have a method that gets a signed url for a blob in a google bucket and then returns that to users. Ideally, I could change the name of the file shown as well. Is this possible?
An example is:
https://storage.googleapis.com/<bucket>/<path to file.mp4>?Expires=1580050133&GoogleAccessId=<access-id>&Signature=<signature>
The part that I'd like to set myself is <path to file.mp4>.

The only way I can think of is having something in the middle that will be responsible for the name "swap".
For example Google App Engine with an http trigger or Cloud Function with storage trigger that whenever you need it will fetch the object, rename it, and either provide it to the user directly or store it with the new name in another bucket.
Keep in mind that things you want to store temporarily in GAE or Cloud Functions need to be stored in "/tmp" directory.
Then for renaming, if you are using GAE possibly you can use something like:
import os
os.system([YOUR_SHELL_COMMAND])
However, the easiest but more costly approach is to set a Function with storage trigger that whenever an object is uploaded it will store a copy of it with the desired new name in a different bucket that you will use for the users.

Related

How to set up blob storage Property before web activity drop a file into Blob storage?

I have created pipeline in Azure Data factory in which I'm Have web activity which copy the excel(xlsx) file from Dropbox App console and have another web activity which copy the file into Blob Storage, Pipeline is executing successfully, it is copying the file in same xlsx format in blob Storage as well but when I open the excel file from blob storage getting error that "The file myfilename.xlsx may not render correctly as it contains an unrecognized extension"
when the web activity copy the file I see it has content-Type = Applicatoin/octent-stream, I did try to change the content-type = application/vnd.openxmlformats-officedocument.spreadsheetml.sheet, Any help would be appreciate to set the blob storage property before I copy my excel file from web activity output.
The web activity's associated service and dataset characteristics are to be applied. I'll give you a solution that I know will work in the interim. Use two online activities. The data is initially retrieved from the blob. The first web activity's output is used as the body of the second web activity. In the example below, I substitute another blob for the website. (It essentially duplicates the blob.)
Here, the URL looks like this to obtain the blob:
https://.blob.core.windows.net///;
As well, I employ MSI authentication. You must assign the Data Factory rights in the storage account (or container) access control for this to function.
The header x-ms-version must then be added, along with the value 2017-11-09. (A picture of an earlier solution illustrates this.)
You should use https://storage.azure.com as the resource.
If an error message appears,
AuthorizationPermissionMismatch
This operation cannot be carried out under the terms of this request.Then you might need to visit your storage account and grant additional permissions, as seen below. Wait for the permissions to spread over a few minutes; otherwise, you risk getting a false negative.

VS Code workspace storing additional script data

I wanna build external editor for scripts stored in crm.
I have API which exposes script engine returning list of scripts containing ,script id, script name, content...
Editor will be Visual Studio Code plus my extension. Manage to build extension and get data, store scripts on in the folder and give them name from the response. Problem is in update, to send data back to the server i need to use id not the name.
Question: is there any kind of container in workspace where i can store all data related to every script such as Id?
Something like settings for every script containing scriptName:id, date, owner...
Maybe https://code.visualstudio.com/api/extension-capabilities/common-capabilities#data-storage can help? You can store data either locally, in the workspace, or globally, in the user' storage space. Can be simple key/value pairs, or your own format.

Setting up a BigQuery to Google Cloud Storage pipeline with overwriting

I am trying to setup a really simple pipeline in Data Fusion which takes a table from BigQuery, then stores that data into Google Cloud Storage. With the pipeline setup below it's fairly easy. We first read the bigquery table and schema, then sink the data into a Google Cloud Storage bucket. This works, but the problem is that a new map and a new file gets created for each new transfer that I run. What I would like to do is to overwrite a single file in the same filepath with each new transfer that I do.
What I ran into that in this setup, a new map and a new file gets within Google Cloud Storage created using a timestamp prefix. Looking at the sink configuration below, indeed, by default you see a timestamp.
Alright, that would mean if I would remove the prefix a new map shouldn't be created. The hover-over confirms this: "If not specified, nothing will be appended to the path".
However, when I clear this value and then save it, the full time format automatically pops up again. I can't use a static value because this results in errors. For example I just tried creating a map with the number "12" in Google Cloud Storage and then setting the prefix to that, but as you would guess this doesn't work. Is anyone else running into this problem? How do I get rid of the path suffix so I don't get a new map for each timestamp within Google Cloud Storage?
This seems to be an issue with Data Fusion UI. Have filed a JIRA for this https://issues.cask.co/browse/CDAP-16129.
I understand this can be confusing when you open the configuration again. The reason this is happening is whenever you open the configuraion modal we pre-populate fields with default values from plugin widget json (if no value is present).
As a workaround can you try,
Export pipeline - Once you have configured all the properties in the plugins you can export the pipeline. This step should download a JSON for you where you can locate the property and remove it and import the pipeline and publish without opening the specific plugin.
Or, simply remove the property from the plugin configuration modal and close and publish the pipeline directly. UI will Re-populate the value every time you open the plugin configuration. Once you delete and close the modal it should retain that state until you open the configuration again.
Hope this helps.

Can I pass a FilePicker a custom source using their API

Essentially I'm trying to see if I can use file picker to manage user assets.
With they accelerate bundle you can specify a custom s3 source, but only on their dashboard.
I want users to pick and store to their own folders ( which appears to be possible )
but then also be able to re-use those files they have already picked and stored using filepicker.
Is this possible? by reading through the doc it appears not.
Such a feature is not available so far.
I think the only solution here would be to create a database with user filepicker filelinks.

Store and Use Configuaration values in Alfresco Share

I have a custom developed share dashlet that has hardcoded values in use (For example the name of a Workspace to use as a default).
I would rather have the values placed into a configuration file and have the dashlet read the fileup at server start and work from there. I have two questions:
Which file can I use and where can I place the file?
How can I get the share dashlet to read the file contents and load the data into local variables?
I would advise you to leverage the PreferenceService, which can be accessed from a Share Web Script by consuming the following API exposed by the repository:
GET /cms-repository-5.0.0/service/api/people/{userid}/preferences?pf={preferencefilter?}
POST /cms-repository-5.0.0/service/api/people/{userid}/preferences?pf={preferencefilter?}
DELETE /cms-repository-5.0.0/service/api/people/{userid}/preferences?pf={preferencefilter?}
In this way you will probably need to define some hard coded meaningful defaults, then let the user customize his experience and store the customized settings as user preferences.