How to know filename contains other string or not in firebase cloud storage? - firebase-storage

I want to check that uploading image name contains "thumb_" or not .If filename contains "thumb_" then user do not allow to upload images.
something like that :
match/users/{id}/{image}{
if: request.resource.contentType.matches('image/.*') &&
request.resource.size < 1024 *1024 * 2 &&
image.size()<32 &&
!image.includes('thumb_')}
but it give error
Error running simulation — Error: simulator.rules line [13], column [27]. Function not found error: Name: [includes].
we can not use includes() function in security rule.So what should I use instead of includes() ?

According to the API documentation for security rules for Cloud Storage, there is no method called "includes" on string data. You can see there is a method called "matches" which lets you perform regular expression matching on strings. To see if the string in image contains the string "thumb_":
image.matches('thumb_')

Related

Invalid hashing in Firebase Cloud Storage Rules Playground

I am testing hashing in the rules playground:
This returns "CRexOpCRkV1UtjNvRZCVOczkUrNmGyHzhkGKJXiDswo=", the correct hash of the string "SECRET" :
let expected = hashing.sha256("SECRET");
But this returns "SECRETpath/to/the/file.mp4", the argument itself instead of its hash:
let expected = hashing.sha256("SECRET" + request.resource.name);
Is it a bug in the rules playground?
Can hashing functions be used on dynamic values or is it intentionally prevented?
The strange rules playground behavior has been mentioned here before, this time with Firestore security rules: Firestore rules hashing returns identity
Firebaser here!
There are a few issues at play here. I think the primary source of confusion is that the hashing.sha256 function returns a rules.Bytes type. It appears that the Rules Playground in the Firebase Console incorrectly shows a string value when debugging the bytes type, but that is unrelated to behavior in production. For example, this Rule will always deny:
allow write: if hashing.sha256("SECRET" + request.resource.name) ==
"SECRET" + request.resource.name;
To get the behavior you're looking for, you need to use one of the conversion functions for the rules.Bytes type. Based on your question, you'll probably want the toBase64() function, but toHexString() is also an option. If you try these functions in your Rules, the Playground should start behaving correctly and the Rules will work as expected in production as well. So to put it all together, you'd write:
let expected = hashing.sha256("SECRET" + request.resource.name).toBase64();
For example, the rules listed below would allow you to upload a file called "foo/bar" (as Gqot1HkcleDFQ5770UsfmKDKQxt_-Jp4DRkTNmXL9m4= is the Base64 SHA-256 hash of "SECRETfoo/bar")
allow write: if hashing.sha256('SECRET' + request.resource.name).toBase64() ==
"Gqot1HkcleDFQ5770UsfmKDKQxt_-Jp4DRkTNmXL9m4=";
I hope this helps clear things up! Separately we will look into addressing the wrong debugging output in the Playground
After trying with emulators and the deployed app, it seems that hashing.sha256 does not work on dynamic data in any environment. The behavior is consistent, so I filed a feature request to add this function to storage security rules. This would be nice because it would allow passing signed data to the security rule for each file (for ex: an upload authorization obtained via a Cloud Function)
As of now, the workaround that I imagine is putting data in user custom token (or custom claims), so I can pass signed data to the security rule. It is not ideal because I need to re-sign with custom token for every file upload.

how to use json_extract_path_text?

I am facing an issue with JSON extract using JSON_EXTRACT_PATH_TEXT in redshift
I have two separate JSON columns
One containing the modems the customer is using and the other one containing the recharge details
{"Mnfcr": "Technicolor","Model_Name":"Technicolor ABC1243","Smart_Modem":"Y"}
For the above, I have no issue extracting the Model_name using JSON_EXTRACT_PATH_TEXT(COLUMN_NAME, 'Model_Name') as model_name
[{"Date":"2021-12-24 21:42:01","Amt":50.00}]
This one is causing me trouble. I used the same method above and it did not work. it gave me the below error
ERROR: JSON parsing error Detail: ----------------------------------------------- error: JSON parsing error code: 8001 context: invalid json object [{"Date":"2021-07-03 17:12:16","Amt":50.00
Can I please get assistance on how to extract this using the json_extract_path_text?
One other method I have found and it worked was to use regexp_substring.
This second string is a json array (square braces), not an object (curly brackets). The array contains a single element which is an object. So you need to extract the object from the array before using JSON_EXTRACT_PATH_TEXT().
The junction for this is JSON_EXTRACT_ARRAY_ELEMENT_TEXT().
Putting this all together we get:
JSON_EXTRACT_PATH_TEXT(
JSON_EXTRACT_ARRAY_ELEMENT_TEXT( <column>, 0)
, 'Amt')
you can use json_extract_path_text like below example
json_extract_path_text(json_columnName, json_keyName) = compareValue
for more you can refer this article
https://docs.aws.amazon.com/redshift/latest/dg/JSON_EXTRACT_PATH_TEXT.html

Compare String using tMap

I am using Talend to prepare dataware.
I want to compare the string with the contents of a column using the tMap component and create a variable to store in the DB. The problem is that the == operator does not give the right result (Example: row2.recipient == "text"?"text":"" I always get "") and if I use .equals I get errors when executing.
You will get error if row2.recipient is null, and "==" should not be used when comparing strings.
Correct syntax would be :
"text".equals(row2.recipient)?"text":""
Then you will prevent NullPointerExceptions.

Use CSV values in JMeter as request path

I have one of jmeter User defined variable as a "comma separated value" - ${countries} = IN,US,CA,ALL .
(I was first trying to get it as a list/array - [IN,US,CA,ALL] )
I want to use the variable to test a web service - GET /${country}/info . IS it possible using ForEach controller or Loop controller ?
Only thing is that I want to save it or read it as IN,US,..,ALL and use it in the request path.
Thanks
The CSV should be as per the format mentioned in the image attached.
Refer to the link on how to use CSV in Jmeter: http://ivetetecedor.com/how-to-use-a-csv-file-with-jmeter/
Thread Group Settings
No. of threads: 1
Ramp-up period: 1
Loop Count: 4
Hope this will help.
CSV config is a red herring, you don't need it.
You can use a regular expression extractor to split up the variable into another variable (eg MyVar), using something like:
(.+?)[,\n]
This is trying to match each item before a , or newline. It will place the values in variables like MyVar_1, MyVar_2, etc. This is as close to an array as JMeter understands natively.
You can then loop on the contents of the matches using MyVar_matchNr, and MyVar_1 to MyVar_n (you will need to use __V() function to access the 'array' contents.

Using changestamp in GoogleDocs (null changestamp)

I am trying to find the max changestamp so I can start using it. I tried the following:
URL url = "https://docs.google.com/feeds/default/private/changes?v=3"
ChangelogFeed foo = service.getFeed(url, ChangelogFeed.class);
LargestChangestamp stamp = foo.getLargestChangestamp();
stamp is always null.
Is this the way to get the largest changestamp, or do I need to set it first in order to use it?
The largest changestamp is also available in the user metadata feed. See the "docs:largestChangestamp" element within the response protocol tab here,
I'm not sure the java api exposes the largestChangestamp property directly yet - last time I checked it was hidden in the xmlBlob property, and I had to do an xml parse to grab it out.
This seems to be a bug in the API. I got the changestamps by getting the ChangelogEntrys from the ChangelogFeed:
List<ChangelogEntry> entries = foo.getEntries();
for (ChangelogEntry entry: entries) {
String blob = entry.getXmlBlob().getBlob();
System.out.println("Blob: " + blob);
}
The changestamp for an entry is contained in its blob.