Firebase Storage Rules - How to implement different read and write rules - firebase-storage

I'm trying to implement rules for loading images to the gallery in my app. All users should be able to read, but only users with type 'admin' should be able to upload / write images to storage.
I've been able to get the rules to work with several other directories inside Storage, but always where the read and write rules are the same.
For example, in a directory where users upload their own documents, I've implemented a rule which says that to read or write a file to the directory, the directory must have the same name as the uid OR have admin rights.
The code which is working is:
match /userUploadedDocuments/{uid}/{allPaths=**} {
allow read, write: if (root.child('CustomerDatabase').child(request.auth.uid).child('userType').val() == 'admin')
|| request.auth.uid == uid;
}
The code which isn't working is:
match /gallery {
match /{allImages=**} {
allow read: if request.auth.uid != null;
allow write: if (root.child('CustomerDatabase').child(request.auth.uid).child('userType').val() == 'admin');
}
}
Firebase will accept this code, but it fails whenever I try to write to the directory.
Your help would be very welcome.

Related

Powershell: FTP download not working despite having permissions [duplicate]

What is the best way to download all files in a remote directory using C# and FTP and save them to a local directory?
Thanks.
downloading all files in a specific folder seems to be an easy task. However, there are some issues which has to be solved. To name a few:
How to get list of files (System.Net.FtpWebRequest gives you unparsed list and directory list format is not standardized in any RFC)
What if remote directory has both files and subdirectories. Do we have to dive into the subdirs and download it's content?
What if some of the remote files already exist on the local computer? Should they be overwritten? Skipped? Should we overwrite older files only?
What if the local file is not writable? Should the whole transfer fail? Should we skip the file and continue to the next?
How to handle files on a remote disk which are unreadable because we don’t have sufficient access rights?
How are the symlinks, hard links and junction points handled? Links can easily be used to create an infinite recursive directory tree structure. Consider folder A with subfolder B which in fact is not the real folder but the *nix hard link pointing back to folder A. The naive approach will end in an application which never ends (at least if nobody manage to pull the plug).
Decent third party FTP component should have a method for handling those issues. Following code uses our Rebex FTP for .NET.
using (Ftp client = new Ftp())
{
// connect and login to the FTP site
client.Connect("mirror.aarnet.edu.au");
client.Login("anonymous", "my#password");
// download all files
client.GetFiles(
"/pub/fedora/linux/development/i386/os/EFI/*",
"c:\\temp\\download",
FtpBatchTransferOptions.Recursive,
FtpActionOnExistingFiles.OverwriteAll
);
client.Disconnect();
}
The code is taken from my blogpost available at blog.rebex.net. The blogpost also references a sample which shows how ask the user how to handle each problem (e.g. Overwrite/Overwrite older/Skip/Skip all).
Using C# FtpWebRequest and FtpWebReponse, you can use the following recursion (make sure the folder strings terminate in '\'):
public void GetAllDirectoriesAndFiles(string getFolder, string putFolder)
{
List<string> dirIitems = DirectoryListing(getFolder);
foreach (var item in dirIitems)
{
if ( item.Contains('.') )
{
GetFile(getFolder + item, putFolder + item);
}
else
{
var subDirPut = new DirectoryInfo(putFolder + "\\" + item);
subDirPut.Create();
GetAllDirectoriesAndFiles(getFolder + item + "\\", subDirPut.FullName + "\\");
}
}
}
The "item.Contains('.')" is a bit primitive, but has worked for my purposes. Post a comment if you need an example of the methods:
GetFile(string getFileAndPath, string putFileAndPath)
or
DirectoryListing(getFolder)
For FTP protocol you can use FtpWebRequest class from .NET framework. Though it does not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the FtpWebRequest. The FtpWebRequest unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it's a directory. But that can become a performance problem, when you have a large number of entries.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
void DownloadFtpDirectory(
string url, NetworkCredential credentials, string localPath)
{
FtpWebRequest listRequest = (FtpWebRequest)WebRequest.Create(url);
listRequest.UsePassive = true;
listRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
listRequest.Credentials = credentials;
List<string> lines = new List<string>();
using (WebResponse listResponse = listRequest.GetResponse())
using (Stream listStream = listResponse.GetResponseStream())
using (StreamReader listReader = new StreamReader(listStream))
{
while (!listReader.EndOfStream)
{
lines.Add(listReader.ReadLine());
}
}
foreach (string line in lines)
{
string[] tokens =
line.Split(new[] { ' ' }, 9, StringSplitOptions.RemoveEmptyEntries);
string name = tokens[8];
string permissions = tokens[0];
string localFilePath = Path.Combine(localPath, name);
string fileUrl = url + name;
if (permissions[0] == 'd')
{
Directory.CreateDirectory(localFilePath);
DownloadFtpDirectory(fileUrl + "/", credentials, localFilePath);
}
else
{
var downloadRequest = (FtpWebRequest)WebRequest.Create(fileUrl);
downloadRequest.UsePassive = true;
downloadRequest.UseBinary = true;
downloadRequest.Method = WebRequestMethods.Ftp.DownloadFile;
downloadRequest.Credentials = credentials;
var response = downloadRequest.GetResponse();
using (Stream ftpStream = response.GetResponseStream())
using (Stream fileStream = File.Create(localFilePath))
{
ftpStream.CopyTo(fileStream);
}
}
}
}
The url must be like:
ftp://example.com/ or
ftp://example.com/path/
Or use 3rd party library that supports recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "mypassword",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Download files
session.GetFiles("/home/user/*", #"d:\download\").Check();
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
(I'm the author of WinSCP)
You could use System.Net.WebClient.DownloadFile(), which supports FTP. MSDN Details here
You can use FTPClient from laedit.net. It's under Apache license and easy to use.
It use FtpWebRequest :
first you need to use WebRequestMethods.Ftp.ListDirectoryDetails to get the detail of all the list of the folder
for each files you need to use WebRequestMethods.Ftp.DownloadFile to download it to a local folder

Saving user data when application closes in Flutter

I want to automatically save the username and password of password when he connects to the login page of my app, to then autofill the username and password field with the data in the next connection.
To do this I use a class which I called ContextManager where I store data I need to use in my app. I also use SharedPreferences.
Here is the code which is executed when the user log in to the app :
if (ContextManager.isConnected == false) {
return 'Le nom d\'utilisateur ou le mot de passe est incorrect';
} else {
SharedPreferences prefs = await SharedPreferences.getInstance();
prefs.setString('email', data.name);
prefs.setString('password', data.password);
ContextManager.savedEmail = prefs.getString('email').toString();
ContextManager.savedPassword = prefs.getString('password').toString();
return null;
}
And then in the build of my LoginPage widget I return a FlutterLogin widget (from the flutter_login package) with the value ContextManager.savedPassword for his savedPassword attribute, and ContextManager.SavedEmail for his savedEmail attribute.
It works when the app is running and I'm logging out while still in the app, the data is stored and autofilled correctly, but when I'm stopping the app and running it back all data is gone.
How could I permanently stored the data and simply retrieve it in the app at any moment ?
Thanks.
From my experience, I know three different packages to store data between app restart.
The first I use is the package sqflite, which, as the name suggests, includes a full SQLite database (but I don't know if that fits your definition of "simply retrieve it").
A simpler tool is the shared_preferences package, that allows you to store data as a key-value pairs(stored somewhere in a file unreachable by the user, if I'm correct).
The last one that I know of, that I used for a desktop app, is the ini package, that lets you create a configuration file like the .ini files often found in desktop applications. It's less user-friendly than the other two, but it allows you to choose where to put the file, and lets the user access that file if needed.

How to cache a value returned by a Future?

For my Flutter app I'm trying to access a List of Notebooks, this list is parsed from a file by the Api. I would like to avoid reading/parsing the file every time I call my notebooks getter, hence I would like something like this:
if the file had already been parsed once, return the previously parsed List<Note> ;
else, read and parse the file, save the List<Note> and returns it.
I imagine something like this :
List<Notebook> get notebooks {
if (_notebooks != null) {
return _notebooks;
} else {
return await _api.getNotebooks();
}
}
This code is not correct because I would need to mark the getter async and return a Future, which I don't want to. Do you know any solution for this problem, like caching values ? I also want to avoid initializing this list in my app startup and only parse once the first time the value is needed.
you can create a singleton and you can store the data inside a property once you read it, from next time onwards you can use the same from the property.
You can refer this question to create a singleton
How do you build a Singleton in Dart?

How can I use Chrome App fileSystem API

I want to use chrome.fileSystem API to create a new file in C:/ or anywhere, but I cannot figure out how to use this API.
I cannot find any argument for file path, only thing is fileEntry, but how can I generate fileEntry with something like C://a/b/c?
Chrome apps have limitations - for security reasons - on what files can be accessed. Basically, the user needs to approve access from your app to the files and directories that are accessed.
The only way to get access to files outside of the app's sandbox is through a user gesture - that is, you need to ask the user for a file. You do this with chrome.fileSystem.chooseEntry.
If this isn't clear of obvious, maybe you could explain more about what you are trying to do with the files and we can give advice on the best way to do this. Usually chrome.fileSystem is not the best choice for data storage - there are other more convenient and sandboxed alterntives like chrome.storage.
It's a bit tricky to work with. But it follows the same model other languages use, except it's even tricker because of all the callbacks. This is a function I wrote to get a nested file entry, creating directories as it goes along. Maybe this can help you get started.
For this function, youd pass in a FileSystem that you'd get from something like chrome.fileSystem.chooseEntry with a directory type option, and path would be in your example ['a','b','c']
function recursiveGetEntry(filesystem, path, callback) {
function recurse(e) {
if (path.length == 0) {
if (e.isFile) {
callback(e)
} else {
callback({error:'file exists'})
}
} else if (e.isDirectory) {
if (path.length > 1) {
e.getDirectory(path.shift(), {create:true}, recurse, recurse)
} else {
e.getFile(path.shift(), {create:true}, recurse, recurse)
}
} else {
callback({error:'file exists'})
}
}
recurse(filesystem)
}

HTML5 File API in Firefox Addon SDK

Is there a way to access Html5 file api in Fire Fox addon sdk in the content script?
This is needed in order to store user added words and their meanings. The data can grow large and so local storage isn't an option.
window.requestFileSystem3 = window.requestFileSystem || window.webkitRequestFileSystem;
gives me the error TypeError: window.requestFileSystem3 is not a function.
I am asking this because i am porting this code from a Google Chrome Extension which allows accessing the file api in a content script.
Additional Questions
1) If HTML5 File API is not allowed then should i use file module?
2) Does the file module allow access to any file on the file system as opposed to the Html5 file api which only access to a sandboxed access to file system?
3) Assuming i have to use file module what would be the best location to store my files ( like the user profile directory or extension directory ) and how would i get this path in code.
I apologize for so many sub questions inside this questions. Google wasn't very helpful regarding this topic.
Any sample code would be very helpful.
Firefox doesn't support writing files via File API yet and even when this will be added it will probably be accessible to web pages only and not extensions. In other words: yes, if you absolutely need to write to files then you should use low-level APIs. You want to store your data in the user profile directory (there is no extension directory, your extension is usually installed as a single packed file). Something like this should work to write a file:
var file = require("sdk/io/file");
var profilePath = require("sdk/system").pathFor("ProfD");
var filePath = file.join(profilePath, "foo.txt");
var writer = file.open(filePath, "w");
writer.writeAsync("foo!", function(error)
{
if (error)
console.log("Error: " + error);
else
console.log("Success!");
});
For reference: sdk/io/file, sdk/system
You could use TextReader.read() or file.read() to read the file. Unfortunately, Add-on SDK doesn't seem to support asynchronous file reading so the read will block the Firefox UI. The only alternative would be importing NetUtil and FileUtils via chrome authority, something like this:
var {components, Cu} = require("chrome");
var {NetUtil} = Cu.import("resource://gre/modules/NetUtil.jsm", null);
var {FileUtils} = Cu.import("resource://gre/modules/FileUtils.jsm", null);
NetUtil.asyncFetch(new FileUtils.File(filePath), function(stream, result)
{
if (components.isSuccessCode(result))
{
var data = NetUtil.readInputStreamToString(stream, stream.available());
console.log("Success: " + data);
}
else
console.log("Error: " + result);
});