How can I have a software access files on the cloud - neural-network

So I have a small company with plenty of documents and I want to set up an archiving system. I have several employees with different levels of permissions to access the files on the server. This will serve as an archive system plus a management system, as employees can read and write files (depending on the permission) for a certain project, then the admin can prevent access to certain directory (i.e. project).
So after some research I think the best idea is to have a cloud-based NAS in which a user can have locally by giving the correct username and password. Then a software will access these files (which are now local) and can display some data (e.g. project progress, minutes of meetings), or the user can access the files directly.
Does any of this make sense? I mean is that what NAS can actually do and can it be done on the cloud? and can users access the file system (with restrictions) given username and password (much like if it were a network). Is there a better alternative for my purposes?
To the best of my knowledge, I can, instead, create a software that accesses the cloud directly, but how can I get the users to write files and be stored on the cloud? won't that be more complicated to implement? Can I use an RDMS for it? I've used it before but never for files.

If I understand your use case correctly, all you really want is to have access to different files for different roles within your company, is this correct?
To the best of my knowledge, I believe that Google provide corporate accounts which are quite affordable which should have access control schemes suiting what you need (after all, storing files on scaling storage, with various access controls in an offsite location and with redundancy is partly what the cloud is for).
If not, or if this solution isn't appealing to you and you would prefer to use your NAS, the best way to do this would be to use Google's Backup and Sync application (you can download this by clicking the cog icon on Drive and selecting it). If you install and run this on an admin computer that is always on (and always connected (mounted) with your NAS), you can set a root folder on the NAS as your Drive sync folder. Any files added to this folder will be uploaded to Drive, and any added to Drive will be automatically downloaded. After this you can configure the access control on the NAS using various user accounts and roles, and have each employee mount the store using their own credentials, revealing only the files they have access to.

Related

Sharing TeamDrive with Contractors?

I am going to migrate onto Google Workspace. I will create several TeamDrives under there, where my contractors (photographer etc.) will upload the files they created.
They will use their own gmail account, not with an email of my company's domain.
In personal Google Drive, when someone shares a folder/file with me, those are still owned by them. Therefore, when they delete on their own personal account, those files are also removed from my side. As per my understanding, TeamDrive sorts out this issue. When they upload a file/folder into a TeamDrive using their own gmail account, the ownership of these are taken by the TeamDrive and protected from deletion in the future.
Can someone help me to clarify these questions?
Thanks
According to the documentation available about Shared Drives, which I think is the feature you are referring to, if a user outside of your organization contributes to your organization's shared drives, the content uploaded, created or edited is transferred to it and belongs to the shared drives uploaded.
The documentation says the following: "Any work an external user contributes (for example, edits to, creating, or uploading a file) is transferred to and owned by the domain that created the shared drive."
Here is the link in case you need it.
In addition, the external user shouldn't be able to remove the file unless the privilege given is Manager or Content Manager; to upload Contributor privilege is enough. Check the documentation about the access levels.
Google Shared Drive (former Team Drive) is suitable for transferring ownership of Google Drive files. Files on Google Shared Drive do not have an individual owner.
Another possible option is to use Google Forms to upload files to Google Drive. When using the apps script or Google Forms add-on, these files can be automatically renamed, organized into folders on Google Drive and Google Shared Drive.

How to store and organize uploaded images on webserver?

I am writing a server that allows user to upload images. It appears that most people tend to store those files on the filesystem directly.
My question would be if that really is the way how to do it. I'm not familiar with the capacities of a server but what I'm curious about is e.g. how to make sure that the server does not run out of (hard drive) memory?
I would also like to know how one would organize those files for many different users. Is it enough to just store it like war/images/<user-database-id>/<uuid-for-image>.(jpeg|png) by just using the user ID inside the database or are there a lot more things to consider when it comes to storing images?
I think your best bet would be to use a cloud storage system such as Amazon S3, Google Cloud Storage, Rackspace, or MS Azure.
Using a path like the one you suggested ought to be possible but you could also omit the user-database-id if that database already gives you a list of objects owned by that user.

Sugestions about file storage in Amazon AWS

I'm developing a Asp.Net MVC project that will be hosted in Amazon AWS, but I have some questions about storage of the client's files. The documentation from Amazon is not clear to me and I'm looking for some directions and experiences here.
1 - each client have a few files with low space disk requirements, low update frequency but very high access frequency (like brand image and even sensitive files like certificates). Is appropriate to storage this files in app_data folder in web server?
2 - the most critical to me are sensitive documents (from hundreds to dozen of thousands per client, most like xml signed files). This files has a medium read access frequency but a very high demand for creation. One solution I found is MongoDB, wich give me some freedom to manage the storage policy and allow me a external backup easy, but I'm not sure about that. Other options are to use the Amazon Storage and handle all this files and GBs in there with a lot of folders or maybe use a regular database and save the files as xml or bin.
My concerns are about the amount of data, the security and the reliable in case of disaster as most of this documents has legal value.
You could, but storing them locally, violates the shared nothing architecture and would limit your scaling options. Amazon S3 is a good option here. You can set some files public and serve them direct from s3 (or with cloudfront) and keep other private and provide access via signed urls.
Again, you can put the files on s3 and make them private. You will still probably store references to the files in your database. Generally its not a great idea to store large blob files in a database since they are often not well optimized to access them.

Packaged App: syncFileSystem / fileSystem API - For *large* files

I am looking to develop a Chrome Packaged App that will (at a very simple level) provide a dynamic form filling UI - but allow users to attach large attachments to the forms (could be upwards of 10 files of 10MB each). I would like to have the ability to save and share the form data and the attachment via Google Drive. The forms will be completed collaboratively by multiple team members who also need to all see the attachments. Imagine a form front-end/metadata that sits on top of a shared Google Drive folder...
I have read the documentation, and learnt that the syncFileSystem API is not intended for use for general and/or large files to be stored in Google Drive, but rather for small configuration data.
I then looked at the fileSytem API - hoping that I could include the Sandboxed folder for the app in the folders that the Google Drive Client App (so that the files get synced automatically) - but it doesn't look like the sandbox is meant to be accessed externally.
My current thinking is to recreate a windows explorer type UI in the packaged app (can use drag and drop) - then store the files in the sandbox using the fileSystem API. I can reuse the code from the Google Drive sample packaged app to implement cloud syncing. Good idea?
Two questions stem from this:
How persistent is the fileSystem API. The documentation mentions that the user can purge all stored files - is this done through 'clearing all browser history' ? In which case they could very easily accidentally wipe many hundreds of MB of useful information that I am storing in the packaged app.
I have read that you can use a 3rd party authentication services (which I want to do). If I use a non-Google account to authenticate my users, how would the Google Drive authentication work ? Would I be able to use a different Google account to perform the cloud storage (i.e. unrelated to the actual end user, who may or may not have a Google account already - which may already be signed in)
It seems like waiting for this https://code.google.com/p/chromium/issues/detail?id=148486 (getting read access to non-sandbox directories) would be the easiest way forward.
I don't think clearing browser history deletes temporary sandbox filesystem files, they're supposed to be sort of automatically garbage collected when space is required. It would make sense if that were another checkbox in the "Clear browsing data" section of chrome's options. Perhaps that would make the answer to your first question more clear :-)
The second point, I am not sure how to do this, but it looks like you have already figured out something? At least that's what this page https://groups.google.com/a/chromium.org/forum/#!topic/chromium-apps/hOYu75Cv0AE seems to indicate

Using google cloud storage from pc application

Hopefully I don't sound too stupid asking this. My wife and I run a small business out of our home. We want to share the accounting data, but I'm at another location often. We use a PC version of Sage Peachtree Premium Accounting that has networking capabilities, so the data files can be stored in a common place. Is it possible to share this file using something like Google Cloud Storage?
Google Drive is certainly the cheaper option as it is optimized for consumer usage patterns. Google Cloud Storage is optimized for applications that demand highly available and replicated storage with strong global consistency.
Here are a few ways that Google Cloud Storage attempts to improve team collaboration:
Resources are owned by a project team composed of multiple people.
It is possible to share files with a group.
It is possible to change the default acl applied to new objects.
Collaborate with a team
Each bucket is owned by a project and by default everyone on your team can read new objects upload to those buckets.
You manage the people on your team in the following manner:
Go to https://code.google.com/apis/console
Click on teams on the sidebar.
Add the email addresses of other people you want to collaborate with.
Use the drop-down list to give them more permissions.
Use the x to remove team members.
Permissions are concentric:
Everyone with can view access will be able to read files that do not specify an acl.
Everyone with can edit access will also be able to create and delete buckets as well as upload new objects.
Everyone with is owner access will also be able to add other viewers, editors and owners.
Share to a Google Group
Google Cloud Storage allows you to share files with a Google group. User gain access to these files when you add them to the group and lose access when you remove them from the group.
First download the objects acl:
gsutil getacl gs://bucket/obj > acl.xml
vim acl.xml
Now add the following acl entry inside the <Entries/> tag:
<Entry>
<Scope type="GroupByEmail">
<!-- Give everyone in the gs-discussion group READ access. -->
<EmailAddress>gs-discussion#googlegroups.com</EmailAddress>
</Scope>
<Permission>READ</Permission>
</Entry>
Now update the acl:
gsutil setacl acl.xml gs://bucket/obj
See the online documentation for further information about access control https://developers.google.com/storage/docs/accesscontrol#applyacls
You can create a Google group at google.com/groups
Change the default object acl
By default everyone on the team can read objects you upload. However you can configure this to be more or less permissive. You could make objects publicly-readable by default or only viewable by the owner and a Google group.
Changing the default object acl is similar to changing object acls. Just use the getdefacl and setdefacl commands.
Some predefined configurations do not require editing an xml file:
# Team members can view new objects.
gsutil setdefacl project-private gs://bucket
# Anonymous internet users can view new objects.
gsutil setdefacl public-read gs://bucket
Otherwise you can edit the acl xml:
gsutil getdefacl gs://bucket > def_acl.xml
vim def_acl.xml
# Add whichever UserByEmail, GroupByEmail, AllUsers, etc grants you want.
gsutil setdefacl def_acl.xml gs://bucket
New objects apply the default object acl:
gsutil cp foo gs://bucket # This object will receive the def_acl.xml acls.
It is easy to override the default object acl with a predefined acl for a particular object:
# Ignore the default acl. Use public-read.
gsutil cp -a public-read foo gs://bucket
The full list of predefined acls is available at developers.google.com/storage/docs/accesscontrol#extension
Google Cloud Storage is probably overkill for what you're talking about. Cloud Storage is something web developers use to deliver assets like images, videos, and documents to a large number of users around the world.
However something like Google Drive or Dropbox would probably work well for this. If you both have Gmail accounts then Google Drive is a natural choice. Both of these solutions have a service which runs on each PC and automatically syncs changed files in a specified folder to all other computers using that folder.
So if one of you makes changes to the file, it will show up in the other location automatically. However the real question is how your software will handle this. I'm not familiar with Peachtree Accounting but it probably isn't possible for you to both be making changes at the same time, unless the software is specifically designed for that use case.
If you can post a link or description for the "networking capabilities" (that is a rather vague term on its own) it may be possible to tell for sure.