how to deploy large sized deep learning model using streamlit? - deployment

I have saved an (.h5) model with size about 85GB. I am trying to build a webapp using streamlit. However, I face the problem or loading my model, as I can not upload it to github. I tried to shrink the size of the saved model using tensorflow lite, but it only reduced the size to about 80GB. Is there a way to deploy my model for free using streamlit or other tools with my saved model.
The code and the issue is at https://github.com/kenanmorani/COVID-19Deployment.
Thank you

Even when using the Git Large File Storage, you can't use larger files than 5GB with Github. I think the easiest way for you would be to save your model in Gdrive and provide the url to your streamlit application.
An easy way to download public files from gdrive in python is the library gdown.
If the model should remain private, you would need to rely on the google drive api and the usage of streamlits secrets to pass the credentials.

Related

Is there any way to split data files into smaller chunks in Unity WebGL builds?

I am limited to the size of web applications I can build by the "Build\application.data" file.
I.e if its over a certain size I cannot upload it certain hosts, github, etc.
Ideally I would like to split the application into multiple data files under a certain size, while the application is still executable.
How would this be possible? Is this something I can do from Unity build configuration?
Can I do it after the build is done?
Can I split the file into chunks by archiving it with zero compression, and somehow still execute it from the browser? There is a file called Build.Loader.js, is it something that can be edited for this purpose?
This is for the purposes of using the application after it has been uploaded, not sharing it, I do not want to compress it into separate archives, or use gitlfs, I've tested this and the application does not work from the browser with github and gitlfs.
Thanks
Unity has 2 technologies for split data file:
Asset bundle
An AssetBundle is an archive file that contains platform-specific
non-code Assets (such as Models, Textures, Prefabs, Audio clips, and
even entire Scenes) that Unity can load at run time
Addressbles
The Addressable Asset System allows the developer to ask for an asset
via its address. Once an asset (e.g. a prefab) is marked
"addressable", it generates an address which can be called from
anywhere. Wherever the asset resides (local or remote), the system
will locate it and its dependencies, then return it.
Both technologies create separate files that you can host on a server and download as needed. Addressable is a newer technology that Unity team recommends.
Probably the total size of the bundle will grow, but user will be able to download only the necessary assets and the amount of data for the user may decrease
If you do not use Unity solutions, you can divide data file into parts. But on the client side (javascript) you will need to download all the parts, connect them and pass to Unity loader. You probably won't be able to use the browser's built-in gzip or brotli (not sure). It seems to be quite difficult.

Flutter asset folder path

I want to build an app with multiple flavors, which work very well. But now I have the requirement to include static json files into my preview app, because sometime the backend differs in development speed. So I can use the json files within my app as fake/mock data.
Hard requirements are:
Preview has jsons included
Production should be release without any unused files (no jsons files)
What's the best way to do it?
I think it depends on the data. If JSON will not change anymore or rarely change, you can just load JSON from the local asset. Otherwise, getting the JSON data from the cloud is a better way. It's more flexible.

flutterfire-web: How to upload large files to firebase storage in web based flutter app

Apparently, the web implementation of FlutterFire storageref.putFile does not support the dart:html File object.
Because of this I cannot find a simple way to upload large files, since all the other options require to load the data into the browser first.
Am I missing something here?

Meteor 1.4 - General approach to file system + /public activity

I've done some digging around and a lot of the threads regarding file system and how it works with Meteor seem to be pretty outdated, not to mention packages related to file storage/serving (i.e. CollectionFS). I was wondering if anyone here has deep experience with handling files in lieu of 1.4 or even 1.3 (I am currently on 1.4.1.1).
My questions are as follows:
Did Meteor 1.3/1.4 come with any changes regarding fs?
What is the general best approach to storing and serving static assets in light of Meteor 1.4?
I've seen many threads that say dynamically storing files to /public triggers a server upload, but I've tested this on local by manually copy/pasting a .png file into /public, and it only triggers a client refresh with the console message Client modified -- refreshing. Would this hold true for files added during runtime, and would it hold true in production?
Currently I am trying to stay clear from S3 or any other third party CDN's to keep a low budget, and also trying to stay clear from storing files into Mongo.
Thanks for any and all opinions!
What about setting up a shared folder or NFS folder, have your Meteor app handle the file upload, write the file to that location, and configure Nginx or whatever you are using as the load balancer to serve those files. If you worry about browser refreshed when the file is put into the public folder, you do not need to write files to the public folder right?

Which is the best method to get a local file URI and save it online?

I'm working on a web project but the scenario has some restrictions for a specific user case. We have been investigating a web-only solution and a dropbox-like native way to solve this.
The main restriction is that we shouldn't upload local files to a cloud. We can only track local URI's.
The use cases are:
As a developer, I should be able to link the URI of a local file to a webapp. Thus, I can click on a webapp element and the local file should be opened.
As a user, I should be able to add a directory and view the same structure on the webapp (clicking opens the file). The files are not uploaded.
Possible solutions:
We started trying the FileSystem API but when the specs. were fully defined, we figured out that a local sandbox was not enough, and we can't access to the local URI due to security issues.
We are considering a Dropbox-like native app. The Invision Sync App is closer to what we want.
The less optimal solution would be a complete native application.
The question:
Which is the more efficient way to achieve this? Any idea on some native libraries for doing this faster? Any web-only workaround?
Thanks in advance.