enter image description here
how to increase this File Usage
Per their web hosting page (https://www.godaddy.com/hosting/web-hosting#compare) the most number of files you can have on any plan is 250,000. You either need to delete files, or find another host.
Related
I'm doing a flutter app that needs to open a binary file, display the content to a user and allow them to edit and save. File size would be between 10K and 10 MB. The file also needs to be in the cloud for sharing and accessing from other devices. To minimise remote network egress data charges and also local user mobile data charges, I'm envisaging that when the user saves the file it would be saved locally rather than written to the cloud and only written to the cloud when the user closes the file or maybe at regular intervals or no activity. To minimise network data charges I would like to keep a permanent local copy and the remote copy of the file would have a small supporting file that identified by who and when the remote file was last written. When the app starts, it checks if its local copy is up to date by reading the supporting file. The data does not need high security.
The app will run on Android, IOS, the web and preferably on the desktop - though I know that google firebase SDK for Windows is incomplete/ unavailable.
Is google firebase cloud storage the easiest and best way to do this. If not what is the easiest way.
Are there any cloud storage providers that don't charge for network egress data, just for storage.
Some context:
I have Strapi deployed on Heroku successfully with a MongoDB backend, and can add/edit entries. My issue comes when I upload an image using the media library plug in. I'm able to upload an image, and have my frontend access it initially, displaying it etc. after sometime, like the next day or in an hour or so, the history of the file is present, as can be seen with this endpoint:
https://blog-back-end-green.herokuapp.com/upload/files/
However, the url endpoint to access the media doesn't work as it used to, and I get a 404 error when I follow it to the endpoint. e.g.
https://blog-back-end-green.herokuapp.com/uploads/avatarperson_32889bfac5.png
New to Strapi so any help/guidance appreciated
The docs address your question directly:
Like with project updates on Heroku, the file system doesn't support
local uploading of files as they will be wiped when Heroku "Cycles"
the dyno. This type of file system is called ephemeral, which means
the file system only lasts until the dyno is restarted (with Heroku
this happens any time you redeploy or during their regular restart
which can happen every few hours or every day).
Due to Heroku's filesystem you will need to use an upload provider
such as AWS S3, Cloudinary, or Rackspace. You can view the
documentation for installing providers here and you can see a list of
providers from both Strapi and the community on npmjs.com.
When your app runs, it consumes dyno hours of HEROKU
When your app idles (automatically, after 30 minutes of inactivity), as long as you have dyno hours, your app will be live and publicly accessible.
Generally, Authentication failures return a 401 (unauthorized) error but in some platforms, 404 error can also return.
Check Your second request does have the correct Authorization header
Check out role-permissions
I have previously configured firebase hosting successfully but now want to change the domain name I am using.
It appears you can Edit the current domain name but this doesnt seem to do anything that I can see when I enter a new domain name.
Not sure how to proceed.
Thanks,
Craig.
So I sent an email to firebase support and this process involves removing your current domain and adding a new domain which involves some downtime whilst they procure a new SSL certificate, etc.
If anyone is attempting to do this and does not want to incur any downtime its not too difficult. The highlevel process I have followed to move from olddomain.com to newdomain.com without downtime is detailed below:
Procure another temp server (i.e. amazon or whatever) and bring up nginx
Deploy your static files (css, js, html, jpg, etc) to this temporary server
Procure a cert for olddomain.com and deploy it on your server. You can get some free ones for a month if you have a search (dont want to endorse any particular product here)
Ensure the site is running as olddomain.com on your temp server (hack you hosts file to force your domain name to point at the new temp server.
If all good, modify your DNS for olddomain.com so it points at your temp server.
Wait a few hours to ensure all traffic going to your temp server (look at w3c style logs to ensure traffic is coming in)
You can now safely remove your olddomain.com and setup newdomain.com under firebase hosting without losing traffic to olddomain.com
Once newdomain.com setup and running on firebase hosting, configure olddomain.com to do a redirect to newdomain.com. May want to leave this up for a while depending upon how much traffic you are expecting to olddomain.com.
Job done without any downtime :)
Hope these steps are of use to others.
Thanks.
I created a new project with the desired domain name, then switched to the new project using the firebase use <project_id> command from Firebase CLI.
I have been using the Google Cloud Storage Manager link on the Google APIs console in order to upload my files.
This works great for most files: 1KB, 10KB, 1MB, 10MB, 100MB. However yesterday I could not upload a 3GB file. Any idea what is wrong?
What is the best way to upload large files to Google Cloud Storage?
The web UI only supports uploads smaller than 2^32 bytes (4 GigaBytes). I believe this is a javascript limitation.
If you need to transfer many or large files consider using gsutil:
GSUtil uploads and downloads any size file.
GSUtil resumes uploads and resumes downloads that fail part way through.
GSUtil calculates the MD5 checksum to verify the contents of each file transferred correctly.
GSUtil can upload and download many files at the same time.
gsutil -m cp /path/to/*thousands-of-files* gs://my-bucket/*
In my experience, the accepted answer is not correct - maybe it was but something has changed.
I just uploaded a file of size 2.2GB to GCS using the web interface on Chrome 42 on Windows 8.1.
I would also point out that the question is about files > 2GB, and the answer mentions 2GB, but gets that from 2^32, which is 4GB, not 2. So maybe the limit really is 2^32 (4GB) - I haven't tried anything that big.
(It is still a good idea to use gsutil for large files.)
I want to limit browser internet speed for local files. I want to test my website thats local. Is there any way/tool/extension using which I can set browsing speed to some 56K, 256K etc?
I found tools like netlimiter, speetlimit but they work for remote files.
Note that "remote files" can also include "a locally running Apache" or "a locally running nginx". You can just install a server on your local system and then use whatever bandwidth-limiting proxies you want to rate limit your bandwidth to your local web server.