Store data inside an NFT using Solana - metadata

I'm trying to store meta data of a image or a video in a Solana NFT without going for IPFS or other cloud services. Is it possible to do this?

Related

How would you design a video upload system using gcp and Go?

I want to build a tiny story system where users can upload videos.
I'm using Firebase and the frontend will be in flutter.
I'm struggling a bit to design the flow from frontend to my Go backend. What's the simplest way to achieve this ?
From what I understand I could use different flows:
Front ask for an upload signed url to Go backend
Backend generate a gcp storage signed url
Front uploads the video
Front send the link to backend
Backend transcode the video
Backend store the link in firestore
Or
Front use directly firebase storage
Front send the link to backend ?
What's the benefits of using an upload signed url vs directly firebase storage?
Thanks in advance
What's the benefits of using an upload signed url vs directly firebase storage?
Firebase storage offers simplicity of security rules to restrict access while using GCS directly will require you to have a backend to generate signed URLs. I would prefer signed URLs when it's the system does not use Firebase Authentication or you want some validation before the file is uploaded as first place. However most of that can be done using security rules as well.
When using Firebase storage, the upload is simpler just by using uploadBytes() function while signed URLs would require some additional code. An example can be found in this
I am not sure what you mean by 'transcode video' but you can use Cloud Storage Triggers for Cloud Functions and run any actions such as adding URL to Firestore or process video once a file is uploaded.

Hosting on Firebase

I was trying to host a website on firebase and this keeps coming up enter image description here
If you want to initialize a Firebase project using the CLI (like you're doing), and you want to use Firestore or Storage, you first have to go into the Firebase admin console, click on Firestore and/or Storage, and select a location/region for your database/storage buckets. The error you're getting is that you're trying to set up Storage but you haven't yet set a Storage location for your buckets.

OpenStack Swift Object Storage File Access?

I'm setting up OpenStack Swift object storage via IBM Bluemix for a few needs in our application. First of all I need a place to securly upload customer files via our API, which this is a perfect solution for.
The portion I'm struggling with is the public piece. Our SAAS product has certian images that are uploaded during account provisioning. These files need to be public accessable via a URL.
I'm able to get the swift SDK to retrieve files with both a token and username/password methods. However I'm not able to find a way to generate a public URL, or to set public access on objects or containers. The documentation seems to be lacking on this too.
Is this possible even? Should I be using a different method for storing public assets?
It is. You can either create temporary URLs for the resources you want to access, or change the settings for your Object Store container to allow read access.
To modify the container ACL, follow Public URLs For Objects In Bluemix Object Storage Service
Hints for creating URLs just for some files are at How to create temporary URL for Swift object storage using REST API?

Image upload from browser to amazon S3 using perl

I want to upload image from browser to amazon s3 bucket using perl. I have found some link to save the image to our server then migrate to amazon s3 bucket.I want to know that is there any option to directly upload image ?
I Googled "browser upload to Amazon S3". The first result was this link - Browser-Based Uploads Using POST (AWS Signature Version 2). The actual details are on this page. Basically, you set the form's action to the URL of your S3 bucket and set all of the authentication stuff in the forms' hidden fields.
But wouldn't you be worried that a user could incur huge S3 charges on your behalf?
JavaScript and the AWS.S3 class would be a good bet if you want to do that from a browser.

How to upload Files to Cloud Storage?

I have a Google Cloud Endpoints wich is using Cloud SQL to store data. I want to provide a file upload for Clients and the files should be stored in Cloud Storage but I also want to store file meta data and the file storage url in Cloud SQL.
What's the best was to do this?
Can I upload files through cloud endpoints or do I need an extra upload Servlet?
How can I update my database entities which needs a reference to the uploaded files.
Any examples on how to combine those 3 technologies?
Assuming your clients are not added to your google cloud project (which is typically the case), your users don't have write access to your GCS bucket. You can either submit files to your application and move to GCS from there (not recommended as consumes more network and CPU) or a better way is to submit to GCS directly.
To let the client write to your GCS bucket directly, you will need to either:
1. put your access key on client for write access (not recommended), if the client is used by limited trusted people.
2. generate a time-bound token and put it on the client as signed URL to upload directly.
Endpoints APIs themselves cannot do this, but you can generate the signed GCS URL at the server and get it using endpoints on client. then set it as form action (on web client, other clients have similar ways for signed upload) and submit the form to upload the file.
<form action="SIGNED_URL_FROM_ENDPOINTS" method="post" enctype="multipart/form-data">
I don't see an open-source code out there doing exactly this, but closest is this project that does generate the signed URL with a time-out (the only unintuitive part).
Best way to update the metadata in your database is to watch GCS bucket using 'Object Change Notifications'. Another way is to send the metadata to your server from client itself, which can be an endpoints call. You can also use a mix of both where the metadata goes to server using endpoints even before the the file is uploaded and the notification updates the record with confirmation that it is available to serve.