Can I upload Input stream data in GCS ? If possible then how ? and what is mean by BlobWriteOption and BlobTargetOption in create function ?
public void upload(String objectName,InputStream stream) throws IOException {
Credentials credentials = GoogleCredentials.fromStream(new FileInputStream(jsonkey));
Storage storage = StorageOptions.newBuilder().setCredentials(credentials).setProjectId(projectId).build().getService();
BlobId blobId = BlobId.of(bucketName, objectName);
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).build();
// storage.create(blobInfo, stream , ??);
storage.create(blobInfo, content, options);
log.info("File uploaded in {}",objectName);
}
But I found this deprecated message:
The method create(BlobInfo, InputStream, Storage.BlobWriteOption...) from the type Storage is deprecated
Related
I want to upload an image from java firebase admin sdk and save the path to firestore and retrieve it later from the saved url in flutter application. Below is the code from which I upload the image to cloud storage.
public String import_to_storage(BufferedImage bi, String imgName, int folderName){
byte[] bytes = null;
bytes = Compression.compress_image(bi);
Bucket bucket = StorageClient.getInstance().bucket();
Blob folderCreated = bucket.create(String.valueOf(folderName)+"/", "".getBytes());
//bucket.get("images/").delete(); // Delete the created folder
Blob importImage = bucket.create(String.valueOf(folderName)+"/"+imgName, bytes, "image/jpg");
return importImage.getSelfLink();
}
The code to retrive image from saved url is final httpsReference = FirebaseStorage.instance.refFromURL( "https://firebasestorage.googleapis.com/b/YOUR_BUCKET/o/images%20stars.jpg").getDownloadUrl(); but the app crash immediately.
Or How to upload an image to s3 with public access by Flutter Amplify?
In my current flutter project, I can't pass ACL:public-read property while uploading files to S3 using amplify.
And because of this, whenever I'm uploading a new file to s3, I need to make it public manually.
So I just want to upload a new file with public read access for everyone.
I found some solutions for the Javascript project but not in the Flutter project.
Below is a method, I'm using to upload.
Future<String> uploadFile(String fileName, File local) async {
try {
Map<String, String> metadata = <String, String>{};
metadata['name'] = 'filename';
metadata['desc'] = 'A file';
S3UploadFileOptions options = S3UploadFileOptions(accessLevel: StorageAccessLevel.guest, metadata: metadata);
UploadFileResult result = await Amplify.Storage.uploadFile(key: fileName, local: local, options: options);
return result.key;
} catch (e) {
print('UploadFile Err: ' + e.toString());
}
return null;
}
I think you should be using Dio for declaring the client object that will be used for posting the request
You can find an example code in the following answer
So far Flutter Amplify is not giving any option to upload images with public access.
It always uploads with private read access.
So I updated a few things in my project as described below.
Before Amplify integration I was uploading images to S3 and storing that URL to my server, and wherever I have to display, I'm just fetching URL from my server and loading images.
But now I'm storing key(that is used to upload images to S3 by Amplify) to my server.
And to display the image I'm getting the image URL from Amplify using that key(which is stored in my server).
Amplify adds a token to the image URL with a default validity of 7 days
Future<String> getUrl(String key) async {
try {
S3GetUrlOptions options = S3GetUrlOptions(accessLevel: StorageAccessLevel.guest, expires: 10000);
GetUrlResult result = await Amplify.Storage.getUrl(key: key, options: options);
String url = result.url;
return url;
} catch (e) {
print('GetUrl Err: ' + e.toString());
}
return null;
}
So it can be displayed by ImageView.
I have a folder called "myfolder" within Cloud storage bucket. It has files like a.log, b.log etc. How can I programmatically delete all these files from the folder in the bucket.
I want some some java example code to do it.
I framed this sample by taking snippets from our production code base. Please note the usage of setPrefix(folder) to filter contents from a bucket.
So the logic is get all contents from the bucket and filter based on folder name, the delete it.
import java.io.IOException;
import com.google.api.client.googleapis.auth.oauth2.GoogleCredential;
import com.google.api.client.http.*;
import com.google.api.client.http.javanet.NetHttpTransport;
import com.google.api.client.json.JsonFactory;
import com.google.api.client.json.jackson2.JacksonFactory;
import com.google.api.services.storage.Storage;
import com.google.api.services.storage.model.*;
public void deleteFolder(String bucketName, String folder) throws IOException {
HttpTransport httpTransport = new NetHttpTransport();
JsonFactory jsonFactory = new JacksonFactory();
GoogleCredential.Builder credentialBuilder = new GoogleCredential.Builder();
final GoogleCredential credential = credentialBuilder
.setTransport(httpTransport).setJsonFactory(jsonFactory)
.setServiceAccountId("YourServiceAccountId")
// .setServiceAccountPrivateKeyFromP12File(new File("PrivateKeyFile"))
.build();
Storage storage = new Storage(httpTransport, jsonFactory,
new HttpRequestInitializer() {
public void initialize(HttpRequest request)
throws IOException {
credential.initialize(request);
}
});
// PLEASE NOTE THE USAGE OF setPrefix(folder) TO FILTER ITEMS IN FOLDER
Objects objectsInFolder = storage.objects().list(bucketName).setPrefix(folder).execute();
for(StorageObject object : objectsInFolder.getItems()) {
storage.objects().delete(bucketName, object.getName()).execute();
}
}
I'm trying to simply upload a new blob to an Azure Storage countainer using WebClient like this :
var sas = "[a new generated sas with Read, Write, List & Delete permissions]";
var sData = "This is a test!";
var sEndPoint = "http://myaccount.blob.core.windows.net/mycontainer/MyTest.txt" + sas;
var clt = new WebClient();
var res = await clt.UploadStringTaskAsync(sEndPoint, "PUT", sData);
This is giving me a "(400) Bad Request." error. Am I doing anything wrong here?
Thanks
(By the way, I need to use REST instead of Client API since I'm in a Silverlight project)
You would need to define a request header (x-ms-blob-type) for blob type and set it's value to BlockBlob. Also for Put requests you would need to define the Content-Length request header as well. I wrote a blog post on Shared Access Signatures and performing some blob operations using that (with both REST API and Storage Client library) which you can read here: http://gauravmantri.com/2013/02/13/revisiting-windows-azure-shared-access-signature/.
and here's the code from that post on uploading blob. It uses HttpWebRequest/HttpWebResponse instead of WebClient:
static void UploadBlobWithRestAPISasPermissionOnBlobContainer(string blobContainerSasUri)
{
string blobName = "sample.txt";
string sampleContent = "This is sample text.";
int contentLength = Encoding.UTF8.GetByteCount(sampleContent);
string queryString = (new Uri(blobContainerSasUri)).Query;
string blobContainerUri = blobContainerSasUri.Substring(0, blobContainerSasUri.Length - queryString.Length);
string requestUri = string.Format(CultureInfo.InvariantCulture, "{0}/{1}{2}", blobContainerUri, blobName, queryString);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(requestUri);
request.Method = "PUT";
request.Headers.Add("x-ms-blob-type", "BlockBlob");
request.ContentLength = contentLength;
using (Stream requestStream = request.GetRequestStream())
{
requestStream.Write(Encoding.UTF8.GetBytes(sampleContent), 0, contentLength);
}
using (HttpWebResponse resp = (HttpWebResponse)request.GetResponse())
{
}
}
When testing against the blob emulator this is the code I need to get it working:
var connection = ConfigurationManager.AppSettings["AzureStorageConnectionString"];
var storageAccount = CloudStorageAccount.Parse(connection);
var client = new WebClient();
client.Headers.Add("x-ms-blob-type", "BlockBlob");
client.Headers.Add("x-ms-version", "2012-02-12");
client.UploadData(string.Format(#"{0}/$root/{1}{2}", storageAccount.BlobEndpoint, myFileName, sharedAccessSignature), "PUT", _content);
I have created a bucket on Amazon S3 and I kept some images in this bucket inside a folder. All the images are private and I am using Zend_Service_Amazon_S3 class of Zend.
How can I access the private images?
You can do this task by making private url Like this
public function get_s3_signed_url($bucket, $resource, $AWS_S3_KEY, $AWS_s3_secret_key, $expire_seconds) {
$expires = time()+$expire_seconds;
// S3 Signed URL creation
$string_to_sign = "GET\n\n\n{$expires}\n/".str_replace(".s3.amazonAWS.com","", $bucket)."/$resource";
$signature = urlencode(base64_encode((hash_hmac("sha1", utf8_encode($string_to_sign), $AWS_s3_secret_key, TRUE))));
$authentication_params = "AWSAccessKeyId=".$AWS_S3_KEY;
$authentication_params.= "&Expires={$expires}";
$authentication_params.= "&Signature={$signature}";
return $link = "http://s3.amazonAWS.com/{$bucket}/{$resource}?{$authentication_params}";
}
now use this url to get access.
Try this:
It will return the binary data for the file stored on the Amazon S3 Bucket.
require_once 'Zend/Service/Amazon/S3.php';
$s3 = new Zend_Service_Amazon_S3($my_aws_key, $my_aws_secret_key);
echo $s3->getObject("my-own-bucket/myobject");
Documentation is here: http://framework.zend.com/manual/de/zend.service.amazon.s3.html
This is example #1