How to create a volume in softlayer using rest api
their is no option for create volume
These examples may help you:
Create performance and endurance storage:
API for Performance and Endurance storage(Block storage)
List Network storage:
How can we capture storage_Id while placing order for storage (either performance or endurance)?
Get Network_Storage_id:
How can we capture storage_Id while placing order for storage (either performance or endurance)?
Also to cancelnetwork storage, please see:
How to cancel SoftLayer order for Block Storage?
Regards.
Curently, It is not possible to create volume in softlayer neither API nor Softlayer' Portal.
Regards
Related
Instead of using Google Cloud or AWS Storage buckets; how do we create our own scalable storage bucket?
For example; if someone was to hit a photo 1 billion times a day. What would be the options here? Saying that the photo is user generated and not image/app generated.
If I have asked this in the wrong place, please redirect me.
As an alternative to GKE or AWS objects storage, you could consider using something like MinIO.
It's easy to set up, it could run in Kubernetes. All you need is some PersistentVolumeClaim, to write your data. Although you could use emptyDirs to evaluate the solution, with ephemeral storage.
A less obvious alternative would be something like Ceph. It's more complicated to setup, although it goes beyond objects storage. If you need to implement block storage as well, for your Kubernetes cluster, then Ceph could do this (Rados Block Devices), whilst offering with object storage (Rados Gateways).
The system i am building is currently storing videos with Google Cloud Storage, my server will return the link from Google Cloud Storage which is used to play the video on mobile platforms. Is there a limit for how many user can access that link at the same time? . Thank You!
All of the known limits for Cloud Storage are listed in the documentation. It says:
There is no limit to reads of objects in a bucket, which includes reading object data, reading object metadata, and listing objects. Buckets initially support roughly 5000 object reads per second and then scale as needed.
So, no, there are effectively no limits to the number of concurrent downloads.
Is there a way how to apply upload limit for google storage bucket per day/month/year?
Is there is a way how to apply limit on amount of Network traffic?
Is there is a way how to apply limit on Class A operations?
Is there is a way how to apply limit on Class B operations?
I found only Queries per 100 seconds per user and Queries per day using
https://cloud.google.com/docs/quota instructions, but this is JSON Api quotas
(I even not sure what kind of api is used inside of StorageClient c# client class)
For defining Quotas, and by the way SLO, you need to have SLI: Service level indicator. that means to have metrics on what you want to observe.
Here, it's not the case. Cloud Storage haven't indicator on the volume of data per day. Thus, you don't have built in indicator and metrics, ... and quotas.
If you want it, you have to build something by your own. To wrap all the Cloud Storage call in a service that count the volume of blob per days and then you will be able to apply your own rules on this personal indicator.
Of course, for preventing any by pass, you have to deny direct access to the buckets and only grant your "indicator service" to access them. Same things for the bucket creation, to register the new buckets in your service.
Not an easy task...
The sync tool takes an s3:// format address - should this work with Digital Ocean?
It seems that the more appropriate way to achieve this sync task would be the usage of the transfer tool provided by Google Cloud Platform, but instead of configuring the URL for spaces, you can create a job and specify that your transfer will be done using a URL List(via Cloud Console or Storage Transfer API)
If you have a lot of objects in your Spaces buckets, you could use the Spaces API for list bucket's content and if you prefer, use the outcome of this API to then create a transfer job using the Storage transfer API (you can take a look to the transfer spec here
Is there a way through either the IBM Cloud API, or the Softlayer API, to programmatically run/schedule/setup snapshots on an endurance storage device? aka iSCSI drive.
I've looked through the documentation, but have not found anything.
you need to take a look at these methods:
https://sldn.softlayer.com/reference/services/softlayer_network_storage/createsnapshot
the method above will allow you to create a new manual snapshot
https://sldn.softlayer.com/reference/services/softlayer_network_storage/enablesnapshots
the method above will allow you to schedule the snapshots
see bellow some examples of code:
https://softlayer.github.io/php/enableSnapshots/
https://softlayer.github.io/rest/createsnapshot/