GCP Storage API upload files using Java 1.6 - google-cloud-storage

We require to upload files to GCP storage bucket using Java 1.6 version. As storage SDK library support starts with Java 1.7 or above, request to let me know the way forward to operate with GCP storage bucket.
I tried with apache http library by following the guidelines of setting required http header with token however faced issues with oauth token along with refreshing it without using GCP SDK.
Request to provide sample code or any reference to operate on GCP storage bucket using Java 1.6.

As you have been able to see in the documentation, the client library supports Java 7 and above. In addition, the Java version you are trying to use was deprecated more than a year ago as explained in this post. Therefore, I would recommend you to switch either to Java 7 or Java 8.

There are a few different client libraries available for Google Cloud for Java, but I believe they all require a minimum of Java 7 due to Oracle dropping support for Java 6.
The good news is that Google's OAuth client library for Java seems to only require Java 6. The bad news is that it requires the Google HTTP for Java library, the current version of which requires Java 7. However, that library supported Java 6 until around only dropped about 6 months ago, so you could most likely grab an older release and get the OAuth stuff working in Java 6, although I haven't tried.
That wouldn't get you a rich SDK client, but it would at least take care of your OAuth authentication for manually-crafted API requests.
Another option would be to use Google Cloud Storage's compatibility API, the XML API, which works with a variety of third party blob storage clients.

Related

Issue: aws-encryption-sdk-java version 2.3.3 issues with credentials

We are trying to test decryption login in local using a profile based credentials setup. I was using KMSMasterKeyProvider.withCredentials along with ProfileCredentialsProvider.
In doing so, I realised that aws-encryption-sdk-java 2.3.3 version internally uses aws-java-sdk 1.x version. This is sort of deadlock as we are already on aws-java-sdk 2.x
Any suggestions as to how we can overcome this? Interesting that how Amazon still hasnt migrated aws encryption sdk completely to 2.x.
AWS Encryption SDK for Java IS NOT a part of the AWS SDK for Java. Neither 1.x, nor 2.x. They have different release cycles. Thus, version 2.3.3 of the AWS Encryption SDK for Java doesn't mean that you are using AWS SDK for Java 2.x.
Even in the README they say:
You don't need an Amazon Web Services (AWS) account to use the AWS Encryption SDK, but some of the example code require an AWS account, an AWS KMS key, and the AWS SDK for Java 1.x. (The AWS Encryption SDK for Java does not support the AWS SDK for Java 2.x.)
The docs repeat that:
Prerequisites
Before you install the AWS Encryption SDK for Java, be sure you have
the following prerequisites.
…
AWS SDK for Java (Optional)
The AWS Encryption SDK for Java does not require the AWS SDK for Java.
However, the AWS SDK for Java 1.x is required to use AWS Key Management Service (AWS KMS) as a master key provider.
It's also required for some of the Java code examples in this guide.
The AWS Encryption SDK for Java supports only the 1.x version of the AWS SDK for Java.
To install the AWS SDK for Java 1.x, use Apache Maven.
To import the entire AWS SDK for Java as a dependency, declare it in your pom.xml file.
To create a dependency only for the AWS KMS module, follow the instructions for specifying particular modules, and set the artifactId to aws-java-sdk-kms.
ALAS, you have no choice but to use AWS SDK for Java 1.x. But it should not be a deadlock or a conflict: 1.x and 2.x use different packages.

Confluence migration from cloud to server

We have migrated a space from cloud instance to server instance,in cloud instance we were using "Plantuml diagrams for confluence" but in server we are using "Confluence PlantUML Plugin" .so macro name are different in both cloud and server ,so macro name for cloud is "plantumlcloud" but for server it is "plantuml".so ,in pages after migration it is showing "plantumlcloud" not a valid macro ,kindly help to resolve.
In general, migration of confluence spaces to another application which is not running the same plugins will cause any functionality of that plugin to break.
If you migrate hosting platforms, and have the equivalent version of the plugin for your new platform, created by the same developer, in most cases you will retain functionality, however there will often be differences between versions.
These differences are found especially when downgrading, and moving from cloud to server is a very definite example of a downgrade, as cloud will always run the latest version.
In general I would reccomend against a migration from cloud to server, and when it must be done, time should be spent to ensure compatability with all plugins, and migration guides and plans should be made and followed.
As commented by #tgdavies, there seems to be an equivelent version of the plugin you were using on cloud, so hopefully that can resolve your issue.

IBM Bluemix - Kitura Swift - is missing a required environment variable: 'OPENAPI_SPEC'

I am stuck for the moment. I cannot obtain the source code for the mobile project and I do not know where is the problem. All my researches was without positive result.
My intention with IBM Bluemix is to develop myself a small project only in Swift (server side + iOS) because I am iOS mobile developer.
When I try to get the code for mobile project (iOS) I get these error:
Error Notification:
The Cloud Foundry App 'XXX' is missing a required environment variable: 'OPENAPI_SPEC'.
I want to use OpenWhisk SDK for iOS. I do not know where to set the variable OPENAPI_SPEC and what value to put in it.
I have setup a Cloud Foundry App started from "Runtime for Swift - Kitura" and a mobile project named started from "Code Starter - OpenWhisk".
Can you help me with some advice or some sample?
Thank you!
If you added a Swift server side Compute to your mobile project, you will need to add an environment variable called OPENAPI_SPEC to your backend to point to a valid Open API swagger document outlining the API.
This way when you download the project, it will auto-generate an SDK corresponding with your backend's Open API.
For instance, here is how you set the environment variable:
And here is a valid API doc that it's using (albeit not in the most elegant Open API compliant format yet but it works).
https://updatesdk.mybluemix.net/explorer/swagger.json
The idea is that the "project" concept takes an abstracted view of a Compute runtime (Cloud Foundry, Docker, etc.) and only cares that it exposes an API compliant with the Open API specification. Using that defined Open API spec, you can dynamically generate an SDK for a "project" when it's downloaded (for iOS, Android, etc.).
If your backend Compute exposes no Open API specification at this time, and you just want to download the code of OpenWhisk for iOS, you can just deassociate that backend Compute from your mobile project for now, and it should download the code. If you ever build on top of that backend and want to reconnect it in the future, you can add it and redownload at a later time (doing a git diff or using the Bluemix CLI SDK plugin to download an SDK from your Open API specification later in your project's lifecycle).

Bluemix: Can I scan a Java ReST API using Application Security on Cloud

I am planning to use Bluemix for a ReST API development using Java. I wanted to use Application Security on Cloud for scanning the application to eliminate security concern.
Can I use it? Is there something more appropriate?
You can use the Static analysis feature of Application Security on Cloud to scan Java applications for security vulnerabilities. To accomplish this, a small utility needs to be downloaded to convert the application byte code files into an Intermediate Representation (IRX) of the code. This IRX file is uploaded to the server and scanned using trace analysis to find security vulnerabilities (the IRX file is encrypted to keep your data safe). IRX files can be generated using a small client command-line interface (CLI) that you need only download and extract to your local disk. In addition, you can run a small installer that adds static analysis plug-ins to Eclipse or Maven. Note that the Client Utility and cloud service versions must be compatible.
Take a look at Getting started with Application Security on Cloud for more information.

Heroku-like services for Scala?

I love Heroku but I would prefer to develop in Scala rather than Ruby on Rails.
Does anyone know of any services like Heroku that work with Scala?
UPDATE: Heroku now officially supports Scala - see answers below for links
As of October 3rd 2011, Heroku officially supports Scala, Akka and sbt.
http://blog.heroku.com/archives/2011/10/3/scala/
Update
Heroku has just announced support for Java.
Update 2
Heroku has just announced support for Scala
Also
Check out Amazon Elastic Beanstalk.
To deploy Java applications using
Elastic Beanstalk, you simply:
Create your application as you
normally would using any editor or IDE
(e.g. Eclipse).
Package your
deployable code into a standard Java
Web Application Archive (WAR file).
Upload your WAR file to Elastic
Beanstalk using the AWS Management
Console, the AWS Toolkit for Eclipse,
the web service APIs, or the Command
Line Tools.
Deploy your application.
Behind the scenes, Elastic Beanstalk
handles the provisioning of a load
balancer and the deployment of your
WAR file to one or more EC2 instances
running the Apache Tomcat application
server.
Within a few minutes you will
be able to access your application at
a customized URL (e.g.
http://myapp.elasticbeanstalk.com/).
Once an application is running,
Elastic Beanstalk provides several
management features such as:
Easily deploy new application versions
to running environments (or rollback
to a previous version).
Access
built-in CloudWatch monitoring metrics
such as average CPU utilization,
request count, and average latency.
Receive e-mail notifications through
Amazon Simple Notification Service
when application health changes or
application servers are added or
removed.
Access Tomcat server log
files without needing to login to the
application servers.
Quickly restart
the application servers on all EC2
instances with a single command.
Another strong contender is Cloud Foundry. One of the nice features of Cloud Foundry is the ability to have a local version of "the cloud" running on your laptop so you can deploy and test offline.
I started working on the exact same thing as what you said a few weeks ago. I use Lift, which is a great framework and has a lot of potential, on top of Linux chroot environment.
I'm done with a demo version, but Linux chroot is not that stable (nor secure), so I'm now switching to FreeBSD jail on Amazon EC2, and hopefully it'll be done soon.
http://lifthub.net/
There are also other Java hosting environment including VMForce mentioned above.
If you are looking for a custom setup which also has the ease of deployment that heroku offers: http://dotcloud.com. They are invite only right now but I was given access in under three days. I am working on a Lift/MongoDB project there and it works well.
Off the top of my head, only VMForce comes to mind, but its not available yet. This will be a Java-oriented service, so that probably means you'll have to spend a wee bit of time figuring out how to package the app.
For more discussion, there was a debate about this in 2008.
I'm not entirely sure if it's really suitable or not, but people have deployed Scala applications to Google App Engine, for example http://mawson.wordpress.com/2009/04/10/first-steps-with-scala-on-google-app-engine/
Actually you can run scala on heroku right now. You don't believe it?
https://github.com/lstoll/heroku-playframework-scala
I'm not sure the tricks lstoll has used are legit but using the
new cedar platform where you can run custom processes and some
ingenious Gemfile hacking he has managed to bootstrap the Java
play platform into a process. Seems to work as he has a live
site running a test page.
Stax cloud service offers preconfigured lift project skeleton. Also, there is a tutorial on how to deploy lift project to appengine.