can apache storm setup done on ECS fargate - amazon-ecs

I have a application that has apache storm setup and want to move on aws cloud. Looking for information if is it possible to migrate/deploy this on container ECS with Fargate.
If not the is there any equivalent aws service ?

Related

Can I run multiple services in same go on same ECS fargate Cluster

I have a ECS fargate cluster up and running and it has 1 service and 1 task definition attached to it.
The task definition already has 2 container images described.This cluster is up and running.
Can I create a new service and for another application and configure it with this Existing ECS cluster.
If yes, will both the service run simultaneously.
From the AWS Documentation in regards Amazon ECS Clusters
An Amazon ECS cluster is a logical grouping of tasks or services. Your
tasks and services are run on infrastructure that is registered to a
cluster.
So I believe, you should be able to run multiple services in a cluster that is attached to its related task definition in the ECS.
Source Documentation - https://docs.aws.amazon.com/AmazonECS/latest/developerguide/clusters.html

How to connect to AWS ECS cluster?

I have successfully created ECS cluster (EC2 Linux + Networking). Is it possible to login to the cluster to perform some administrative tasks? I have not deployed any containers or tasks to it yet. I can’t find any hints for it in AWS console or AWS documentation.
The "cluster" is just a logical grouping of resources. The "cluster" itself isn't a server you can log into or anything. You would perform actions on the cluster via the AWS console or the AWS API. You can connect to the EC2 servers managed by the ECS cluster individually. You would do that via the standard ssh method you would use to connect to any other EC2 Linux server.
ECS will take care most of the administrative works for you.You simply have to deploy and manage your applications on ECS. If you setup ECS correctly, you will never have to connect to instances.
Follow these instructions to deploy your service (docker image): https://docs.aws.amazon.com/AmazonECS/latest/developerguide/create-service.html
Also you can use Cloudwatch to store container logs, so that you don't have to connect to instances to check the logs: https://docs.aws.amazon.com/AmazonECS/latest/developerguide/using_awslogs.html

Connect to a DB hosted within a Kubernetes engine cluster from a PySpark Dataproc job

I am a new Dataproc user and I am trying to run a PySpark job that is supposed to use the MongoDB connector to retrieve data from a MongoDB replicaset hosted within a Googke Kubernetes Engine cluster.
Is it there a way to achieve this as my replicaset is not supposed to be accessible from the outside without using a port-forward or something?
In this case I assume by saying "outside" you're pointing to the internet or other networks than your GKE cluster's. If you deploy your Dataproc cluster on the same network as your GKE cluster, and expose the MongoDB service to the internal network, you should be able to connect to the databases from your Dataproc job without needing to expose it to outside of the network.
You can find more information in this link to know how to create a Cloud Dataproc cluster with internal IP addresses.
Just expose your Mogodb service in GKE and your should be able to reach it from within the same VPC network.
Take a look at this post for reference.
You should also be able to automate the service exposure through an init script

Deploy container images to kubernetes to google cloud from java/node js

I am trying to do some experiments with Kubernetes in google cloud.
I have docker image in google cloud registry and need to deploy that image to a kubernetes cluster.
Here are the steps I need to perform.
Create a Kubernetes cluster.
Copy the image from GCR and deploy to Kubernetes cluster.
Expose the cluster to internet via load balancer.
I know, it is possible to do via google cloud sdk cli. Is there way to do these steps via Java/node js?
There is a RESTful kubernetes-engine API:
https://cloud.google.com/kubernetes-engine/docs/reference/api-organization
e.g. create a cluster:
https://cloud.google.com/kubernetes-engine/docs/reference/rest/v1beta1/projects.zones.clusters/create
The container registry should be standard docker APIs.
Both Java and Node have kubernetes clients:
https://github.com/kubernetes-client/java
https://github.com/godaddy/kubernetes-client

How to Access Apache Ignite Service Grid services from j2ee aplication or from outside the Ignite cluster

I want to access services running on Apache Ignite cluster from a J2ee application running on wildfly10 application server. is there an option to achieve this integration?
How do we expose Apache Ignite service grid services to outside world outside the Ignite cluster.
I would recommend to create Singleton session bean(as it is never passivated) with Ignite client in it. With client node you will connect to Ignite cluster. Here is documentation about clients and servers.