Deploy python API on Amazon EC2 Instances - rest

I have created a script which is running on a localhost and port:5006 on EC2 instance. I am planning to make it run in the background even after I logout from SSH terminal. The question is that when I try to throw or reach to my script from my browser or Postman with the following link:
http://ec2-52-15-176-255.us-east-2.compute.amazonaws.com:5006/main?<myparametes>
Steps which I have done is:
1.) Created EC2 Instances of Linux flavor (Which is available in Free tier)
2.) Started the python script in virtualenv folder and listening to the port,
3.) Now trying to reach the IP and the port as mentioned above!
Apart from that, I haven't done anything else!
Please help me with understanding the concepts because there are no such tutorials available which serve straightforwardly

Agreed that the question is very broad, so starting from a basic step: have you created the proper security group for your instance that allows this 5006 port to be accessible from whichever network you are trying to?

Related

How to set up the litmus Chaos backend in the local machine?

As a part of a project, setting up the litmus Chaos portal on the local machine is one of the requirements.
I could run the litmus Chaos frontend on on the localhost by following the below link: https://github.com/litmuschaos/litmus/wiki/Litmus-Portal-Development-Guide
In the above case we are starting the backend servers using kubectl commands.
But I am facing issues to set up my local environment such that I ll be able to start the backend servers on my local IDE(intellij for instance) without using kubectl commands.
It would be great if anyone could help me with this. Thank you in advance.

Airflow network access through raspberry pi 4

I have been struggling a bit to get access to airflow from outsite my network.
I have airflow running on a raspberry pi 4, and all seems to be working just fine, I can access it via http://localhost:8080.
I have also set forwarding rules in my home router pointing to the raspberry pi:
The port forwarding is working since I can see it open using an external tool, and I have also set up a SSH port forwarding to access my raspberry pi from outside my local network.
So accessing http://my-public-ip:123 should be taking me to my aiflow web ui, but instead I just got "This site can't be reached".
Can anybody spot what I have done wrong or if I skipped any step of the process?
Thanks in advance.
Are you running Airflow with Docker or you have it installed? If using Docker, setting the webserver's service port to 127.0.0.1:8080:8080 it's the first step.
Secondly, you might need to look into the config webserver options. You will need to set the AIRFLOW__WEBSERVER__BASE_URL: 'http[s]://your_address/airflow' environment variable and install a reverse proxy (I suggest you nginx); you can find more details here. After you've done this, don't forget to also set the environment variable AIRFLOW__WEBSERVER__ENABLE_PROXY_FIX: 'true

Is it possible to SSH into a Virtual Machine instance using google-api-client rather than command line

I want to automate the entire process from starting an instance to running a program on that instance.
So just as running a python program on a local computer requires only one command on command line, so too would I like to run my program on a remote VM instance with just one command.
It seems though that in order to SSH into a remote VM instance I have to use command line and I have to answer some yes/no questions or multiple choice questions. Admittedly you can use the sub process module but I have not yet figure out how to answer the yes/no questions.
Before I do more research however, I need to know if what I'm doing is even possible. So I would like to build a python program using google-api-client which automates the entire process from starting the instance to connecting the instance to a drive, to running a program.
It seems though I cannot SSH into a remote VM instance with python but have to do this with command line. Is this right?
You can use ssh from Python (link).
Minimally, any ssh client will require the IP address of the remote machine (running sshd), the port on which sshd is listening and some credentials that authenticate the client to the remote machine (generally you'll want to use ssh-keys).
Google's SDKs (including the API Client Libraries for Python) help you interact with Google's services. You can use Google's Compute Engine API (and Python library) to provision (create) VMs including Linux VMs. You'll need to ensure the image you use runs sshd and the machine will need a (probably public) IP address. Once the command succeeds, you have a Linux VM mostly like any other (created on GCP, AWS, Azure etc.). You can query Compute Engine for the public IP address of an instance.
As long as you've ssh'd into a Google Compute Engine instance using gcloud compute ssh ... from your machine (any project), the command will have created a ssh key-pair for you that will be copied to your instances upon creation. You can then use the private key (.ssh/google_compute_engine) to authenticate your Python ssh client to the Compute Engine instance. You'll need to provide the IP address to the client too (it'll default to port :22).
NB gcloud compute ssh ... uses your machine's ssh client. It does not reimplement ssh. You can prove this to yourself by running gcloud compute ssh --ssh-flag="-vvv" or by temporarily making which ssh inaccessible to the gcloud command and trying gcloud compute ssh ... when which ssh is inaccessible to it; it won't work.
It seems though I cannot SSH into a remote VM instance with python but have to do this with command line. Is this right?
Yes, this is correct. You can automate to a point but not all the process as have you described.
One alternative way that I can think of is to use OS Login which manages the SSH access to your VM instances using IAM without having to create and manage individual SSH keys.

Google Cloud SQL VM refusing connection

I have been stuck trying to figure out why my Cloud SQL VM is refusing my connection from my machine (whom ip address I have added as a subnet). I cann SSH into the VM but i cannot access the VM from a browser to make SQLs. I have scoured the internet for days trying to find a fix but i cannot seem to get pass this point. My apache listens to port 80. Also Id like to add that I have been connecting to my Mysql db for months through php and making sqls so I do not believe the problem is with apache. However if it is please point me to where i should be looking.
It sounds like you have MySQL running on a GCE VM, not an actual CloudSQL instance (that is a different service from GCE). Is that right?
If so, then if you are trying to connect from your local machine directly to the mysql instance, you are probably getting blocked by the firewall. Go to the networks tab (under Compute Engine) on the cloud console and see what firewall rules you have enabled. You might need to add one for 3306 or whatever port you are using.

Linux/CMD environment and terminal on website

I am looking for a way to incorporate a command line interface into my website. Specifically I have 2 servers, one running Linux distro and the other Windows. People can request accounts and if I approve them they get a user partition on either of the servers.
They can then sign in on the website and access the servers through a command line interface. I saw a couple of repos that do something similar for the Amazon EC2 servers but was wondering if there is anything more general?
You can use shellinabox. This runs a daemon on the server and can be accessed through a specified port. You simply have to enter the IP of your server and the port number and you can log in over a browser.