Connecting to orient DB distributed server - orientdb

how does one connect to a distributed setup in orientdb. I have 3 servers cluster setup and running. They are communicating and data from one is available from the other. But how do I connect to them as a cluster in my program?
in mongo there is the connection URI :
mongodb://[username:password#]host1[:port1][,...hostN[:portN]][/[database][?options]]
is there something similar in orientdb? Currently, I connect to just one and insert my data but the program stops when this server stops for some reason. can I automatically have the program connect to one of the other servers and continue? All 3 are masters in this case, no replication servers.

You need to first authenticate the user, using basic auth, with the /connect/ API which will return the OSESSIONID as a cookie.
Then pass the OSESSIONID as a header to other APIs (in postman it is auto passed to other APIs).
Some of the OrientDB REST APIs require basic auth - they document it as "Requires additional authentication to the server."

Related

Postgresql requests proxied by HTTP server

I am using a mobile application that connects directly to the database instance (Postgres), as such, I have to keep the ports open for traffic that is generated from the internet (4G, mobile app).
This mobile app (QFIELD, mobile version of QGIS) has a direct connection to the database, this is the reason why the database is reachable from the internet on a public ip but this is a critical issue for the security of the data and the requests that can be sent to the database.
I would like to proxy the requests so that the database is only available to local machines and not open for connections directly.
The mobile appp would send the request to an HTTP url which would send the request to the local ip and port, this way I would avoid to have the database exposed on the internet.
Ideally, I would like to go from this app (which uses a postgres connection string to connect to the server) to an HTTP server that routes the request locally, as such:
APP connects to https://myproxy/postgres
Request is proxied to a local server
Can I do this with Apache2? Any ideas?
At the moment I cannot write a middleware that proxies requests from the APP to the local postgres.
If your application is expecting to connect directly to a PostgreSQL database and you don't want to change that then you need to connect to something that "speaks" PostgreSQL's client protocol.
You can place a proxy such as pgbouncer or pgpool in front of it, but they aren't a guarantee of greater security just by themselves. This is the same problem as with any proxy - it is just forwarding requests and responses to your actual server so any vulnerability is still exposed.
What you can do is:
restrict the number of connections at the proxy point
restrict which users can connect non-locally to your PostgreSQL cluster
restrict where they can connect from to just your proxy
restrict those users permissions within the database(s)
That last point is particularly important - assume any user account your application can be used maliciously. Restrict the account to prevent mass updating or deleting of data. Also take special care to restrict access to other users' data.
If I was forced to allow access like this, I would want one PostgreSQL user account per actual user at the very least. In practice I wouldn't get to this point with a production application.

Google Cloud SQL - PostgreSQL database connection from QGIS for third parties

I have a Google Cloud SQL PostgreSQL database in which I can connect by using SSL and by entering my IP address in allowed connection settings. However, I do not want to list all the IP addresses that is going to connect to this database (because I do not know all the IP addresses). I have around 15 people which I want them to login to my database using QGIS and they should be able to change the data as this is a research. Security is not a big issue as this database will be online for a very short period of time. What connection method can you suggest? The users are not very proficient so I need to setup everything.
I hope you're doing fine.
I would like to suggest to set the connections with the Cloud SQL proxy as it will provide the security needed without using ssl or the need of authorize any network. so basically the set up is to:
Enable the API
Install the proxy client on your local machine
Determine how you will authenticate the proxy
If required by your authentication method, create a service account
Also you can find the steps on "Connecting to Cloud SQL from external applications"
Hope this works for you as I have never used it with QGIS but I believe that as you are using a proxy it won't be hard from there to use it with QGIS as if you connected to a local server.

authentication server microservice, should I use different services for different user functionalities

I have an authentication server using oauth2.
I use it for :
Authentication from the other services, subscription, change and retrieve password etc.
As resource server to store and retrieve more users and groups informations. I have a ManyToMany relationship between users and groups.
Should I seperate the second part of functionalities of this app on another standalone service that will work as resource server only. And only keep the authentication part on the authorization server?
That way I could horizontally scale these two services separately.
Yes, the better idea would be to have the configuration as a separate standalone service running on cloud. With configuration server as a separate service you can add all the authorization and other sort of details like DB details, API details, messaging queue configuration etc, and get connected to N number of services.

Securing access to REST API of Kafka Connect

The REST API for Kafka Connect is not secured and authenticated.
Since its not authenticated, the configuration for a connector or Tasks are easily accessible by anyone. Since these configurations may contain about how to access the Source System [in case of SourceConnector] and destination system [in case of SinkConnector], Is there a standard way to restrict access to these APIs?
In Kafka 2.1.0, there is possibility to configure http basic authentication for REST interface of Kafka Connect without writing any custom code.
This became real due to implementation of REST extensions mechanism (see KIP-285).
Shortly, configuration procedure as follows:
Add extension class to worker configuration file:
rest.extension.classes = org.apache.kafka.connect.rest.basic.auth.extension.BasicAuthSecurityRestExtension
Create JAAS config file (i.e. connect_jaas.conf) for application name 'KafkaConnect':
KafkaConnect {
org.apache.kafka.connect.rest.basic.auth.extension.PropertyFileLoginModule required
file="/your/path/rest-credentials.properties";
};
Create rest-credentials.properties file in above-mentioned directory:
user=password
Finally, inform java about you JAAS config file, for example, by adding command-line property to java:
-Djava.security.auth.login.config=/your/path/connect_jaas.conf
After restarting Kafka Connect, you will be unable to use REST API without basic authentication.
Please keep in mind that used classes are rather examples than production-ready features.
Links:
Connect configuratin
BasicAuthSecurityRestExtension
JaasBasicAuthFilter
PropertyFileLoginModule
This is a known area in need of improvement in the future but for now you should use a firewall on the Kafka Connect machines and either an API Management tool (Apigee, etc) or a Reverse proxy (haproxy, nginx, etc.) to ensure that HTTPS is terminated at an endpoint that you can configure access control rules on and then have the firewall only accept connections from the secure proxy. With some products the firewall, access control, and SSL/TLS termination functions can be all done in a fewer number of products.
As of Kafka 1.1.0, you can set up SSL and SSL client authentication for the Kafka Connect REST API. See KIP-208 for the details.
Now you are able to enable certificate based authentication for client access to the REST API of Kafka Connect.
An example here https://github.com/sudar-path/kc-rest-mtls

Is using Redis a violation of REST principles?

I am creating a webapp for data analysis. I want to use Redis to store the data that the user has uploaded so that I can send it to other pages/views. This data is only valid during the session and should expire when the session expires.
Is this a violation of REST principles? Or is this only a problem if I use some value that I have stored server side as session key/identifier?
With your updates what you can do is to upload the data, generate a key against it, place it in Redis and keep it in hash(with meta data) or list(if there could be more than one upload). They list/hash key could be identified by the user id.
Then moving forward, let the client refer to this object using the generated id.
Actually one of the best practices is to use Redis over the internet is to expose a REST API and handle all communication using your Web Server. Redis is always kept in a secure network since Redis doesn't provide any security.
On Redis website
Network security
Access to the Redis port should be denied to everybody but trusted
clients in the network, so the servers running Redis should be
directly accessible only by the computers implementing the application
using Redis.
In the common case of a single computer directly exposed to the
internet, such as a virtualized Linux instance (Linode, EC2, ...), the
Redis port should be firewalled to prevent access from the outside.
Clients will still be able to access Redis using the loopback
interface.
This is also a basic practice when using traditional databases.