I've created one instance on Google Cloud with PostgreSql and I've connected the data studio with this database adding all the addresses specified in white list specified at link below
[https://support.google.com/datastudio/answer/7288010?hl=en]
With that solution I have to open access to my database to a lot of addresses. And this issue, associated to the fact that SSL is not supported is
a big lack of security.
Is there any different way to use google data studio for reports?
Maybe using CloudSqlProxy and considering google data studio as an external application from the GC environment?
Thanks for cooperation
Michele
I am assuming you are concerned about data being exposed due to the lack of support for SSL. Though that is a valid concern in a lot of cases, for your specific use case, it should not matter:
All the ip addresses that you have to whitelist here are Google Server/infrastructure addresses.
Data Studio as an application runs on Google's servers. So the communication between Google Cloud SQL and Google Data Studio will be entirely within Google's network. Even if it is not SSL, that traffic should not be exposed to outside world.
The connection between any client computer (where report is being viewed) and Data Studio will always be HTTPS.
However, if you still want to have an SSL connection, you can create a Community Connector in Apps Script that uses the JDBC service to connect to databases using SSL.
Try using client.key in both client fields.
The solution posted below helped here,
https://support.google.com/datastudio/thread/8739014?hl=en
Related
So I possibly have a dumb question about MongoDB hosting. I'm learning the MERN stack and can't figure out how to host my app. Most of the tutorials I've seen use Heroku I believe, but it's just yet another service or thing to learn or manage. I've used Postman to verify the code works. And yes I've googled this, which only confused me more.
I have several Dreamhost domains, but can't find much info on using it to host MongoDB. Is it possible to use my current host or do I HAVE to point the DNS or whatever to another server/service, or just plain move my domain to a different provider?
Also, I've got a client/front-end directory and an api/server directory in my root folder. Is that standard practice, do I need to upload them to different hosts, merge them or what? I cannot for the life of me get the backend to work.
Edit/Update:Thank you for the response! Sorry im just now answering. It was a crazy week. The code itself works. I built a portfolio blog with a login/register system with express/mongodb to store users and posts. ALL my domains are on dreamhost and didn't want to spread out service providers if I could help it. I've built websites with PHP and SQL on there and it was easy. But from what I could find out MongoDB cannot be used on dreamhost servers. I ended up using heroku, which worked, although I haven't been able to point my DNS from my dreamhost domain to it yet. Currently it has a domain name of ***.herokuapp.com and is hosted on heroku. So that's where my problem is now, but still want to figure understand the why and how a little better. How is MongoDB different from SQL other than the relational aspect and why does it need something like heroku as opposed to dreamhost or blue host or godaddy?
So first thing first, you should know that MongoDb is hosted on an OS, now that can be your own system, cloud server or a service provider.
Domain name are nothing but just a pointer to your actual server. So you have to host your MondoDb somewhere, whether that be a service like Mongo Atlas or you have to spin up your own server on digitalocean, AWS, gcp etc.
For that need to see what are you actually doing, can't comment without having a look at your code. If you not comfortable sharing all the code online. You can personally chat with me.
I have a Google Cloud SQL PostgreSQL database in which I can connect by using SSL and by entering my IP address in allowed connection settings. However, I do not want to list all the IP addresses that is going to connect to this database (because I do not know all the IP addresses). I have around 15 people which I want them to login to my database using QGIS and they should be able to change the data as this is a research. Security is not a big issue as this database will be online for a very short period of time. What connection method can you suggest? The users are not very proficient so I need to setup everything.
I hope you're doing fine.
I would like to suggest to set the connections with the Cloud SQL proxy as it will provide the security needed without using ssl or the need of authorize any network. so basically the set up is to:
Enable the API
Install the proxy client on your local machine
Determine how you will authenticate the proxy
If required by your authentication method, create a service account
Also you can find the steps on "Connecting to Cloud SQL from external applications"
Hope this works for you as I have never used it with QGIS but I believe that as you are using a proxy it won't be hard from there to use it with QGIS as if you connected to a local server.
I have a web app that I have built and am hosting with my own provider. I am wanting to connect this to a Google Cloud SQL database. What is the best way to do this?
I spoke with my hosting provider and they stated that they have no IP addresses / ranges they can give me to setup with GCSQL.
Any help on this would be appreciated. Sorry if this questions is completely simple and not complicated at all.
Connect using the instance IP address provided by Google Cloud SQL (link). Be sure to use SSL.
I've got a spring boot app which is connected to mongodb atlas.
Everything is working locally.
I now want to publish this to pivotal cloud foundry.
Secure connection between PCF and atlas
In mongodb atlas I need to open up the firewall an allow certain ip numbers.
How should I configure mongodb atlas to connect to pcf in the most secure way?
Autoconfigure getting in the way
cloud foundry is overriding my connection urls to point to localhost:27017 instead of my atlas cluster.
What is the recommended way to connect to mongodb atlas?
In mongodb atlas I need to open up the firewall an allow certain ip numbers. How should I configure mongodb atlas to connect to pcf in the most secure way?
White listing IP addresses for applications that run on CF is not particularly effective. The reason it's not effective is that you don't know the IP address from which you'll be connecting, because it depends on where Diego decides to run your application. In other words, it depends on the cell where your application is told to run. To compound matters, that will change when you restart / restage your application.
Because the IP can vary, what you end up needing to do is white list all of your Cells. The problem with this and why it's not effective is that you've ended up white listing every app running on the platform.
What you can do to improve the security a bit is to make use of application security groups. ASG's can be used to limit outgoing traffic. You can also control them at the space level. That means you can configure your default running security group to not allow access to your MongoDb server, but you can allow access for individual spaces by binding an ASG to only those spaces with apps that need to talk to your MongoDb servers.
The downside of this approach is that it requires you to be a platform administrator, which means it will only work if you own your CF installation (not going to work for public providers).
More on ASG's here: https://docs.cloudfoundry.org/adminguide/app-sec-groups.html
For public providers, you can use a proxy. To make this work, you need to have your application configured to talk through a proxy when it attempts to access your Mongodb servers. You control the proxies, which have fixed IPs, so you can white list the proxies to allow access to just your app. If you don't want to run your own proxy servers, there are public proxy providers that you can use.
cloud foundry is overriding my connection urls to point to localhost:27017 instead of my atlas cluster. What is the recommended way to connect to mongodb atlas?
It's possible to disable auto configuration. One way is described in the docs here. If you include the Spring Cloud Connectors dependencies and use them manually, then the auto configuration will not run.
https://docs.cloudfoundry.org/buildpacks/java/spring-service-bindings.html#manual
The other option is to tell the Java build pack not to install the auto configuration. You can do that by setting the following environment variable for your application, either with cf set-env or via a manifest.yml file.
Ex: JBP_CONFIG_SPRING_AUTO_RECONFIGURATION='[enabled: false]'
Be careful if you do this as it will disable everything provided by the auto reconfiguration, which includes setting the "cloud" profile for your app. If you use this option to disable auto reconfiguration, you'll probably also want to set SPRING_PROFILES_ACTIVE='cloud' to manually enable the cloud profile.
I suppose your other option is to simply embrace the auto configuration. It's a little confusing / magical at first, but I've found this article to explain it very well.
https://spring.io/blog/2015/04/27/binding-to-data-services-with-spring-boot-in-cloud-foundry
Hope that helps!
How are Cassandra clusters usually built in security way? Should they always be kept locally or are there any security functions that makes it reasonable to open up for external connections to the cluster? As far as I've understand I seems like Cassandra doesn't have any "inbuild security engine" for handling these kind of things. I'm planning on building a service to talk with the Cassandra from, should that connection be made locally (on the same net as the cluster) or from external using the DNS?
Cassandra supports builtin password authentication and authorisation since version 1.2.
User credentials and privileges are kept internally, in system auth tables. This can be viewed as its "inbuild security engine".
As for protecting connections (encryption), since version 1.2, there's SSL support for both internode and client-to-node communication. DataStax Enterprise platform additionally extends that with Kerberos/LDAP support to allow single-sign-on.
Configure a stateful firewall to allow incoming connections, but allow outgoing only if someone requested something from the server. Also C* has inbuilt SSL support, but not all APIs can use the SSL, so you'll have to pick a compatible one.