I'm using Owncloud on personal server for personal data, and need to connect to business-related server for business data. Server-to-server sharing is unappealing because of wasting costly hosted storage (and some other arguments). Is there a way to make windows client sync both servers simultaneously?
Such a feature currently doesn't exist. Two possible "workarounds" are listened here:
https://forum.owncloud.org/viewtopic.php?f=17&t=20521
An implementation of this feature without workarounds are planned for 1.9:
https://github.com/owncloud/client/issues/43
Related
So I possibly have a dumb question about MongoDB hosting. I'm learning the MERN stack and can't figure out how to host my app. Most of the tutorials I've seen use Heroku I believe, but it's just yet another service or thing to learn or manage. I've used Postman to verify the code works. And yes I've googled this, which only confused me more.
I have several Dreamhost domains, but can't find much info on using it to host MongoDB. Is it possible to use my current host or do I HAVE to point the DNS or whatever to another server/service, or just plain move my domain to a different provider?
Also, I've got a client/front-end directory and an api/server directory in my root folder. Is that standard practice, do I need to upload them to different hosts, merge them or what? I cannot for the life of me get the backend to work.
Edit/Update:Thank you for the response! Sorry im just now answering. It was a crazy week. The code itself works. I built a portfolio blog with a login/register system with express/mongodb to store users and posts. ALL my domains are on dreamhost and didn't want to spread out service providers if I could help it. I've built websites with PHP and SQL on there and it was easy. But from what I could find out MongoDB cannot be used on dreamhost servers. I ended up using heroku, which worked, although I haven't been able to point my DNS from my dreamhost domain to it yet. Currently it has a domain name of ***.herokuapp.com and is hosted on heroku. So that's where my problem is now, but still want to figure understand the why and how a little better. How is MongoDB different from SQL other than the relational aspect and why does it need something like heroku as opposed to dreamhost or blue host or godaddy?
So first thing first, you should know that MongoDb is hosted on an OS, now that can be your own system, cloud server or a service provider.
Domain name are nothing but just a pointer to your actual server. So you have to host your MondoDb somewhere, whether that be a service like Mongo Atlas or you have to spin up your own server on digitalocean, AWS, gcp etc.
For that need to see what are you actually doing, can't comment without having a look at your code. If you not comfortable sharing all the code online. You can personally chat with me.
I have a working postfix smtp server on my Ubuntu 20.04 cloud machine. I can send/receive emails using the standard command line "mail" client. I am now looking for a way to do the same via web browser. I already am running nginx on the server.
It seems there are various apps such as RoundCube and SquirrelMail that are available on Ubuntu. However, they seem to require additional pop3/imap server packages to be installed.
As the webmail client is intended to be on the same machine as my smtp server is, I do not see why additional pop3/imap packages need to be installed.
Wondering if there is a simpler way to look at emails via web browser. Regards.
You need to install a web server, PHP (or whatever is required to run the webmail app of your choosing), and an IMAP server.
mail is an email client that knows how to directly access your messages on the filesystem, something that a web app has no capability to do. Also note that it is executed from the context of you having already logged in to your server as a particular user.
It's a Very Bad Idea to give your web server read/write access to parts of the filesystem outside the directories where your web-related files are kept (write access can and should be even more strict).
It's technically feasible to create a webmail app that does what you want (I think there may have been some attempts in the distant past), but it would be limited to systems with a very specific mail system setup and require some questionable permission tweaking. IMAP is the layer that abstracts your particular mail system setup from any of the various mail clients you may want to use to access your messages. It also helps make sure users and apps are not able to access things they should not.
Wondering if there is a simpler way to look at emails via web browser
Not that I can think of. Fortunately, this will get you most of the way there:
apt-get install dovecot-imapd
Dovecot will need minimal configuration in your case, and more time will be spent installing and tweaking whatever webmail client you choose (or you can try Thunderbird). And remember that the IMAP server can be limited to local clients (webmail counts as such) and need not be exposed to the Internet.
I've created one instance on Google Cloud with PostgreSql and I've connected the data studio with this database adding all the addresses specified in white list specified at link below
[https://support.google.com/datastudio/answer/7288010?hl=en]
With that solution I have to open access to my database to a lot of addresses. And this issue, associated to the fact that SSL is not supported is
a big lack of security.
Is there any different way to use google data studio for reports?
Maybe using CloudSqlProxy and considering google data studio as an external application from the GC environment?
Thanks for cooperation
Michele
I am assuming you are concerned about data being exposed due to the lack of support for SSL. Though that is a valid concern in a lot of cases, for your specific use case, it should not matter:
All the ip addresses that you have to whitelist here are Google Server/infrastructure addresses.
Data Studio as an application runs on Google's servers. So the communication between Google Cloud SQL and Google Data Studio will be entirely within Google's network. Even if it is not SSL, that traffic should not be exposed to outside world.
The connection between any client computer (where report is being viewed) and Data Studio will always be HTTPS.
However, if you still want to have an SSL connection, you can create a Community Connector in Apps Script that uses the JDBC service to connect to databases using SSL.
Try using client.key in both client fields.
The solution posted below helped here,
https://support.google.com/datastudio/thread/8739014?hl=en
I've been reading security articles for several days, but have no formal training in the field. I am developing a configuration and management application for an IoT device. It is meant to be run either on an internal network, or accessed over the web.
My application will be used by IT admins, managers, and factory-floor workers. Depending on the installation, there will be varying levels of infrastructure in place. It could run on a laptop on the floor itself, on a server, or hosted in the cloud. For this reason, we can not assume that our clients will have the kind of infrastructure you might find at a datacenter or in the cloud, for example CAS or NTP.
Our application provides a REST API for client applications to gather data. We'd like to use roles to restrict what data users can access. I've gathered that a common solution for authentication is to encode the username/pass in the REST Header. However, this is completely insecure unless sent over a secure channel.
As I understand it, SSL Certification Authorities grant certs for a specific domain. Our application will have no set domain, and a different IP depending on the installation. Many web applications do not trust self-signed certs. It's not clear to me whether a self-signed application is good enough for a typical application-developer who will be consuming our interface.
With this being the case:
1) What are my options to set up a secure channel, internally or via the web?
2) Am I making assumptions about how our product will be used that damage our users' security unnecessarily?
Well you can use custom encryption to encrypt the data being sent to the applications.
You can also use JSON web tokens to secure your REST API. https://en.wikipedia.org/wiki/JSON_Web_Token. The JSON tokens could be generated by a centralized authentication server and included in all requests sent by the client applications to the server
How are Cassandra clusters usually built in security way? Should they always be kept locally or are there any security functions that makes it reasonable to open up for external connections to the cluster? As far as I've understand I seems like Cassandra doesn't have any "inbuild security engine" for handling these kind of things. I'm planning on building a service to talk with the Cassandra from, should that connection be made locally (on the same net as the cluster) or from external using the DNS?
Cassandra supports builtin password authentication and authorisation since version 1.2.
User credentials and privileges are kept internally, in system auth tables. This can be viewed as its "inbuild security engine".
As for protecting connections (encryption), since version 1.2, there's SSL support for both internode and client-to-node communication. DataStax Enterprise platform additionally extends that with Kerberos/LDAP support to allow single-sign-on.
Configure a stateful firewall to allow incoming connections, but allow outgoing only if someone requested something from the server. Also C* has inbuilt SSL support, but not all APIs can use the SSL, so you'll have to pick a compatible one.