I am trying to setup SFTP so that I have access to the root folder on my Raspberry Pi 2. Essentially I am trying to access / by default rather than ~/ when using SFTP. This way accessing my Apache server and associated files is much easier from the get go. I am looking for a simple way to achieve this.Please suggest.
This ended up working
sudo chown -R your-user-name /var/www/
Related
I cannot execute any command using sudo. I get this error
-sh: sudo: command not found
First: if you are already root, you do not need sudo.
Second: if this is a yocto-based image as the question tag suggests, then there is no apt-get either. This is the "debianoid" way of installing things and does not apply to prebuilt-image based distributions as yocto provides them. So you have two options:
Change to ubuntu or debian (or any derivative thereof), then this approach will apply.
Use the yocto/OpenEmbedded way of installing things. This is unfortunately not exactly trivial, so you better get started here then: Yocto Projct Quick Start
Maybe you need to check the user you log in.
If you are the root user, you have the super access right yet.
If not, you need to change your configuration in your yocto project like this
EXTRA_USERS_PARAMS = "\
usermod -p 'passowrd' root; \
"
I'm trying to automate deployment via ftp via bitbucket pipelines.
Path is:
/var/www/vhosts/maindomain.com/subdomain.maindomain.com
Tried it with and without the first forward slash in there. Also checked the default path when you connect and its maindomain.com/subdomain.maindomain.com -- tried that too but same error.
Code looks like this:
image: node:9.8.0
pipelines:
default:
- step:
name: Deployment
script:
- apt-get update
- apt-get install ncftp
- ncftpput -v -u "$FTP_USERNAME" -p "$FTP_PASSWORD" -R $FTP_HOST $FTP_SITE_ROOT dist/*
- echo Finished uploading /dist files to $FTP_HOST$FTP_SITE_ROOT
But the problem is ncftp doesn't like the file path to upload no matter what. I've been using the one showing up in filezilla after navigating to that folder whilst connecting with the exact same credentials.
How could I trackdown the right path or troubleshoot this forward?
I think the problem lies with my server only accepting SFTP connections and can't set port to 22 as NCFTP does not support SSH. I'm currently look at lftp as an alternative, will post here the syntax if I figure it out.
Edit: Does not scale well, will be pursuing different avenues for continuous deployment.
Don't need to add the full path of FTP site just put the path as below.
-R /maindomain.com/subdomain.maindomain.com dist/*
for check the physical path of site, site->manage ftp site-> advance setting.
where you find the physical path that we don't need to include when we use cli.
I am using the following configuration, ubuntu 16.04 apache2 php 7.0 owncloud 10.0.3. I think I have made an error when I setup ownclound. The data directory lives in /var/www/owncloud/data ( I believe that owncloud.log resides in this folder). I have deployed fail2ban and the issue that I am having is that fail2ban cannot access the data folder because I ran sudo chown -R www-data:www-data /var/www/owncloud/. The only way I access the log file is through the OWNcloud gui settings > general > log. where I can see the failed login attempts by me. I cannot seem to get Fail2ban to read the owncloud log.
I am new to ubuntu and Owncloud can anyone advise how to rectify this issue, owncloud is working fine and I am using ip addresses to restrict access to owncloud. Fail2ban was supposed to make the server secure so that I could open up owncloud to the internet.
Regards
Steve
You should change the permissions of the log file so that it can be read by everyone but written only by the php process. Do a 'chmod 755 /var/log/owncloud/owncloud.log'
By the way. I suggest that you migrate from Owncloud to Nextcloud. It is a full replacement, fully open source, more features and more secure. And it has a fail2ban equivalent brute force protection already build in :-)
I am a junior front-end developer and I am working on my process using command line. I would like to push my local changes to my server without having to use an FTP client like Filezilla. I am manually dragging and dropping files using the client and would like to learn how developers perform this process. I am building a static site using SiteLeaf on a Mac. Thanks in advance for help with this workflow.
If your target has SSH installed you can use SCP:
$ scp -r your_remote_user#remote_address:/path/to/save/dir /local/dir/to/transfer
This can also be used to transfer single files: just remove the -r (recursive) option and specify files path instead of directories.
I am trying to automate an application deployment as part of this I need to upload a file to a server. I have created a minimal user and configured chroot for the SFTP server but I can't work out how to upload a file non interactive.
At present I am doing scp myfile buildUser#myserver.com:newBuilds/
I tried sftp buildUser#myserver.com myfile (newBuilds is the chroot dir) but this didn't upload anything but it did connect.
The reason for favouring this aproach and NOT using scp is that its a lot more difficult to restrict scp access (from the information I have learned).
If you are using OpenSSH server, chrooting works for both SCP and SFTP.
For instructions see:
https://www.techrepublic.com/article/chroot-users-with-openssh-an-easier-way-to-confine-users-to-their-home-directories/
So I believe your question is irrelevant.
Anyway, sftp (assuming OpenSSH) is not really designed for command-line-only upload. You typically use -b switch to specify batch file with put command.
sftp buildUser#myserver.com -b batchfile
With batchfile containing:
put /local/path /remote/path
If you really need command-line-only upload, see:
Single line sftp from terminal or
Using sftp like scp
So basically, you can use various forms of input redirection like:
sftp buildUser#myserver.com <<< 'put /local/path /remote/path'
Or simply use scp, instead of sftp. Most servers support both. And actually OpenSSH scp supports SFTP protocol since 8.7.
Since OpenSSH 9.0 is even uses SFTP by default. In 8.7 through 8.9, the SFTP has to be selected via -s parameter. See my answer to already mentioned Single line sftp from terminal.
You can pass inline commands to SFTP like this:
sftp -o PasswordAuthentication=no user#host <<END
lcd /path/to/local/dir
cd /path/to/remote/dir
put file
END
I resolved this issue by approaching it from a different side. I tried configuring chroot for sftp but could not get this to work. My solution was to use rssh and only allow scp. This works for me because the user I am trying to restrict is known and authenticated user.