laravel - cannt open paths.php on server - centos

this ones a weird one. For some reason, out of the blue, everytime I create a new project and upload to my server, it wont allow me to edit the paths.php file through FTP.
I accessed the server through command line earlier on today to install a bundle and noticed the paths.php file was green and has a star next to it. Does any one know what this means and is it affecting me from opening this file?
regards

The permission of the file is 755 which mean:
755 = rwx r-x r-x
Owner has Read, Write and Execute
Group has Read and Execute only
Other has Read and Execute only
Viewing the picture, qsradmin is the owner of the file, so he is the only one who can write or edit the file.
In order to change the owner of the file, use chown command like this:
chown NameOfTheUser path.php
For more information checkout Unix File permission

Related

Using p4 zip and unzip to export files from one perforce server to another

I was trying to export files along with their revision history inside my depot folder from 2015.2 to 2019 perforce server.Also , I would want perforce to create new user on my new server corresponding to the commiter/submitter on my original 2015 repo.
Perforce replicate looked like overkill for my current task and then I came across this read on perforce's website that mentioned P4 zip.
This looked like it will solve my problem, but the article has a few issues I could not understand.
Let's say I am moving data from server1_ip:port --> server2_ip:port
I am currently following these steps
Making zip of folder to be copied using
p4 remote my_remote_spec , setting
Address: server1_ip:port
DepotMap://depot/... //depot2/...
p4 -p server1_ip:port zip -o test.zip -r my_remote_spec -A //depot/.... But on this step I get permission denied error. This is weird to me because the user although not super/admin has access to files i ask to get zipped.
Also, when i did try with a super user, i could not find test.zip even though i was not prompted any errors.
Isn't the above command supposed to generate a zip file inside the directory which i run it from?
Is the unzip command supposed to be run after a p4 login from user of second server?
Lastly, from the document why is a third port , 1667 mentioned in the transfer of files from server running on 1666 and 1777.
on this step I get permission denied error. This is weird to me because the user although not super/admin has access to files i ask to get zipped.
This is expected:
C:\Perforce\test>p4 help zip
zip -- Package a set of files and their history for use by p4 unzip
...
The zip command requires super permission granted by p4 protect.
Isn't the above command supposed to generate a zip file inside the directory which i run it from?
Similar to p4 admin checkpoint, the zip file is written to the server machine (relative to the server root, if you don't specify an absolute path), rather than being transferred to the local client directory. This is not explicitly stated in the documentation (which seems like an oversight), but if you look in the root directory of the server where you ran the zip, you should find your test.zip there.
Is the unzip command supposed to be run after a p4 login from user of second server?
Yes, any time you run a command against a particular server, you will need to be logged in to that server. In the case of p4 unzip you will need at least admin permission on the second server.
Lastly, from the document why is a third port , 1667 mentioned in the transfer of files from server running on 1666 and 1777.
I'm pretty sure that's a typo; whoever wrote the article started off using ports 1666 and 1777, changed their mind halfway through, and didn't proofread. :)

Moodle: PDFs are empty

Many PDFs from different courses appear to have been corrupted or something. We first noticed when viewing to view in CHrome and got the error "Failed to load PDF document." In Internet Explorer the page just shows up empty.
When viewing the file in the "Updating file in" area, it says the following: "Either the file does not exist or there is a permission problem." It has a file size, but when I click on Download, the file is 0 kb.
Where are the files saved? Why are they corrupted?
Update: I've narrowed it down to that the /moodledate/filedir lost all the references. The folders are there as well as the files. Is there any way to fix this without having to reupload all PDFs?
I am on version 3.6.3 on Windows
The content/path hash is stored in the mdl_files table - maybe have a look in there to see if you can match up the files. The hash should match the folder/file name.
SELECT *
FROM mdl_files
WHERE filename LIKE '%pdf%'
OR mimetype LIKE '%pdf%'
OR source LIKE '%pdf%'
Also, check the file permissions. I don't use Windows, so not sure how it works on there. But on Linux, the web server should have access to the data folder.
Something like:
sudo chown -R www-data:www-data /pathto/moodledata/
sudo chmod -R 02777 /pathto/moodledata/
see https://docs.moodle.org/38/en/Security_recommendations#Most_secure.2Fparanoid_file_permissions

Total Commander external editor emacs "permission denied" on editing files

I configured my Total Commander so I can open a file *.txt i.e. within emacs.
Therefore I setup my Editor
D:\Tools\emacs\bin\emacsclientw.exe "%1"
When I now open my file everything is ok. But when I edit it and save it emacs tells me the following:
Saving file c:/log.txt...
basic-save-buffer-2: Opening output file: permission denied, c:/log.txt
How do I make it run so it can actually edit files?
By Default you should not be saving anything to root C:\
It is just bad practice and by default a normal user does not have permission to it.
Instead, create a working DIR in your Documents folder and a log directory in that, then you will have something like:
C:\Users\Frank\Documents\Working\logs\log.txt
This should not create any permission errrors.

Capistrano v3 not able to cleanup old releases

Since I'm running my rails app as root, it creates files that are owned by root in the tmp directory. Because of this
cap production deploy:cleanup
can't remove old releases because it is not run as root.
I've looked at the capistrano v3 code, but I don't see a way to run the cleanup command as root. Is this option missing or is this problem occurring because I'm doing something wrong in another place of the deployment flow.
I start the app as root because I need to bind to port 80.
What you can also do is triggering a task just before cleaning up the old release :
namespace :deploy do
before :cleanup, :cleanup_permissions
desc 'Set permissions on old releases before cleanup'
task :cleanup_permissions do
on release_roles :all do |host|
releases = capture(:ls, '-x', releases_path).split
if releases.count >= fetch(:keep_releases)
info "Cleaning permissions on old releases"
directories = (releases - releases.last(1))
if directories.any?
directories.each do |release|
within releases_path.join(release) do
execute :sudo, :chown, '-R', 'deployuser', 'path/to/your/files/writtend/by/root'
end
end
else
info t(:no_old_releases, host: host.to_s, keep_releases: fetch(:keep_releases))
end
end
end
end
end
Note that you'll need to give your deployment user the right to execute this specific sudo command (with a sudoers definition file.
I've looked at the capistrano v3 code, but I don't see a way to run the cleanup command as root. Is this option missing or is this problem occurring because I'm doing something wrong in another place of the deployment flow.
There is no secret sauce in Capistrano, we rely on you having correctly set up the permissions for your deploy user as documented at http://www.capistranorb.com/
Removing directories requires write permissions on the parent directory, that is to say, given the following directory structure:
/var/www/releases/
\- 20131015180000
\- 20131015181500
\- 20131015183000
You need write permission on the /var/www/releases/ directory, as the list of files and directory in that directory, is stored in the directory.
From a similar StackSverflow question:
In UNIX and Linux, the ability to remove a file is not determined by the access bits of that file. It is determined by the access bits of the directory which contains the file.
From the Wikipedia article on Unix File Permissions:
The write permission grants the ability to modify a file. When set for a directory, this permission grants the ability to modify entries in the directory. This includes creating files, deleting files, and renaming files.
One of the things you may want to do is to create a group called app or web on your linux box and add root and the deploy user to the same group. Then, as part of your deployment, chmod the release_path permissions to g+s which will ensure that any new files created by root user are group writable.
You should then be able to remove the old folders as deploy user.
I was running into similar issues, so, to confirm, logged into my Web server via SSH, and tried rm -rf [directory], which also failed due to the same permissions issues, even logged in as admin. Running chmod -R 755 [directory]/, then rm -rf [directory]/ did work, though.
To fix it, in the project's silverstripe.rake file, I changed the command being run from:
execute :chown, "-R [user]:[group] /path/to/project"
to:
execute :chmod, "-R 755 /path/to/project"
So far, no more issues with deleting the oldest release when running cap [release name] deploy

Postgres ERROR: could not open file for reading: Permission denied

Computer: Mac OS X, version 10.8
Database: Postgres
Trying to import csv file into postgres.
pg> copy items_ordered from '/users/darchcruise/desktop/items_ordered.csv' with CSV;
ERROR: could not open file "/users/darchcruise/desktop/items_ordered.csv" for reading: Permission denied
Then I tried
$> chown postgres /users/darchcruise/desktop/items_ordered.csv
chown: /users/darchcruise/desktop/items_ordered.csv: Operation not permitted
Lastly, I tried
$> ls -l
-rw-r--r-- 1 darchcruise staff 1016 Oct 18 21:04 items_ordered.csv
Any help is much appreciated!
Assuming the psql command-line tool, you may use \copy instead of copy.
\copy opens the file and feeds the contents to the server, whereas copy tells the server the open the file itself and read it, which may be problematic permission-wise, or even impossible if client and server run on different machines with no file sharing in-between.
Under the hood, \copy is implemented as COPY FROM stdin and accepts the same options than the server-side COPY.
Copy the CSV file to /tmp
For me this solved the issue.
chmod a+rX /users/darchcruise/ /users/darchcruise/desktop /users/darchcruise/desktop/items_ordered.csv
This will change access rights for your folder. Note that everyone will be able to read your file.
You can't use chown being a user without administrative rights.
Also consider learning umask to ease creation of shared files.
Copy your CSV file into the /tmp folder
Files named in a COPY command are read or written directly by the server, not by the client application. Therefore, they must reside on or be accessible to the database server machine, not the client. They must be accessible to and readable or writable by the PostgreSQL user (the user ID the server runs as), not the client. COPY naming a file is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
I had the issue when I was trying to export data from a remote server into the local disk. I hadn't realised that SQL copy actually is executed on the server and that it tries to write to a server folder. Instead the correct thing to do was to use \copy which is the psql command and it writes to the local file system as I expected. http://www.postgresql.org/message-id/CAFjNrYsE4Za_KWzmfgN1_-MG7GTw_vpMRxPk=OEjAiLqLskxdA#mail.gmail.com
Perhaps that might be useful to someone else too.
Another way to do this, if you have pgAdmin and are comfortable using the GUI is to go the table in the schema and right click on the table you wish to import the file to and select "Import" browse your computer for the file, select the type your file is, the columns you want the data to be imputed into, and then select import.
That was done using pgAdmin III and the 9.4 version of PostgreSQL
I resolved the same issue with a recursive chown on the parent folder:
sudo chown -R postgres:postgres /home/my_user/export_folder
(my export being in /home/my_user/export_folder/export_1.csv)
for macbook first i opened terminal then type
open /tmp
or in finder directory you directly enter command+shift+g then type /tmp in go to the folder.
it opens temp folder in finder. then i paste copied csv file into this folder.then again i go to postgres terminal and typed below command and then it is copied my csv data into db table
\copy recharge_operator FROM '/private/tmp/operator.csv' DELIMITER ',' CSV;
COPY your table (Name, Latitude, Longitude) FROM 'C:\Temp\your file.csv' DELIMITERS ',' CSV HEADER;
Use c:\Temp\"Your File"\.
For me it worked to simply to add sudo (or run as root) for the chown command:
sudo chown postgres /users/darchcruise/desktop/items_ordered.csv
You must grant the pg_read_server_files permission to the user if you are not using postgres superuser.
Example:
GRANT pg_read_server_files TO my_user WITH ADMIN OPTION;
just in case you're facing this problem under windows 10 , add the group of users "youcomputer\Users" on the security Tab and grant it full control , that solved my issue
I had the same error message but was using psycopg2 to communicate with PostgreSQL. I fixed the permission issues by using the functions copy_from and copy_expert that will open the file on the client side as the user running the python script and feed the data to the database over STDIN.
Refer to this link for further information.
This answer is only for Linux Beginners.
Assuming initially the DB user didn't have file/folder(directory) permission on the client side.
Let's constrain ourselves to the following:
User: postgres
Purpose: You wanted to (write to / read from) a specific folder
Tool: psql
Connected to a specific database: YES
FILE_PATH: /home/user/training/sql/csv_example.csv
Query: \copy (SELECT * FROM table_name TO FILE_PATH, DELIMITER ',' CSV HEADER;
Actual Results: After running the query you got an error : Permission Denied
Expected Results: COPY COUNT_OF_ROWS_COPIED
Here are the steps I'd follow to try and resolve it.
Confirm the FILE_PATH permissions on your File system.
Inside a terminal to view the permissions for a file/folder you need to long list them by entering the command ls -l.
The output has a section that shows sth like this -> drwxrwxr-x
Which is interpreted in the following way:
TYPE | OWNER RIGHTS | GROUP RIGHTS | USER RIGHTS
rwx (r: Read, W: Write, X: Execute)
TYPE (1 Char) = d: directory, -: file
OWNER RIGHTS (3 Chars after TYPE)
GROUP RIGHTS (3 Chars after OWNER)
USER RIGHTS (3 Chars after GROUP)
If permissions are not enough (Ensure that a user can at least enter all folders in the path you wanted path) - x.
This means for FILE_PATH, All the directories (home , user, training, sql) should have at least an x in the USER RIGHTS.
Change permissions for all parent folders that you need to enter to have a x. You can use chmod rights_you_want parent_folder
Assuming /training/ didn't have an execute permission.
I'd go the user folder and enter chmod a+x training
Change the destination folder/directory to have a w if you want to write to it. or at least a r if you want to read from it
Assuming /sql didn't have a write permission.
I would now chmod a+w sql
Restart the postgresql server sudo systemctl restart postgresql
Try again.
This would most probably help you now get a successful expected result.
On Linux you can fix this by giving the postgres user read/write/execute permissions on the target directory. Eg:
setfacl -m u:postgres:rwx /home/hi
I just copied the source csv file to another folder in which you have more permissions (C:/temp), and it worked fine.
May be You are using pgadmin by connecting remote host then U are trying to update there from your system but it searches for that file in remote system's file system... its the error wat I faced May be its also for u check it