parallels plesk file permission - webserver

I,m trying to install a joomla site in parallels plesk panel via akeeba backup . Where I,m facing file permission issue.
An error occured
Could not open /var/www/vhosts/xyz.com/httpdocs/pearl_new/jquery.min.js for writing.
As searched all over and also in Plesk forum . I found this is a very common problem. Some suggested installing mod_suphp can solve the problem. I tried but don't know is it successfully installed or not.
Then I have created a new service plan from where in hosting parameter I select Run PHP as FastCGI
After that I took my domain to that service plan. I thought it will solve the problem. But still getting same error. Can anyone help please ?

On the ssh command line try:
find /var/www/vhosts/xyz.com/httpdocs/ -type f -exec chmod 664 {} \;
find /var/www/vhosts/xyz.com/httpdocs/ -type d -exec chmod 775 {} \;
these will set the permissions correct for writing to by user and group for files (f) and directories (d). you also need to make sure that apache is in the psacln and psaserv groups in the /etc/group file: the lines should look like this:
psaserv:x:504:apache,psaftp,psaadm
psacln:x:505:apache
Then you can run the commad:
chown -R siteusername.psacln /var/www/vhosts/xyz.com/httpdocs/*
where "siteusername" is the username of the site's files.
Hope this helps.

this is common issue in linux and users had shared hosting.
So simple.
If you already selected PHP module with FAST CGi so follow the following steps:
Open file manager
Make new folder "ABC"
Click "ALL" on right side to view all files on the tree.
Select all files and folders except "plesk-stats"
Select Copy/move button
in the path filed type /httpdocs/abc/
Click Move.
If all files moved and then open "abc" folder
Select all files and folders.
Select Copy/move button
in the path filed type /httpdocs/
that's it issue sorted out.
I tried these steps for many clients.
I hope this helps for someone.

Related

Using p4 zip and unzip to export files from one perforce server to another

I was trying to export files along with their revision history inside my depot folder from 2015.2 to 2019 perforce server.Also , I would want perforce to create new user on my new server corresponding to the commiter/submitter on my original 2015 repo.
Perforce replicate looked like overkill for my current task and then I came across this read on perforce's website that mentioned P4 zip.
This looked like it will solve my problem, but the article has a few issues I could not understand.
Let's say I am moving data from server1_ip:port --> server2_ip:port
I am currently following these steps
Making zip of folder to be copied using
p4 remote my_remote_spec , setting
Address: server1_ip:port
DepotMap://depot/... //depot2/...
p4 -p server1_ip:port zip -o test.zip -r my_remote_spec -A //depot/.... But on this step I get permission denied error. This is weird to me because the user although not super/admin has access to files i ask to get zipped.
Also, when i did try with a super user, i could not find test.zip even though i was not prompted any errors.
Isn't the above command supposed to generate a zip file inside the directory which i run it from?
Is the unzip command supposed to be run after a p4 login from user of second server?
Lastly, from the document why is a third port , 1667 mentioned in the transfer of files from server running on 1666 and 1777.
on this step I get permission denied error. This is weird to me because the user although not super/admin has access to files i ask to get zipped.
This is expected:
C:\Perforce\test>p4 help zip
zip -- Package a set of files and their history for use by p4 unzip
...
The zip command requires super permission granted by p4 protect.
Isn't the above command supposed to generate a zip file inside the directory which i run it from?
Similar to p4 admin checkpoint, the zip file is written to the server machine (relative to the server root, if you don't specify an absolute path), rather than being transferred to the local client directory. This is not explicitly stated in the documentation (which seems like an oversight), but if you look in the root directory of the server where you ran the zip, you should find your test.zip there.
Is the unzip command supposed to be run after a p4 login from user of second server?
Yes, any time you run a command against a particular server, you will need to be logged in to that server. In the case of p4 unzip you will need at least admin permission on the second server.
Lastly, from the document why is a third port , 1667 mentioned in the transfer of files from server running on 1666 and 1777.
I'm pretty sure that's a typo; whoever wrote the article started off using ports 1666 and 1777, changed their mind halfway through, and didn't proofread. :)

Moodle: PDFs are empty

Many PDFs from different courses appear to have been corrupted or something. We first noticed when viewing to view in CHrome and got the error "Failed to load PDF document." In Internet Explorer the page just shows up empty.
When viewing the file in the "Updating file in" area, it says the following: "Either the file does not exist or there is a permission problem." It has a file size, but when I click on Download, the file is 0 kb.
Where are the files saved? Why are they corrupted?
Update: I've narrowed it down to that the /moodledate/filedir lost all the references. The folders are there as well as the files. Is there any way to fix this without having to reupload all PDFs?
I am on version 3.6.3 on Windows
The content/path hash is stored in the mdl_files table - maybe have a look in there to see if you can match up the files. The hash should match the folder/file name.
SELECT *
FROM mdl_files
WHERE filename LIKE '%pdf%'
OR mimetype LIKE '%pdf%'
OR source LIKE '%pdf%'
Also, check the file permissions. I don't use Windows, so not sure how it works on there. But on Linux, the web server should have access to the data folder.
Something like:
sudo chown -R www-data:www-data /pathto/moodledata/
sudo chmod -R 02777 /pathto/moodledata/
see https://docs.moodle.org/38/en/Security_recommendations#Most_secure.2Fparanoid_file_permissions

Need help to write a basic Command Line code

I'm using Windows 10 if it matters and I'm trying to feed a file to the "oeminst" app that will convert this file from .EDR to .CCSS. According to the app's website its usage summary is this:
oeminst [-options] [inputfiles]
-v Verbose
-n Don't install, show where files would be installed
-c Don't install, save files to current directory
-S d Specify the install scope u = user (def.), l = local system]
infile Manufacturers setup.exe install file(s) or .dll(s) containing install files
infile.[edr|ccss|ccmx] EDR file(s) to translate and install or CCSS or CCMX files to install
If no file is provided, oeminst will look for the install CD.
more info can be found here https://www.argyllcms.com/doc/oeminst.html
So far I tried this code:
C:\Users\PC>oeminst infile. [C:\Users\PC\testfile.edr]
oeminst: Error - Unable to load file 'infile [C:\Users\PC\testfile]'
I'd appreciate if someone at least could tell me if I'm doing it right or not.
P.S. sorry for the messed up text. Not sure how to fix it. It looks good in editing mode.
Try this : oeminst infile.edr C:\Users\PC\testfile.edr
Nevermind, I got it.
C:\Users\PC>oeminst C:\Users\PC\testfile.edr

Ctools do not show up in pentaho UI

I am using Pentaho CE 5 on windows. I would like to use CTools but I can't make them show up in the File -> New menu to use them.
Being behind a proxy, I can not use the Marketplace plugin, so I have tried a manual installation.
First, I tried to use the ctools-installer.sh. I have run the following command line in cygwin (wget and unzip are installed):
./ctools-installer.sh -s /cygdrive/d/Users/[user]/Mes\ Programmes/pentaho/biserver-ce/pentaho-solutions/ -w /cygdrive/d/Users/[user]/Mes\ programmes/pentaho/biserver-ce/tomcat/webapps/pentaho/
The script starts, asks me what module I want to install, and begins the downloads.
For each module, I get an output like (set -x added to the script) :
echo -n 'Downloading CDF...' Downloading CDF...+ wget -q --no-check-certificate 'http://ci.analytical-labs.com/job/Webdetails-CDF-5-Release/lastSuccessfulBuild/artifact/bi-platform-v2-plugin/dist/zip/dist.zip'
-O .tmp/cdf/dist.zip SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc
'[' '!' -z '' ']'
rm -f .tmp/dist/marketplace.xml
unzip -o .tmp/cdf/dist.zip -d .tmp End-of-central-directory signature not found. Either this file is not a zipfile, or it
constitutes one disk of a multi-part archive. In the latter case
the central directory and zipfile comment will be found on the last
disk(s) of this archive. unzip: cannot find zipfile directory in
.tmp/cdf/dist.zip,
and cannot find .tmp/cdf/dist.zip.zip, period.
chmod -R u+rwx .tmp
echo Done Done
Then the script ends. I have seen on this page (pentaho-bi-suite) that it is the normal output. Nevertheless, it seems a bit strange to me and when I start my pentaho server (login: admin/password), I cannot see any new tools in the menus.
After a look to a few other tutorials and the script itself, I have downloaded the .zip snapshots for every tool and unzipped them in the system directory of my pentaho server. Same result.
I would like to make the .sh works, what can I try or adjust ?
Thanks
EDIT 05/06/2014
I checked the dist.zip files dowloaded by the script and they are all empty. It seems that wget cannot fetch the zip files, and therefore the installation fails.
When I try to get any webpage through wget, it fails. I think it is because of the proxy.
Here is my .wgetrc file, located in my user's cygwin home folder:
use_proxy=on
http_proxy=http://[url]:[port]
https_proxy=http://[url]:[port]
proxy_user=[user]
proxy_password=[password]
How could I make this work?
EDIT 10/06/2014
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline.
I guess this is related with the -r option.
I consider this post solved, since it not a CTools issue anymore.
Difficult to identify the issue in the above procedure..
but you can refer this blog he is key member of pentaho itself..
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline. I guess this is related with the -r option.
I consider this post solved, since it is not a CTools issue anymore.
You can manually install the components from http://www.webdetails.pt/ctools/ or if you have pentaho 5.1 or above, you add the following parameters to CATALINA_OPTS option (in start-pentaho.bat or start-pentaho.sh):
-Dhttp.proxyHost= -Dhttp.proxyPort= -Dhttp.nonProxyHosts="localhost|127.0.0.1|10...*"
http://docs.treasuredata.com/articles/pentaho-dataintegration#tips-how-can-i-use-pentaho-through-a-proxy

My joomla site is showing blank

I have exported my joomla site from beta release nvpccbeta.com to trying to be nvpcc.org
and it is showing blank I know configuration.php is running so I checked erro log it is chowing me
href='function.main'>function.main]: SAFE MODE Restriction in effect. The script whose uid is 10006 is not allowed to access /var/www/vhosts/nvpcc.org/httpdocs/includes/defines.php owned by uid 0 in /var/www/vhosts/nvpcc.org/httpdocs/index.php on line 21
[client 68.224.6.162] PHP Warning: main(/var/www/vhosts/nvpcc.org/httpdocs/includes/defines.php) [function.main]: failed to open stream: Success in /var/www/vhosts/nvpcc.org/httpdocs/index.php on line 21
I have changed my php.ini safe mode to off and still showing same problem. Need help thank you
It could be your Live Site variable in your configuration.php file. Did you change that? If you don't, it will give you problems.
The error listed about says that the user that your webserver is running as can't access /var/www/vhosts/nvpcc.org/httpdocs/includes/defines.php. The files are owned by the wrong uid. This is an issue with the server.
These must be ran from the web root.
I would change everything in the web root to the correct user.
chown -R user:group *
Verify folder and file permissions
find . -type d -exec chmod 755 {} \;
find . -type f -exec chmod 644 {} \;
Changed safe_mode off, did you restart your web server?