edited cgi file and then the website stopped [closed] - perl

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
First of all, I am not an expert.
I have been working with a website.
I have looked at a cgi file:
eval {
# configuration
$ui_title="Something";
$ui_title_abbrev="Something";
$ui_dblue="#400080";
$ui_blue ="#6464FF";
$ui_lblue="#d0d9f4";
...
}
I have changed a value of a parameter $ui_title from "Something" to "Something_else".
Then the website stopped working.
I have changed the value from "Something_else" to "Something."
But the website still do not go back to the previous status.
Please advise the next step.

If you really want to get to the bottom of the error, look at the error log file of the web server. It will state the exact reason for the error. If you don't understand the error then searching for it on Google will give you more details and probably the exact solution to the problem. Things never just stops working for no reason.
The answers below assume that your web site is hosted on a Linux based web host:
In my experience the most common error in this kind of scenario is that the file permission of the file changed when you edited it or updated it. Perl files or cgi files need the permission setting of 755 (set to executable) to work. In other words the file needs to be set to be executable. If you are using FTP to update/transfer the file there will normally be an option in the FTP program to set the file permission setting to 755 on the server. (with your ftp program) If you are in a Linux terminal window you can use chmod to set the permission setting of the file. Also double check that the directory holding this file has the permission setting of 755
This is rare but happens: If you are using FTP to transfer the file to the server then it might be that it did not transfer properly. (Broken connections, sluggish internet, etc.) Try to re-transfer the file. You can also compare the file sizes of the one you have offline and the file on the server to make sure that they are the same.
This answer assume that you are using windows (opened the file in Windows to edit) and have a Linux based server. If you used a text editor to open the file and your web host or web server is Linux based make sure that you saved the file in the Unix file format. There are many good or more advanced text editors out there that will enable you to read and save files in the unix file format. Get a good text editor like Textpad (free to try) to open the file and save it in the Unix file format.
If you're new to Perl/CGI scripts then the best advice would probably be to make a backup of the file before you make changes to it. The 3 solutions above are probably some of the most common problems if there is nothing wrong with the code.

Related

ftp-kr - Cannot edit in read-only editor

I cannot figure out how to use the ftp-kr extension for VS Code properly. I have read and re-read the GitHub Wiki and the README documentation and cannot find any other help for my issue. I posted this question to the project Issues on GitHub but have yet to receive a response.
ftp-kr is just a simple extension for editing files via FTP that are located on a remote server. Many other users seem to be using the extension with no issue.
I installed the extension, and then I successfully edited the connection settings (in ftp-kr.json) and successfully made a connection to the remote server I am trying to edit files on.
I opened the FTP-KR: EXPLORER pane to look at all of the files on the server, but if I double-click any of the files to open them and then try to edit them, I am unable to type and just get a warning in VS Code that says "Cannot edit in read-only editor".
I have tried right-clicking on files and clicking the "Download This" button, thinking that maybe I need to download a local copy of the files to edit before uploading the changes. However, whenever I click the "Download This" option on any file in the ftp-kr Explorer, it just gives me an error message that says "[file_name] is not in remotePath".
I tried running the >ftp-kr: Download All command, but it just spits out a notice that says "Nothing to DO".
How do I edit files located on the remote server and save those changes to the server?
P.S. I have tried the solutions found in this question but unfortunately none of them seem to work. Particularly, "code-runner: Run in terminal" is not in my settings and "Edit in Local" is not a context menu item that appears in my editor.
After a number of months, I finally have a solution to this thanks to the developer eventually responding on GitHub. There are a few things going on here.
Any file that a user wishes to modify must be downloaded as a local copy on the user's machine first and then that copy can be uploaded to the webserver via FTP.
ftp-kr cannot auto-download individual files. (Either when they are double-clicked on or through any other method.) It can only download entire directories, and those directories can be changed by using the localPath and ignore options in the configuration file, then stopping and restarting the ftp connection.
The "Download This" context menu option that appears is a piece of non-implemented code. It will not do anything.
>ftp-kr: Download All is the preferred way (by the developer's intent) to download the remote files onto the user's computer. The fact that it was returning an error before was a bug which has now been fixed.
Confusingly, the user can view the filenames and context of every file on the remote file system using a convenient tree view, you just cannot simply download any of those files individually.
In all, this plugin does not provide the functionality that I hoped it would have. (Namely, being able to easily download, modify, and upload individual files.) So I know how to properly use it, I will just be switching to a different plugin for my purposes.

Saving a text file from Citrix session to client machine

I've a requirement to save a text file from the citrix session to the client machine. So the published machine runs an executable and that executable 'knows' where the client machine is and can save the text file onto it.
I've tried to save it using the equivalent of \tsclient share as outlined in this pretty old article. I couldn't get it to work - the published executable couldn't see the filepath back to the client.
Failing this a programmatic solution would be good - we own the published executable so we can amend it.
Many thanks for any / all help
EDIT
Just to emphasize- I don't expect anyone to write this code for me, but some pointers to how it would be possible with decent references would be extremely helpful.

Symlink when the directory already exists [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 years ago.
Improve this question
We have added an NFS share onto our web servers and need to set-up a symlink - however, we are not sure how to do this when the directory already exists.
Webserver directory : /home/example/public_html/images
NFS target : /nfsshare
We want to have a symlink from /home/example/public_html/images -> /nfsshare
The NFS share has an up to date copy of everything in /home/example/public_html/images and we want it anytime someone goes to https://www.example.com/images/123.jpg or references /images/123.jpg in HTML that it shows the file /nfsshare/123.jpg for example
When we create the symlink
ln -s /nfsshare /home/example/public_html/images/
it appears to work fine but when we browse to /images it still shows the original content and the nfsshare folder is then inside the images folder
[root#host01 images]# ls
index.html nfsshare image.png
Is there any other way when you want to in effect point a folder to somewhere else rather than its current location?
The point of a symlink is to create a "virtual" folder (or file) that points to an existing folder (or file). But you can't create a folder or file when it already exists -- more precisely you can't create a "virtual" folder (symlink) when a physical directory already exists.
... and if I understand you correctly, you already have an existing /home/example/public_html/images directory. You can't have both; you'd need to remove (or rename) the existing web server directory.
As an aside, the symlink got created inside the folder because you specified an existing folder name as the target.
Update:
Assuming a sane set of permissions on the server, you probably need to do this as root or using sudo. You should probably also verify that the permissions on the NFS folder restrict your web app(s) to read-only access.

How to find out current directory and go to a directory in MySQL console?

I have the following instruction in I need to perform to run a web app I that have received:
"Go to the directory where the app is unpacked and type 'gradle jettyRun'."
Sounds simple enough, if you know the commands for finding out your current directory and changing it. The problem is, searching for these basic things only nets a huge amounts of irrelevant answers to much more advanced questions where the same terms are used with a slightly different meaning. So what do they exactly mean by what they say and how do I achieve that? It sound's so simple I'm almost embarrased to have to ask it, yet I'm still dumbfounded by the MySQL command line enough to have to.
This has nothing to do with the MySQL command line (>>>), or MySQL itself. This is simply saying:
Open your terminal or shell. In Windows, this is called Command Prompt.
Change the directory to where the files are located, you do this with the cd (change directory) command.
Next you simply type gradle jettyRun.

Best way to edit files through FTP client and save a backup automatically

It is a habit that I have for editing files online . As far as I have many working websites and I don't want to backup all the files located on them but only those that I have edited through FTP client software .
What is the best way to have a version tracker for files ? Something like Github
I am not cool with editing files (websites) on localhost and move them to online mode. I am looking for a way to synchronize both local and web files in order to have the latest version of special files.
What about trying something like WinSCP or setting up XAMPP and working locally pushing to bitbuckket or github then once done working uploading all the files through FTP. WinSCP is for windows and allows you to edit the files without having to download them, edit them, reupload them. It allows you to edit them while they are live. However, XAMPP way is a better way to go if you plan to work on other peoples websites at any point in time.