Scripted FTP Upload from Container - server

I am trying to upload a file from a container field to a location on FTP as a serverside script. I have been trying to use the Base Elements BE_FTP_Upload as I'm lead to believe this works on a server script, however I just simply cannot get it to work, I've had the file on FTP, but its always blank missing the content.
I should also add that the BE_Curl_Trace feedback shows successful connection to the FTP, it seems to be my method of moving the file rather than a bad connection. Script attached. (excuse the squiggles, data protection and what not.)

After all of this, simply changing the "filewin:" to "file:" solved my problem, I am now exporting from FM to FTP via a scheduled server script :)

Related

Using WGET to retrieve information from PLC - Error 400 Bad Request

I'm attempting to use the wget program to retrieve and save a list of data from my Siemens S7-1200 PLC. Using a batch file I had written I was able drill down the folder path to my wget.exe file. Upon running the wget executable I get the error message seen in the attached screenshot, labeled "Command Prompt Screenshot".
The command prompt shows me that I've "connected" and I know the username and password are correct because I can log into the PLC using my web browser. It's for those reasons I'm stumped on what the problem is.
Has anyone seen this before or can anyone point me in the right direction?
Thanks for the response Ken. I was actually able to get it working with the assistance of the Siemens technical support. Apparently my computer didn't like the way I was trying to pass it the username and password login credentials. Through Siemens TIA Portal software I was able remove the login restrictions, allowing all users access to reading data off the PLC and it works now. I've attached a copy of the exact batch file I used. Also, to make sure I'm adding as much detail as possible, I have the batch file and the wget.exe file saved to a folder on my c:\ drive. Functional wget batch file

SFTP file uploading and downloading at same time

A cronjob runs every 3 hours to download a file using SFTP. The scheduled program is written in Perl and the module used is Net::SFTP::Foreign.
Can the Net::SFTP::Foreign download files that are only partially uploaded using SFTP?
If so, do we need to check the SFTP file modified date to check copy process completion?
Suppose a new file is uploading by someone in SFTP and he file upload/copy is in progress. If a download is attempted at the same time, do I need to code for the possibility of fetching only part of a file?
It's not a question of the SFTP client you use, that's irrelevant. It's how the SFTP server handles the situation.
Some SFTP servers may lock the file being uploaded, preventing you from accessing it, while it is still being uploaded. But most SFTP servers, particularly the common OpenSSH SFTP server, won't lock the file.
There's no generic solution to this problem. Checking for timestamp or size changes may work for you, but it's hardly reliable.
There are some common workarounds to the problem:
Have the uploader upload "done" file once upload finishes. Make your program wait for the "done" file to appear.
You can have dedicated "upload" folder and have the uploader (atomically) move the uploaded file to "done" folder. Make your program look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the uploader (atomically) rename the file after upload to its final name. Make your program ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for example of implementing this approach.
Also, some FTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive.
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, when you download an incomplete file.
For details, see my answer to SFTP file lock mechanism.
The easiest way to do that when the upload process is also under your control, is to upload files using temporal names (for instance, foo-20170809.tgz.temp) and once the upload finishes, rename then (Net::SFTP::Foreign::put method supports the atomic option which does just that). Then on the download side, filter out the files with names corresponding to temporal files.
Anyway, Net::SFTP::Foreign get and rget methods can be instructed to resume a transfer passing the option resume => 1.
Also, if you have full SSH access to the SFTP server, you could check if some other process is still writing to the file to be downloaded using fuser or some similar tool (though, note that even then, the file may be incomplete if for instance there is some network issue and the uploader needs to reconnect before resuming the transfer).
You can check the size of the file.
Connect to SFTP.
Check file size.
Sleep for 5/10 seconds.
Check file size again.
If size did not change, download the file, if the size changed do step 3.

How can I download a perl script from a CGI server instead of running it?

I am trying to download the perl scripts from this site:
http://pages.cs.wisc.edu/~david/courses/cs552/S12/handouts/bins/
When I open or download any of them, the scripts execute. However, I want the text in the scripts. Is there any way I can do this?
For the non .pl files,
The server is actually retuning the script (not its output), so you must be executing it on your end. Right-click on the link and choose Save Link As.
For the .pl files,
The server is actually executing these scripts and returning the output. You would need to use a different URL to get the script itself. No such URL is likely to exist.
Contact your prof and advise him of the issue.
If the scripts are executing then the webserver is configured to execute scripts in that directory or it is configured to execute all .pl or .cgi files where ever they reside. That is normal for a webserver to execute a script. If you want the script source then the webserver must be configured to not execute the scripts, in which case it would deliver the source you are after. Contact the administrator to configure correctly, assuming they shouldn't be executing.
Otherwise you would need filesystem access to ~david/courses/cs552/S12/handouts/bins/, be it via FTP or whatever to download them. Basically access however you can without it being over HTTP.
Just right-click it and choose "Save link As..." (works only if no script processing enabled on server side)
Normaly you can't do this, because when you request any script it beiing processed server-side and only result shown at browser.
If you try to configure you httpd to let users download scripts you should disable script handling by extension, or just change the extension to txt for example.

PHP Slow to process soap request via browser but fine on the command line

I am trying to connect to an external SOAP service using PHP and have written a small php test script that just connects to the service and performs a simple request to check everything is working.
This all works correctly but when I run via a browser request, it is very slow taking somewhere in the region of 40s to establish the initial connection. When I do the same request using the exact same script on the command line, it goes through straight away.
Does anyone have any ideas as to why this might be?
Cheers
PHP caches the wsdl in /tmp. If you run from the command line first, the cache file will be owned by whatever user you're running the script as, and apache won't be able to read the cache. The wsdl will have to be downloaded and parsed every time which will be slow.
Check the permissions of /tmp/wsdl*.
Maybe external SOAP service trying to check your IP, and your server has ICMP allowed, when your local network - not.
Anyway, this question might be answered more clearly by administrator of external SOAP service :)
Is there a difference between the php.inis that are being used?
On a standard ubuntu server installation:
diff /etc/php5/apache2/php.ini /etc/php5/cli/php.ini
//edit:
Another difference might be in the include paths. Had this trouble myself on a local test server, it didn't actually use the soap class that was included (it didn't include anything, because the search paths weren't valid), but it included the built-in soap_client class.

FTP a site from local host to server

I know this is a very basic question.I am new to web programming.Im working with a CMS.My client has asked me to 'FTP' the site that i am manipulating on my local machine, so that he can view the changes too.He also gave a link on cliking which, the site pops up in its original form.I understand that its hosted on a server and i am suposed to make it look like the one i have modified on local.How do i do this?using an FTP client.What about the database?
And also what if something goes wrong during the process?Is it undoable?
I would have done much more research before asking this question, but i have got so little time to figure this out.Thnks
Encourage your client to use scp or sftp instead. It'll encrypt the login and traffic.
Get an FTP program like WSFTP.
What about the database? You need a copy of the database on the server (which is presumably where the link goes).
Get the login/pass from the client.
It's un-doable if you have a copy or backup of the original.
FTP copies files from one machine to another. Sounds like you need to install the CMS on the server.
Need more information: what CMS, is it already on the server, what database?