PHP script works from browser but not from Windows Server Task Scheduler or from CMD/PS - command-line

I have this simple script called webcam.php to acquire some screenshot from webcams
<?php
$d=date('YmdHis');
$url = 'http://xxx:40801/snap.jpeg?'.$d;
$img = 'camera_east.jpg';
echo file_put_contents($img, file_get_contents($url));
$url = 'http://xxx:40802/snap.jpeg?'.$d;
$img = 'camera_west.jpg';
echo file_put_contents($img, file_get_contents($url));
echo $d;
?>
and if I call http://xxx/webcam.php from browser, everything's OK:
I find the two pictures in the folder, and the script returns the length of the files and the timestamp as echoes.
I tried to make this script to be executed by the windows scheduler, but although it returns 0x0 the pictures are not updated.
(I tried also unlinking the images, and also using curl but nothing changes)
Then I tried to run the PHP script from command line (also from PowerShell):
something like:
C:\Program Files\PHP\v7.2\php.exe -f C:\\webcam.php
but again, although it seems working, since it returns the length of the two files and the timestamp, the pictures are not updated and if I add unlink command, files are not cancelled:
Clearly folder has all permissions...
I've not big experience in PHP... :-(
what can be wrong?
Thanks!

obviously from cmd/ps/scheduler requires the full path,
while from browser can accept relative path

Related

How can I export cached files saved in a browser using CacheStorage?

I have a website which uses the CacheStorage API to save various files using a Service Worker. For reasons beyond my control, lots of these files have been lost from the server they get loaded from. However, I have just realised that several hundred of the files have been cached locally in a browser which had accessed the site lots over a period of years (Luckily the site hadn't been clearing up the cache after itself properly). I can preview the files using chrome's dev tools, but when I click "download" it attempts to download a copy from the server (which no longer exists), rather than giving me the locally cached version.
What's the simplest way to do a one-off export of these files (bearing in mind there's a few hundred of them)? I have full access to the computer the browser is running on, and the domain that the site / service worker is running on. It doesn't need to be a pretty solution, as once the files are restored I plan to learn plenty of lessons to prevent something similar happening in future.
The CacheStorage API can be accessed from normal web page JavaScript, as well as a service worker, so if you create a web page on the server that accesses window.caches, you should be able to fetch things out of the cache and do whatever you want. Once you have cache.keys() you could loop over that and use match() which returns the response for that request. You could then print them out for copy and paste (presumably not ideal), POST each one to a server that saves them, or similar.
Here is some normal JS I have on traintimes.org.uk; only to display a list of offline pages, but it could presumably fetch the actual cache entries if it needed.
<script>
// Open the page cache
caches.open("pages")
// Fetch its keys (cached requests)
.then(cache => cache.keys())
// We only want the URLs of each request
.then(reqs => reqs.map(r => r.url))
// We want most recent one first (reverse is in-place)
.then(urls => (urls.reverse(), urls))
// We don't care about the domain name
.then(urls => urls.map(u => u.replace(/^.*?uk/, '')))
// We want them to be clickable links
.then(urls => urls.map(u => [
'<a href="', u, '">',
u.replace(/\?cookie=[^;&]*/, ''),
'</a>'].join("")))
// We want them to be visible on the page
.then(urls =>
document.getElementById('offline-list').innerHTML =
'<li>' + urls.join('</li><li>') + '</li>'
);
</script>
Responses added to the CacheStorage API are stored on disk. For example, chrome on Mac OSX stores them in
~/Library/Application Support/Google/Chrome/Default/Service Worker/CacheStorage. Inside this directory, there is a directory for each domain, and within those, separate directories for each particular cache used by that domain. The names of these directories (at both levels) don't appear to be human-readable, so you may need to search the contents to find the specific cache you're looking for.
Within the directory for each cache, every response is saved in a different file. These are binary files and contain various bits of info, including the URL requested (near the top) and the HTTP response headers (towards the end). Between these, you'll find the body of the HTTP response.
The exact logic for extracting the bodies and saving them to files usable elsewhere will vary based URL schemas, file formats etc. This bash script worked for me:
#!/bin/bash
mkdir -p export
for file in *_0
do
output=`LC_ALL=C sed -nE 's%^.*/music/images/artists/542x305/([^\.]*\.jpg).*%\1%p;/jpg/q' $file`
if [ -z "$output" ]
then
echo "file $file missing music URL"
continue
fi
if [[ $(LC_ALL=C sed -n '/x-backend-status.*404/,/.*/p' $file) ]]
then
echo "$file returned a 404"
continue
fi
path="export/$output"
cat $file | LC_ALL=C sed -n '/music\/images\/artists/,$p' | LC_ALL=C sed 's%^.*/music/images/artists/542x305/[^\.]*\.jpg%%g' | LC_ALL=C sed -n '/GET.*$/q;p' > $path
echo "$file -> $path"
done

Ctools do not show up in pentaho UI

I am using Pentaho CE 5 on windows. I would like to use CTools but I can't make them show up in the File -> New menu to use them.
Being behind a proxy, I can not use the Marketplace plugin, so I have tried a manual installation.
First, I tried to use the ctools-installer.sh. I have run the following command line in cygwin (wget and unzip are installed):
./ctools-installer.sh -s /cygdrive/d/Users/[user]/Mes\ Programmes/pentaho/biserver-ce/pentaho-solutions/ -w /cygdrive/d/Users/[user]/Mes\ programmes/pentaho/biserver-ce/tomcat/webapps/pentaho/
The script starts, asks me what module I want to install, and begins the downloads.
For each module, I get an output like (set -x added to the script) :
echo -n 'Downloading CDF...' Downloading CDF...+ wget -q --no-check-certificate 'http://ci.analytical-labs.com/job/Webdetails-CDF-5-Release/lastSuccessfulBuild/artifact/bi-platform-v2-plugin/dist/zip/dist.zip'
-O .tmp/cdf/dist.zip SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc
'[' '!' -z '' ']'
rm -f .tmp/dist/marketplace.xml
unzip -o .tmp/cdf/dist.zip -d .tmp End-of-central-directory signature not found. Either this file is not a zipfile, or it
constitutes one disk of a multi-part archive. In the latter case
the central directory and zipfile comment will be found on the last
disk(s) of this archive. unzip: cannot find zipfile directory in
.tmp/cdf/dist.zip,
and cannot find .tmp/cdf/dist.zip.zip, period.
chmod -R u+rwx .tmp
echo Done Done
Then the script ends. I have seen on this page (pentaho-bi-suite) that it is the normal output. Nevertheless, it seems a bit strange to me and when I start my pentaho server (login: admin/password), I cannot see any new tools in the menus.
After a look to a few other tutorials and the script itself, I have downloaded the .zip snapshots for every tool and unzipped them in the system directory of my pentaho server. Same result.
I would like to make the .sh works, what can I try or adjust ?
Thanks
EDIT 05/06/2014
I checked the dist.zip files dowloaded by the script and they are all empty. It seems that wget cannot fetch the zip files, and therefore the installation fails.
When I try to get any webpage through wget, it fails. I think it is because of the proxy.
Here is my .wgetrc file, located in my user's cygwin home folder:
use_proxy=on
http_proxy=http://[url]:[port]
https_proxy=http://[url]:[port]
proxy_user=[user]
proxy_password=[password]
How could I make this work?
EDIT 10/06/2014
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline.
I guess this is related with the -r option.
I consider this post solved, since it not a CTools issue anymore.
Difficult to identify the issue in the above procedure..
but you can refer this blog he is key member of pentaho itself..
In the end, I have changed my network connection settings to bypass the proxy. It seems that there is an offline mode for the installer, so one can download all needed files on a proxy-free environment and then run the script offline. I guess this is related with the -r option.
I consider this post solved, since it is not a CTools issue anymore.
You can manually install the components from http://www.webdetails.pt/ctools/ or if you have pentaho 5.1 or above, you add the following parameters to CATALINA_OPTS option (in start-pentaho.bat or start-pentaho.sh):
-Dhttp.proxyHost= -Dhttp.proxyPort= -Dhttp.nonProxyHosts="localhost|127.0.0.1|10...*"
http://docs.treasuredata.com/articles/pentaho-dataintegration#tips-how-can-i-use-pentaho-through-a-proxy

file for saving cookie data not found when using HTTP::Cookies in Perl script

all. I had some questions about the Perl module HTTP::Cookies. The example on CPAN is like below:
$cookie_jar = HTTP::Cookies->new( file => '$ENV{\'HOME\'}/lwp_cookies.dat', autosave => 1);
The lwp_cookies.dat file is used to save cookie data on my local machine as I understand. On my machine, '$ENV{\'HOME\'}' is an empty path. The script runs good, even after execution I can't find any file named "lwp_cookies.dat" on my machine. I changed '$ENV{\'HOME\'}' to '$ENV{\'TMP\'}', which is a path really exists after I verified by Perl print. Still I can't find the "lwp_cookies.dat" in my TEMP folder. My first question is how the HTTP::Cookies is working with the "lwp_cookies.dat" file.
On the other hand, on one of my systems(all're Windows system as mentioned here), the same code produce error message below:
Can't open $ENV{'HOME'}/lwp_cookies.dat: No such file or directory
So it's strange to me. On my good system, even file or path not exists, the script runs well, which I suppose the file is created on some temp memory instead; on bad system, the code example doesn't work at all.
If you want the $ENV{'HOME'} variable to interpolate into the string, you need double quotes; single quotes don't interpolate variables:
`file => "$ENV{'HOME'}/lwp_cookies.dat",`

Perl Statistics::R generates blank plot image (jpeg)

I am currently using ActiveState Perl 5.14 and the R project version 2.13.2. Within Perl I am using Statistics::R version 0.08. According to ActiveState the more recent versions of Statistics::R (through 0.24) failed to pass scrutiny and are therefore not available through the PPM.
History: I have been successfully using Perl to access R for some time to perform analysis. Now I want to generate JPEG images of the results of the analysis for easy visualization.
Here's the problem: I can generate the images successfully from within the R console. However, when I run the same commands through Perl I only get a blank image. My console code includes (simplified, of course):
x<-c(1,2,3,4,5)
y<-c(5,4,3,2,1)
jpeg("C:/temp.jpg")
plot(x,y)
dev.off()
And my Perl commands include (also simplified):
$R = Statistics::R->new();
$R->start_sharedR
$R->send("x<-c(1,2,3,4,5)");
$R->send("y<-c(5,4,3,2,1)");
$R->send('jpeg("C:/temp.jpg")');
$R->send("plot(x,y)");
$R->send("dev.off()");
Any suggestions? I know that there are other plotting options accessible to Perl. I have eliminated some (GD Graph) because X-axis data is not treated as numeric. I'd prefer to keep it in R if at all possible since I'm already interacting in that package for the analysis. Thanks!
Forget Statistics::R. Just use a system call. At least it's what I do!
my $path_to_r = "C:/Program Files/R/bin/Rscript.exe";
my $cmd = "x<-c(1,2,3,4,5);";
$cmd .= "y<-c(5,4,3,2,1);";
$cmd .= 'jpeg("C:/temp.jpg");';
$cmd .= "plot(x,y);";
$cmd .= "dev.off()";
system($path_to_r . " -e '" . $cmd . "'");
If your R script grows up a bit or if it takes input from the parameters, write it in a file and Rscript.exe this file.
It works fine for me with Statistics R::0.27, but not with 0.08, the only version I could find in Active perl's package manager. In order to install 0.27, I had to use cpan command line. Make test fails but make install was fine. Bit of a life-saver.
(By the way I'm a relative noob. Using cpan command line was pretty easy however.
Type i /Statistics-R/ from cpan command line, then
install FANGLY/Statistics-R-0.27.tar.gz (or whatever the relevant file is. I'm using a windows system so RSPerl annoyingly not an option for me. I note that latest Statistics::R version is dated March 2012 so perhaps some of the previously documented (piping?) problems have been solved. You may also need to install a 'maker'; in my case it was 'dmake', not 'nmake'. Pretty easy, you can get a version of make from M$ website and copy that + .err file into PERL\bin dir. But help on this is available elsewhere. Hope this helps!)

Where does CGI.pm normally create temporary files?

On all my Windows servers, except for one machine, when I execute the following code to allocate a temporary files folder:
use CGI;
my $tmpfile = new CGITempFile(1);
print "tmpfile='", $tmpfile->as_string(), "'\n";
The variable $tmpfile is assigned the value '.\CGItemp1' and this is what I want. But on one of my servers it's incorrectly set to C:\temp\CGItemp1.
All the servers are running Windows 2003 Standard Edition, IIS6 and ActivePerl 5.8.8.822 (upgrading to later version of Perl not an option). The result is always the same when running a script from the command line or in IIS as a CGI script (where scriptmap .pl = c:\perl\bin\perl.exe "%s" %s).
How I can fix this Perl installation and force it to return '.\CGItemp1' by default?
I've even copied the whole Perl folder from one of the working servers to this machine but no joy.
#Hometoast:
I checked the 'TMP' and 'TEMP' environment variables and also $ENV{TMP} and $ENV{TEMP} and they're identical.
From command line they point to the user profile directory, for example:
C:\DOCUME~1\[USERNAME]\LOCALS~1\Temp\1
When run under IIS as a CGI script they both point to:
c:\windows\temp
In registry key HKEY_USERS/.DEFAULT/Environment, both servers have:
%USERPROFILE%\Local Settings\Temp
The ActiveState implementation of CGITempFile() is clearly using an alternative mechanism to determine how it should generate the temporary folder.
#Ranguard:
The real problem is with the CGI.pm module and attachment handling. Whenever a file is uploaded to the site CGI.pm needs to store it somewhere temporary. To do this CGITempFile() is called within CGI.pm to allocate a temporary folder. So unfortunately I can't use File::Temp. Thanks anyway.
#Chris:
That helped a bunch. I did have a quick scan through the CGI.pm source earlier but your suggestion made me go back and look at it more studiously to understand the underlying algorithm. I got things working, but the oddest thing is that there was originally no c:\temp folder on the server.
To obtain a temporary fix I created a c:\temp folder and set the relevant permissions for the website's anonymous user account. But because this is a shared box I couldn't leave things that way, even though the temp files were being deleted. To cut a long story short, I renamed the c:\temp folder to something different and magically the correct '.\' folder path was being returned. I also noticed that the customer had enabled FrontPage extensions on the site, which removes write access for the anonymous user account on the website folders, so this permission needed re-applying. I'm still at a loss as to why at the start of this issue CGITempFile() was returning c:\temp, even though that folder didn't exist, and why it magically started working again.
The name of the temporary directory is held in $CGITempFile::TMPDIRECTORY and initialised in the find_tempdir function in CGI.pm.
The algorithm for choosing the temporary directory is described in the CGI.pm documentation (search for -private_tempfiles).
IIUC, if a C:\Temp folder exists on the server, CGI.pm will use it. If none of the directories checked in find_tempdir exist, then the current directory "." is used.
I hope this helps.
Not the direct answer to your question, but have you tried using File::Temp?
It is specifically designed to work on any OS.
If you're running this script as you, check the %TEMP% environment variable to see if if it differs.
If IIS is executing, check the values in registry for TMP and TEMP under
HKEY_USERS/.DEFAULT/Environment