In mojolicious, how to protect images from public view - perl

Somebody can help me, please. I have app in mojolicious Lite
I wanna block images from all without session login
when i type http://m.y.i.p:3000/images/imageX.jpg
i wanna show images only with session login.
my html code is
shwo image 1

Same way as any other content. Set up a handler for requests, render (or don't render) the content.
get '/images/*img' => sub {
my $c = shift;
if (!$c->session("is_authenticated")) {
return $c->render( text => "Forbidden", status => 403 );
}
my $file = $c->param("img");
if (!open(my $fh, '<', $IMAGE_DIR/$file)) {
return $c->render( text => "Not found", status => 404 );
}
my $data = do { local $/; <$fh> };
close $fh;
$c->render( data => $data, format => 'jpg' );
};
Your request handler will take priority over the default handler that serves content from public folders, but once you have this handler in place, you don't need to store the files it serves in a public folder.

another sololution is
get '/pays/*img' => sub {
my $self = shift;
my $img = $self->param('img');
plugin 'RenderFile';
my #imgext = split(/\./, $img);
my $ext = $imgext[-1];
$self->render_file(
'filepath' => "/directiry/$img",
'format' => "$ext", # will change Content-Type "application/x-download" to "application/pdf"
'content_disposition' => 'inline', # will change Content-Disposition from "attachment" to "inline"
# delete file after completed
);
It is use plugin 'RenderFile';

Related

Delayed response to slash command with Mojolicious in Perl

I am trying to create a slack application in Perl with mojolicious and I am having the following use case:
Slack sends a request to my API from a slash command and needs a response in a 3 seconds timeframe. However, Slack also gives me the opportunity to send up to 5 more responses in a 30 minute timeframe but still needs an initial response in 3 seconds (it just sends a "late_response_url" in the initial call back so that I could POST something to that url later on). In my case I would like to send an initial response to slack to inform the user that the operation is "running" and after a while send the actual outcome of my slow function to Slack.
Currently, I can do this by spawning a second process using fork() and using one process to respond imidiately as Slack dictates and the second to do the rest of the work and respond later on.
I am trying to do this with Mojolicious' subprocesses to avoid using fork(). However I can't find a way to get this to work....
a sample code of what I am already doing with fork is like this:
sub withpath
{
my $c = shift;
my $user = $c->param('user_name');
my $response_body = {
response_type => "ephemeral",
text => "Running for $user:",
attachments => [
{ text => 'analyze' },
],
};
my $pid = fork();
if($pid != 0){
$c->render( json => $response_body );
}else{
$output = do_time_consuming_things()
$response_body = {
response_type => "in-channel",
text => "Result for $user:",
attachments => [
{ text => $output },
],
};
my $ua = Mojo::UserAgent->new;
my $tx = $ua->post(
$response_url,
{ Accept => '*/*' },
json => $response_body,
);
if( my $res = $tx->success )
{
print "\n success \n";
}
else
{
my $err = $tx->error;
print "$err->{code} response: $err->{message}\n" if $err->{code};
print "Connection error: $err->{message}\n";
}
}
}
So the problem is that no matter how I tried I couldn't replicate the exact same code with Mojolicious' subproccesses. Any ideas?
Thanks in advance!
Actually I just found a solution to my problem!
So here is my solution:
my $c = shift; #receive request
my $user = $c->param('user_name'); #get parameters
my $response_url = $c->param('response_url');
my $text = $c->param('text');
my $response_body = { #create the imidiate response that Slack is waiting for
response_type => "ephemeral",
text => "Running for $user:",
attachments => [
{ text => 'analyze' },
],
};
my $subprocess = Mojo::IOLoop::Subprocess->new; #create the subprocesses
$subprocess->run(
sub {do_time_consuming_things($user,$response_url,$text)}, #this callback is the
#actuall subprocess that will run in background and contains the POST request
#from my "fork" code (with the output) that should send a late response to Slack
sub {# this is a dummy subprocess doing nothing as this is needed by Mojo.
my ($subprocess, $err, #results) = #_;
say $err if $err;
say "\n\nok\n\n";
}
);
#and here is the actual imidiate response outside of the subprocesses in order
#to avoid making the server wait for the subprocess to finish before responding!
$c->render( json => $response_body );
So I actually simply had to put my code of do_time_consuming_things in the first callback (in order for it to run as a subprocess) and use the second callback (that is actually linked to the parent process) as a dummy one and keep my "imidiate" response in the main body of the whole function instead of putting it inside one of the subprocesses. See code comments for more information!

How do I do a chunked transfer-encoding upload with WWW:Mechanize?

I'm attempting to use a particular web service, and I can successfully perform the upload with the following command:
curl -X POST --header "Transfer-Encoding: chunked" -d #Downloads/file.pdf https://some.webservice/upload
I get back a json response indicate success.
However, I'm unable to figure out how to do the same with WWW::Mechanize.
$mech->post("https://" . $server . "/upload", Content_Type => 'multipart/form-data', Content => [upID => $upid, name => $dlfile, userID => 0, userK => 0, file_0 => [$dlfile]]);
This receives a similar json response with a big fat error message in it.
Do I need to explicitly set the Transfer-Encoding header first? Is there some other trick to it? Google's not shedding much light on this, Perlmonks neither, and the documentation's a little obtuse.
You can do it using HTTP::Request::StreamingUpload
my $starttime = time();
my $req = HTTP::Request::StreamingUpload->new(
POST => $url,
path => $file,
headers => HTTP::Headers->new(
'Transfer-Encoding' => 'chunked'
),
);
my $gen = $req->content;
die unless ref($gen) eq "CODE";
my $total = 0;
$req->content(sub {
my $chunk = &$gen();
$total += length($chunk);
print "\r$total / $size bytes ("
. int($total/$size*100)
. "%) sent, "
. int($total/1000/(time()-$starttime+1))
. " k / sec ";
return $chunk;
});
my $resp = $ua->request($req);
print "\n";
unless ($resp->is_success) {
die "Failed uploading the file: ", $resp->status_line;
}
my $con = $resp->content;
return $con;
Do you really need WWW::Mechanize? It is a subclass of LWP::UserAgent with additional functionality that gives browser-like functionality like filling in and submitting forms, clicking links, a page history with a "back" operation etc. If you don't need all of that then you may as well use LWP::UserAgent directly
Either way, the post method is inherited unchanged from LWP::UserAgent, and it's fine to use it directly as you have done
The way to send a chunked POST is to set the Content to a reference to a subroutine. The subroutine must return the next chunk of data each time it is called, and finally ann empty string or undef when there is no more to send
Is the data supposed to be a JSON string?
It's easiest to write a factory subroutine that returns a closure, like this
sub make_callback {
my ($data) = shift;
sub { substr($data, 0, 512, "") }
}
Then you can call post like this
my $payload = to_json(...);
$mech->post(
"https://$server/upload",
Content_Type => 'multipart/form-data',
Content => make_callback($payload)
);
Please be aware that all of this is untested

How Upload file using Mojolicious?

I have been trying out Mojolicious web framework based on perl. And I have try to develop a full application instead of the Lite. The problem I am facing is that I am trying to upload files to server, but the below code is not working.
Please guide me what is wrong with it. Also, if the file gets uploaded then is it in public folder of the application or some place else.
Thanks in advance.
sub posted {
my $self = shift;
my $logger = $self->app->log;
my $filetype = $self->req->param('filetype');
my $fileuploaded = $self->req->upload('upload');
$logger->debug("filetype: $filetype");
$logger->debug("upload: $fileuploaded");
return $self->render(message => 'File is not available.')
unless ($fileuploaded);
return $self->render(message => 'File is too big.', status => 200)
if $self->req->is_limit_exceeded;
# Render template "example/posted.html.ep" with message
$self->render(message => 'Stuff Uploaded in this website.');
}
(First, you need some HTML form with method="post" and enctype="multipart/form-data", and a input type="file" with name="upload". Just to be sure.)
If there were no errors, $fileuploaded would be a Mojo::Upload. Then you could check its size, its headers, you could slurp it or move it, with $fileuploaded->move_to('path/file.ext').
Taken from a strange example.
To process uploading files you should use $c->req->uploads
post '/' => sub {
my $c = shift;
my #files;
for my $file (#{$c->req->uploads('files')}) {
my $size = $file->size;
my $name = $file->filename;
push #files, "$name ($size)";
$file->move_to("C:\\Program Files\\Apache Software Foundation\\Apache24\\htdocs\\ProcessingFolder\\".$name);
}
$c->render(text => "#files");
} => 'save';
See full code here: https://stackoverflow.com/a/28605563/4632019
You can use Mojolicious::Plugin::RenderFile
Mojolicious::Plugin::RenderFile

www::curl - how to upload (post) large files

I use WWW::Curl to upload files:
use WWW::Curl::Easy 4.14;
use WWW::Curl::Form;
my $url = 'http://example.com/backups/?sid=12313qwed323';
my $params = {
name => 'upload',
action => 'keep',
backup1 => [ '/tmp/backup1.zip' ], # 1st file for upload
};
my $form = WWW::Curl::Form->new();
foreach my $k (keys %{$params}) {
if (ref $params->{$k}) {
$form->formaddfile(#{$params->{$k}}[0], $k, 'multipart/form-data');
} else {
$form->formadd($k, $params->{$k});
}
}
my $curl = WWW::Curl::Easy->new() or die $!;
$curl->setopt(CURLOPT_HTTPPOST, $form);
$curl->setopt(CURLOPT_URL, $url);
my $body;
$curl->setopt(CURLOPT_WRITEDATA, \$body);
my $retcode = $curl->perform();
my $response_code = $curl->getinfo(CURLINFO_HTTP_CODE);
nothing special here and this code works well.
I want to upload large files and I don't want to preload everything in the memory. At least that is what I heard that libcurl is doing.
CURLOPT_READFUNCTION accepts callbacks which returns parts of the content. That means that I cannot use WWW::Curl::Form to set POST parameters but that I have to return the whole content through this callback. Is that right?
I think that the code could look like this:
use WWW::Curl::Easy 4.14;
my $url = 'http://example.com/backups/?sid=12313qwed323'
my $params = {
name => 'upload',
action => 'keep',
backup1 => [ '/tmp/backup1.zip' ], # 1st file for upload
};
my $fields;
foreach my $k (keys %{$params}) {
$fields .= "$k=".(ref $params->{$k} ? '#'.#{$params->{$k}}[0] : uri_escape_utf8($params->{$k}))."&";
}
chop($fields);
my $curl = WWW::Curl::Easy->new() or die $!;
$curl->setopt(CURLOPT_POST, 1);
$curl->setopt(CURLOPT_POSTFIELDS, $fields); # is it needed with READFUNCTION??
$curl->setopt(CURLOPT_URL, $url);
my #header = ('Content-type: multipart/form-data', 'Transfer-Encoding: chunked');
$curl->setopt(CURLOPT_HTTPHEADER, \#header);
#$curl->setopt(CURLOPT_INFILESIZE, $size);
$curl->setopt(CURLOPT_READFUNCTION, sub {
# which data to return here?
# $params (without file) + file content?
return 0;
});
Which data does CURLOPT_READFUNCTION callback have to return? $params + File(s) content? In which format?
Do I really have to create the data (returned by CURLOPT_READFUNCTION) by myself or is there a simple way to create it in the right format?
Thanks
Test 16formpost.t is relevant. As you can see, it's completely disabled. This fact and my fruitless experiments with various return values for the callback function lets me believe the CURLOPT_READFUNCTION feature is known broken in the Perl binding.
I have to return the whole content through this callback. Is that right?
No, you can feed it the request body piecewise, suitable for chunked encoding. The callback will be necessarily called several times, according to the limit set in CURLOPT_INFILESIZE.
Which data does CURLOPT_READFUNCTION callback have to return?
A HTTP request body. Since you do a file upload, this means Content-Type multipart/form-data. Following is an example using HTTP::Message. CURLOPT_HTTPPOST is another way to construct this format.
use HTTP::Request::Common qw(POST);
use WWW::Curl::Easy 4.14;
my $curl = WWW::Curl::Easy->new or die $!;
$curl->setopt(CURLOPT_POST, 1);
$curl->setopt(CURLOPT_URL, 'http://localhost:5000');
$curl->setopt(CURLOPT_HTTPHEADER, [
'Content-type: multipart/form-data', 'Transfer-Encoding: chunked'
]);
$curl->setopt(CURLOPT_READFUNCTION, sub {
return POST(undef, Content_Type => 'multipart/form-data', Content => [
name => 'upload',
action => 'keep',
backup1 => [ '/tmp/backup1.zip' ], # 1st file for upload
])->content;
});
my $r = $curl->perform;
The CURLOPT_READFUNCTION callback is only used for chunked tranfer encoding. It may work, but I haven't been able to get it to and found that doing so wasn't necessary anyway.
My use case was for upload of data to AWS, where it's not ok to upload the data as multi-part form data. Instead, it's a straight POST of the data. It does require that you know how much data you're sending the server, though. This seems to work for me:
my $infile = 'file-to-upload.json';
my $size = -s $infile;
open( IN, $infile ) or die("Cannot open file - $infile. $! \n");
my $curl = WWW::Curl::Easy->new;
$curl->setopt(CURLOPT_HEADER, 1);
$curl->setopt(CURLOPT_NOPROGRESS, 1);
$curl->setopt(CURLOPT_POST, 1);
$curl->setopt(CURLOPT_URL, $myPostUrl);
$curl->setopt(CURLOPT_HTTPHEADER,
['Content-Type: application/json']); #For my use case
$curl->setopt(CURLOPT_POSTFIELDSIZE_LARGE, $size);
$curl->setopt(CURLOPT_READDATA, \*IN);
my $retcode = $curl->perform;
if ($retcode == 0) {
print("File upload success\n");
}
else {
print("An error happened: $retcode ".$curl->strerror($retcode)."\n");
}
The key is providing an open filehandle reference to CURLOPT_READDATA. After that, the core curl library handles the reads from it without any need for callbacks.

Connecting keeps closing?

so i'm having a problem trying to automatically login to a internal website. I'm able to send a post request but in the response I always get the Header Connection: close. I've tried to pass is through the post request but it still seems to respond with Connection: close. I want to be able to navigate through the website so I need the Connection: keep-alive so that i can send more request. Could anyone tell me what I'm doing wrong? here's the code:
#usr/bin/perl
#NetTelnet.pl
use strict; use warnings;
#Sign into cfxint Unix something...
use Net::Telnet;
# Create a new instance of Net::Telnet,
my $telnetCon = new Net::Telnet (Timeout => 10,
Prompt => '/bash\$ $/') or die "Could not make connection.";
my $hostname = 'cfxint';
# Connect to the host of the users choice
$telnetCon->open(Host => $hostname,
Port => 23) or die "Could not connect to $hostname.";
use WWW::Mechanize;
my $mech = WWW::Mechanize->new(cookie_jar => {});
&login_alfresco;
sub login_cxfint {
#get username and password from user
my $CXusername = '';
my $CXpassword = '';
# Recreate the login
# Wait for the login: message and then enter the username
$telnetCon->waitfor(match => '/login:/i');
# this method adds a \n to the end of the username, it mimics hitting the enter key after entering your username
$telnetCon->print($CXusername);
# does the same as the previous command but for the password
$telnetCon->print($CXpassword);
#Wait for the login successful message
$telnetCon->waitfor();
}
sub login_alfresco{
my $ALusername = '';
my $ALpassword = '';
$mech->get('http://documents.ifds.group:8080/alfresco/faces/jsp/login.jsp');
my $res = $mech->res;
my $idfaces = '';
if($res->is_success){
my $ff = $res->content;
if($ff =~ /id="javax.faces.ViewState" value="(.*?)"/){
$idfaces = $1;
}
else {
print "javax.faces /Regex error?\n";
die;
}
}
print $idfaces, "\n";
#Send the get request for Alfresco
$mech->post('http://documents.ifds.group:8080/alfresco/faces/jsp/login.jsp',[
'loginForm:rediretURL' =>,
'loginForm:user-name' => $ALusername,
'loginForm:user-password' => $ALpassword,
'loginForm:submit' => 'Login',
'loginForm_SUBMIT' => '1',
'loginForm:_idcl' => ,
'loginForm:_link_hidden_' => ,
'javax.faces.ViewState' => $idfaces], **'Connection' =>'keep-alive'**);
$res = $mech->res;
open ALF, ">Alfresco.html";
print ALF $mech->response->as_string;
if($res->is_success){
my $ff = $res->content;
if($ff =~ /id="javax.faces.ViewState" value="(.*?)"/){
$idfaces = $1;
}
else {
print "javax.faces /Regex error?\n";
die;
}
}
print $idfaces, "\n";
#Logout
$mech->post('http://documents.ifds.group:8080/alfresco/faces/jsp/extension/browse/browse.jsp', [
'browse:serach:_option' => '0',
'browse:search' => ,
'browse:spaces-pages' => '20',
'browse:content-pages' => '50',
'browse_SUBMIT' => '1',
'id' => ,
'browse:modelist' => '',
'ref'=>'',
'browse:spacesList:sort' => ,
'browse:_idJsp7' => ,
'browse:sidebar-body:navigator' => ,
'browse:contentRichList:sort' => ,
'browse:act' => 'browse:logout',
'outcome' => 'logout',
'browse:panel' => ,
'javax.faces.ViewState' => $idfaces,])
}
You can enable keep-alive by using a connection cache:
use LWP::ConnCache;
...
$mech->conn_cache(LWP::ConnCache->new);
All that header means is that the connection will be closed upon completion of the request, instead of being kept open for possible further requests. This is perfectly normal and should not interfere with sending the request.
EDIT: If you're sending a Connection:Keep-Alive and the server is still responding with Connection:Close, then the server configuration needs to be changed. The default for HTTP/1.1 is persistent connections, so the server must explicitly be configured to send Connection:Close. See Section 8 of RFC2616.