I have a perl CGI application that I want to take the users request headers, and turn those around into an LWP::UserAgent get request. Basically the goal is to replicate the incoming users headers and use those to make a separate request.
I've tried to create the headers myself but when I attempt to display the CGI headers and then my clone UserAgent headers, they aren't exactly the same. Here's what I got:
my $cgi = new CGI;
my %headers = map { $_ => $cgi->http($_) } $cgi->http;
my $req_headers = HTTP::Headers->new( %headers );
my $ua = LWP::UserAgent->new( default_headers => $req_headers );
print Dumper $ua->default_headers;
Basically, %headers and $ua->default_headers are not identical. $ua->default_headers has an agent that identifies itself as a perl script. I can manually set $ua->agent("") but there are other imperfections and the headers still aren't identical.
What's the best way to do what I want? There's got to be an easier solution...
It looks like the problem has to do with naming of incoming http headers compared to what HTTP::Headers uses.
The incoming parameters all have HTTP_ prefix in them where HTTP::Headers doesn't use that naming convention (which makes sense). Plus it looks like (a quick read in the code) that HTTP::Headers does the right thing in converting '-' into '_' for its own use.
I would recommended changing your map to following that removes the prefix:
# remove leading HTTP_ from keys, note: this assumes all keys have pattern
# HTTP_*
%headers = map { ( /^HTTP_(.*?)$/ ) => $cgi->http($_) } $cgi->http;
Here is the debugging script I used:
my $cgi = CGI->new;
my %headers = map { $_ => $cgi->http($_) } $cgi->http;
my $req_headers = HTTP::Headers->new( %headers );
my $ua = LWP::UserAgent->new( default_headers => $req_headers );
print "Content-type: text/plain\n\n";
print Dumper($ua->default_headers);
print Dumper( \%headers );
# remove HTTP_ from $_
%headers = map { ( /^HTTP_(.*?)$/ ) => $cgi->http($_) } $cgi->http;
$req_headers = HTTP::Headers->new( %headers );
$ua = LWP::UserAgent->new( default_headers => $req_headers );
print "headers part deux:\n";
print Dumper( $ua );
Hope that Helps
Related
I am writing a Perl script to POST an attachment to JIRA using
REST::Client to access the API
but I am getting an error.
use REST::Client;
use warnings;
use strict;
use File::Slurp;
use MIME::Base64;
my $user = 'user';
my $pass = 'pass';
my $url = "http://******/rest/api/2/issue/BugID/attachments";
my $client = REST::Client->new();
$client->addHeader( 'Authorization', 'Basic' . encode_base64( $user . ':' . $pass ) );
$client->addHeader( 'X-Atlassian-Token', 'no-check' );
$client->setHost( $url );
# my %header = ('Authorization' => 'Basic'. encode_base64($user . ':' . $pass),'X-Atlassian-Token' => 'no-check');
my $attachment = "C:\\Folder\\Test.txt";
$client->POST(
$url,
'Content_Type' => 'form-data',
'Content' => [ 'file' => [$attachment] ]
);
if ( $client->responseCode() eq '200' ) {
print "Updated\n";
}
# print the result
print $client->responseContent() . "\n";
The error I get is
REST::Client exception: headers must be presented as a hashref at C:\Users\a\filename.pl line 24.
As shown in the code, I have tried setting headers in different ways but I still get same error.
Please suggest if there is any other method.
I have tried using JIRA module but it gives error too.
According to the documentation, the POST method:
Takes an optional body content and hashref of custom request headers.
You need to put your headers in a hashref, e.g.:
$client->POST($url, $content, {
foo => 'bar',
baz => 'qux'
});
But...it looks like you're expecting REST::Client to use HTTP::Request::Common to construct a multipart/form-data request. Unfortunately, that's not the case, so you'll have to build the content by hand.
You could use HTTP::Request::Common directly like this:
use strict;
use warnings 'all';
use 5.010;
use HTTP::Request::Common;
use REST::Client;
my $client = REST::Client->new;
my $url = 'http://www.example.com';
my $req = POST($url,
Content_Type => 'form-data',
Content => [ file => [ 'foo.txt' ] ]
);
$client->POST($url, $req->content(), {
$req->headers->flatten()
});
But this is a bit convoluted; I would recommend dropping REST::Client and using LWP::UserAgent instead. REST::Client is just a thin wrapper for LWP::UserAgent with a few convenience features, like prepending a default host to all requests. In this case, it's just getting in the way and I don't think the conveniences are worth the trouble.
From the documentation:
POST ( $url, [$body_content, %$headers] )
And you're doing:
$client->POST(
$url,
'Content_Type' => 'form-data',
'Content' => [ 'file' => [$attachment] ]
);
So - passing a list of scalars, with an arrayref at the end.
Perhaps you want something like:
$client->POST(
$url,
$attachment,
{ 'Content-Type' => 'form-data' }
);
Note the {} to construct an anonymous hash for the headers.
Although you probably want to open and include the 'attachment', because there's nothing in REST::Client about opening files and sending them automagically.
Here is my issue with handling argument Perl. I need to pass Perl argument argument to a http request (Webservice) whatever Argument given to the perl file.
perl wsgrep.pl -name=john -weight -employeeid -cardtype=physical
In the wsgrep.pl file, i need to pass the above arguments to http post params.
like below,
http://example.com/query?name=john&weight&employeeid&cardtype=physical.
I am using LWP Package for this url to get response.
Is there any good approach to do this?
Updated:
Inside the wsgrep.pl
my ( %args, %config );
my $ws_url =
"http://example.com/query";
my $browser = LWP::UserAgent->new;
# Currently i have hard-coded the post param arguments. But it should dynamic based on the file arguments.
my $response = $browser->post(
$ws_url,
[
'name' => 'john',
'cardtype' => 'physical'
],
);
if ( $response->is_success ) {
print $response->content;
}
else {
print "Failed to query webservice";
return 0;
}
I need to construct post parameter part from the given arguments.
[
'name' => 'john',
'cardtype' => 'physical'
],
Normally, to url-encode params, I'd use the following:
use URI;
my $url = URI->new('http://example.com/query');
$url->query_form(%params);
say $url;
Your needs are more elaborate.
use URI qw( );
use URI::Escape qw( uri_escape );
my $url = URI->new('http://example.com/query');
my #escaped_args;
for (#ARGV) {
my ($arg) = /^-(.*)/s
or die("usage");
push #escaped_args,
join '=',
map uri_escape($_),
split /=/, $arg, 2;
}
$url->query(#escaped_args ? join('&', #escaped_args) : undef);
say $url;
I have to play with some misconfigured web servers, so I started processing the HTML meta tags to feed information back into the web user-agent object. I tried a variety of ways of doing this in Mojolicious and settled on a looking for a "finish" event on the response. My goal was to make this mostly invisible to the rest of the code so the process wasn't even aware this was happening.
Still, this just doesn't sit right with me for a reason I can't quite put my finger on. Aside from the particular code in process_meta_options, is there a more Mojolicious way to do this? For example, Mojo::UserAgent get() with userdefined callback uses the read event, but I tend to think that might interfere with things. Or I could just be over-thinking it.
use v5.20;
use feature qw(signatures);
no warnings qw(experimental::signatures);
use Data::Dumper;
use Mojo::UserAgent;
my $ua = Mojo::UserAgent->new;
my $tx = $ua->build_tx( GET => 'http://blogs.perl.org' );
$tx->res->on(
finish => \&process_meta_options
);
$tx = $ua->start( $tx );
say "At end, charset is ", $tx->res->content->charset;
sub process_meta_options ( $res ) {
$res
->dom
->find( 'head meta[charset]' ) # HTML 5
->map( sub {
my $content_type = $res->headers->header( 'Content-type' );
return unless my $meta_charset = $_->{charset};
$content_type =~ s/;.*//;
$res->headers->header( 'Content-type', "$content_type; charset=$_->{charset}" );
} );
}
I think the answer is just what I came up with. I haven't found anything that I liked better.
use v5.20;
use feature qw(signatures);
no warnings qw(experimental::signatures);
use Data::Dumper;
use Mojo::UserAgent;
my $ua = Mojo::UserAgent->new;
my $tx = $ua->build_tx( GET => 'http://blogs.perl.org' );
$tx->res->on(
finish => \&process_meta_options
);
$tx = $ua->start( $tx );
say "At end, charset is ", $tx->res->content->charset;
sub process_meta_options ( $res ) {
$res
->dom
->find( 'head meta[charset]' ) # HTML 5
->map( sub {
my $content_type = $res->headers->header( 'Content-type' );
return unless my $meta_charset = $_->{charset};
$content_type =~ s/;.*//;
$res->headers->header( 'Content-type', "$content_type; charset=$_->{charset}" );
} );
}
I'm trying to debug a weird warning that is showing up in server logs when a Plack::Request is being parsed. In some cases, a broken UserAgent will send a Content-Length header that looks something like "6375, 6375", which is obviously wrong.
To fix this properly, I need to be able to reproduce the warning. I'd like to include this in a unit test so that I can ensure there are no regressions after the warning is silenced. However, I'm having trouble doing this with Perl. I know this can be done using netcat and socat, but I don't want the unit test to have to rely on other binaries to be installed.
Here is what I've tried:
#!/usr/bin/env perl
use strict;
use warnings;
use JSON::XS qw( encode_json );
use WWW::Mechanize;
my $mech = WWW::Mechanize->new;
$mech->add_handler(
request_prepare => sub {
my ( $req, $ua, $h ) = #_;
$req->headers->header( 'Content-Length' => 9999 );
return;
}
);
my $json = encode_json( { foo => 'bar' } );
$mech->post(
'http://example.com'/url,
'Content-Length' => 999,
Content => $json
);
Output is:
Content-Length header value was wrong, fixed at /opt/perl5.16.3/lib/site_perl/5.16.3/LWP/Protocol/http.pm line 260.
200
That's entirely too helpful for me. :)
If I use HTTP::Request and LWP::UserAgent, it's the same end result.
So, I tried HTTP::Tiny.
#!/usr/bin/env perl
use strict;
use warnings;
use DDP;
use HTTP::Tiny;
use JSON::XS qw( encode_json );
my $http = HTTP::Tiny->new;
my $json = encode_json( { foo => 'bar' } );
my $response = $http->request(
'POST',
'http://example.com'/url',
{ headers => { 'Content-Length' => 999, },
content => $json,
}
);
p $response;
The output is:
{ content => "Content-Length missmatch (got: 13 expected: 999)
",
headers => {
content
-length => 49,
content-type => "text/plain",
},
reason => "Internal Exception",
status => 599,
success => "",
url => "http://example.com'/url",
}
Again, too helpful. At this point, I could use a few suggestions.
Seems like the higher level API's are fixing your error; Here's an example using raw sockets that overcomes this;
#!/usr/bin/env perl
use strict 'vars';
use warnings;
use Socket;
# initialize host and port
my $host = 'www.example.com';
my $port = 80;
# contact the server
open_tcp(F, $host, $port)
or die 'Could not connect to server';
# Send request data
while ( my $request = <DATA> ) {
print F $request;
}
# Get Response
while ( my $response = <F> ) {
print "Response:> $response";
}
close(F);
# TCP Helper
sub open_tcp
{
# get parameters
my ($FS, $dest, $port) = #_;
my $proto = getprotobyname('tcp');
socket($FS, PF_INET, SOCK_STREAM, $proto);
my $sin = sockaddr_in($port,inet_aton($dest));
connect($FS,$sin);
my $old_fh = select($FS);
$| = 1; # don't buffer output
select($old_fh);
}
__DATA__
GET / HTTP/1.1
Host: example.com
Content-Length: 999
-END-
#use SOAP::Lite ( +trace => all, maptype => {} );
use SOAP::Lite maptype => {};
use LWP::UserAgent;
use HTTP::Request::Common;
#credentials' file
require "c:\\test\\pass.pl";
my $userAgent = LWP::UserAgent->new(keep_alive => 1);
sub SOAP::Transport::HTTP::Client::get_basic_credentials {
return $username => $password;
}
my $soap
= SOAP::Lite
->uri('<mysite>/_vti_bin/lists.asmx')
->on_action(sub {join '/', 'http://schemas.microsoft.com/sharepoint/soap/CopyIntoItemsLocal', $_[1]})
->proxy('<mysite>/_layouts/viewlsts.aspx?BaseType=0', keep_alive => 1);
# my $method = SOAP::Data->name('CopyIntoItemsLocal')
# ->attr({xmlns => 'http://schemas.microsoft.com/sharepoint/soap/'});
# my #params = (SOAP::Data->name(SourceUrl => $source),
# SOAP::Data->name(DestinationUrl => $destination) );
# print $soap->call($method => #params)->result;
my $fileName = 'c:\test\abc.txt';
my $destDir = "<mysite>/Lists/sharepoint1/";
#load and encode Data
my $data;
open(FILE, $fileName) or die "$!";
#read in chunks of 57 bytes to ensure no padding in the middle (Padding means extra space for large files)
while (read(FILE, $buf, 60 * 57)) {
$data .= encode_base64($buf);
}
close(FILE);
#make the call
print "uploading $fileName...";
$lists = $soap->GetList();
my $method = SOAP::Data->name('CopyIntoItemsLocal')->attr({xmlns => 'http://schemas.microsoft.com/sharepoint/soap/'});
my #params = (
SOAP::Data->name('SourceUrl')->value($fileName)->type(''),
SOAP::Data->name('DestinationUrls')->type('')->value(
\SOAP::Data->name('string')->type('')->value($destDir . $fileName)
),
SOAP::Data->name('Fields')->type('')->value(
\SOAP::Data->name('FieldInformation')->type('')->attr({Type => 'File'})->value('')
),
SOAP::Data->name('Stream')->value("$data")->type('')
);
#print Results
print $soap->call($method => #params)->result;
#print $response->headerof('//CopyResult')->attr->{ErrorCode};
#use SOAP::Lite ( +trace => all, maptype => {} );
use SOAP::Lite maptype => {};
use LWP::UserAgent;
use HTTP::Request::Common;
use MIME::Base64 qw(encode_base64);
require "c:\\test\\pass.pl";
my $userAgent = LWP::UserAgent->new(keep_alive=>1);
#setup connection
sub SOAP::Transport::HTTP::Client::get_basic_credentials {
return $username => $password;
}
my $soap = SOAP::Lite
-> uri('http://<mysite>')
-> on_action( sub{ join '/', 'http://schemas.microsoft.com/sharepoint/soap', $_[1] })
-> proxy('http://<mysite>/_vti_bin/lists.asmx',keep_alive => 1);
$lists = $soap->GetListCollection();
quit(1, $lists->faultstring()) if defined $lists->fault();
my #result = $lists->dataof('//GetListCollectionResult/Lists/List');
foreach my $data (#result) {
my $attr = $data->attr;
foreach my $a qw'Title Description DefaultViewUrl Name ID WebId ItemCount' {
printf "%-16s %s\n", $a, $attr->{$a};
}
print "\n";
}
The authentication seems to be working. First I thought that the GetlistCollection Web service is working, as when I made call using that Web service, it returned a page. But I think the call is returning the page I specified in the proxy argument.
I am able to get the collection of list on the particular site on the sharepoint.
I have used GetListCollection. However I did not really understand the code which is printing the list. I just copied it from squish.net. Now I am trying to use the CopyIntoItemsLocal web service.
We have a repository of files on one server (SVN) and I have to write a Perl script which when executed will copy the files and directories from SVN to sharepoint along with the directory structure.
I will appreciate any help or tips. Since it is a big task, I am doing it in modules.
I would start by using soapUI (formerly by eviware, now by smartbear) an open source soapUI testing tool. This will allow you to send soap transactions back and forth without any other user interface. Once you are sure your transactions work and you can parse the data to get what you want, then I would make the move to use Perl to automate those transactions.
This helps you eliminate errors in your requests early on, figure out how to parse responses, and familiarize yourself with the API.