How to connecting RapidApp to PostgreSQL, with utf-8 enabled - perl

I'm creating a simple CRUD interface to a database, and I'm trying RapidApp.
I have an existing database, which I connect to with existing Moose-based code. There is a complication in that there is UTF-8 text in the database (eg 'Encyclopédie médico-chirurgicale. Técnicas quirúrgicas. Aparato digestivo')
My Moose-based code works just fine: data goes in & data comes out... and everyone is happy.
In my existing Moose code, the connector is:
$schema = My::Service::Schema->connect(
'dbi:Pg:dbname=my_db;host=my.host.name;port=1234',
'me',
'secret',
{ pg_enable_utf8 => 1 }
);
When I set about connecting RapidApp, I first tried a simple rdbic.pl command, but that doesn't pick up the UTF-8 strings. In an attempt to enforce UTF-8-ness, I've created the following:
use Plack::Runner;
use Plack::App::RapidApp::rDbic;
my $cnf = {
connect_info => {
dsn => 'dbi:Pg:dbname=my_db;host=my.host.name;port=1234',
user => 'me',
password => 'secret',
{ pg_enable_utf8 => 1 },
},
schema_class => 'My::Service::Schema'
};
my $App = Plack::App::RapidApp::rDbic->new( $cnf );
my $psgi = $App->to_app;
my $runner = Plack::Runner->new;
$runner->parse_options('--port', '5678');
$runner->run($psgi);
(which is pretty much rdbic.pl, compressed to one specific thing)
However - I'm getting mal-formed strings (eg: 'Encyclopédie médico-chirurgicale. Técnicas quirúrgicas. Aparato digestivo')
Having fought to get the correct text INTO the database, I know the database is correct... so how do I connect RapidApp to get UTF-8 back out?

Your schema will need to be configured to support UTF-8. Here's a helpful set of things to try:
How to properly use UTF-8-encoded data from Schema inside Catalyst app?

Related

jdbc output plugin logstash using UPSERT function

I'm using this plugin as output for my logstash logs.
I need to use the upsert function to check if a row exists then update, if it doesn't exist then simply add.
I'm using PostgreSQL as db and it supports the usage of UPSERT, very good described here. As input, the logs are coming from elasticsearch.
The problem with my configuration is that I correctly add new rows in my table but cannot update an existing one.
Here's my configuration :
jdbc {
driver_jar_path => '/home/vittorio/Downloads/postgresql-42.1.1.jre6.jar'
connection_test => false
connection_string => 'jdbc:postgresql://127.0.0.1:5432/postgres'
statement => ["
INSERT INTO userstate VALUES(?,?,?,?,?) on conflict (username)
do update set (business_name, iban, status, timestamp) = ('%{[resource][response_attributes][business_name]}','%{[resource][response_attributes][iban]}','%{[resource][response_attributes][status]}','%{#timestamp}')
where userstate.username = '%{[request][username]}';", "%{[request][username]}","%{[resource][response_attributes][business_name]}","%{[resource][response_attributes][iban]}","%{[resource][response_attributes][status]}","%{#timestamp}"
]
username => "myuser"
password => "mypass"
}
Am I doing something wrong?
thanks
I manged to make it work by myself and this is what I've done so far :
jdbc {
driver_jar_path => '/home/vittorio/Downloads/postgresql-42.1.1.jre6.jar'
connection_test => false
connection_string => 'jdbc:postgresql://127.0.0.1:5432/postgres'
statement => ["
INSERT INTO userstate VALUES(?,?,?,?,?)
on conflict (username)
do update set (business_name, iban, status, timestamp) = (?,?,?,?)
where userstate.username = ?"
, "%{[request][username]}","%{[resource][response_attributes][business_name]}","%{[resource][response_attributes][iban]}","%{[resource][response_attributes][status]}","%{#timestamp}","%{[resource][response_attributes][business_name]}","%{[resource][response_attributes][iban]}","%{[resource][response_attributes][status]}","%{#timestamp}","%{[request][username]}"
]
username => "myusername"
password => "mypass"
}
Basically,I've changed the where statement using ? instead of %{[request][username]} and then map each ? with the corresponding value from the log. I know, it's pretty long stuff after the coma but this is the only way I found to make it work. If anyone knows a better way to do it please let me know.
Thank you

SugarCRM Mass removal of custom fields

I have a few hundred custom fields to delete in an older version of SugarCRM. Is very labor intensive to delete through web interface...
Can this be done directly by deleting files in the instalation (vardefs, anything else?)
This is similar to [question asked earlier] (revert the custom fields made by sugarCRM), but was solved by using web interface for a few fields.
I can easily write a script then to remove the fields from the {table_name}_cstm tables...
You can try something like that (should be execute in a SugarCRM environment like an entryPoint and with an admin user)
$fieldsByModule = array(
'Accounts' => array(
'field_1_c',
'field_2_c',
),
'Contacts' => array(
'field_1_c',
'field_2_c',
),
);
require_once('modules/DynamicFields/DynamicField.php');
foreach ($fieldsByModule as $moduleName => $fields) {
foreach($fields as $field){
$dyField = new DynamicField();
$dyField->bean = BeanFactory::getBean($moduleName);;
$dyField->module = $moduleName;
$dyField->deleteField($field);
}
}
Live coding without test the code but the core of the process should be near like that.

How to properly use UTF-8-encoded data from Schema inside Catalyst app?

Data defined inside Catalyst app or in templates has correct encoding and is diplayed well, but from database everything non-Latin1 is converted to ?. I suppose problem should be in model class, which is such:
use strict;
use base 'Catalyst::Model::DBIC::Schema';
__PACKAGE__->config(
schema_class => 'vhinnad::Schema::DB',
connect_info => {
dsn => 'dbi:mysql:test',
user => 'user',
password => 'password',
{
AutoCommit => 1,
RaiseError => 1,
mysql_enable_utf8 => 1,
},
'on_connect_do' => [
'SET NAMES utf8',
],
}
);
1;
I see no flaws here, but something must be wrong. I used my schema also with test scripts and data was well encoded and output was correct, but inside Catalyst app i did not get encoding right. Where may be the problem?
EDIT
For future reference i put solution here: i mixed in connect info old and new style.
Old style is like (dsn, username, passw, hashref_options, hashref_other options)
New style is (dsn => dsn, username => username, etc), so right is to use:
connect_info => {
dsn => 'dbi:mysql:test',
user => 'user',
password => 'password',
AutoCommit => 1,
RaiseError => 1,
mysql_enable_utf8 => 1,
on_connect_do => [
'SET NAMES utf8',
],
}
In a typical Catalyst setup with Catalyst::View::TT and Catalyst::Model::DBIC::Schema you'll need several things for UTF-8 to work:
add Catalyst::Plugin::Unicode::Encoding to your Catalyst app
add encoding => 'UTF-8' to your app config
add ENCODING => 'utf-8' to your TT view config
add <meta http-equiv="Content-type" content="text/html; charset=UTF-8"/> to the <head> section of your html to satisfy old IEs which don't care about the Content-Type:text/html; charset=utf-8 http header set by Catalyst::Plugin::Unicode::Encoding
make sure your text editor saves your templates in UTF-8 if they include non ASCII characters
configure your DBIC model according to DBIx::Class::Manual::Cookbook#Using Unicode
if you use Catalyst::Authentication::Store::LDAP configure your LDAP stores to return UTF-8 by adding ldap_server_options => { raw => 'dn' }
According to Catalyst::Model::DBIC::Schema#connect_info:
The old arrayref style with hashrefs for DBI then DBIx::Class options is also supported.
But you are already using the 'new' style so you shouldn't nest the dbi attributes:
connect_info => {
dsn => 'dbi:mysql:test',
user => 'user',
password => 'password',
AutoCommit => 1,
RaiseError => 1,
mysql_enable_utf8 => 1,
on_connect_do => [
'SET NAMES utf8',
],
}
This advice assumes you have fairly up to date versions of DBIC and Catalyst.
This is not necessary: on_connect_do => [ 'SET NAMES utf8' ]
Ensure the table|column charsets are UTF-8 in your DB. You can achieve things that sometimes look right even when parts are broken. The DB must be saving the character data as UTF-8 if you expect the entire chain to work.
Ensure you're using and configuring Catalyst::Plugin::Unicode::Encoding in your Catalyst app. It did have serious-ish bugs in the not too distant past so get the newest.

How can I get attributes from complex SOAP::DATA structure in SOAP::Lite?

I can't get a simple attribute value from SOAP response using SOAP::Lite.
Below the code and output of SOAP::Data. I'm trying to get value of the attribute //response/dirn/attr/uuid
my $cm = new SOAP::Lite
uri => 'http://www.cisco.com/AXL/API/1.0',
proxy => "https://10.0.0.1:8443/axl/";
my $res = $cm->getPhone(
SOAP::Data->name(phoneName => 'SEP00270D3D7A4C'),
);
for my $i ($res->valueof('//device/lines/line')) {
print Dumper($i);
#print $i->{dirn}->{attr}->{'uuid'}."\n"; # line below give me an error
}
Here the output of Data::Dumper. I actually have the requested value, but I can't get it through SOAP::Data
$VAR1 = \bless( {
'_signature' => [],
'_value' => [
bless( {
'_name' => 'dirn',
'_signature' => [],
'_value' => [
''
],
'_prefix' => '',
'_attr' => {
'uuid' => '{615C3550-1EFD-56C7-3788-2AA8725880E3}' #!!!!!!!!!!!!!!!!!!!!!!!!!!
}
}, 'SOAP::Data' ),
],
'_attr' => {}
}, 'SOAP::Data' );
I spent about several hours trying to get this attribute value. I've already thinking about using output of Data::Dumper to get the value as fast and dirty hack.
Thanks in advance
P.S.: SOAP Server is Cisco CUCM 6.1.5
$$i->value->attr->{uuid}
$i->{'_value'}[0]{'uuid'}
I think, though, I'm not sure about the [0].
I have same issue, but can not find an "quick and easy" solution to it. I developed a Perl library module to use certain vendor Web Service (WSDL). I had done many of such Web Service interfaces, but until now - all of the data was returned as XML "elements". On the contrary, this particular Web Service returns most of the data as XML elements, but also sets some - as XML attributes. I can not get values returned as attributes - since SOAP::Data methods (valueof(), body(), etc.) only return values of XML elements, but not associated attributes.
This problem is a little different from the one posted before - in that I do not know up front the XML structure that is being returned (given web service provides many different methods, and each - has different response).
So question is - how it is possible to get all of the XML data (both elements and attributes) for a generic response SOAP data
I went through the same thing recently and found the answer, refer to my question and my updated answer in the comments section.
Extract specific XML element in CDATA taken from SOAP::Lite Response Hash

How do I set up my POE::Filter to receive the entire chunk of data returned from the server?

I tried the following
my $filter = POE::Filter::Line->new(OutputLiteral => '');
my $wheel = POE::Wheel::ReadWrite->new(
Handle => $socket,
Filter => $filter,
InputEvent => 'on_input',
ErrorEvent => 'on_error',
FlushedEvent => 'on_flush',
);
But on_input is called several times with each line separately in ARG0. How do I get it all together? Doesn't setting setting OutputLiteral to '' change the filter's understanding of what a "line" is?
First of all, you are reading from the filter, so it's InputLiteral which is important here, not OutputLiteral. Second, you can't have an empty InputLiteral (if you try, it will just autodetect the input literal). Consequently, you can't use POE::Filter::Line to get all the data, because it is made for parsing line-terminated records. Use POE::Filter::Stream instead.