Is there a ready-made anomizer engine/library (browser-in-browser) for personal use? - content-management-system

I want to be able to quickly open sites that are blocked in my country. On this server there is already OpenVPN and I use it, but it is quite inconvenient when it comes to someone else's PC. Putting an OpenVPN client, slipping it an .ovpn file, then deleting it all is not the best solution. It is possible, of course, to set up tunnels, but, excuse me, it's not very fast.
That is why I want to have my own browser in the browser. I used guacomole and xrdp, but since my server doesn't have a graphics card it is hard to call it comfortable working.
I sketched this for the test:
<?php
$url = "Google.com";
$ch = curl_init($url);
curl_setopt_array($ch, [CURLOPT_CONNECTTIMEOUT => 20, CURLOPT_RETURNTRANSFER => 1, CURLOPT_FOLLOWLOCATION => 1]);
$result = curl_exec($ch);
?>
<div><?=$result?></div>
And it's not working the way I'd like it to. I understand that I have to configure headers, properly redirect certificates and other stuff a la "rebuild all the links on the site", so I want to ask you: is there a ready-made such engine (although it's hard to call it an engine), which would eventually offer me the same browser in the browser?

Related

Slim Framework and Auth0

Not worked with PHP for close on 10 years now, so very out of touch. I have a a project I am working on that requires a web front end with secure authentication. There is no need for API's at all.
Auth0 meets the requirements from an authentication point of view, and provides a lot of options.
What I cant find is how to integrate this with Slim Framework, can anyone point me in the right direction?
Background on the app, I am collating information from multiple API sources into a database and want to display this out and add some more functionality. Currently most of this is displayed on Grafana dashboards around the office, but there are some new requirements for this which cant be solved with dashboards.
Slim looks like the right tool for me, I need something that allows me to create pages quite easily where I will be in effect displaying a few graphs but mostly tables and forms to interact with the data. If Slim is not the right fit, happy to look elsewhere.
Thanks
According to the official Auth0 documentation I would try a setup in Slim 3 like this:
Installation
composer require auth0/auth0-php
Container Setup
Add a new container factory entry:
use Auth0\SDK\Auth0;
use Psr\Container\ContainerInterface as Container;
//...
$container[Auth0::class] = function (Container $container) {
return new Auth0([
'domain' => 'YOUR_DOMAIN',
'client_id' => 'YOUR_CLIENT_ID',
'client_secret' => 'YOUR_CLIENT_SECRET',
'redirect_uri' => 'https://YOUR_APP/callback',
'audience' => 'https://YOUR_DOMAIN/userinfo',
'scope' => 'openid profile',
'persist_id_token' => true,
'persist_access_token' => true,
'persist_refresh_token' => true,
]);
};
Usage
The user's information is stored in the session. Each time you call getUser(), it retrieves the information from the session.
use Auth0\SDK\Auth0;
$auth0 = $container->get(Auth0::class);
$userInfo = $auth0->getUser();
if (!$userInfo) {
// We have no user info
// redirect to Login
} else {
// User is authenticated
// Say hello to $userInfo['name']
// print logout button
}
Note: Don't use the container directly. In reality it's better to use dependency injection.
"but mostly tables and forms to interact with the data"
aside from your graphs to be displayed if the above is the main requirement then I would also recommend you look at Yii Framework (a PHP framework)
In particular looking at Gii - a code generator that builds, exceptionally quickly, CRUD forms and tables...

Perl miss and hit with cookie grabber

Bellow is a script which grabs a cookie from "example.com" and encodes it base64,
It usually works, although for some reason I will have random days where it acts up and does not grab any cookies at all.
I've checked myself at times where the script was failing, and the site would still send a cookie to the client.
Same method, nothing changed on the sites behalf, and nothing would change on the script, though it would still just act up some times.
Anyone know what this could possibly be?
Do I have to change my method of grabbing cookies incase this method may be obsolete or ancient?
my $ua = new LWP::UserAgent;
$ua->cookie_jar({});
use Data::Dumper;
$ua->get("http://www.example.com");
my $cookie = encode_base64($ua->cookie_jar->as_string);
Other info: It's apart of a perl cgi script, hosted on a website.
I'm still unsure of the error thats causing this, Although, when I had turned off "CloudFlare" for my website, the problem resolved itself immediately.
Oh well there goes 3 hours of my life over CloudFlare...

Magento post from controller

Lately I've been using cURL to post data back from a custom Magento controller to a custom page on the same website.
However, the way I do it somehow breaks Magento's log in data. So I've tried another way. Magento has cURL functionality built into it (Varien_Http_Adapter_Curl).
I've tried to Post through this, but so far it has been over my head and documentation on the web is fairly sparse. I need help with this. I've got a string with all the $_POST data ready to go. Please can someone tell me how to send it?
This:
$url="<URL>";
$curl = new Varien_Http_Adapter_Curl;
$curl->setConfig(array('timeout' => 15));
$curl->write(Zend_Http_Client::POST,$url, '1.1', array(), $poststring);
$result = $curl->read();
$curl->close();
...isn't sending data .
Edit:
I've tried the non-Magento cURL, but didn't know about session Data. I still have no Idea how to send session data, either.
Now, I've tried session variables, but the result is that I can set and extract data on one page, but when changing pages the data is lost. So, this can't be used currently between the controller and view.
better you can use the magento sessions
http://magento-rohan.blogspot.in/2012/03/magento-get-set-unset-session.html
here is this how to use it
You need to give us more information about what are you trying to achieve. Basically you need to tell us where are you sending POST request to? Perhaps another Magento instance or even same Magento website? Are you expecting user to have same session it is having now? Once you give us more info I will edit my answer. For now I will try to guess what is bothering you based on the input you gave.
When you are submitting POST request with curl from server side, that means that user is no longer interacting with the "page" you are trying to submit post request to.
If user is not interacting with it, that means it is not sending user session information.
Basically it looks like this:
Normally
Eric ->(Request with session info)-> Server (Oh it's you Eric, here is
the response just for you)
What are you trying to do
Eric ->(Request with session info)-> Server ->(Request without session
info)-> Server (This server doesn't know about Eric)
So to implement this correctly, if I am good at assuming what is your problem, just pass session information to the second server along with your request.
I will add more info if you tell me I am on the good track of understanding your problem.
---UPDATE---
You didn't explain your situation well. I am telling you this because your whole approach with the cURL may be bad decision from the start. For example, if you are trying to execute code in the same Magento codebase and that code is trapped inside some controller, perhaps you can refactor your code and encapsulate that logic inside some model and execute it directly.
But here is the example of passing session information over curl in plain php:
$strCookie = 'PHPSESSID=' . $_COOKIE['PHPSESSID'] . '; path=/';
curl_setopt( $curl, CURLOPT_COOKIE, $strCookie );
Cookie should perhaps be called "frontend" in Magento. And I checked Varien_Http_Adapter_Curl it doesn't have any method for seting CURLOPT_COOKIE option so I suggest you go with plain curl setup. You also have an option to extend adapter and add that option by your self. Just override "_applyConfig" method.

WWW::Mechanize won't work after submit

At my work I build a lot of wordpress sites and I also do a lot of cutting and pasting. In order to streamline this process I'm trying to make a crawler that can fill out and submit form information to wordpress. However, I can't get the crawler to operate correctly in the wordpress admin panel once I'm past the login.
I know it works to submit the login form because I've gotten the page back before. But this script doesn't seem to return the "settings" page, which is what I want. I've been trying to use this site as a guide: www.higherpass.com/Perl/Tutorials/Using-Www-mechanize/3/ for how to use mechanize but I could use some additional pointers for this. Here is my Perl script, I've tried a few variations but I just need to be pointed in the right direction.
Thanks!
use WWW::Mechanize;
my $m = WWW::Mechanize->new();
$url2 = 'http://www.moversbatonrougela.com/wp-admin/options-general.php';
$url = 'http://www.moversbatonrougela.com/wp-admin';
$m->get($url);
$m->form_name('loginform');
$m->set_fields('username' => 'user', 'password' => 'password');
$m->submit();
$response = $m->get($url2);
print $response->decoded_content();
Put the below lines of code just before $m->submit(); . Since WWW::Mechanize is a subclass of LWP::UserAgent you can use any of LWP's methods.
$m->add_handler("request_send", sub { shift->dump; return });
$m->add_handler("response_done", sub { shift->dump; return });
The above would enable logging in your code. Look out for the Request/Response return codes i.e. 200 (OK) or 302 (Redirect) etc. The URL request i.e. the $m->get() is probably getting redirected or the machine's ip is Blocked by the server. If its a redirect, then you can probably use $m->redirect_ok(); to follow the redirect URL, or in case you don't want to follow the redirect URL use $m->requests_redirectable (this is an LWP method). The logs should show something like below-
HTTP/1.1 200 OK
OR
HTTP/1.1 302 Found
If none of the above works, use an alternative of $m->submit(); like below and give it a try-
my $inputobject=$mech->current_form()->find_input( undef, 'submit' );
$m->click_button(input => $inputobject);

Perl WWW::Mechanize::Firefox POST() Implementation

Can I get some help on how to submit a POST with the necessary variables using WWW::Mechanize::Firefox? I've installed all the perl modules, and the firefox plugin and tested such that I can connect to a give host and get responses... my questions is how to submit a POST request. On the documentation Corion says he may never implement. This seems odd, I'm hoping I can use the inherited nature from Mechanize, but can't find any examples. A simple example would help me tremendously.
my $mech = WWW::Mechanize::Firefox->new();
$mech->allow( javascript =>1); # enable javascript
# http
$mech->get("http://www.example.com");
my $c = $mech->content;
Is there a mech->post() option I am simply missing?
many thanks in advance.
R
Normally you would just set the fields and submit the form like this:
$mech->get('http://www.website.com');
$mech->submit_form(
with_fields => {
user => 'me',
pass => 'secret',
}
);
get a page, fill out a form, submit the form
if you're going to be skipping the above steps by using post, you don't need mechanize firefox