I am trying to write a little SwiftUI app that connects to my server so I can display files and visually use SFTP (kind of like FileZilla). I have a function called SSH that executes on a button press, here it is:
func ssh(user: String, domain: String, password: String) {
let arg = "ssh " + user + "#" + domain
let task = Process()
task.launchPath = "/usr/bin/ssh"
task.arguments = [arg]
task.launch()
task.waitUntilExit()
}
When I call the function, I get this nasty error message in the Xcode console:
2020-11-25 11:05:35.122998-0800 SFTP App[25287:1434441] Metal API Validation Enabled
2020-11-25 11:05:49.056839-0800 ssh[25291:1434628] [] nw_resolver_can_use_dns_xpc_block_invoke Sandbox does not allow access to com.apple.dnssd.service
2020-11-25 11:05:49.057250-0800 ssh[25291:1434628] dnssd_clientstub ConnectToServer: connect() failed path:/var/run/mDNSResponder Socket:4 Err:-1 Errno:1 Operation not permitted
2020-11-25 11:05:49.057351-0800 ssh[25291:1434628] [connection] nw_resolver_create_dns_service_locked [C1] DNSServiceCreateDelegateConnection failed: ServiceNotRunning(-65563)
ssh: Could not resolve hostname connect.markregg.com: -65563
I am able to connect to my server using the following terminal command and then entering my password:
ssh mark#connect.markregg.com
Related
I would like to establish a TLS encrypted connection to a PostgreSQL 11 database using Tokio as the framework, Deadpool as the connection pooler and rustls as TLS library.
I developed/modified the following code:
let pool = if let Some(ca_cert) = settings.db_ca_cert {
let mut tls_config = ClientConfig::new();
let cert_file = File::open(&ca_cert)?;
let mut buf = BufReader::new(cert_file);
tls_config.root_store.add_pem_file(&mut buf).map_err(|_| {
anyhow::anyhow!("failed to read database root certificate: {}", ca_cert)
})?;
let tls = MakeRustlsConnect::new(tls_config);
settings.pg.create_pool(tls)?
} else {
settings.pg.create_pool(NoTls)?
};
My test scenario is taken from here:
PostgreSQL 11 docker container (including TLS turned on)
TLS was already tested successfully with the psql client
I now get the following error message and can't explain the problem. I already checked the access rights and other parameters.
/usr/local/bin/cargo run --color=always
Finished dev [unoptimized + debuginfo] target(s) in 0.20s
Running `target/debug/tokio-postgres-rustls-connection-pool-demo`
DEBUG tokio_postgres_rustls_connection_pool_demo > settings: Settings { pg: Config { user: Some("postgres"), password: Some("postgres"), dbname: Some("postgres"), options: Some("sslrootcert=/xxx/tokio-postgres-rustls-connection-pool-demo/docker/files/cert/ca.pem"), application_name: None, ssl_mode: None, host: Some("127.0.0.1"), hosts: None, port: Some(6432), ports: None, connect_timeout: None, keepalives: None, keepalives_idle: None, target_session_attrs: None, channel_binding: None, manager: None, pool: None }, db_ca_cert: None }
Error: Backend(Error { kind: Connect, cause: Some(Os { code: 2, kind: NotFound, message: "No such file or directory" }) })
I looked at the logs of the database and could identify the following error:
[86] LOG: XX000: could not accept SSL connection: Success
[86] LOCATION: be_tls_open_server, be-secure-openssl.c:408
How can I solve the problem?
I am trying to connect to database in Rust using sqlx crate and Postgres database.
main.rs:
use dotenv;
use sqlx::Pool;
use sqlx::PgPool;
use sqlx::query;
#[async_std::main]
async fn main() -> Result<(), Error> {
dotenv::dotenv().ok();
pretty_env_logger::init();
let url = std::env::var("DATABASE_URL").unwrap();
dbg!(url);
let db_url = std::env::var("DATABASE_URL")?;
let db_pool: PgPool = Pool::new(&db_url).await?;
let rows = query!("select 1 as one").fetch_one(&db_pool).await?;
dbg!(rows);
let mut app = tide::new();
app.at("/").get(|_| async move {Ok("Hello Rustacean!")});
app.listen("127.0.0.1:8080").await?;
Ok(())
}
#[derive(thiserror::Error, Debug)]
enum Error {
#[error(transparent)]
DbError(#[from] sqlx::Error),
#[error(transparent)]
IoError(#[from] std::io::Error),
#[error(transparent)]
VarError(#[from] std::env::VarError),
}
Here is my .env file:
DATABASE_URL=postgres://localhost/twitter
RUST_LOG=trace
Error log:
error: failed to connect to database: password authentication failed for user "ayman"
--> src/main.rs:19:16
|
19 | let rows = query!("select 1 as one").fetch_one(&db_pool).await?;
| ^^^^^^^^^^^^^^^^^^^^^^^^^
|
= note: this error originates in a macro (in Nightly builds, run with -Z macro-backtrace for more info)
error: aborting due to previous error
error: could not compile `backend`.
Note:
There exists a database called twitter.
I have include macros for sqlx's dependency
sqlx = {version="0.3.5", features = ["runtime-async-std", "macros", "chrono", "json", "postgres", "uuid"]}
Am I missing some level of authentication for connecting to database? I could not find it in docs for sqlx::Query macro
The reason why it is unable to authenticate is that you must provide credentials before accessing the database
There are two ways to do it
Option 1: Change your URL to contain the credentials - For instance -
DATABASE_URL=postgres://localhost?dbname=mydb&user=postgres&password=postgres
Option 2 Use PgConnectionOptions - For instance
let pool_options = PgConnectOptions::new()
.host("localhost")
.port(5432)
.username("dbuser")
.database("dbtest")
.password("dbpassword");
let pool: PgPool = Pool::<Postgres>::connect_with(pool_options).await?;
Note: The sqlx version that I am using is sqlx = {version="0.5.1"}
For more information refer the docs - https://docs.rs/sqlx/0.5.1/sqlx/postgres/struct.PgConnectOptions.html#method.password
Hope this helps you.
I already have a schema of users with authentication-key and wanted to do authentication via that. I tried implementing authentication via sql but due to different structure of my schema I was getting error and so I implemented external-authentication method. The technologies and OS used in my application are :
Node.JS
Ejabberd as XMPP server
MySQL Database
React-Native (Front-End)
OS - Ubuntu 18.04
I implemented the external authentication configuration as mentioned in https://docs.ejabberd.im/admin/configuration/#external-script and took php script https://www.ejabberd.im/files/efiles/check_mysql.php.txt as an example. But I am getting the below mentioned error in error.log. In ejabberd.yml I have done following configuration.
...
host_config:
"example.org.co":
auth_method: [external]
extauth_program: "/usr/local/etc/ejabberd/JabberAuth.class.php"
auth_use_cache: false
...
Also, is there any external auth javascript script?
Here is the error.log and ejabberd.log as mentioned below
error.log
2019-03-19 07:19:16.814 [error]
<0.524.0>#ejabberd_auth_external:failure:103 External authentication
program failed when calling 'check_password' for admin#example.org.co:
disconnected
ejabberd.log
2019-03-19 07:19:16.811 [debug] <0.524.0>#ejabberd_http:init:151 S:
[{[<<"api">>],mod_http_api},{[<<"admin">>],ejabberd_web_admin}]
2019-03-19 07:19:16.811 [debug]
<0.524.0>#ejabberd_http:process_header:307 (#Port<0.13811>) http
query: 'POST' <<"/api/register">>
2019-03-19 07:19:16.811 [debug]
<0.524.0>#ejabberd_http:process:394 [<<"api">>,<<"register">>] matches
[<<"api">>]
2019-03-19 07:19:16.811 [info]
<0.364.0>#ejabberd_listener:accept:238 (<0.524.0>) Accepted connection
::ffff:ip -> ::ffff:ip
2019-03-19 07:19:16.814 [info]
<0.524.0>#mod_http_api:log:548 API call register
[{<<"user">>,<<"test">>},{<<"host">>,<<"example.org.co">>},{<<"password">>,<<"test">>}]
from ::ffff:ip
2019-03-19 07:19:16.814 [error]
<0.524.0>#ejabberd_auth_external:failure:103 External authentication
program failed when calling 'check_password' for admin#example.org.co:
disconnected
2019-03-19 07:19:16.814 [debug]
<0.524.0>#mod_http_api:extract_auth:171 Invalid auth data:
{error,invalid_auth}
Any help regarding this topic will be appreciated.
1) Your config about the auth_method looks good.
2) Here is a python script I've used and upgraded to make an external authentication for ejabberd.
#!/usr/bin/python
import sys
from struct import *
import os
def openAuth(args):
(user, server, password) = args
# Implement your interactions with your service / database
# Return True or False
return True
def openIsuser(args):
(user, server) = args
# Implement your interactions with your service / database
# Return True or False
return True
def loop():
switcher = {
"auth": openAuth,
"isuser": openIsuser,
"setpass": lambda(none): True,
"tryregister": lambda(none): False,
"removeuser": lambda(none): False,
"removeuser3": lambda(none): False,
}
data = from_ejabberd()
to_ejabberd(switcher.get(data[0], lambda(none): False)(data[1:]))
loop()
def from_ejabberd():
input_length = sys.stdin.read(2)
(size,) = unpack('>h', input_length)
return sys.stdin.read(size).split(':')
def to_ejabberd(result):
if result:
sys.stdout.write('\x00\x02\x00\x01')
else:
sys.stdout.write('\x00\x02\x00\x00')
sys.stdout.flush()
if __name__ == "__main__":
try:
loop()
except error:
pass
I didn't created the communication with Ejabberd from_ejabberd() and to_ejabberd(), and unfortunately can't find back the sources.
This question already has an answer here:
Error: Cannot read property 'close' of null
(1 answer)
Closed 4 years ago.
here is my code :
const mongodb = require('mongodb')
const MongoClient = mongodb.MongoClient
const url = 'mongodb://localhost:27017/edx-course-db'
MongoClient.connect(url, (err, db) => {
console.log('Kudos. Connected successfully to server')
db.close()
})
error :
TypeError: Cannot read property 'close' of null
at MongoClient.connect (/home/akshay/nodeJs/node-edx/mongoDB/mongodb-script-project/server.js:11:6)
at args.push (/home/akshay/nodeJs/node-edx/mongoDB/mongodb-script-project/node_modules/mongodb/lib/utils.js:404:25)
at /home/akshay/nodeJs/node-edx/mongoDB/mongodb-script-project/node_modules/mongodb/lib/mongo_client.js:270:21
at connectCallback (/home/akshay/nodeJs/node-edx/mongoDB/mongodb-script-project/node_modules/mongodb/lib/mongo_client.js:935:5)
at /home/akshay/nodeJs/node-edx/mongoDB/mongodb-script-project/node_modules/mongodb/lib/mongo_client.js:784:11
at _combinedTickCallback (internal/process/next_tick.js:131:7)
at process._tickCallback (internal/process/next_tick.js:180:9)
You aren't ever checking the err object to see if it successfully connected. That callback will be invoked whether the connection is successful or not. Callbacks are simply invoked when the function completes its call, it does not imply success or failure of the call.
Check the err object for more details, but I suspect the mongod is not running on that port.
Try running the following commands to verify the mongod is running.
netstat -tulpn | grep mongod
mongo --eval "db.version()"
Im using docker-composer and Im finding issues with execution order of services. The main issue happens when my express app tries to connect to mongod but this is not yet ready.
The issue can be reproduced easily by running first the nodejs application but not mongod (manually forcing this case).
My app uses mongoose and try to establish connection to mongod. Because mongod is not up and running, the app throws an error about it.
$ nodemon server/app.js
24 Apr 21:42:05 - [nodemon] v1.7.0
24 Apr 21:42:05 - [nodemon] to restart at any time, enter `rs`
24 Apr 21:42:05 - [nodemon] watching: *.*
24 Apr 21:42:05 - [nodemon] starting `node server/app.js`
Listening on port 8000
disconnected
connection error: { [MongoError: connect ECONNREFUSED] name: 'MongoError', message: 'connect ECONNREFUSED' }
Starting mongod later seems to reconnect.
24 Apr 21:51:28 - [nodemon] v1.7.0
24 Apr 21:51:28 - [nodemon] to restart at any time, enter `rs`
24 Apr 21:51:28 - [nodemon] watching: *.*
24 Apr 21:51:28 - [nodemon] starting `node server/app.js`
Listening on port 8000
disconnected
connection error: { [MongoError: connect ECONNREFUSED] name: 'MongoError', message: 'connect ECONNREFUSED' }
connected
reconnected
Despite of that, operations that require access to mongo will not come through... neither error is shown
This is the code to connect to mongo using mongoose:
// Starting mongo
mongoose.connect(config.database, {
server:{
auto_reconnect:true,
reconnectTries: 10,
reconnectInterval: 5000,
}
});
// Listening for connection
var mongo = {};
var db = mongoose.connection;
db.on('connected', console.error.bind(console, 'connected'));
db.on('error', console.error.bind(console, 'connection error:'));
db.on('close', console.error.bind(console, 'connection close.'));
db.once('open', function() {
console.log("We are alive");
});
db.on('reconnected', function(){
console.error('reconnected');
});
db.on('disconnected', console.error.bind(console, 'disconnected'));
And here is the route that will try to get data from mongo but fail.
router.post('/auth', function(req, res){
User.findOne({name: req.body.name})
.then(function(user){
if(!user)
{
res.status(401).send({ success: false, message: 'Authentication failed. User not found.' });
}
...
How can I recover from running nodejs before mongo is ready?.
In my case, I created separate function only for mongoose connect method:
const connect = () => {
mongoose.connect('mongodb://localhost:27017/myapp', {
useNewUrlParser: true,
reconnectTries: Number.MAX_VALUE,
reconnectInterval: 500,
poolSize: 10,
});
};
I'm calling it at the same start. I also added Event Handler for error event:
mongoose.connection.on('error', (e) => {
console.log('[MongoDB] Something went super wrong!', e);
setTimeout(() => {
connect();
}, 10000);
});
If mongoose fails to connect because MongoDB is not running, error event handler is fired and setTimeout schedules "custom" reconnect.
Hope it helps.
How long does it take before mongod is ready? Because it seems like this is an edge case issue, where mongod might take a couple of seconds to get ready; and when mongoose is connected it serves requests as expected. Just trying to understand why the slight delay (probably a only a few seconds) is necessary to resolve?
But here is a solution anyway:
You could set up an express middleware to check if mongoose is ready and throw an error if not:
app.use(function(req,res,next){
if (mongoose.Connection.STATES.connected === mongoose.connection.readyState){
next();
} else {
res.status(503).send({success:false, message: 'DB not ready' });
}
});
This should go before you inject your router.
I had the same issue with Mongoose 5+. I was able to get this working by creating a retry function using set timeout.
const mongoose = require('mongoose');
const {
MONGO_USERNAME,
MONGO_PASSWORD,
MONGO_HOSTNAME,
MONGO_PORT,
MONGO_DB,
MONGO_DEBUG,
MONGO_RECONNECT_TRIES,
MONGO_RECONNECT_INTERVAL,
MONGO_TIMEOUT_MS,
} = process.env;
if (MONGO_DEBUG) {
console.log(`********* MongoDB DEBUG MODE *********`);
mongoose.set('debug', true);
}
const DB_OPTIONS = {
useNewUrlParser: true,
reconnectTries: MONGO_RECONNECT_TRIES,
reconnectInterval: MONGO_RECONNECT_INTERVAL,
connectTimeoutMS: MONGO_TIMEOUT_MS,
};
const DB_URL = `mongodb://${MONGO_USERNAME}:${MONGO_PASSWORD}#${MONGO_HOSTNAME}:${MONGO_PORT}/${MONGO_DB}?authSource=admin`;
// Initialize conenction retry counter
let reconnectTriesAlready = 1;
// Connect to database with timeout and retry
const connectWithRetry = () => {
mongoose.connect(DB_URL, DB_OPTIONS).then(() => {
// Connected successfully
console.log('********* MongoDB connected successfully *********');
// Reset retry counter
reconnectTriesAlready = 1;
}).catch(err => {
// Connection failed
console.error(`********* ERROR: MongoDB connection failed ${err} *********`)
// Compare retries made already to maximum retry count
if (reconnectTriesAlready <= DB_OPTIONS.reconnectTries) {
// Increment retry counter
reconnectTriesAlready = reconnectTriesAlready + 1;
// Reconnect retries made already has not exceeded maximum retry count
console.log(`********* MongoDB connection retry after ${MONGO_RECONNECT_INTERVAL / 1000} seconds *********`)
// Connection retry
setTimeout(connectWithRetry, MONGO_RECONNECT_INTERVAL)
} else {
// Reconnect retries made already has exceeded maximum retry count
console.error(`********* ERROR: MongoDB maximum connection retry attempts have been made already ${DB_OPTIONS.reconnectTries} stopping *********`)
}
})
}
connectWithRetry();