I'm trying to setup my FTP in VSCode using the sftp extension but I can't get it to connect, it says it's timing out every time. The issue is that my SFTP config from sublime text has always worked on this specific server (ftps connection using port 990). Never had a single issue.
The config that has been working on sublime text:
"type": "ftps",
"save_before_upload": true,
"upload_on_save": true,
"host": "ftpweb.user.srv",
"user": "ftpweb.user.srv|user1",
"password": "redacted",
"port": "990",
"remote_path": "/",
"connect_timeout": 5,
"keepalive": 120,
"ftp_passive_mode": true,
"ftp_obey_passive_host": true,
"allow_config_upload": true,
new config for VScode that is timing out:
{
"name": "test",
"host": "ftpweb.user.srv",
"protocol": "ftp",
"secure": true,
"secureOptions": {
"rejectUnauthorized": false
},
"port": 990,
"username": "ftpweb.user.srv|user1",
"password": "redacted",
"remotepath": "/",
"uploadOnSave": true
}
What am I missing in VScode? I know it doesn't truly support ftps but I thought there would be a workaround using ftp, I just can't seem to find any loophole here. Has anyone been able to connect to a FTPS server on port 990 in VSCode?
Related
I am trying to connect with Postgres 13 from loopback 4 but its showing error
Cannot migrate database schema Error: Timeout in connecting after 5000 ms
at Timeout._onTimeout (/Users/mohammad/Work/prithu/prithuPlatform/node_modules/loopback-datasource-juggler/lib/datasource.js:2654:10)
at listOnTimeout (internal/timers.js:554:17)
at processTimers (internal/timers.js:497:7)
I tried connecting with pgadmin and its connecting fine and I also tried connecting it with express and its also connecting. It was connecting with earlier system but after changing system and installing Postgres 13 it's not working.
Here is the snapshot of app-db.datasource.config.json
{
"name": "app_db",
"connector": "postgresql",
"debug": true,
"host": "localhost",
"port": 5432,
"user": "adminuser",
"password": "password123",
"database": "testdb",
"ssl": false
}
Finally I figured out what was the problem. It was because I was using node 14. There was no problem with the database. I then install the node 12 and it's working now.
I faced the same issue.
I was putting the wrong password for postgres user in datasource.config, check that.
{
"name": "app_db",
"connector": "postgresql",
"debug": true,
"host": "localhost",
"port": 5432,
"user": "adminuser",
"password": "password123", (maybe this password is wrong)
"database": "testdb",
"ssl": false
}
I'm using Visual Studio Code with the plugin FTP-KR and I would like to ask a question on the ignore module, I actually have this configuration:
{
"host": "x.x.x.x",
"username": "foo",
"password": "foofoo",
"remotePath": "/public_html/test",
"protocol": "ftp",
"port": 0,
"fileNameEncoding": "utf8",
"autoUpload": true,
"autoDelete": true,
"autoDownload": false,
"ignoreRemoteModification": true,
"ignore": [
".git",
"/.vscode"
]
}
as you can see I setup as ignore the following directories:
.git
./vscode
is possible ignore the javascript file which does not end with .min.js? So all the files that just and with .js?
Kind regards.
I have this sftp.json file:
{
"name": "projectname",
"remotePath": "/htdocs/",
"host": "myhostdomain",
"protocol": "ftp",
"username": "user",
"password": "pass",
"passive": true,
"watcher": {
"files": "**/*",
"autoUpload": true,
"ignore": [
".vscode",
".git",
".gitignore",
"config.php"
]
}
}
It's really ignore the .vscode folder and config.php file, but not the .git and .gitignore
How can I fix this? The .git folder's changes generate massive network usage.
It's been three years since this question and I'm not sure if anyone still needs this answer.
Since I also fell into the same trap, leaving a possible solution here hoping it helps others. :)
According to the project's wiki page on github
The developer has updated setting field name, So your sftp settings need to be changed to
{
"name": "projectname",
"remotePath": "/htdocs/",
"host": "myhostdomain",
"protocol": "ftp",
"username": "user",
"password": "pass",
"passive": true,
"watcher": {
"files": "**/*",
"autoUpload": true,
},
"remoteExplorer": {
"filesExclude": [
".vscode",
".git",
".gitignore",
"config.php"
]
}
}
I have tried this, and it works for me.
The following configuration is valid for me:
{
"name": "My Server",
"uploadOnSave": true,
"ignore": [
".git",
".gitignore"
]
}
restraunt.json file
`{
"name": "restraunt",
"base": "PersistedModel",
"idInjection": true,
"options": {
"validateUpsert": true
},
"properties": {
"name": {
"type": "string",
"required": true
},
"location": {
"type": "string",
"required": true
}
},
"validations": [],
"relations": {},
"acls": [],
"methods": {}
}`
restraunt.js file
`module.exports = function(Restraunt) {
Restraunt.find({where:{id:1}}, function(data) {
console.log(data);
})
};`
model-config.json file
`"restraunt": {
"dataSource": "restrauntManagement"
}`
datasources.json file
`{
"db": {
"name": "db",
"connector": "memory"
},
"restrauntManagement": {
"host": "localhost",
"port": 0,
"url": "",
"database": "restraunt-management",
"password": "restraunt-management",
"name": "restrauntManagement",
"user": "rohit",
"connector": "mysql"
}
}`
I am able to get,put,post from the explorer which means the sql db has been set up properly but i am not able to 'find' from restraunt.js file.It throws an error.
"Error: Cannot call restraunt.find(). The find method has not been setup. The PersistedModel has not been correctly attached to a DataSource"
Besides that executing code in boot folder, there's a possibility to use event, emitted after attaching the model.
You can write your code right in model.js, not in boot folder.
Looks like:
Model.once("attached", function () {})
Model = Accounts (for example).
I know, this is an old topic, but maybe this helps someone else.
Try installing mysql connector again:
npm i -S loopback-connector-mysql
Take a look at your datasources.json, because mysql's port might be wrong, default port is 3306, also you could try changing localhost to 0.0.0.0.
"restrauntManagement": {
"host": "localhost", /* if you're using docker, you need to set it to 0.0.0.0 instead of localhost */
"port": 0, /* default port is 3306 */
"url": "",
"database": "restraunt-management",
"password": "restraunt-management",
"name": "restrauntManagement",
"user": "rohit",
"connector": "mysql"
}
model-config.json must be:
"restraunt": {
"dataSource": "restrauntManagement" /* this name must be the same name in datasources object key (in your case it is restrauntManagement not the connector name which is mysql) */
}
You also need to execute the migration for restaurant model:
create migration.js at /server/boot and add this:
'use strict';
module.exports = function(server) {
var mysql = server.dataSources.mysql;
mysql.autoupdate('restraunt');
};
you need to migrate every single model you'll use it. you also need to migrate the default models (ACL, AccessToken, etc...) if you're going to attach them to a datasource.
Also in the docs says you can't perform any operation inside the model.js file because the system (at that point) it is not fully loaded. Any operation you need to execute must be inside a .js file in the /boot directory because the system is completely loaded there. You can perform operations inside remote methods because the system is loaded as well.
I am trying to get sensu working.
The following is the sensu-client.log
ubuntu#ip:~$ sudo tail -f /var/log/sensu/sensu-client.log
{"timestamp":"2016-09-27T16:07:37.628182-0400","level":"info","message":"completing checks in progress","checks_in_progress":[]}
{"timestamp":"2016-09-27T16:07:38.128912-0400","level":"info","message":"closing client tcp and udp sockets"}
{"timestamp":"2016-09-27T16:07:38.129275-0400","level":"warn","message":"stopping reactor"}
{"timestamp":"2016-09-27T16:07:39.224377-0400","level":"warn","message":"loading config file","file":"/etc/sensu/config.json"}
{"timestamp":"2016-09-27T16:07:39.224487-0400","level":"warn","message":"loading config files from directory","directory":"/etc/sensu/conf.d"}
{"timestamp":"2016-09-27T16:07:39.224528-0400","level":"warn","message":"loading config file","file":"/etc/sensu/conf.d/check_mem.json"}
{"timestamp":"2016-09-27T16:07:39.224573-0400","level":"warn","message":"config file applied changes","file":"/etc/sensu/conf.d/check_mem.json","changes":{}}
{"timestamp":"2016-09-27T16:07:39.224618-0400","level":"warn","message":"applied sensu client overrides","client":{"name":"localhost","address":"127.0.0.1","subscriptions":["test","client:localhost"]}}
{"timestamp":"2016-09-27T16:07:39.230963-0400","level":"warn","message":"loading extension files from directory","directory":"/etc/sensu/extensions"}
{"timestamp":"2016-09-27T16:07:39.231048-0400","level":"info","message":"configuring sensu spawn","settings":{"limit":12}}
/etc/sensu/client.json contains
{
"rabbitmq": {
"host": "ipaddressofsensuserver",
"port": 5672,
"user": "username",
"password": "password",
"vhost": "/sensu"
},
"api": {
"host": "localhost",
"port": 4567
},
"checks": {
"test": {
"command": "echo -n OK",
"subscribers": [
"test"
],
"interval": 60
},
"memory-percentage": {
"command": "check-memory-percent.sh -w 50 -c 70",
"interval": 10,
"subscribers": [
"test"
]
}
},
"client": {
"name": "localhost",
"address": "127.0.0.1",
"subscriptions": [
"test"
]
}
}
I have copied check-memory-present.sh into /etc/sensu/conf.d folder
I was expecting the log file to run check-memory-percent every 10 seconds. What am I missing here ?
The Sensu client cannot operate entirely independent of the server, but it can schedule its own checks to run and have them be sent to the server through the transport (RabbitMQ in this case). You'll have to add "standalone": true to the check configuration in order to have this take effect, and then restart the sensu-client service.
So, the file /etc/sensu/conf.d/check_mem.json should look something like:
"checks": {
"memory-percentage": {
"command": "/etc/sensu/conf.d/check-memory-percent.sh -w 50 -c 70",
"interval": 10,
"standalone": true
}
}
Remember to remove the block from /etc/sensu/client.json as well, as you may get unexpected results if you have the same check name defined multiple times.
In Client.json, under "client", you need to add the subscriptions. Like in the example here. It should match the definition of "subscribers" for your check.