Remotely read locales file - server

I want to read a file from server and wanted to set my locale variable of server. I want to read file form remote serve and wanted to store in variable/ buffer and then want to set it locale of my centOS server

Related

Not able to import CSV file in MySQL Workbench through load data local infile query [duplicate]

I tried to upload a .txt file into MySQL Workbench, but I have the following issue:
Error Code: 3948 Loading Local data is disable; this must be enable on both the client and server sides
Workbench uses a MySQL feature called LOAD DATA LOCAL for this .txt file import operation. Because that feature exposes some security problems in the server, the operator of the server needs to enable that feature, by running the MySQL server software (mysqld, it's called) with a specific system variable called local_infile. Your error message means that flag is not enabled.
You can try enabling it at runtime before you do your upload operation. Try this SQL statement.
SET ##GLOBAL.local_infile = 1;
If that doesn't work you need to ask the person who runs your server to enable it.

Is there a way to keep Windows EFS encryption metadata in place when uploading a file to Linux?

I am trying to copy an EFS Encrypted zip file from Windows to a Linux server (through OpenSSH scp). It was encrypted using the PowerShell .Encrypt() method. Unfortunately, for whatever reason, when I download the file from the Linux server to a Windows machine, it can't be opened because the Windows machine does not detect it's EFS encrypted, and just regards it as an unreadable zip file.
I have exported the EFS key from the first computer and installed it on the computer that opens the file. The file is successfully detected as an EFS encrypted file when I use a USB key to move the file around and can be opened properly.
The PowerShell script that I'm trying to create should be invisible to the user. Another question is: could creating and mounting a VHDX file still be part of a script that doesn't interrupt the normal workflow of the user?

DB2 load ( client in remote | file in the db2 server )

I'm using db2 client in windows to connect to Linux DB2 server.
I'm trying to upload data using my client but the data is in the /tmp/ directory in the host server.
If I use LOAD FROM "/tmp/file.txt" OF .. it fails with message QL2036N The path for the file, named pipe, or device "/tmp/file.txt" is not valid.
It is possible doing thins without db2 connect from the server itself ?
regards
Per comment thread: the solution was to ensure that the Db2-instance owner has read access to the file on the server.
When you use load from then the specified file must reside on the Db2-server, and the Db2-instance owner (e.g. db2inst1) on the server must have read access to the file. DOUBLE CHECK the permissions/ownerships. If the file is on your workstation use load client from.

Where is the DATA_DUMP_DIR in sql developer

I'm trying to import a .dmp file using the Data Pump Import tool in oracle sql developer.
I'm connected to an oracle database running in a container on my local machine.
When I get to the step where I specify where the dump file is to import, where should I place the .dmp file?
DATA_PUMP_DIR is a default Oracle directory object. It isn't part of SQL Developer; the import tool is really just giving you a GUI equivalent of running impdp from the command line.
You can find the operating system location that Oracle directory object points to by querying the data dictionary:
select directory_path from all_directories where directory_name = 'DATA_PUMP_DIR';
The path that returns is on the database server (in your case that'll be inside your container too), and your dump file needs to go there.
You might want to create additional directory objects pointing to other locations, and grant suitable privileges to users to be able to access them; but they all need to be on the DB server and read/writable by the Oracle process owner on that server.
(They could be remote filesystems mounted on the server, they don't necessarily have to be local storage, but that's another issue and more operating-system specific. Again, in your case, you might be able to share a folder on your local machine with the container, if you don't want to copy the file into the container.)

Export Postgres table to csv

I am trying to export my Postgres table to a csv on my desktop and I get this error:
ERROR: could not open file "C:\Users\blah\Desktop\countyreport.csv" for writing: Permission denied
SQL state: 42501
This is my query which I believe is the correct syntax
COPY countyreport TO 'C:\\Users\\blah\\Desktop\\countyreport.csv' DELIMITER ',' CSV HEADER;
According to the user manual:
Files named in a COPY command are read or written directly by the
server, not by the client application.
https://www.postgresql.org/docs/current/static/sql-copy.html
The common mistake is to believe that the filesystem access will be that of the (client) user, but it's not. It's normal to run the postgresql server as its own user. Therefore action carried out by the server will be done as a different OS user to the client. The server is usually run as an OS user postgres.
Assuming that you are running the server on your local machine then the simplest way to fix it would be to give postgres access to your home directory or desktop. This can be done by changing the windows security settings on your home directory.
Before you do this.... Stop and think. Is this what you are looking for? If the server is in development then will it always run on the user's machine. If not then you may need to use COPY to write to the stdout. See the manual for information on this.