Remote desktop sessions - Track/get the current location of user - powershell

I have been playing around with PSTermninalServices for a while, and I was wondering if it's possible to track, or get, the current location of the user that is holding, working directory, a remote desktop sessoin? E.g what folders are open and what files are opened by the user. Is this possible to achieve with scripting like PowerShell?
OS: Windows Server 2008 R2
Best regards.
DA.

i don't know how to do it with PS, but you can use psfile (part of the excellent pstools suite from Mark Russinovich )
psfile.exe [\\RemoteComputer [-u Username [-p Password]]] [[Id | path] [-c]]
-u Specifies optional user name for login to
remote computer.
-p Specifies password for user name.
Id Id of file to print information for or close.
Path Full or partial path of files to match.
-c Closes file identified by file Id.
Omitting a file identifier has PsFile list all files opened remotely.

Related

Using p4 zip and unzip to export files from one perforce server to another

I was trying to export files along with their revision history inside my depot folder from 2015.2 to 2019 perforce server.Also , I would want perforce to create new user on my new server corresponding to the commiter/submitter on my original 2015 repo.
Perforce replicate looked like overkill for my current task and then I came across this read on perforce's website that mentioned P4 zip.
This looked like it will solve my problem, but the article has a few issues I could not understand.
Let's say I am moving data from server1_ip:port --> server2_ip:port
I am currently following these steps
Making zip of folder to be copied using
p4 remote my_remote_spec , setting
Address: server1_ip:port
DepotMap://depot/... //depot2/...
p4 -p server1_ip:port zip -o test.zip -r my_remote_spec -A //depot/.... But on this step I get permission denied error. This is weird to me because the user although not super/admin has access to files i ask to get zipped.
Also, when i did try with a super user, i could not find test.zip even though i was not prompted any errors.
Isn't the above command supposed to generate a zip file inside the directory which i run it from?
Is the unzip command supposed to be run after a p4 login from user of second server?
Lastly, from the document why is a third port , 1667 mentioned in the transfer of files from server running on 1666 and 1777.
on this step I get permission denied error. This is weird to me because the user although not super/admin has access to files i ask to get zipped.
This is expected:
C:\Perforce\test>p4 help zip
zip -- Package a set of files and their history for use by p4 unzip
...
The zip command requires super permission granted by p4 protect.
Isn't the above command supposed to generate a zip file inside the directory which i run it from?
Similar to p4 admin checkpoint, the zip file is written to the server machine (relative to the server root, if you don't specify an absolute path), rather than being transferred to the local client directory. This is not explicitly stated in the documentation (which seems like an oversight), but if you look in the root directory of the server where you ran the zip, you should find your test.zip there.
Is the unzip command supposed to be run after a p4 login from user of second server?
Yes, any time you run a command against a particular server, you will need to be logged in to that server. In the case of p4 unzip you will need at least admin permission on the second server.
Lastly, from the document why is a third port , 1667 mentioned in the transfer of files from server running on 1666 and 1777.
I'm pretty sure that's a typo; whoever wrote the article started off using ports 1666 and 1777, changed their mind halfway through, and didn't proofread. :)

.pgpass file for more than one user

If I set PGPASSFILE to an explicit path like /home/user/.pgpass then it works fine and when logged in as the user that owns that file I can use psql for the entries in .pgpass.conf.
The problem I have is that I need to have multiple accounts use psql. If I change PGPASSFILE to user directory like ~/.pgpass.conf then it doesn't work and doesn't read the file so it gives a password error.
Because I can only specify one file it means only the owner of that file can run the commands I need to run.
I am running on Ubuntu 18.04 and I need root & www-data to have a .pgpass.conf file.
How do I do this?
If you have system users corresponding to your db users (root and www-data in your case), each has its own, separate .pgpass file in its respective home directory. Set each accordingly.
And simply do not set PGPASSFILE at all. The manual:
PGPASSFILE behaves the same as the passfile connection parameter.
And:
passfile
Specifies the name of the file used to store passwords (see Section 33.15). Defaults to ~/.pgpass, or
%APPDATA%\postgresql\pgpass.conf on Microsoft Windows. (No error is
reported if this file does not exist.)
Related:
Run batch file with psql command without password

Owncloud Calendar ICS Backup

I wanted to have a regular backup of my Owncloud calendars as ICS files, in case the server runs into a problem that I don't have time to fix right away. For this purpose I wrote a little script, which can be run as a cronjob.
Any feedback, improvements, alterations are welcome!
I have been using this script for quite a while. It was a big help in having a backup for calendars and contacts from my onwCloud installation. Thanks!
However, one thing really bugged me with the script of envyrus: new calendars/addressbooks need to be shared manually with the „backup-user“, whose calendars will be backed up. This made the script basically useless for me, because my wife is creating and deleting her calendars and task-lists quite often.
There is a script which can automatically deal with additionally created/deleted calendars, since it fetches all data from the database and not via http-request (like the script from envyrus). It just creates a backup of every single calendar/addressbook existing in the database. Giving a username/password combination is not necessary when using this script. Also there is no need to share calendars to be backed up with a certain user. Last but not least, the script doesn‘t require root privileges.
From the scripts‘ README:
This Bash script exports calendars and addressbooks from
ownCloud/Nextcloud to .ics and .vcf files and saves them to a
compressed file. Additional options are available.
Starting with version 0.8.0, there is no need anymore for a file with
user credentials because all data is fetched directly from the
database. If only calendars/addressbooks of certain users shall be
backed up, list them in users.txt without any passwords.
Maybe this is also a help for others: calcardbackup
DISCLAIMER: I created this script for a little Owncloud instance that I run for myself and 1-2 other friends - it is not meant for any "serious business", so to speak. I used the scripts from this and this site as a starting point - thank you!
To create ics backups of all the user calendars, I created an Owncloud user called "calendarBackup", who other users can share their calendars with. I wrote a little script, that loops through all those calendars and downloads the ics files. They are then put into a shared folder owned by the calendarBackup, and the backup is distributed across users. (An easy adjustment could be made, so that each user gets his own calendar files.)
The advantage to this approach is that the script doesn't need to know all the user passwords.
Here the code:
#!/bin/bash
#owncloud login data for calendar backup user
OCuser=owncloudUserName
OCpassword="owncloudUserPassword"
OCpath="/var/www/owncloud/"
OCbaseURL="https://localhost/owncloud/"
OCdatabase="owncloudDatabaseName"
#destination folder for calendar backups
dest="/var/www/owncloud/data/owncloudUserName/files/Backup/"
#mysql user data with access to owncloud database
MSQLuser=owncloudMysqlUser
MSQLpassword="owncloudMysqlUserPassword"
#timestamp used as backup name
timeStamp=$(date +%Y%m%d%H%M%S)
archivePassword="passwordForArchivedCalendars"
#apachee user and group
apacheUser="apacheUser"
apacheGroup="apacheGroup"
#create folder for new backup files
mkdir "$dest$timeStamp"
#create array of calendar names from Owncloud database query
calendars=($(mysql -B -N -u $MSQLuser -p$MSQLpassword -e "SELECT uri FROM $OCdatabase.oc_calendars"))
calendarCount=${#calendars[#]}
#create array of calendar owners from Owncloud database query
owners=($(mysql -B -N -u $MSQLuser -p$MSQLpassword -e "SELECT principaluri FROM $OCdatabase.oc_calendars"))
loopCount=0
#loop through all calendars
while [ $loopCount -lt $calendarCount ]
do
#see if owner starts with "principals/users/"
#(this part of the script assumes that principaluri for normal users looks like this: principal/users/USERNAME )
if [ "${owners[$loopCount]:0:17}" = "principals/users/" ]
then
#concatenate download url
url=$OCbaseURL"remote.php/dav/calendars/$OCuser/${calendars[$loopCount]}_shared_by_${owners[$loopCount]:17}?export"
#echo $url
#download the ics files (if download fails, delete file)
wget \
--output-document="$dest$timeStamp/${owners[$loopCount]:17}${calendars[$loopCount]}.ics" \
--no-check-certificate --auth-no-challenge \
--http-user=$OCuser --http-password="$OCpassword" \
"$url" || rm "$dest$timeStamp/${owners[$loopCount]:17}${calendars[$loopCount]}.ics"
#echo ${owners[$loopCount]:17}
fi
#echo "${calendars[$loopCount]} ${owners[$loopCount]}"
loopCount=$(($loopCount + 1))
done
#zip backed up ics files and remove the folder (this could easily be left out, change the chown command though)
zip -r -m -j -P $archivePassword "$dest$timeStamp" "$dest$timeStamp"
rm -R $dest$timeStamp
#chown needed so owncloud can access backup file
chown $apacheUser:$apacheGroup "$dest$timeStamp.zip"
#update owncloud database of calendar backup user
sudo -u $apacheUser php "$OCpath"occ files:scan $OCuser
A few notes on the script:
It is written for a Debian shell.
It works for Owncloud 9.1 with Mysql.
It assumes the download URL for a shared calendar looks like this:
OwncloudURL/remote.php/dav/calendars/LoggedInOwncloudUser/CalendarName_shared_by_CalendarOwner?export
To check for the correct URL, simply download a shared calendar in the web interface and check the download URL.
It assumes that the calendar names are stored in the column "uri" of the table "oc_calendars".
It assumes that the calendar owner is stored in the column "principaluri" of the table "oc_calendars" and that all normal users are prefixed with "principals/users/".
It needs sudo permission to update Owncloud file structure.
It needs zip to be installed.

Postgres ERROR: could not open file for reading: Permission denied

Computer: Mac OS X, version 10.8
Database: Postgres
Trying to import csv file into postgres.
pg> copy items_ordered from '/users/darchcruise/desktop/items_ordered.csv' with CSV;
ERROR: could not open file "/users/darchcruise/desktop/items_ordered.csv" for reading: Permission denied
Then I tried
$> chown postgres /users/darchcruise/desktop/items_ordered.csv
chown: /users/darchcruise/desktop/items_ordered.csv: Operation not permitted
Lastly, I tried
$> ls -l
-rw-r--r-- 1 darchcruise staff 1016 Oct 18 21:04 items_ordered.csv
Any help is much appreciated!
Assuming the psql command-line tool, you may use \copy instead of copy.
\copy opens the file and feeds the contents to the server, whereas copy tells the server the open the file itself and read it, which may be problematic permission-wise, or even impossible if client and server run on different machines with no file sharing in-between.
Under the hood, \copy is implemented as COPY FROM stdin and accepts the same options than the server-side COPY.
Copy the CSV file to /tmp
For me this solved the issue.
chmod a+rX /users/darchcruise/ /users/darchcruise/desktop /users/darchcruise/desktop/items_ordered.csv
This will change access rights for your folder. Note that everyone will be able to read your file.
You can't use chown being a user without administrative rights.
Also consider learning umask to ease creation of shared files.
Copy your CSV file into the /tmp folder
Files named in a COPY command are read or written directly by the server, not by the client application. Therefore, they must reside on or be accessible to the database server machine, not the client. They must be accessible to and readable or writable by the PostgreSQL user (the user ID the server runs as), not the client. COPY naming a file is only allowed to database superusers, since it allows reading or writing any file that the server has privileges to access.
I had the issue when I was trying to export data from a remote server into the local disk. I hadn't realised that SQL copy actually is executed on the server and that it tries to write to a server folder. Instead the correct thing to do was to use \copy which is the psql command and it writes to the local file system as I expected. http://www.postgresql.org/message-id/CAFjNrYsE4Za_KWzmfgN1_-MG7GTw_vpMRxPk=OEjAiLqLskxdA#mail.gmail.com
Perhaps that might be useful to someone else too.
Another way to do this, if you have pgAdmin and are comfortable using the GUI is to go the table in the schema and right click on the table you wish to import the file to and select "Import" browse your computer for the file, select the type your file is, the columns you want the data to be imputed into, and then select import.
That was done using pgAdmin III and the 9.4 version of PostgreSQL
I resolved the same issue with a recursive chown on the parent folder:
sudo chown -R postgres:postgres /home/my_user/export_folder
(my export being in /home/my_user/export_folder/export_1.csv)
for macbook first i opened terminal then type
open /tmp
or in finder directory you directly enter command+shift+g then type /tmp in go to the folder.
it opens temp folder in finder. then i paste copied csv file into this folder.then again i go to postgres terminal and typed below command and then it is copied my csv data into db table
\copy recharge_operator FROM '/private/tmp/operator.csv' DELIMITER ',' CSV;
COPY your table (Name, Latitude, Longitude) FROM 'C:\Temp\your file.csv' DELIMITERS ',' CSV HEADER;
Use c:\Temp\"Your File"\.
For me it worked to simply to add sudo (or run as root) for the chown command:
sudo chown postgres /users/darchcruise/desktop/items_ordered.csv
You must grant the pg_read_server_files permission to the user if you are not using postgres superuser.
Example:
GRANT pg_read_server_files TO my_user WITH ADMIN OPTION;
just in case you're facing this problem under windows 10 , add the group of users "youcomputer\Users" on the security Tab and grant it full control , that solved my issue
I had the same error message but was using psycopg2 to communicate with PostgreSQL. I fixed the permission issues by using the functions copy_from and copy_expert that will open the file on the client side as the user running the python script and feed the data to the database over STDIN.
Refer to this link for further information.
This answer is only for Linux Beginners.
Assuming initially the DB user didn't have file/folder(directory) permission on the client side.
Let's constrain ourselves to the following:
User: postgres
Purpose: You wanted to (write to / read from) a specific folder
Tool: psql
Connected to a specific database: YES
FILE_PATH: /home/user/training/sql/csv_example.csv
Query: \copy (SELECT * FROM table_name TO FILE_PATH, DELIMITER ',' CSV HEADER;
Actual Results: After running the query you got an error : Permission Denied
Expected Results: COPY COUNT_OF_ROWS_COPIED
Here are the steps I'd follow to try and resolve it.
Confirm the FILE_PATH permissions on your File system.
Inside a terminal to view the permissions for a file/folder you need to long list them by entering the command ls -l.
The output has a section that shows sth like this -> drwxrwxr-x
Which is interpreted in the following way:
TYPE | OWNER RIGHTS | GROUP RIGHTS | USER RIGHTS
rwx (r: Read, W: Write, X: Execute)
TYPE (1 Char) = d: directory, -: file
OWNER RIGHTS (3 Chars after TYPE)
GROUP RIGHTS (3 Chars after OWNER)
USER RIGHTS (3 Chars after GROUP)
If permissions are not enough (Ensure that a user can at least enter all folders in the path you wanted path) - x.
This means for FILE_PATH, All the directories (home , user, training, sql) should have at least an x in the USER RIGHTS.
Change permissions for all parent folders that you need to enter to have a x. You can use chmod rights_you_want parent_folder
Assuming /training/ didn't have an execute permission.
I'd go the user folder and enter chmod a+x training
Change the destination folder/directory to have a w if you want to write to it. or at least a r if you want to read from it
Assuming /sql didn't have a write permission.
I would now chmod a+w sql
Restart the postgresql server sudo systemctl restart postgresql
Try again.
This would most probably help you now get a successful expected result.
On Linux you can fix this by giving the postgres user read/write/execute permissions on the target directory. Eg:
setfacl -m u:postgres:rwx /home/hi
I just copied the source csv file to another folder in which you have more permissions (C:/temp), and it worked fine.
May be You are using pgadmin by connecting remote host then U are trying to update there from your system but it searches for that file in remote system's file system... its the error wat I faced May be its also for u check it

Gathering Files from multiple computers into one

I am trying to gather files/folders from multiple computers in my network into one centralized folder in the command console (this is the name of the pseudo server for this set of computers)
Basically, what i need is to collect a certain file from all the computers connected to my network and back it up in the console.
Example:
* data.txt // this is the file that i need to back up and its located in all the computers in the same location
* \console\users\administrator\desktop\backup\%computername% // i need each computer to create a folder with its computer name into the command console's desktop so i can keep track of which files belongs to which computer
I was trying to use psexec to do this using the following code:
psexec #cart.txt -u administrator -p <password> cmd /c (^net use \\console /USER:administrator <password> ^& mkdir \\console\users\Administrator\Desktop\backup\%computername% ^& copy c:\data.txt \\console\USERS\Administrator\DESKTOP\backup\%computername%\)
any other suggestions since im having trouble with this command
Just use the command copy must easy.
take a look:
for /F %%a in (computerslist.txt) do (
copy \\%%a\c$\users\administrator\desktop\%%a\*.txt c:\mycollecteddata\%%a
)
that will copy all files *.txt for all computers that are on computereslist.txt; the copy will be with the current credentials. Save the code on a file *.cmd and execute with the right user, you can create a scheduled taks to start with a user thant is commom for all computers.
Good work.