What is PowerShell command output default location? - powershell

Following this official Azure tutorial. When I run the following PowerShell command (mentioned in Create a project ZIP file section of the tutorial), it runs successfully but I don't know where the zip file created by the command is located.
Compress-Archive -Path * -DestinationPath myAppFiles.zip
I don't see the file in following location either: C:\Windows\System32\WindowsPowerShell\v1.0. What is the default output location for PS command?

The default output location in this case is your current working directory. Running that cmdlet as posted will copy everything in the current directory into $CurrentDirectory\myAppfiles.zip
Note that as you're not specifying the actual path, you'll want to Set-Location to the actual location of the items you're trying to compress. When you do that, the .zip file will end up in that directory.

Related

My script can read a text file when run manually through ISE, but it looks in a different directory when run through Task Scheduler

Powershell noob here.
I have a script for copying PDF documents and CSV files. The script gets the CSV data from a URL defined in a .txt file in the same directory as the script. In the script, the file is determined like this:
$publishedCSV = Get-Content .\DriveURL.txt -Raw
When I run this script in ISE, it works fine and retrieves all the CSV data. However, when I run it in Scheduler, it tries to find the DriveURL file in System32, rather than in the path that is specified (I used transcript to find out what was happening)
I figured that out, and defined the FULL path of DriveURL, rather than just using the .\ notation. It works, but I don't know why it works
What I did:
Specified proper path of DriveURL and now my script works. I don't understand why it worked previously with using ./DriveURL.txt rather than the full path when I'd run it in ISE, but it didn't when run in Scheduler. It's the same script
If you use relative paths then you must also either set your working directory, or in the script change to the appropriate directory before referencing said relative paths. Alternatively you can use full paths, as you have already discovered.
A simple use of cd or pushd and the automatic $PSScriptRoot variable will change your working directory to wherever the script is saved to:
pushd $PSScriptRoot

PowerShell - can´t find long path but file exists

I have an issue while trying to run a PowerShell script.
When i try to obtain a file content (ex. Get-Content ".\d\e\file.xml") using a script a few directories bellow this .xml it works.
But, if i run this script from a different directory to get the file content (ex. c:\users\x\desktop), it will not be able to read it. I have tried Set-Location -Path "C:\a\b\c\ so it can get by relative path.
This is the error that I´m having:
Cannot find path 'C:\a\b\c\d\e\file.xml' because it does not exist. (path has 263 characters, too long)
Powershell Version : 5.1.18362.628

Command for Robocopy to log created date

I'm using Robocopy's log feature to get file information regardless of folder depth (from Learn-PowerShell).
I was able to get a file's timestamp using /TS option of Robocopy, but this timestamp is modified date of the file.
I also need to log the created date.
So how can I log created date of file?
And another one, how to log Modified date and Created date of folder too, using Robocopy?
if you can find a computer with PowerShell v5 you can install a module called PSAlphaFS .
Install-Module PSAlphaFS
copy the module to your system running powershell 4 and import it.
Import-module PSAlphaFS
then run the following command which is similar to get-childitem but without the 260 char limitation.
Get-LongChildItem -Path C:\temp -Recurse |
Select-Object Name,FullName,CreationTime,LastWriteTime

Powershell - Log Off Script to copy two files

Very much new to Powershell scripting. Usually am running scripts I find somewhere else...not trying to create my own. I am using Remote Desktop Services. I have a batch file that will copy two directories to a userpath on log on. When the user logs out, I only want to copy 2 specific files back to the file share. Here is the powershell script I (along with help from another online community) came up with.
I would like to assign this to a Log Off Script for the users. It fails at $unc. More information:
W drive points to the users directory on the File Share.
I have also tried to path it as \\fileshare\Share\$username
CODE:
function copydir ($user){
$userpath = $env:HOMEPATH
$unc = W:
copy-item $userpath\Data\App\DMS\DMS.accde $user\Data\App\DMS\ -recurse
copy-item $userpath\Dynamics\GP2010\dex.ini $user\Dynamics\GP2010\ -recurse
}
$username = $env:username
copydir $username*
Error:
The term \\fileshare\Share\$username is not recognized as the name of a cmdlet, function, script file....
So, obviously it doesn't like the $UNC parameter. I tried to run it with just W: instead since that points to the User Profile on File Server and get a similar error.
But the basic premise I'm trying to accomplish is to copy Data\App\DMS\DMS.accde and Dynamics\GP2010\dex.ini to a users roaming profile on a file share.
Thanks!
Can it be that you need to have the double slashes at the front for your UNC path?
$unc = "\\fileshare\Share\$user"

Powershell Copy-item command not working in script although it does work when run from a command line

I am on a Windows 7 machine trying to execute a PowerShell script to copy a template directory to another directory. The command I am executing looks like:
Copy-Item -path "$projectsFolder$SourceFolder" -destination "$Test" -recurse -verbose;
The parameters are as follows:
path: C:\Users\username\Documents\Visual Studio 2010\Projects\TemplateSolution\Source
Destination: C:\Users\username\Documents\Visual Studio 2010\Projects\test\source\main
When I run this command at a PowerShell prompt, the files are copied correctly. If I try and execute the command in the script with verbose enabled, it appears to copy the files and directories, but only the top level directory is created in the file system. I am not sure why this would happen and I would appreciate any guidance or troubleshooting steps to perform.
Make sure you put quotes around the directory names if they have spaces in them. Also, you may need the -Force parameter to create destination directories if they do not exist.