Compact and Repair all Access databases in a directory using Powershell - powershell

I am looking to find a way to compact and repair all the Access databases in a certain directory using Powershell via a script.
The VBA codes below work, but need one for Powershell:
Find all Access databases, and Compact and Repair
I am new to Powershell so will be grateful for the assistance.
Thanks

You may try this.
Add-Type -AssemblyName Microsoft.Office.Interop.Access
$rootfolder = 'c:\some\folder'
$createlog = $true # change to false if no log desired
$access = New-Object -ComObject access.application
$access.Visible = $false
$access.AutomationSecurity = 1
Get-ChildItem -Path $rootfolder -File -Filter *.accdb -Recurse -PipelineVariable file | ForEach-Object {
$newname = Join-Path $file.Directory ("{0}_compacted{1}" -f $file.BaseName,$file.Extension)
$message = #"
Current file: {0}
Output file: {1}
"# -f $file.FullName,$newname
Write-Host $message -ForegroundColor Cyan
$access.CompactRepair($file.fullname,$newname,$createlog)
}
$access.Quit()
This will output each compacted database as the name of the original file with _compacted appended to the name (before the extension.) I have tested this in every way except actually compacting databases.
Edit
Regarding your comment, a few minor changes should achieve the desired result. Keep in mind that this will put all new files in the same folder. This may not be an issue for your case but if there are duplicate file names you will have problems.
$rootfolder = 'c:\some\folder'
$destination = 'c:\some\other\folder'
$todaysdate = get-date -format '_dd_MM_yyyy'
Add-Type -AssemblyName Microsoft.Office.Interop.Access
$createlog = $true # change to false if no log desired
$access = New-Object -ComObject access.application
$access.Visible = $false
$access.AutomationSecurity = 1
Get-ChildItem -Path $rootfolder -File -Filter *.accdb -Recurse -PipelineVariable file | ForEach-Object {
$newname = Join-Path $destination ("{0}$todaysdate{1}" -f $file.BaseName,$file.Extension)
$message = #"
Current file: {0}
Output file: {1}
"# -f $file.FullName,$newname
Write-Host $message -ForegroundColor Cyan
$access.CompactRepair($file.fullname,$newname,$createlog)
}
$access.Quit()

Related

temp files in error with 7zip in powershell

I am currently strugling on a simple powershell script to archive files.
I have thousand of old file in a folder and i want to archive them depending on the month/year of their creation date in archives named "YYYYMM".
I use the code below
Get-ChildItem -Path $sourcePath -filter $filter |
Where-Object {(($_.CreationTime) -le $dateCriteria) -and ($_.psIsContainer -eq $false)}|
ForEach {
$archive = "{0:yyyy}{0:MM}.7z" -f $_.CreationTime
$archivePath= Join-Path -Path $destinationFolder -ChildPath $archive
& "C:\Program Files\7-Zip\7z.exe" a -mx9 -t7z -m0=lzma2 -sdel $archivePath$_.FullName |Out-Null
}
The logic seems fine as it creates files like
201809.7z
201810.7z
...
In my destination folder.
The problem is i see errors in the console:
System ERROR:
The file exists
or
System ERROR:
Access denied
or
System ERROR:
The file exists
ERROR: ********\202011.7z
Can not open the file as archive
As a result, in my destination folder, in addition to the expected archive files i have file like "201810.7z.tmp1"
I changes the working directory to isolate those files by adding -w"{WORK_PATH}"
to the command line.
I also added Start-Sleep -Milliseconds 1
as it looked like concurrent access even if my script is mono threaded (maybe 7zip doesn't end properly) but it didn't work.
With Start-Sleep -Milliseconds 500it seems to work but for obvious reasons i dont want to use that. What would the proper way to do that be ?
EDIT 1
Following MisterSmith's answer i changed my code for
Get-ChildItem -Path $emplacementSource -filter $filtreNomFichiers |
Where-Object {(($_.CreationTime) -le $dernierJour) -and ($_.psIsContainer -eq $false)}|
ForEach {
$archive = "{0:yyyy}{0:MM}.7z" -f $_.CreationTime
$cheminArchive= Join-Path -Path $dossierCible -ChildPath $archive
[Array]$arguments = "a" ,"-w$workDir", "-mx9" ,"-t7z" ,"-m0=lzma2" ,"-sdel" ,$cheminArchive, $_.FullName
$pinfo = New-Object System.Diagnostics.ProcessStartInfo
$pinfo.FileName = $sevenZip
$pinfo.RedirectStandardError = $true
$pinfo.CreateNoWindow= $true
$pinfo.UseShellExecute = $false
$pinfo.Arguments = "$arguments"
$process = New-Object System.Diagnostics.Process
$process.StartInfo = $pinfo
$process.Start()
$output = $process.StandardError.ReadToEnd()
$process.WaitForExit()
if(0 -ne $process.ExitCode){
Write-Output "$(Get-TimeStamp) Erreur: $output" | Out-file $fichierLogs -append
}
}
I still have File exists errors and .tmpX archives in my temp folder.
Instead of using & use Start-Process and pass the -wait switch or use -PassThru switch and the returned System.Diagnostics.Process to check if the process has finished yourself. Either will get the same result as your Start-Sleep -Milliseconds 500 test, but it will only wait for the time taken by 7z.exe to complete its execution.
Side note - you can append multiple files at once. That would probably work out quicker overall than adding each file separately.

Why is powershell switching directories mid code on copy-item when the variable is already defined?

I have some code that checks a target file, waits for a change, and I want it to only move the most recent files based on their LastWriteTime Value. However, every time I change a file within the target directory nothing is copying over and I am having the copy-item directory change to "C:\Users\run". I
it recognizes that there are files to copy and even states their filename when throwing the error. What can I do in this situation to make sure my copy-item command is copying from my target directory?
Code for Reference:
$File = "C:\Users\run\Desktop\Target"
$destinationFolder = "c:\users\run\desktop\dest"
$maxDays = "-1"
$maxMins = "20"
$date = Get-Date
Write-Host "Waiting For File To Change in Job Cloud..."
$Action = '
dateChecker
Write-Host "Moving Files From Job Cloud To Server Shares... Please Do Not Disrupt This Service"
write-host "files copied to job cloud..."
exit
'
$global:FileChanged = $false
function dateChecker {
Foreach($File in (Get-ChildItem -Path $File)){
if($File.LastWriteTime -lt ($date).AddMinutes($maxMins)){
Write-Host "Moving Files From Job Cloud To Server Shares... Please Do Not Disrupt This Service"
Copy-Item -Path $File -Destination $destinationFolder -recurs #-ErrorAction #silentlyContinue
}
}
}
while($true) {
function Wait-FileChange {
param(
[string]$File,
[string]$Action
)
$FilePath = Split-Path $File -Parent
$FileName = Split-Path $File -Leaf
$ScriptBlock = [scriptblock]::Create($Action)
$Watcher = New-Object IO.FileSystemWatcher $FilePath, $FileName -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$onChange = Register-ObjectEvent $Watcher Changed -Action {$global:FileChanged = $true}
while ($global:FileChanged -eq $false){
Start-Sleep -Milliseconds 100
}
& $ScriptBlock
Unregister-Event -SubscriptionId $onChange.Id
}
Wait-FileChange -File $File -Action $Action
}
PowerShell is not switching directories - although I can certainly see why you'd think that based on the behavior. The explanation is closer than you might think though:
The -Path parameter takes a [string] argument.
$File is not a string - it's a [FileInfo] object - and PowerShell therefore converts it to a string before passing it to Copy-Item -Path. Unfortunately, this results in the name of the file (not the full path) being passed as the argument, and Copy-Item therefore has to resolve the full path, and does so relative to the current working directory.
You can fix this by passing the full path explicitly to Copy-Item -LiteralPath:
Copy-Item -LiteralPath $File.FullName ... |...
or you can let the pipeline parameter binder do it for you by piping the $File object to Copy-Item:
$File |Copy-Item ... |...
Why -LiteralPath instead of -Path? -Path accepts wildcard patterns like filenameprefix[0-9] and tries to resolve it to a file on disk, meaning if you have to operate on files with [ or ] in the name, it'll result in some unexpected behavior :)

Copy multiple workbooks into a single workbooks with powershell

I am trying to copy multiple excel workbooks to a single excel workbook with the below, but it is only copying 6 columns when I have 35.
#Get a list of files to copy from
$Files = GCI 'C:\Users\bob\Desktop\Und' | ?{$_.Extension -Match "xlsx?"} | select -ExpandProperty FullName
#Launch Excel, and make it do as its told (supress confirmations)
$Excel = New-Object -ComObject Excel.Application
$Excel.Visible = $True
$Excel.DisplayAlerts = $False
#Open up a new workbook
$Dest = $Excel.Workbooks.Add()
`enter code here` ForEach($File in $Files[0..4]){
$Source = $Excel.Workbooks.Open($File,$true,$true)
If(($Dest.ActiveSheet.UsedRange.Count -eq 1) -and ([String]::IsNullOrEmpty ($Dest.ActiveSheet.Range("A1").Value2))){ #If there is only 1 used cell and it is blank select A1
[void]$source.ActiveSheet.Range("A1","F$(($Source.ActiveSheet.UsedRange.Rows|Select -Last 1).Row)").Copy()
[void]$Dest.Activate()
[void]$Dest.ActiveSheet.Range("A1").Select()}
Else{ #If there is data go to the next empty row and select Column A
[void]$source.ActiveSheet.Range("A2","F$(($Source.ActiveSheet.UsedRange.Rows|Select -Last 1).Row)").Copy()
[void]$Dest.Activate()
[void]$Dest.ActiveSheet.Range ("A$(($Dest.ActiveSheet.UsedRange.Rows|Select -last 1).row+1)").Select()}
[void]$Dest.ActiveSheet.Paste()
$Source.Close()}
$Dest.SaveAs("C:\Users\bob\Desktop\Und\combo\Combined.xlsx",51)
$Dest.close()
$Excel.Quit()
A solution with the excellent PoserShell module ImportExcel (with PowerShell 5 or more)
First, install the module:
in an Administrator PowerShell console: Install-Module -Name ImportExcel
in a non Administrator PowerShell console: Install-Module -Name ImportExcel -Scope CurrentUser
Then, use the following code:
$source = 'C:\Users\bob\Desktop\Und'
$destination = 'C:\Users\bob\Desktop\Und\combo\Combined.xlsx'
$fileList = Get-ChildItem -Path $source -Filter '*.xlsx'
foreach ($file in $fileList) {
$fileContent = Import-Excel -Path $file.FullName
$excelParameters = #{
Path = $destination
WorkSheetname = 'Combined'
}
if ((Test-Path -Path $destination) -and (Import-Excel #excelParameters)) {
$excelParameters.Append = $true
}
$fileContent | Export-Excel #excelParameters
}
This code assumes that all your Excel source files have the same headers and you want all your data in the same WorkSheet. But can be adapted to support other scenarios.

Powershell: Openfile dialog and Set-Content Issue. Cannot access file. In use

What I'm trying to do is utilize the openfile dialog, select an ini file and make line changes to it at the end of this script with set-content. But I keep getting the error that Set-Content : The process cannot access the file, and that it's in use.
$a = $env:userprofile
Function Get-FileName($InitialDirectory)
{
Get-FileName -InitialDirectory "$a\AppData\Roaming\Milliman"
}#end function Get-FileName
[System.Reflection.Assembly]::LoadWithPartialName("System.windows.forms") | Out-Null
$dialog = New-Object System.Windows.Forms.OpenFileDialog
$dialog.DefaultExt = '.*'
$dialog.Filter = 'All Files|*.*'
$dialog.FilterIndex = 0
$dialog.InitialDirectory = $InitialDirectory
$dialog.Multiselect = $false
$dialog.RestoreDirectory = $true
$dialog.Title = "Select a file"
$dialog.ValidateNames = $true
$dialog.ShowHelp = $true
$dialog.ShowDialog()
$dialog.FileName
##Folder Dialog
$dir = new-object -com Shell.Application
$aldir = $dir.BrowseForFolder(0, "AL Dir", 0, "C:\Program Files\Milliman\")
if ($aldir.Self.Path -ne "") {write-host "You selected " $aldir.Self.Path}
## Grid Integration Steps
Copy-Item -path "\\ap102aric\alfaadmin$\Ver70andAbove\DataSynapse\*" -destination "C:\Program Files\Common Files\Milliman\MG-ALFA Shared\DataSynapse" -Force
Copy-Item -path "\\ap102aric\alfaadmin$\Ver70andAbove\JobOptions-RPRic\*" -destination "C:\Program Files\Common Files\Milliman\MG-ALFA Shared\DataSynapse" -Force
Copy-Item -path "\\ap102aric\alfaadmin$\Ver70andAbove\GSDLL\dsdrv.dll" -Destination $aldir.Self.Path -Force
## Set Environment Variable
[Environment]::SetEnvironmentVariable("DSDRIVER_DIR","C:\Program Files\Common Files\Milliman\MG-ALFA Shared\DataSynapse\Config","Machine")
## Edit Config UI.ini to set SDP LOGON for Datasynapse
#Write-Host $dialog.FileName
Get-Content $dialog.FileName | ForEach-Object {
$_ -replace 'SDPAvailable=*','SDPAvailable=DataSynapse'
-replace 'SDPFolder=*','SDPFolder=C:\Program Files\Common Files\Milliman\MG-ALFA Shared\DataSynapse'
-replace 'SDPLogon=*','SDPAvailable=Yes'
} | Set-Content $dialog.FileName
Try destroying the $aldir object. It may be holding a handle to the file. I'm not sure how to do that. Maybe set it to $null after you grab the path the user selected.
You can also try using Process Monitor to figure out what process is locking the file.
Finally, you can't pipe the output from Get-Content to Set-Content, e.g.
Get-Content $Path | Set-Content $Path
Items are sent down the PowerShell pipeline immediately, so when Get-Content reads a line, it immediately gets set to Set-Content, which won't work because Get-Content has the file open. Instead, try saving the contents of the file:
$file = Get-Content $path
# Modify $file
$file | Set-Content $path

Powershell - "The process cannot access the file because it is being used by another process"

Below is a script that monitors a directory and its subfolders for deposited files. Every 10 minutes or so, I look for new files and then match them against a database table that tell me where they need to be moved to - then it copies the files to a local archive, moves them to the locations they need to be moved to, and inserts a record into another database table with the file's attributes and where it came and went. If there is no match in the database - or there is an script error - it sends me an email.
However, since files are getting deposited to the directory constantly, it's possible that a file is still being written when the script executes. As a result, I get the error The process cannot access the file because it is being used by another process. emailed to me all the time. In addition, because I'm not dealing with the error up front; it goes through the loop and a false entry is inserted into my log table in the database with incorrect file attributes. When the file finally frees up, it gets inserted again.
I'm looking for a way to identify files that have processes attached to them; and skipping them when the script executes - but several days of web searches and some testing hasn't yielded an answer yet.
## CLEAR ERROR LOG
$error.clear()
Write-Host "***File Transfer Script***"
## PARAMETERS
$source_path = "D:\Files\In\"
$xferfail_path = "D:\Files\XferFailed\"
$archive_path = "D:\Files\XferArchive\"
$email_from = "SQLMail <SQLMail#bar.com>"
$email_recip = [STRING]"foo#bar.com"
$smtp_server = "email.bar.com"
$secpasswd = ConvertTo-SecureString "Pa$$w0rd" -AsPlainText -Force
$smtp_cred = New-Object System.Management.Automation.PSCredential ("BAR\SQLAdmin", $secpasswd)
## SQL LOG FUNCTION
function Run-SQL ([string]$filename, [string]$filepath, [int]$filesize, [int]$rowcount, [string]$xferpath)
{
$date = get-date -format G
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server=SQLSERVER;Database=DATABASE;Uid=SQLAdmin;Pwd=Pa$$w0rd;"
$SqlConnection.Open()
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = "INSERT INTO DATABASE..Table VALUES ('$date','$filename','$filepath',$filesize,$rowcount,'$xferpath',0)"
$SqlCmd.Connection = $SqlConnection
$SqlCmd.ExecuteNonQuery()
$SqlConnection.Close()
}
## DETERMINE IF THERE ARE ANY FILES TO PROCESS
$file_count = Get-ChildItem -path $source_path |? {$_.PSIsContainer} `
| Get-ChildItem -path {$_.FullName} -Recurse | Where {$_.psIsContainer -eq $false} | Where {$_.Fullname -notlike "D:\Files\In\MCI\*"} `
| Measure-Object | Select Count
If ($file_count.Count -gt 0)
{
Write-Host $file_count.Count "File(s) Found - Processing."
Start-Sleep -s 5
## CREATE LIST OF DIRECTORIES
$dirs = Get-ChildItem -path $source_path -Recurse | Where {$_.psIsContainer -eq $true} | Where {$_.Fullname -ne "D:\Files\In\MCI"} `
| Where {$_.Fullname -notlike "D:\Files\In\MCI\*"}
## CREATE LIST OF FILES IN ALL DIRECTORIES
$files = ForEach ($item in $dirs)
{
Get-ChildItem -path $item.FullName | Where {$_.psIsContainer -eq $false} | Sort-Object -Property lastWriteTime -Descending
}
## START LOOPING THROUGH FILE LIST
ForEach ($item in $files)
{
## QUERY DATABASE FOR FILENAME MATCH, AND RETURN TRANSFER DIRECTORY
$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server=SQLSERVER;Database=DATABASE;Uid=SQLAdmin;Pwd=Pa$$w0rd;"
$SqlConnection.Open()
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = "SELECT F.DirTransfer FROM DATABASE..Files F WHERE '$item.Name.Trim()' LIKE F.FileName"
$SqlCmd.Connection = $SqlConnection
$DirTransfer = $SqlCmd.ExecuteScalar()
$SqlConnection.Close()
If ($DirTransfer) # if there is a match
{
Write-Host $item.FullName"`t->`t"$DirTransfer
$filename = $item.Name
$filepath = $item.FullName
$filesize = $item.Length
If (!($filesize))
{
$filesize = 0
}
$rowcount = (Get-Content -Path $item.FullName).Length
If (!($rowcount))
{
$rowcount = 0
}
$xferpath = $DirTransfer
Run-SQL -filename "$filename" -filepath "$filepath" -filesize "$filesize" -rowcount "$rowcount" -xferpath "$DirTransfer"
Copy-Item -path $item.FullName -destination $DirTransfer -force -erroraction "silentlycontinue"
Move-Item -path $item.FullName -destination $archive_path -force -erroraction "silentlycontinue"
#Write-Host "$filename $filepath $filesize $rowcount $xferpath"
}
Else # if there is no match
{
Write-Host $item.FullName "does not have a mapping"
Move-Item -path $item.FullName -destination $xferfail_path -force
$filename = $item.FullName
$email_body = "$filename `r`n`r`n does not have a file transfer mapping setup"
Send-MailMessage -To $email_recip `
-From $email_from `
-SmtpServer $smtp_server `
-Subject "File Transfer Error - $item" `
-Body $email_body `
-Priority "High" `
-Credential $smtp_cred
}
}
}
## IF NO FILES, THEN CLOSE
Else
{
Write-Host "No File(s) Found - Aborting."
Start-Sleep -s 5
}
## SEND EMAIL NOTIFICATION IF SCRIPT ERROR
If ($error.count -gt 0)
{
$email_body = "$error"
Send-MailMessage -To $email_recip `
-From $email_from `
-SmtpServer $smtp_server `
-Subject "File Transfer Error - Script" `
-Body $email_body `
-Priority "High" `
-Credential $smtp_cred
}
You can use the SysInternals handles.exe to find the open handles on a file. The exe can be downloaded from http://live.sysinternals.com/.
$targetfile = "C:\Users\me\Downloads\The-DSC-Book.docx"
$result = Invoke-Expression "C:\Users\me\Downloads\handle.exe $targetfile" | Select-String ([System.IO.Path]::GetFileNameWithoutExtension($targetfile))
$result
Outputs:
WINWORD.EXE pid: 3744 type: File 1A0: C:\Users\me\Downloads\The-DSC-Book.docx
Alternatively, you can check for errors either via try/catch or by looking at the $error collection after the Move-Item attempt then handle the condition appropriately.
$error.Clear()
Move-Item -path $item.FullName -destination $xferfail_path -force -ea 0
if($error.Count -eq 0) {
# do something useful
}
else {
# do something that doesn't involve spamming oneself
}
To expand on Arluin's answer. It fails if there's spaces in either the handle.exe or the $targetfile.
This will work for spaces in both and also formats the result to give you the Program Name.exe
$targetfile = "W:\Apps Folder\File.json"
$result = & "W:\Apps (Portable)\handle.exe" "$targetfile" | Select-String ([System.IO.Path]::GetFileNameWithoutExtension($targetfile))
$result = $result -replace '\s+pid\:.+'
$result
# PS> FreeCommander.exe
One way to avoid file locks caused by running the script on a timer is to use an event driven approach using a file system watcher. It has the ability to execute code when an event such as a new file is created in the folder you are monitoring.
To run code when the file is finished copying you would need to listen for the changed event. There is a slight issue with this event in that it fires once when the file begins copying and again when it is finished. I got an idea to work around this chicken/egg problem after checking out the module Mike linked to in the comments. I've updated the code below so that it will only fire off code when file has fully been written.
To try, change $folderToMonitor to the folder you want to monitor and add some code to process the file.
$processFile = {
try {
$filePath = $event.sourceEventArgs.FullPath
[IO.File]::OpenRead($filePath).Close()
#A Way to prevent false positive for really small files.
if (-not ($newFiles -contains $filePath)) {
$newFiles += $filePath
#Process $filePath here...
}
} catch {
#File is still being created, we wait till next event.
}
}
$folderToMonitor = 'C:\Folder_To_Monitor'
$watcher = New-Object System.IO.FileSystemWatcher -Property #{
Path = $folderToMonitor
Filter = $null
IncludeSubdirectories = $true
EnableRaisingEvents = $true
NotifyFilter = [System.IO.NotifyFilters]'FileName,LastWrite'
}
$script:newFiles = #()
Register-ObjectEvent $watcher -EventName Changed -Action $processFile > $null