PowerShell script ignores files with <whitespace>H - powershell

I inherited this script from my precursor, which is supposed to copy a bunch of files from one location into a JFrog Artifactory.
It workes fine except for files with followed by an H in the name.
For example Test My Client.exe will be copied correctly but Test My Host.exe will be ignored by the script.
Anybody has an idea what the reason for this strange behaviour is?
param
(
[string]$RootPath="D:\_temp\upload\",
[string]$Repo="repo-snapshot",
[string]$Product="SEARCHER",
[string]$Version="1.23.45678.9012",
[string]$ArtifactoryRoot
)
$AF_USER = "myUser"
$AF_PWD = "pA55VVoRd" #ConvertTo-SecureString "pA55VVoRd" -AsPlainText -Force
$CREDS = New-Object System.Management.Automation.PSCredential ($AF_USER, $AF_PWD)
$MajorMinor = $Version.Split('.')[0] + "." + $Version.Split('.')[1]
$BuildRevision = $Version.Split('.')[2] + "." + $Version.Split('.')[3]
$uri = "https://artifactory.test.com/artifactory/$Repo/TEST/$Product/$MajorMinor/$BuildRevision/$ArtifactoryRoot/"
$curl = "$PSScriptRoot\..\..\bin\ReleaseU\curl.exe"
Write-Host $MajorMinor
write-host $BuildRevision
$filesToUpload = Get-ChildItem $RootPath -Recurse -File
foreach($file in $filesToUpload)
{
$artifactoryAddPath = $file.FullName.Remove(0,$RootPath.Length).Replace('\','/')
$completeUrl = $uri+$artifactoryAddPath
write-host $completeUrl
$FileSize = (Get-Item $file.FullName).Length /1MB
Write-Host "Filesize: "$FileSize" MB"
$preTime = Get-Date
write-host "$curl --user $AF_USER`:$AF_PWD --upload-file " $file.FullName "$completeUrl --insecure"
. $curl --user $AF_USER`:$AF_PWD --upload-file $file.FullName $completeUrl --insecure
$postTime = Get-Date
Write-Host "Upload finished"
$timeDiff = $postTime - $preTime
$avgSpeed = $FileSize / $timeDiff.TotalSeconds
Write-Host -Object ("Upload speed is: {0:N2}MB/sec" -f ($avgSpeed));
}

Related

Comparing Server IDs in two tools via PowerShell script

I am making a PowerShell script that is supposed to retrieve and compare the server IDs in two tools that we are using - Octopus tool and MOSS (the idea is to check that all servers in Octopus are also registered in MOSS). The Octopus is accessed on PowerShell via MySQL query and the MOSS is accessed via API call. Currently I am able to retrieve successfully the sql query and format it to JSON to be able to be readable by MOSS. However, I am not aware as to how to make the check if the server IDs are also present in the MOSS. All that the script does is retrieve the IDs from the Octopus SQL and then parse them to JSON and make an empty call to MOSS. Would highly appreciate it if anyone knows how to make MOSS calls from PowerShell.
The current script is:
# Extract RDS and servers from Octopus
# Define log file path
$date = $(get-date).tostring()
$currentdate = get-date -format yyyy-MM-dd
$log_file_path = "C:\Program Files\test\logs\"+$currentdate+"_extract_rds_and_servers_from_octopus.log"
$errorlog_file_path = "C:\Program Files\test\logs\errors\errors.log"
# 0. Exclude Admin Users before getting the RDS licenses which need to be reported
#& ((Split-Path $MyInvocation.InvocationName) + "\exclude_users.ps1") -log_file_path $log_file_path -errorlog_file_path $errorlog_file_path
# 1. Extract ObjectID from Octopus API for current month for each RDP user
try {
$month = (Get-Date -UFormat "%Y%m")
$UrlHost = "https://octopus.mos-windows.eu01.stackit.cloud/api/workspace/55cd5c70-d188-4ac3-b946-f1afec8764ad/report/licensing/spla-usage-reseller?&payload[month_id]=$month&payload[with_itemized]=1&_format=json&_token=j8FE4wZDmBITewHUc7lyYeX9XVVjt3dqz0ID4S6A9KQjkMeKfO7_EcgV7Qshuuw1&_tenant=TlXcM&_language=en&payload[flat_structure]=1"
$HostResponse = Invoke-RestMethod -Uri $UrlHost -Method Get
$users = $HostResponse.itemized
$objectids = #()
foreach ($user in $users.PSObject.Properties) {
if ($user.Value.readable_label -eq "Windows Server Remote Desktop Services")
{
$objectid = $user.Value.object_id
$objectids += $objectid
}
}
}
catch {
$date+" - Error in Octopus API Call: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# 2. Get access device ids from Octopus Database
Import-Module -Name "C:\Program Files\test\request_database.ps1" -ArgumentList $log_file_path, $errorlog_file_path -Verbose
$get_users_devices_query =
#"
select user_id, access_device_ids from oc_reporter.ws_installed_software where user_id is not null;
"#
$ui_ad = execute_db_query $get_users_devices_query
$access_device_ids = $users_devices.access_device_ids
$user_ids = $users_devices.user_id
# 3. Get all openstack server id from Octopus Database 
$get_access_device_server_ids_query =
#"
select id, lower(SUBSTRING_INDEX(SUBSTRING_INDEX(ref_id, "-", -6), "-", 5)) as server_id, ref_id, label from ws_device
where type_id = "vm" and operating_system like "%Windows%" and created > 1644820951;
"#
$ad_si = execute_db_query $get_access_device_server_ids_query
# 4. Map the users/objectids with access device ids and server ids
# Create array with UserID filtered by ObjectID and map each AccessDeviceID(s) to corresponding ServerID
try {
$filteredsi = #()
foreach ($userid in $ui_ad)
{
if ($objectids -contains $userid.user_id)
{
$filteredad = $userid
foreach ($id in $ad_si)
{
if ($filteredad.access_device_ids.split(',') -contains $id.id)
{
$filteredsi += [PSCustomObject]#{"userid" = $filteredad.user_id; "serverid" = $id.server_id}
}
}
}
}
}
catch {
$date+" - Error in Mapping userIDs/objectIDs/accessdeviceIDs/serverIDs: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# Preparation for MOSS
# Create JSON contentblock with looped $filteredsi array
try {
$myArray = $filteredsi
$uniqueUsers = [System.Collections.ArrayList]::new()
for($i = 0; $i -lt $myArray.Count; $i++){
if(!$uniqueUsers.Contains($myArray[$i].userid)){
$uniqueUsers.Add($myArray[$i].userid)
}
}
$allMappings = [System.Collections.ArrayList]::new()
for($i = 0; $i -lt $uniqueUsers.Count; $i++){
$singleMapping = [PSCustomObject]#{id = $uniqueUsers[$i]; servers = $null}
$serverids = [System.Collections.ArrayList]::new()
for($j = 0; $j -lt $myArray.Count; $j++){
if($myArray[$j].userid -eq $uniqueUsers[$i]){
$serverids.Add($myArray[$j].serverid)
}
}
$singleMapping.servers = $serverids
$allMappings.Add($singleMapping)
}
$mosscontent = $allMappings | ConvertTo-Json
$mosscontent
}
catch {
$date+" - Error in creating content block for MOSS: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# Create complete array including contentblock for MOSS API call
try {
$moss = #"
{
"type": "server.usage",
"data": {
"users": $mosscontent
}
}
"#
}
catch {
$date+" - Error in creating array for POST to MOSS: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# 5. Call MOSS prod, dev and qa with the whole list of servers and users
# Authenticating to MOSS
$query_file_path_dev_pw = "~\Documents\MOSSDevEncryptedPassword_"
$query_file_path_qa_pw = "~\Documents\MOSSQaEncryptedPassword_"
$query_file_path_prod_pw = "~\Documents\MOSSProdEncryptedPassword_"
# Function to store credentials
function get_encrypted_content {
param (
[String] $file_path,
[String] $password
)
# Check if credentials file exis
if ( -Not (Test-Path -Path $file_path)) {
switch ($password) {
dev {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-dev-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
qa {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-qa-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
prod {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-prod-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
}
}
# Read credentials from file
$Encrypted_value = Get-Content -Path $file_path
# Decrypt credentials from file
return [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR((ConvertTo-SecureString $Encrypted_value)))
}
# Define username and password
$clientid_dev = "mos-windows-us-dev"
$clientid_qa = "mos-windows-us-qa"
$clientid_prod = "mos-windows-us-prod"
$dev_pass = get_encrypted_content $query_file_path_dev_pw "dev"
$qa_pass = get_encrypted_content $query_file_path_qa_pw "qa"
$prod_pass = get_encrypted_content $query_file_path_prod_pw "prod"
[System.Security.SecureString]$clientsecret_dev = ConvertTo-SecureString -String $dev_pass -AsPlainText -Force
[System.Security.SecureString]$clientsecret_qa = ConvertTo-SecureString -String $qa_pass -AsPlainText -Force
[System.Security.SecureString]$clientsecret_prod = ConvertTo-SecureString -String $prod_pass -AsPlainText -Force
#Prepare static variables
$MOSSToken_dev = 'https://auth.00.idp.eu01.stackit.cloud/oauth/token'
$MOSSToken_qa = 'https://auth.01.idp.eu01.stackit.cloud/oauth/token'
$MOSSToken_prod = 'https://auth.01.idp.eu01.stackit.cloud/oauth/token'
$MOSSUrl_dev = "https://stackit-service-mos-dev.apps.01.cf.eu01.stackit.cloud/v1/events"
$MOSSUrl_qa = "https://stackit-service-mos-qa.apps.01.cf.eu01.stackit.cloud/v1/events"
$MOSSUrl_prod = "https://stackit-service-mos.apps.01.cf.eu01.stackit.cloud/v1/events"
$body = #{grant_type='client_credentials'}
#Set function to get all customerinfo from all portals
function call_moss {
param (
[String] $clientid,
[SecureString] $clientsecret,
[String] $MOSSToken,
[String] $MOSSUrl
)
$cred = New-Object -typename System.Management.Automation.PSCredential -ArgumentList $clientid, $clientsecret
#Get Token from MOSS
$Response = Invoke-RestMethod -Uri $MOSSToken -Method Post -Credential $cred -Body $body -ContentType "application/x-www-form-urlencoded"
$Token = $Response.access_token
$Tokenfinal = "Bearer " + $Token
#Post Content to MOSS
Invoke-RestMethod -Uri $MOSSUrl -Method Post -Headers #{'Authorization' = $Tokenfinal } -Body $moss -ContentType "application/json"
}
#Call function to Call MOSS
try
{
Write-Host "Call to MOSS Dev.."
call_moss $clientid_dev $clientsecret_dev $MOSSToken_dev $MOSSUrl_dev
Write-Host "Call to MOSS QA.."
call_moss $clientid_qa $clientsecret_qa $MOSSToken_qa $MOSSUrl_qa
Write-Host "Call to MOSS Prod"
call_moss $clientid_prod $clientsecret_prod $MOSSToken_prod $MOSSUrl_prod
}
catch
{
$date+" - Error in calling MOSS: "+$_ | Out-File -Append $errorlog_file_path
exit
}
# 6. Create daily logs with users reported to MOSS
$date = $(get-date).tostring()
$log = foreach ($item in $filteredsi)
{
$item
}
echo $date $log | Out-file -Append $log_file_path
# delete logs older than 60 days
$limit = (Get-Date).AddDays(-60)
$path = "C:\Program Files\test\logs"
# Delete files older than the $limit.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force
# Create activity file to check if script is working
$temp_file_path = "C:\Program Files\test\tempfile*"
if (Test-Path $temp_file_path)
{
Remove-Item $temp_file_path
}
[string]$filePath = "C:\Program Files\test\tempfile";
[string]$directory = [System.IO.Path]::GetDirectoryName($filePath);
[string]$strippedFileName = [System.IO.Path]::GetFileNameWithoutExtension($filePath);
[string]$extension = [System.IO.Path]::GetExtension($filePath);
[string]$newFileName = $strippedFileName + "_" + (Get-Date).ToString('MM-dd-yyyy') + $extension;
[string]$newFilePath = [System.IO.Path]::Combine($directory, $newFileName);
New-Item $newFilePath
Additional scripts that are being called:
-> request_database.ps1
# 1. Get path for log files
param(
[string]$log_file_path = "C:\Program Files\test\logs\request_database.log",
[string]$errorlog_file_path = "C:\Program Files\test\logs\request_database_errors.log"
)
# 2. Get credentials to access Octopus DB
try {
# Define username and password security string files
$mysql_user_file_path = "~\Documents\ue_"
$mysql_pass_file_path = "~\Documents\pe_"
# Functions
function get_encrypted_content {
param (
[String] $file_path,
[String] $user_or_pass
)
# Check if credentials file exist
if ( -Not (Test-Path -Path $file_path)) {
switch ($user_or_pass) {
msqu {
# Get credentials
Read-Host -Prompt "Please, enter MySQL username" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
msqp {
# Get credentials
Read-Host -Prompt "Please, enter MySQL password" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
}
}
# Read credentials from file
$Encrypted_value = Get-Content -Path $file_path
# Decrypt credentials from file
return [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR((ConvertTo-SecureString $Encrypted_value)))
}
# Define username and password
$my_sql_user = get_encrypted_content $mysql_user_file_path "msqu"
$my_sql_pass = get_encrypted_content $mysql_pass_file_path "msqp"
[System.Security.SecureString]$SecPwd = ConvertTo-SecureString -String $my_sql_pass -AsPlainText -Force
$Credential = new-object -typename System.Management.Automation.PSCredential -argumentlist #($my_sql_user,$SecPwd)
}
catch {
$(get-date).tostring() +" - Error in fetching SQL user/pwd: "+$_ | Out-File -Append $errorlog_file_path
exit
}
$error_string = ""
# 3. Execute DB query
# If you want the script to continue on error you have to provide $true as second parameter
# Example: execute_db_query $exclude_stackitadmin_users_query $true
function execute_db_query {
param (
[String] $query,
[bool] $continueOnError = $false
)
try {
# Query Octopus DB
Connect-MySqlServer -Credential $Credential -Server localhost
if ($continueOnError) {
$query_results = Invoke-MySqlQuery -Query $query
} else {
$query_results = Invoke-MySqlQuery -Query $query -ErrorAction Stop
}
Disconnect-MySqlServer
return $query_results
}
catch {
$error_message = $(get-date).tostring() + " - Error in DB Query 1: " + $_
$error_message | Out-File -Append $errorlog_file_path
Write-Error $error_message
exit
}
}
-> exclude_users.ps1 (probably not needed for the task but the overall script doesn't work without it)
# 1. Get path for log files or use default
param(
[string]$log_file_path = "C:\Program Files\test\logs\exclude_users.log",
[string]$errorlog_file_path = "C:\Program Files\test\logs\errors\exclude_users_errors.log"
)
Import-Module -Name "C:\Program Files\test\request_database.ps1" -ArgumentList $log_file_path, $errorlog_file_path -Verbose
# Exclude StackitAdmin users
$exclude_stackitadmin_users_query =
#"
update oc_reporter.ws_user as updated_user,
(
select id from oc_reporter.ws_user
where username = "StackITAdmin" and exclude_spla = "no"
) as us
set
exclude_spla = "yes",
exclude_spla_reason = "nh"
where updated_user.id = us.id;
"#
execute_db_query($exclude_stackitadmin_users_query)
# Exclude users on dev & qa environment
$exclude_users_on_dev_qa_query =
#"
update oc_reporter.ws_installed_software as updated_software,
(
select ws.access_device_ids, min(ws.access_device_labels), min(ws.user_label), excluded_users, min(ws.id) as id from oc_reporter.ws_user
join oc_reporter.ws_installed_software as ws
on ws.user_id = ws_user.id
join oc_reporter.ws_customer as wc
on wc.id = ws_user.customer_id
left join
(select access_device_ids, count(1) as excluded_users from oc_reporter.ws_user as wu
join oc_reporter.ws_customer as wc
on wc.id = wu.customer_id
join oc_reporter.ws_installed_software as ws
on ws.user_id = wu.id
where
(internal_id like "d-%" or internal_id like "q-%") and
locate(',', access_device_ids) = 0 and
ws.exclude_spla = "yes" and
ws.label = "Microsoft Remote Desktop Services" and
wu.username != "StackitAdmin"
group by access_device_ids, wu.exclude_spla) as servers
on servers.access_device_ids = ws.access_device_ids
where
ws.exclude_spla = "no" and
ws.label = "Microsoft Remote Desktop Services" and
(internal_id like "d-%" or internal_id like "q-%") and
locate(',', ws.access_device_ids) = 0
group by ws.access_device_ids
having (excluded_users = 1 or excluded_users is null)
) as us
set
exclude_spla = "yes",
exclude_spla_reason = "admin"
where updated_software.id = us.id;
"#
# run twice to exlude 2 users per vm
execute_db_query($exclude_users_on_dev_qa_query)
execute_db_query($exclude_users_on_dev_qa_query)
# Exclude users from our mos-windows-2 project
$exclude_users_from_our_projects =
#"
update oc_reporter.ws_installed_software as ins,
(
select ws.access_device_ids, min(ws.access_device_labels), min(ws.user_label), excluded_users, min(ws.id) as id, min(wu.id) from oc_reporter.ws_user as wu
join oc_reporter.ws_installed_software as ws
on ws.user_id = wu.id
join oc_reporter.ws_device as wd
on wd.id = ws.access_device_ids
left join (
select ws.access_device_ids, min(ws.id), min(wu.id), count(1) as excluded_users from oc_reporter.ws_user as wu
join oc_reporter.ws_installed_software as ws
on ws.user_id = wu.id
join oc_reporter.ws_device as wd
on wd.id = ws.access_device_ids
where
ws.exclude_spla = "yes" and
ws.label = "Microsoft Remote Desktop Services" and
LOCATE(',',access_device_ids) = 0 and
(
hkey like "%d57abb0200304506879bd8037f7a49cb%" or
hkey like "%fce60e2c938c49e4a37687492a45b652%" or
hkey like "%8eb91d45f25b45978b71abb0e06a0443%" or
hkey like "%66ad75e4ff624f7e940dc363549c8404%" or
hkey like "%351aa84fb9b54be896112b36ae15dd48%" or
hkey like "%64edd6c19e17417d86094e6a02610eed%"
) and
wu.username != "StackitAdmin"
group by ws.access_device_ids
) as excluded_ws
on
excluded_ws.access_device_ids = ws.access_device_ids
where
ws.exclude_spla = "no" and
ws.label = "Microsoft Remote Desktop Services" and
LOCATE(',',ws.access_device_ids) = 0 and
(
hkey like "%d57abb0200304506879bd8037f7a49cb%" or
hkey like "%fce60e2c938c49e4a37687492a45b652%" or
hkey like "%8eb91d45f25b45978b71abb0e06a0443%" or
hkey like "%66ad75e4ff624f7e940dc363549c8404%" or
hkey like "%351aa84fb9b54be896112b36ae15dd48%" or
hkey like "%64edd6c19e17417d86094e6a02610eed%"
) and
wu.domain not like "%HOP01%" and
wu.domain not like "%WSUS01%" and
wu.domain not like "%OCKMS%" and
wu.domain not like "%AZDVOP%"
group by ws.access_device_ids
having (excluded_users = 1 or excluded_users is null)
) as rds
set
exclude_spla = "yes",
exclude_spla_reason = "admin"
where ins.id = rds.id;
"#
# run twice to exlude 2 users per vm
execute_db_query($exclude_users_from_our_projects)
execute_db_query($exclude_users_from_our_projects)
The order was wrong, as well as lots of unneccessary elements causing errors and crashes.
Current working code is:
# Compare the data between the MOSS and the Octopus
# Define log file path
$date = $(get-date).tostring()
$currentdate = get-date -format yyyy-MM-dd
$log_file_path = "[FILE PATH]"
$errorlog_file_path = "[FILE PATH]"
# 1. Call MOSS (dev, qa, prod) to get the data for all servers created in the last 48 hours
# Authenticating to MOSS
$query_file_path_dev_pw = "[FILE PATH]"
$query_file_path_qa_pw = "[FILE PATH]"
$query_file_path_prod_pw = "[FILE PATH]"
# Function to store credentials
function get_encrypted_content {
param (
[String] $file_path,
[String] $password
)
# Check if credentials file exis
if ( -Not (Test-Path -Path $file_path)) {
switch ($password) {
dev {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-dev-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
qa {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-qa-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
prod {
# Get credentials
Read-Host -Prompt "Enter password for mos-windows-us-prod-client-id" -AsSecureString | ConvertFrom-SecureString | Out-File -FilePath $file_path
}
}
}
# Read credentials from file
$Encrypted_value = Get-Content -Path $file_path
# Decrypt credentials from file
return [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR((ConvertTo-SecureString $Encrypted_value)))
}
# Define username and password
$clientid_dev = "[USERNAME]"
$clientid_qa = "[USERNAME]"
$clientid_prod = "[USERNAME]"
$dev_pass = get_encrypted_content $query_file_path_dev_pw "dev"
$qa_pass = get_encrypted_content $query_file_path_qa_pw "qa"
$prod_pass = get_encrypted_content $query_file_path_prod_pw "prod"
[System.Security.SecureString]$clientsecret_dev = ConvertTo-SecureString -String $dev_pass -AsPlainText -Force
[System.Security.SecureString]$clientsecret_qa = ConvertTo-SecureString -String $qa_pass -AsPlainText -Force
[System.Security.SecureString]$clientsecret_prod = ConvertTo-SecureString -String $prod_pass -AsPlainText -Force
# Time variable
$date48h = ("{0:yyyy-MM-ddThh:mm:ss}" -f ((get-date).Addhours(-48))).split("T").split(":")
$date = $date48h[0]
$hour = $date48h[1]
$min = $date48h[2]
$sec = $date48h[3]
#Prepare static variables
$MOSSToken_dev = '[URL]'
$MOSSToken_qa = '[URL]'
$MOSSToken_prod = '[URL]'
$MOSSUrl_dev = "[URL]"
$MOSSUrl_qa = "[URL]"
$MOSSUrl_prod = "[URL]"
$body = #{grant_type='client_credentials'}
#Set function to get all customerinfo from all portals
function call_moss {
param (
[String] $clientid,
[SecureString] $clientsecret,
[String] $MOSSToken,
[String] $MOSSUrl
)
$cred = New-Object -typename System.Management.Automation.PSCredential -ArgumentList $clientid, $clientsecret
#Get Token from MOSS
$Response = Invoke-RestMethod -Uri $MOSSToken -Method Post -Credential $cred -Body $body -ContentType "application/x-www-form-urlencoded"
$Token = $Response.access_token
$Tokenfinal = "Bearer " + $Token
#Post Content to MOSS
Invoke-RestMethod -Uri $MOSSUrl -Method Get -Headers #{'Authorization' = $Tokenfinal } -ContentType "application/json"
}
#Call function to Call MOSS
try
{
Write-Host "Call to MOSS Dev.."
$get_moss_dev = call_moss $clientid_dev $clientsecret_dev $MOSSToken_dev $MOSSUrl_dev
Write-Host "Call to MOSS QA.."
$get_moss_qa = call_moss $clientid_qa $clientsecret_qa $MOSSToken_qa $MOSSUrl_qa
Write-Host "Call to MOSS Prod"
$get_moss_prod = call_moss $clientid_prod $clientsecret_prod $MOSSToken_prod $MOSSUrl_prod
}
catch
{
$date+" - Error in calling MOSS: "+$_ | Out-File -Append $errorlog_file_path
exit
}
$moss_dev_serverids = $get_moss_dev.items.id
$moss_qa_serverids = $get_moss_qa.items.id
$moss_prod_serverids = $get_moss_prod.items.id
$moss_serverid_arr = #($moss_dev_serverids, $moss_qa_serverids, $moss_prod_serverids)
# 2. Call Octopus to get the data for new servers created in the last 36 hours
Import-Module -Name "[FILE PATH]" -ArgumentList $log_file_path, $errorlog_file_path -Verbose
# Calculate timestamp
$DateTime = Get-Date #or any other command to get DateTime object
$CurrentUnixTime = ([DateTimeOffset]$DateTime).ToUnixTimeSeconds()
$queryTime = $CurrentUnixTime - (36 * 3600)
$get_new_servers_query_oc =
#"
select id, lower(SUBSTRING_INDEX(SUBSTRING_INDEX(ref_id, "-", -6), "-", 5)) as server_id, ref_id, label from oc_reporter.ws_device where type_id = "vm" and operating_system like "%Windows%" and created > $queryTime;
"#
$query = execute_db_query $get_new_servers_query_oc
$serverid_oc = $query.server_id
$serverid_oc_arr = #($serverid_oc)
# 3. Compare the properties in MOSS and Octopus
$unmatching_serverids = $serverid_oc_arr | Where {$moss_serverid_arr -NotContains $_}
$error_report = #($unmatching_serverids)
# Create daily logs with servers in Octopus that are unregistered in MOSS
$date = $(get-date).tostring()
$log = $error_report
echo $date $log | Out-file -Append $log_file_path
# delete logs older than 60 days
$limit = (Get-Date).AddDays(-60)
$path = "[FILE PATH]"
# Delete files older than the $limit.
Get-ChildItem -Path $path -Recurse -Force | Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } | Remove-Item -Force
# 4. Generate summary with all errors and send a notification if there is an error. Schedule a task to check once per day.
If ($error_report.Count -eq 0) {
exit
}
else {
$JSONBody = [PSCustomObject][Ordered] #{
"type" = "MessageCard"
"title" = "Octopus Alerts"
"text" = "Servers located in Octopus, that are not registered in MOSS. <br>
Please check logs."
}
$TeamsMessageBody = ConvertTo-Json $JSONBody
$parameters = #{
"URI" = '[URL]'
"Method" = 'POST'
"Body" = $TeamsMessageBody
"ContentType" = 'application/json'
}
Invoke-RestMethod #parameters
}
# Create activity file to check if script is working
$temp_file_path = "[FILE PATH]"
if (Test-Path $temp_file_path)
{
Remove-Item $temp_file_path
}
[string]$filePath = "[FILE PATH]";
[string]$directory = [System.IO.Path]::GetDirectoryName($filePath);
[string]$strippedFileName = [System.IO.Path]::GetFileNameWithoutExtension($filePath);
[string]$extension = [System.IO.Path]::GetExtension($filePath);
[string]$newFileName = $strippedFileName + "_" + (Get-Date).ToString('MM-dd-yyyy') + $extension;
[string]$newFilePath = [System.IO.Path]::Combine($directory, $newFileName);
New-Item $newFilePath
Also, as already mentioned the exclude_users script was completely not needed. The only additionally included script is the request_database script.

Problem downloading file using Powershell

I have this simple script that works on a laptop:
[Net.ServicePointManager]::SecurityProtocol
[enum]::GetNames([Net.SecurityProtocolType])
$url = "https://www.contextures.com/SampleData.zip"
wget -Uri $url -OutFile "C:\temp\temp.zip"
But when I'm trying to run it on a server I'm always getting this error:
wget : The request was aborted: Could not create SSL/TLS secure
channel.
Any ideas what might be causing this? Any help would be appreciated.
I already tried this and still getting the same error message:
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
Set it to TLS 1.2:
[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12
You can give a try with this example while adding
$AllProtocols = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12';
cls
$Folder = "$Env:Temp\DownloadFolder\"
# We create a SubFolder Named "DownloadFolder" in the temporary file %Temp% if it doesn't exists yet !
If ((Test-Path -Path $Folder) -eq 0) { New-Item -Path $Folder -ItemType Directory | Out-Null }
$urls=#('https://www.contextures.com/SampleData.zip',
'https://cdn2.unrealengine.com/Fortnite%2FBoogieDown_GIF-1f2be97208316867da7d3cf5217c2486da3c2fe6.gif'
)
$start_time = Get-Date
$AllProtocols = [System.Net.SecurityProtocolType]'Ssl3,Tls,Tls11,Tls12';
[System.Net.ServicePointManager]::SecurityProtocol = $AllProtocols;
$i = 0
foreach($url in $urls)
{
$i = $i + 1
$output = $Folder + $url.Split("/")[-1]
Try {
Write-Host "$i - Downloading" $output.Split("\")[-1] from $url
(New-Object System.Net.WebClient).DownloadFile($url,$output)
}
Catch
{
Write-Host "Error in : " $url -ForegroundColor Red -BackgroundColor Yellow
Write-Host "Message: [$($_.Exception.Message)"] -ForegroundColor Red -BackgroundColor Yellow
$i = $i - 1
}
}
Write-Output "Time taken: $((Get-Date).Subtract($start_time).Seconds) second(s) to download $i files"
ii $Folder

Deploying an SSIS package with dtutil

I am trying to deploy a package with dtutil, however I am getting an error that reads:
Option "-xxxx" is not valid.
This correlates to the SOURCESERVER pararmeter I am providing, which has a hypen in it.
$conString = "xxx-xxxx";
$dtUtilQueryToExecute = '/SOURCESERVER "' + $conString + '" /sql "my path" /exists';
$result = dtutil $dtUtilQueryToExecute
I think that dtutil is expecting a new parameter when it reaches the hyphen and cannot escape the hyphen even though it is in double/single quotation marks.
The reason why I am using dtutil is that I need to be able to deploy the same package to many other servers from one location.
So either I need to figure out how to escape the hyphen in the server name, or I need to use an alternative deployment method.
How do I escape the hyphen, or what alternative deployment method can I use?
I believe PowerShell is interpreting the hyphen in the variable. It looks like the wisdom of the internet suggests permutations like
$conString = 'xxx-xxxx'
$conString = "xxx`-xxxx"
Also, no trailing semicolon is required in PowerShell
As an alternative method why not use SMO with Powershell. That wayyou can create a script to deploy your package and also create foders, projects, environments and SQL Agent tasks.
Here is an example of something I would use:
$here = Split-Path -Parent $MyInvocation.MyCommand.Path
$target_server_name = 'target_server_name'
$source_server_name = 'source_server_name'
$target_database_name = 'target_database_name'
$source_database_name = 'source_database_name'
$environment_name = 'enviornment_name'
$folder_name = 'folder_name'
$project_name = 'project_name'
$ispac_name = 'ispac_name'
$variables = #{
email_alert = 'xx#xx.com'
email_error = 'xx#xx.com'
email_reply = 'noreply#xx.com'
smtp_server = 'outlook.xx.local'
tax_rate = '20'
}
function Write-Message ($msg) {
Write-Host ' [+] ' -ForegroundColor Yellow -NoNewline
Write-Host $msg
}
Write-Host 'Starting deployment' -ForegroundColor DarkGray
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Smo") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.Management.IntegrationServices") | Out-Null
Write-Message ("Connecting to integration services on '$source_server_name'")
$conn_str = "Data Source={0};Initial Catalog=master;Integrated Security=SSPI;" -f $source_server_name
$conn = New-Object System.Data.SqlClient.SqlConnection $conn_str
$namespace = 'Microsoft.SqlServer.Management.IntegrationServices'
$intg_serv = New-Object "$namespace.IntegrationServices" $conn
$catalog = $intg_serv.Catalogs['SSISDB']
$folder = $catalog.Folders[$folder_name]
if(!$folder) {
Write-Message 'Creating folder ...'
$folder = New-Object "$($namespace).CatalogFolder" ($catalog, $Folder_name, $Folder_name)
$folder.Create()
} else {
Write-Message 'Folder found ...'
}
Write-Message 'Deploying project file ...'
$path = "$here\..\path_to_your_project\bin\Development\$($ispac_name).ispac"
if(Test-Path $path) {Copy-Item (Resolve-Path $path) $here}
[byte[]]$project_file = [System.IO.File]::ReadAllBytes("$here\crm_imports.ispac")
$dummy = $folder.DeployProject($project_name, $project_file)
$project = $folder.Projects[$project_name]
Write-Message 'Removing old environment ...'
$old = $folder.Environments | Select -ExpandProperty name
$old | % {$folder.Environments[$_].Drop()}
Write-Message "Creating environment '$envionment_name'"
$environment = New-Object "$namespace.EnvironmentInfo" ($folder, $environment_name, $environment_name)
$environment.Create()
$variables.GetEnumerator() | % {
$n = $_.Name
$v = $_.Value
Write-Message " -> adding environment variable $($n): $v"
$environment.Variables.Add($n,[System.TypeCode]::String, $v, $false, $n)
$environment.Alter()
$project.Packages | % {
if($_.Parameters[$n] -ne $null)
{
$_.Parameters[$n].Set([Microsoft.SqlServer.Management.IntegrationServices.ParameterInfo+ParameterValueType]::Referenced, $n)
$_.Alter()
}
}
}
$reference = $project.References[$environment.name, $folder.Name]
if(!$reference) {
$project.References.Add($environment.name, $folder.Name)
$project.Alter()
}
$environment_ref_id = $project.References[0].ReferenceId
$job_name = 'job_name'
$server = New-Object microsoft.sqlserver.management.smo.server $source_server_name
$Job_server = $server.JobServer
$job = $job_server.Jobs[$job_name]
if($job) {$job.Drop()}
Write-Message "Creating SQL agent job '$job_name'"
$job = New-Object Microsoft.SqlServer.Management.SMO.Agent.Job -ArgumentList $Job_server, $job_name
$job.Create()
$job.OwnerLoginName = 'sa'
$job.ApplyToTargetServer($source_server_name)
$job.Alter()
$job_step = New-Object Microsoft.SqlServer.Management.SMO.Agent.JobStep -ArgumentList $job, 'Run SSIS import_donations package'
$job_step.SubSystem = 'Ssis'
$job_step.command = ('/ISSERVER "\"\SSISDB\{3}\{4}\{5}.dtsx\"" ' +
'/SERVER {0} /ENVREFERENCE {1} ' +
'/Par "\"CM.source.ConnectionString\"";' +
'"\"Data Source={0};Initial Catalog={6};Integrated Security=SSPI;Application Name=my_app;\"" ' +
'/Par "\"CM.source.InitialCatalog\"";"\"{6}\"" ' +
'/Par "\"CM.source.ServerName\"";{0} ' +
'/Par "\"CM.target.ConnectionString\"";' +
'"\"Data Source={2};Initial Catalog={7};Integrated Security=SSPI;Application Name=my_app;\"" ' +
'/Par "\"CM.target.InitialCatalog\"";"\"{7}\"" ' +
'/Par "\"CM.target.ServerName\"";{2} ' +
'/Par "\"$ServerOption::LOGGING_LEVEL(Int16)\"";3 ' +
'/Par "\"$ServerOption::SYNCHRONIZED(Boolean)\"";True /CALLERINFO SQLAGENT /REPORTING E') -f $source_server_name, $reference.ReferenceId, $target_server_name, $folder_name, $project_name, $ispac_name, $source_database_name, $target_database_name
$job_step.Create()
$conn.Close()

How to create directories from powershell on FTP server?

I want to run a PS script when I want to publish to FTP server. I took this script as structure : structure script.
I have very simple folder :
C:\Uploadftp\Files\doc.txt
C:\Uploadftp\Files\Files2
C:\Uploadftp\Files\Files2\doc2.txt
nothing fancy there.
Here is my script :
cd C:\Uploadftp
$location = Get-Location
"We are here: $location"
$user = "test" # Change
$pass = "test" # Change
## Get files
$files = Get-ChildItem -recurse
## Get ftp object
$ftp_client = New-Object System.Net.WebClient
$ftp_client.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$ftp_address = "ftp://test/TestFolder"
## Make uploads
foreach($file in $files)
{
$directory = "";
$source = $($file.DirectoryName + "/" + $file);
if ($file.DirectoryName.Length -gt 0)
{
$directory = $file.DirectoryName.Replace($location,"")
}
$directory = $directory.Replace("\","/")
$source = $source.Replace("\","/")
$directory += "/";
$ftp_command = $($ftp_address + $directory + $file)
# Write-Host $source
$uri = New-Object System.Uri($ftp_command)
"Command is " + $uri + " file is $source"
$ftp_client.UploadFile($uri, $source)
}
I keep getting this error :
Exception calling "UploadFile" with "2" argument(s): "An exception occurred during a WebClient request."
If I hardcode specific folder for $uri and tell source to be some specific folder on my computer, this script doesn't create directory, it creates a file. What am I doing wrong?
P.S. dont hit me too hard, its my fist time ever doing something in power shell.
Try the "Create-FtpDirectory" function from https://github.com/stej/PoshSupport/blob/master/Ftp.psm1
function Create-FtpDirectory {
param(
[Parameter(Mandatory=$true)]
[string]
$sourceuri,
[Parameter(Mandatory=$true)]
[string]
$username,
[Parameter(Mandatory=$true)]
[string]
$password
)
if ($sourceUri -match '\\$|\\\w+$') { throw 'sourceuri should end with a file name' }
$ftprequest = [System.Net.FtpWebRequest]::Create($sourceuri);
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::MakeDirectory
$ftprequest.UseBinary = $true
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$response = $ftprequest.GetResponse();
Write-Host Upload File Complete, status $response.StatusDescription
$response.Close();
}

Upload files with FTP using PowerShell

I want to use PowerShell to transfer files with FTP to an anonymous FTP server. I would not use any extra packages. How?
I am not sure you can 100% bullet proof the script from not hanging or crashing, as there are things outside your control (what if the server loses power mid-upload?) - but this should provide a solid foundation for getting you started:
# create the FtpWebRequest and configure it
$ftp = [System.Net.FtpWebRequest]::Create("ftp://localhost/me.png")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential("anonymous","anonymous#localhost")
$ftp.UseBinary = $true
$ftp.UsePassive = $true
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("C:\me.png")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
There are some other ways too. I have used the following script:
$File = "D:\Dev\somefilename.zip";
$ftp = "ftp://username:password#example.com/pub/incoming/somefilename.zip";
Write-Host -Object "ftp url: $ftp";
$webclient = New-Object -TypeName System.Net.WebClient;
$uri = New-Object -TypeName System.Uri -ArgumentList $ftp;
Write-Host -Object "Uploading $File...";
$webclient.UploadFile($uri, $File);
And you could run a script against the windows FTP command line utility using the following command
ftp -s:script.txt
(Check out this article)
The following question on SO also answers this: How to script FTP upload and download?
I'm not gonna claim that this is more elegant than the highest-voted solution...but this is cool (well, at least in my mind LOL) in its own way:
$server = "ftp.lolcats.com"
$filelist = "file1.txt file2.txt"
"open $server
user $user $password
binary
cd $dir
" +
($filelist.split(' ') | %{ "put ""$_""`n" }) | ftp -i -in
As you can see, it uses that dinky built-in windows FTP client. Much shorter and straightforward, too. Yes, I've actually used this and it works!
Easiest way
The most trivial way to upload a binary file to an FTP server using PowerShell is using WebClient.UploadFile:
$client = New-Object System.Net.WebClient
$client.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$client.UploadFile(
"ftp://ftp.example.com/remote/path/file.zip", "C:\local\path\file.zip")
Advanced options
If you need a greater control, that WebClient does not offer (like TLS/SSL encryption, etc), use FtpWebRequest. Easy way is to just copy a FileStream to FTP stream using Stream.CopyTo:
$request = [Net.WebRequest]::Create("ftp://ftp.example.com/remote/path/file.zip")
$request.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$fileStream = [System.IO.File]::OpenRead("C:\local\path\file.zip")
$ftpStream = $request.GetRequestStream()
$fileStream.CopyTo($ftpStream)
$ftpStream.Dispose()
$fileStream.Dispose()
Progress monitoring
If you need to monitor an upload progress, you have to copy the contents by chunks yourself:
$request = [Net.WebRequest]::Create("ftp://ftp.example.com/remote/path/file.zip")
$request.Credentials =
New-Object System.Net.NetworkCredential("username", "password")
$request.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$fileStream = [System.IO.File]::OpenRead("C:\local\path\file.zip")
$ftpStream = $request.GetRequestStream()
$buffer = New-Object Byte[] 10240
while (($read = $fileStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$ftpStream.Write($buffer, 0, $read)
$pct = ($fileStream.Position / $fileStream.Length)
Write-Progress `
-Activity "Uploading" -Status ("{0:P0} complete:" -f $pct) `
-PercentComplete ($pct * 100)
}
$ftpStream.Dispose()
$fileStream.Dispose()
Uploading folder
If you want to upload all files from a folder, see
PowerShell Script to upload an entire folder to FTP
I recently wrote for powershell several functions for communicating with FTP, see https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1. The second function below, you can send a whole local folder to FTP. In the module are even functions for removing / adding / reading folders and files recursively.
#Add-FtpFile -ftpFilePath "ftp://myHost.com/folder/somewhere/uploaded.txt" -localFile "C:\temp\file.txt" -userName "User" -password "pw"
function Add-FtpFile($ftpFilePath, $localFile, $username, $password) {
$ftprequest = New-FtpRequest -sourceUri $ftpFilePath -method ([System.Net.WebRequestMethods+Ftp]::UploadFile) -username $username -password $password
Write-Host "$($ftpRequest.Method) for '$($ftpRequest.RequestUri)' complete'"
$content = $content = [System.IO.File]::ReadAllBytes($localFile)
$ftprequest.ContentLength = $content.Length
$requestStream = $ftprequest.GetRequestStream()
$requestStream.Write($content, 0, $content.Length)
$requestStream.Close()
$requestStream.Dispose()
}
#Add-FtpFolderWithFiles -sourceFolder "C:\temp\" -destinationFolder "ftp://myHost.com/folder/somewhere/" -userName "User" -password "pw"
function Add-FtpFolderWithFiles($sourceFolder, $destinationFolder, $userName, $password) {
Add-FtpDirectory $destinationFolder $userName $password
$files = Get-ChildItem $sourceFolder -File
foreach($file in $files) {
$uploadUrl ="$destinationFolder/$($file.Name)"
Add-FtpFile -ftpFilePath $uploadUrl -localFile $file.FullName -username $userName -password $password
}
}
#Add-FtpFolderWithFilesRecursive -sourceFolder "C:\temp\" -destinationFolder "ftp://myHost.com/folder/" -userName "User" -password "pw"
function Add-FtpFolderWithFilesRecursive($sourceFolder, $destinationFolder, $userName, $password) {
Add-FtpFolderWithFiles -sourceFolder $sourceFolder -destinationFolder $destinationFolder -userName $userName -password $password
$subDirectories = Get-ChildItem $sourceFolder -Directory
$fromUri = new-object System.Uri($sourceFolder)
foreach($subDirectory in $subDirectories) {
$toUri = new-object System.Uri($subDirectory.FullName)
$relativeUrl = $fromUri.MakeRelativeUri($toUri)
$relativePath = [System.Uri]::UnescapeDataString($relativeUrl.ToString())
$lastFolder = $relativePath.Substring($relativePath.LastIndexOf("/")+1)
Add-FtpFolderWithFilesRecursive -sourceFolder $subDirectory.FullName -destinationFolder "$destinationFolder/$lastFolder" -userName $userName -password $password
}
}
Here's my super cool version BECAUSE IT HAS A PROGRESS BAR :-)
Which is a completely useless feature, I know, but it still looks cool \m/ \m/
$webclient = New-Object System.Net.WebClient
Register-ObjectEvent -InputObject $webclient -EventName "UploadProgressChanged" -Action { Write-Progress -Activity "Upload progress..." -Status "Uploading" -PercentComplete $EventArgs.ProgressPercentage } > $null
$File = "filename.zip"
$ftp = "ftp://user:password#server/filename.zip"
$uri = New-Object System.Uri($ftp)
try{
$webclient.UploadFileAsync($uri, $File)
}
catch [Net.WebException]
{
Write-Host $_.Exception.ToString() -foregroundcolor red
}
while ($webclient.IsBusy) { continue }
PS. Helps a lot, when I'm wondering "did it stop working, or is it just my slow ASDL connection?"
You can simply handle file uploads through PowerShell, like this.
Complete project is available on Github here https://github.com/edouardkombo/PowerShellFtp
#Directory where to find pictures to upload
$Dir= 'c:\fff\medias\'
#Directory where to save uploaded pictures
$saveDir = 'c:\fff\save\'
#ftp server params
$ftp = 'ftp://10.0.1.11:21/'
$user = 'user'
$pass = 'pass'
#Connect to ftp webclient
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
#Initialize var for infinite loop
$i=0
#Infinite loop
while($i -eq 0){
#Pause 1 seconde before continue
Start-Sleep -sec 1
#Search for pictures in directory
foreach($item in (dir $Dir "*.jpg"))
{
#Set default network status to 1
$onNetwork = "1"
#Get picture creation dateTime...
$pictureDateTime = (Get-ChildItem $item.fullName).CreationTime
#Convert dateTime to timeStamp
$pictureTimeStamp = (Get-Date $pictureDateTime).ToFileTime()
#Get actual timeStamp
$timeStamp = (Get-Date).ToFileTime()
#Get picture lifeTime
$pictureLifeTime = $timeStamp - $pictureTimeStamp
#We only treat pictures that are fully written on the disk
#So, we put a 2 second delay to ensure even big pictures have been fully wirtten in the disk
if($pictureLifeTime -gt "2") {
#If upload fails, we set network status at 0
try{
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
} catch [Exception] {
$onNetwork = "0"
write-host $_.Exception.Message;
}
#If upload succeeded, we do further actions
if($onNetwork -eq "1"){
"Copying $item..."
Copy-Item -path $item.fullName -destination $saveDir$item
"Deleting $item..."
Remove-Item $item.fullName
}
}
}
}
You can use this function :
function SendByFTP {
param (
$userFTP = "anonymous",
$passFTP = "anonymous",
[Parameter(Mandatory=$True)]$serverFTP,
[Parameter(Mandatory=$True)]$localFile,
[Parameter(Mandatory=$True)]$remotePath
)
if(Test-Path $localFile){
$remoteFile = $localFile.Split("\")[-1]
$remotePath = Join-Path -Path $remotePath -ChildPath $remoteFile
$ftpAddr = "ftp://${userFTP}:${passFTP}#${serverFTP}/$remotePath"
$browser = New-Object System.Net.WebClient
$url = New-Object System.Uri($ftpAddr)
$browser.UploadFile($url, $localFile)
}
else{
Return "Unable to find $localFile"
}
}
This function send specified file by FTP.
You must call the function with these parameters :
userFTP = "anonymous" by default or your username
passFTP = "anonymous" by default or your password
serverFTP = IP address of the FTP server
localFile = File to send
remotePath = the path on the FTP server
For example :
SendByFTP -userFTP "USERNAME" -passFTP "PASSWORD" -serverFTP "MYSERVER" -localFile "toto.zip" -remotePath "path/on/the/FTP/"
Goyuix's solution works great, but as presented it gives me this error: "The requested FTP command is not supported when using HTTP proxy."
Adding this line after $ftp.UsePassive = $true fixed the problem for me:
$ftp.Proxy = $null;
Simple solution if you can install curl.
curl.exe -p --insecure "ftp://<ftp_server>" --user "user:password" -T "local_file_full_path"