Stop share folder except default share, remote IPC and remote admin - powershell

All my windows client have a default share folder like this.
C$
D$
IPC$
ADMIN$
but i want delete or stop share folder except that, how can i delete or stop that share folder except default share folder in Command Line ?
or maybe there is some script i can use in powershell.

Continuing from my comments.
This is not a PowerShell code issue or even cmd.exe. It is a Windows OS config misunderstanding.
As per MS Docs (this talking to WS2K8 but applies to all server and clientside versions):
How to remove administrative shares in Windows Server 2008
https://learn.microsoft.com/en-us/troubleshoot/windows-server/networking/remove-administrative-shares
Introduction By default, Windows Server 2008 automatically creates
special hidden administrative shares that administrators, programs,
and services can use to manage the computer environment or network.
These special shared resources aren't visible in Windows Explorer or
in My Computer. However, you can view them by using the Shared Folders
tool in Computer Management. Depending on the configuration of your
computer, some or all of the following special shared resources may be
listed in the Shares folder in Shared Folders:
DriveLetter$: It's a shared root partition or volume. Shared root
partitions and volumes are displayed as the drive letter name appended
with the dollar sign ($). For example, when drive letters C and D are
shared, they're displayed as C$ and D$.
ADMIN$: It's a resource that is used during remote administration of a
computer.
IPC$: It's a resource that shares the named pipes that you must have
for communication between programs. This resource cannot be deleted.
NETLOGON: It's a resource that is used on domain controllers.
SYSVOL: It's a resource that is used on domain controllers.
PRINT$: It's a resource that is used during the remote administration
of printers.
FAX$: It's a shared folder on a server that is used by fax clients
during fax transmission.
Generally, we recommend that you don't modify these special shared resources. However, if you want to remove the special shared
resources and prevent them from being created automatically, you can
do it by editing the registry.
So, if you choose to do this, you can do it via the registry, using cmd.exe or PowerShell cmdlets. Yet, you really need to know the impacts of this if you do.
Again as per MS...
Generally, we recommend that you don't modify these special shared resources

Related

Firebase hosting: The remote web server hosts what may be a publicly accessible .bash_history file

We host our website on firebase. We fail a security check due to the following reason:
The remote web server hosts publicly available files whose contents may be indicative of a typical bash history. Such files may contain sensitive information that should not be disclosed to the public.
The following .bash_history files are available on the remote server : - /.bash_history Note, this file is being flagged because you have set your scan to 'Paranoid'. The contents of the detected file has not been inspected to see if it contains any of the common Linux commands one might expect to see in a typical .bash_history file. - /cgi-bin/.bash_history Note, this file is being flagged because you have set your scan to 'Paranoid'. The contents of the detected file has not been inspected to see if it contains any of the common Linux commands one might expect to see in a typical .bash_history file. - /scripts/.bash_history Note, this file is being flagged because you have set your scan to 'Paranoid'. The contents of the detected file has not been inspected to see if it contains any of the common Linux commands one might expect to see in a typical .bash_history file.
The problem is that we don't have an easy way to get access to the hosting machine and delete these files.
Anybody knows how it can be solved?
If you are using Firebase Hosting, you should check the directory (usually public) that you are uploading via the firebase deploy command. Hosting serves only those files (plus a couple of auto-generated ones under the reserved __/ path for auto-configuration).
If you have a .bash_history, cgi-bin/.bash_history or scripts/.bash_history in that public directory, then it will be uploaded to and served by Hosting. There are no automatically served files with those name.
You can check your public directory, and update the list of files to ignore on the next deploy using the firebase.json file (see this doc). You can also download all the files that Firebase Hosting is serving for you using this script.

Unix permissions needed when running Powershell script

As a final step in our AD account creation process that is being moved to a powershell script a few folders need to be created on the filer for users and I am coming unstuck with permissions.
I am just using the basic new-item command to create folder but the locations need unix permissions (775) set before anything can be created. I can't go there and right click in Windows explorer and click new.. and the powershell script is being bounced also due to permissions.
The reasoning from one of the tech guys here is that I am trying to create a sub folder via smb mount from Windows using ntfs permissions. There is no correlation to unix permissions and any of our Linux users won't be able to access / use the location created for them.
Sorry if that is a clumsy way of explaining it, I am not a systems engineer, just the guy trying to translate a whole heap if pearl scripts into a new powershell process.
Thank you
S.

How to use Powershell to script a domain user's temporary file location

I am writing deployment scripts using Powershell to install Scheduled Tasks, Windows Services and IIS App pools.
Each of these items will be run under the identity of an Active Directory domain user. My issue is that the business rules enforced on the servers state that no process or user can write to the C drive.
Therefore i need to direct each installed object to use the E drive for temporary storage of any kind.
How can i assign the temp directory environment variable of a domain user using powershell on a server that will have no 'knowledge' of that domain user until i instantiate the installed objects?
When it comes to the IIS app pools I have found a (hacky) solution that could potentially work in this:
https://serverfault.com/questions/711470/applicationpoolidentity-environment-variables-iis
that requires me to set the app pool to run as the profile, fire up the pool, snoop registry keys, obtain a SID, and then modify registry keys to set the environment variable for the temp drive.
Is there an easier way? And how could i do this for services and scheduled tasks?
Pie in the sky - i write one powershell script to modify the temp env parameter for this one domain account before installing any of these objects and then when they are installed it "just works".
Any suggestions?

Get client connections on cifs share from command line equal to Computer Management view

I would like to use either command line or Powershell to gather the number of client connections on different shares from our NAS'es, multiple vendors.
Connecting "Computer Management" to a NAS-device, this gives us a nice overview of the connections per share under System Tools -> Shared Folders -> Shares. Is it possible to get the same information using cmd or powershell script?
Get-SmbShare, as mentioned by PetSerAl, can list file shares on Windows machines(Windows 8/Server 2012 and later). I do not know if it can enumerate shares on other vendor products that are visible as SMB shares to a Windows machine.
Get-SmbShare | Select Name,ScopeName,PathName,CurrentUsers
For NetApp products, the DataONTAP PowerShell Toolkit has scripts for managing shares, but I'm not sure of the specific commands you would need for the desired output.
For Windows 7/Server 2008 R2, this article discusses a way to use WMI directly to query file shares.
For other products, you may want to contact their respective support teams.

How to push an enterprise-wide PowerShell profile customization?

As I've explained in my other question, I'm busy setting up a PowerShell module repository in my enterprise.
My plan is to have a master repository (r/w access to a limited group of people) and slave repositories (read only access to everyone). I need multiple repositories because clients are located in different security zones and I can't have a central location reachable by all clients.
For this reason, I need to configure the PowerShell profile of the clients so that they can point to the correct repository to find the modules. I would like to define a $PowerShellRepositoryPath environment variable for this purpose.
Also, the profile needs to be customized in order for it to execute a script located in the repository (thus where $PowerShellRepositoryPath points to) when PowerShell starts (my goal here is to automatically add the latest module versions to the PSModulePath of the clients on startup).
We have a mixed environment with domain members and stand-alone servers in different network zones.
How would you proceed? Is it possible to push that variable and the profile via a GPO for domain members? Would customizing the $Profile variable via GPO be an option?
What about the standalone servers?
Edit:
I think that for creating the environment variable, I'll just use a GPO to create it and use it in PowerShell via $env:variableName. For non-domain situations, I'll probably have to use a script though..
I am not sure about pushing $profile via GPO. But, I'd simply put a logon script that copies the profile script from a network location based on the user's group/security membership.
Well if you're going to change the path to the modules, I'd have a file in the repository (say current.txt) that has the name for the current module (or current file path, whichever you are changing) in it. Then have the $profile script read the content of that file, and set the variable based on the contents. This way you don't have to screw around with updating the profile scripts, just update the central repository current.txt file with the path (or partial path, the part that changes, or filename or whatever), and when it replicates to the client repositories, all powershell profiles get updated with the latest modules when the profile script is executed.
Out of curiosity, why not just overwrite the module files in the client repositories with the latest version? If you did it that way, all clients would always have the latest versions, and you wouldn't have to update the $profile scripts.
Alternately you could always write another script to replace the $profile script on all machines. I think the first route I suggested would be the cleanest way of doing what you are after.
As far as the GPO thing goes, I don't believe you can do this. There is no GPO defined to control what is in the profile script. I would say you could maybe do it with a custom ADM file, but the profile script path is not controlled by the registry, so no go there.