I often have to move a large number of files from one part of sharepoint to another, and the GUI often has issues with data loss or duplication, as well as being extremely repetitive and time intensive.
Ideally I'd really like there to be a way to just navigate sharepoint files as if they were just any other files on a command line in a computer file system. Is such a thing even possible? If not, is there at least a way to cp files from one directory into another?
Things like these make it seem easy, except the file paths don't actually match up in reality to any expected path:
https://www.sharepointdiary.com/2018/03/sharepoint-online-move-files-using-powershell.html
When attempting to use the SPO or PNP module in powershell, the documentation is pretty unclear. Get-PNPfile either always returns file not found if I try to use /Documents/Foldername like one would think. Even if I right click and copy link and get that messy url and make sure to deal with the ampersands, it still doesn't work. For example
Get-PnPFile -Url "https://domain.sharepoint.com/sites/team/Documents/file"
I would expect this to well, return an object that contains something pointing to the file, but it never works.
One possibility is that MFA is a requirement in the environment I'm using and it seems requires a flag -UseWebLogin which appears to work without errors, but it also appears to work when I mistyped the url of the team name when I used the command Connect PnPOnline, so maybe there is an issue there?
First for MFA, it will be better to user Connect-PnPOnline -Url "https://domain.sharepoint.com" -Interactive just like the comment on your original post.
And for the file not found error, it seems you are not using the correct URL. Have a try on the site relative URL.
Here is what I have tested
So, the solution ended up being the use of the "sync" feature which then creates an alias that can be manipulated as if it was a regular file on a machine using powershell. There doesn't seem to be a straightforward way to interact with the sharepoint filesystem via command line which is bonkers.
Related
About half the time when I click on WGET script following a CMIP6 data search on the ESGF (LLNL node), I get a wget script that only points to one, unrelated file. It's always the same one, too. Here's the relevant line that shows up in each wget file:
download_files="$(cat <<EOF--dataset.file.url.chksum_type.chksum
'famipc5_ne120_v0.3_00001_01_198001_198401_climo.nc' 'http://esgf.anl.gov/thredds/fileServer/esg_dataroot/ACME/climo/amip/v0_3/atm/mon/native/ne120/ens1/famipc5_ne120_v0.3_00001_01_198001_198401_climo.nc' 'SHA256' 'e5040c5df9d080437418943f02a41e84712dbe1c4a69982447712d7c7334241d'
EOF--dataset.file.url.chksum_type.chksum
)"
This happens with a wide variety of datasets. Here's one file where that happens, for example:
CMIP6.CMIP.CCCma.CanESM5.amip.r1i1p1f1.day.pr.gn
I've been searching for a reason, so far without success. A workaround is to hit the "download HTML" button 1000 times for each individual needed file instead (or set up a Globus endpoint for the files where that's possible), but it's very inconvenient and doesn't provide the functionality of a bash script.
Does anyone know what may be causing this? Is there some sort of limit to how many wget scripts an ESGF user can generate per day and these are downloaded as placeholders afterwards instead?
Grateful for any insight!
PS: I apologize for the cdo tag; I know this isn't a cdo problem, but it's hard to find relevant tags for this, and I figured that community may know what's up.
Turns out this is a browser issue. Repeating the search with chrome fixed it.
(Also that stack overflow may have not been the right venue to post this question; but I want this to be searchable somewhere at least)
After trying to use ConvertTo-Application and New-WebApplication, I've had no luck converting a folder to an app when it's located on a network share.
I receive a "Path doesn't belong to 'WebAdministration' provider"
I've scoured the internet trying to find a solution, including several links from stackoverflow.
Basically I'm making a GUI script to create a website, convert folders to webapps, I have two comboboxes for selecting current websites and AppPools. The GUI also let's me create AppPools. The GUI works great, but I'm just trying to get this one feature of it going.
This feature of the script is something that is required by management.
New-WebApplication -Name $Name2 -Site $SiteN -PhysicalPath $Path -ApplicationPool $AppP
(also)
ConvertTo-WebApplication -PSPath "\\domain.local\webstuff\WebSites\Dev\Internet\foldername\scriptdevsite\scriptdevsite2\SiteApp1"
Now, I realize that ConvertTo-WebApplication is for converting Virtual Directories to apps, but I wanted to try this and see if it'll work.
The New-Website command works great, but it makes apps and leaves the folders.
So I'll get scriptdevsite2\SiteApp1\SiteApp1, not going to work.
We have websites that have physical folders that sometimes need to be converted to an application and occasionally removed for testing.
Likely going to have to implement some C# for this, I've been checking into Application Class.
But it's not very helpful, at least not to me.
I'm still new to this site, so if there's anything I'm leaving out, I apologize in advance.
Thank you for your help
I am writing a script with a lot of modules but I don't really want the user to see my source code so I figured to encode everything in base64 since the user won't be able to decode it even if it is that basic.
I tried to somehow add an encoded module but no luck.
So my question is -
Is it possible to import a base64 encoded module to the main script file?
If you have any better solutions to hide source code please share, I would be more than happy to try them out.
P.S. I tried to find some info on making a .dll files but found out I would have to rewrite the script in C#. (if I didn't missed anything)
Also I tried to put all modules into one encoded file, but then the file gets too big and Powershell is not able to process it anymore.
You've got two options, which can be combined if you would like to be extremely sure that no one will be able to access your code, making your code into an exe was already mentioned, there are several projects to do this but This one is nice as it is wholly contained within PS. The other, imo better, method is to use an obfuscator, which will take your code and replace variable names with nonsense strings and make other changes to make your code very difficult to read, it's still possible to work out your code but generally not worth the effort, you can find a working one Here. But I do have to add that obfuscating your code really goes against the powershell ethos and I recommend against doing it unless you have some sort of requirement too being passed down from management. And please note that this NOT an acceptable method of obscuring code that includes passwords, api keys, or any other information that needs to be secured as all of those are quite easy to extract from code that has been obfuscated this way.
You could change your ps1 to an exe file by using
https://ps2exe.codeplex.com/
You'd still be able to get at the code if you tried, but it would prevent a casual look.
Why do you want to hide the modules?
I have been given a task that involves downloading a single file every day from a website. Let's call it "https://test.example.com". I have credentials that allow me to login to the site, where a Flash interface then presents the files that are available for download. After the file is downloaded, it is then processed in a variety of ways. I have already put together the Powershell that handles all that, I am just having a hard time with automating the actual download of the file.
I used the Flash interface to download a few files while watching the network activity, and found that it is actually pulling the file from this URL:
https://test.example.com/link/EBDB7F67EF3B28XX99NCAD9920160423/file.zip
Therefore, I was able to put this together in order to automatically get the file via my PS script:
$url = 'https://test.example.com/link/EBDB7F67EF3B28XX99NCAD9920160423/file.zip'
$output = "C:\Downloads\file.zip"
Invoke-WebRequest -Uri $url -OutFile $output
However, the long string of numbers in the URL changes every day. The only discernible pattern I can find is that the last eight digits are always the date on which that particular file is posted.
Is there a good way to approach this? I've been experimenting with wildcards and patterns, as well as checking the HTML for elements that I can filter, but I am having a hard time finding the correct solution.
This is very hard to automate. You can't drive Flash from the script unless it is specifically designed for that. As I see it now your only options are:
Contact site devs if possible, maybe they can give you a details on function that generates link. This gives me an idea - perhaps you can reverse engineer Flash code to find that function details yourself. Use flash decompiler for this.
Simulate the user browsing the flash site. This can be done in one of the following ways:
Autohotkey - you can record mouse clicking relative to the browser window and execute the script again. Unless flash interface is too dynamic and unpredictive it will work.
Sikuli - another automation language which relies on picture segment recognition.
All above 2.* methods produce fragile automation code as they depend on browser settings (zoom, theme) and even OS settings. For this reason you need to dedicate one machine for that in all probability (virtual machine ofc). Decompiling flash code and re-implementing the url generting code in powershell will make it a reliable 100%.
As somebody said in comments this is not a powershell queestion but browser automation question.
I'm hoping someone can help. I've started using the Community TFS Build Extensions, in particular the FTP activity. I followed the documentation here and got to grips with the it pretty easily. I'm encountering one major problem though.
My Web app has a basic enough structure:
I start by creating the FindMatchingFile activity which places the files in the drop location into an IEnumberable variable called FilesToFTP :
String.Format("{0}\**\*.*", BuildDetail.DropLocation)
When I iterate through the variable and print out the results, all seems correct:
G:\builds\Build.1203\CredentialManagement\bin\BusLogic.dll
G:\builds\Build.1203\CredentialManagement\css\style.css
G:\builds\Build.1203\CredentialManagement\AppError.aspx
......
G:\builds\Build.1203\CredentialManagement\Web.config
etc etc.
The problem is, when I pass that IEnumerable to the Ftp activity (converting it to a string array), it FTP uploads all the files on the server however it doesn't keep the directory structure of my Web app. It just piles all the output (dlls, aspx etc) into one directory. See the following two screenshots.
Is there any way I can use the FTP activity to upload all the output from the drop location recursively? I feel like I'm doing something simple wrong.
The FTP activity in TFS Build Extensions doesn't upload files recursively.
I think it would be a good value addition to the activity. Please create a request for the project and we will add in it. For now, you can go around it by calling the Ftp activity recursively for each directory and setting the RemoteDirectory for each.