update merge path in word doc - merge

I have 2000 documents which were created by an incorrectly configured application which now uses an invalid merge data path.
e.g. c:\APPNAME\WP\MERGE.TXT
The files now live in H:\MERGE.TXT so all users can access them.
Is there a way to update this path without opening each file in MS Word and reselecting the data source?
Looking forward to your replies.

Related

Split Sharepoint Content Database

I have a single SharePoint content database (Sharepoint 2019 - On Premise) that is over 100 GB and I would like to split the SP sites between some new content databases that I will make.
I have created the new content databases but I have no idea on how to move the subsites to them.
Based on research that I have done, it seems I need to:
Create Content Databases
Create site collections in those databases
Move sub collections into new site collections in the new databases.
Question 1 - are the above steps correct or do I have this wrong?
Question 2 - How in the heck do I move subsites out of the almost full content database, into new content Database? Do I move it to
the site collection in the new database? If so How?!?
Thankyou for your brainpower and help
Tried moving subsites and failed
Unfortunately, I could not understand if you wish to transfer just some subsites or a complete site collection, so I will list below both of these ways.
I would strongly suggest that you create a sandbox environment before proceeding with any of the below scripts, just in case you have misunderstood anything.
Before any transfers are performed, you should create the Content Databases that you will be targeting. You can perform such task either via the Central Admin Panel (GUI) or via a PowerShell Script, of which the command would be the below:
#get web app under which you will create the content db.
$WebApp = Get-SPWebApplication
#create the new content database
New-SPContentDatabase "<Name_of_new_Content_db>" -DatabaseServer "<db_server>" -WebApplication $WebApp
#you can also use the below parchment which points directly to the web app.
#New-SPContentDatabase "<Name_of_new_Content_db>" -DatabaseServer "<db_server>" -WebApplication http://web-app/
In case you wish to transfer whole site collections or clone them on to different Content Databases there are three ways to achieve this.
Copy Site Collection, use the Copy-SPSite cmdlet to make a copy of a site collection from an implied source content database to a specified destination content database.
The copy of the site collection has a new URL and a new SiteID.
Copy-SPSite http://web-app/sites/original -DestinationDatabase <Name_of_new_Content_db> -TargetUrl http://web-app/sites/copyfromoriginal
Move Site Collection, the Move-SPSite cmdlet moves the data in the specified site collection from its current content database to the content database specified by the DestinationDatabase parameter.
A no-access lock is applied to the site collection to prevent users from altering data within the site collection while the move is taking place.
Once the move is complete, the site collection is returned to its original lock state. The original URL is preserved, in contrast with Copy-SPSite where you generate a new one.
As you can see, before executing the below script each content database was hosting at least one site collection.
Move-SPSite http://web-app/sites/originalbeforemove -DestinationDatabase <Name_of_new_Content_db>
After the execution, you can see that a site was transfered from the last content database to the second, preserving its original url.
Backup and Restore Site Collection, this combination will save the site collection on the disk and afterwards restore it onto a new Content Database. The Restore-SPSite cmdlet performs a restoration of the site collection to a location specified by the Identity parameter. A content database may only contain one copy of a site collection. If a site collection is backed up and restored to a different URL location within the same Web application, an additional content database must be available to hold the restored copy of the site collection.
Backup-SPSite http://web-app/sites/original -Path C:\Backup\original.bak
Restore-SPSite http://web-app/sites/originalrestored -Path C:\Backup\original.bak -ContentDatabase <Name_of_new_Content_db>
Once I executed the above commands, a new site was restored on the third Content Database, which was basically a clone of the original site. Keep in mind, that with this path you will preserve the original site and will be able to work on the newly restored copy.
In case you wish to transfer just one Sub Site on to a different Content Databases you can follow the below strategy.
Use the -Force flag in case of the below error.
File C:\Backup\export.cmp already exists. To overwrite the existing file use the -Force parameter.
You can import sites only into sites that are based on same template as the exported site. This is refering to the Site Collection and not the SubSite
Import-SPWeb : Cannot import site. The exported site is based on the template STS#3 but the destination site is based on the template STS#0. You can import sites only
into sites that are based on same template as the exported site.
#Create Site Collection in targeted Content Database first
New-SPSite http://web-app/sites/subsiterestoration2 -OwnerAlias "DOMAIN\user" -Language 1033 -Template STS#3 -ContentDatabase <Name_of_new_Content_db>
#export Web object, use force to overwrite the .cmp file
Export-SPWeb http://web-app/sites/original/subsitetomove -Path "C:\Backup\export.cmp" -Force
#Create a new Web under the new Site Collection, although it is not necessary and you can always restore on to the RootWeb. I created the new Web object just to preserve the previous architecture.
New-SPWeb http://web-app/sites/subsiterestoration2/subsitemoved -Template "STS#3"
#Finally, import the exported Web Object on to the Targeted Web
Import-SPWeb http://web-app/sites/subsiterestoration2/subsitemoved -Path "C:\Backup\export.cmp" -UpdateVersions Overwrite
Final Notes
Keep in mind that all of the transfers were performed on sites that did not have any kind of customizations upon them, like Nintex WFs or custom event receivers. These were just plain sites that several Lists and Document Libraries.
Always make sure that once you are performing the below tasks that the Users are not altering data that currently exist within the site collections in question.
To briefly answer your question, yes you have the correct idea of what there is to be done in case you wish to transfer just the a sub site, but you must pick the best method of the above that suits you.
Always pay attention that most of the methods alter the url which points to a subsite, which you should be cautious about if any other third party automations are getting and updating data on Sharepoint with these urls.
I will try to keep this answer updated with the ways of transfering a subsite, in case anything else comes up.

Github multi-dimensional query

I am trying to query the Github RestAPI v. X-GitHub-Api-Version to find line changes (addition, deletions), user, date, repository_name for any repositories in an organization that have files of a particular file extension. Not seeing a clear path forward with this, do I need to file all files using {{baseUrl}}/search/code?q=org:{{org}}+extension:tf, then take those outputs iterating over and querying commits for a file and then extract the details from the commit details?

Is there a way to have Harmon.ie keep original modified date?

When uploading a file from a drive/fileshare through Harmon.ie, is there a way to have SharePoint keep the original file modified date? Looks like the default is to update Modified with the current date and time.
Thanks!
Troy
The modified field you'd like to keep is an attribute of the file system not of the document, the Modified field in Sharepoint is a --system-- field that is set by Sharepoint server (not by harmon.ie) at the time the document is uploaded and each time it gets modified. As a result, the modified date you see in your file won't be reflected in Sharepoint. This is the reason why you will get the same behavior when uploading documents with Sharepoint web interface in IE.
----- Jean

How do I do bulk file storage with IBM Object Storage?

I'm using IBM Object Storage to store huge amounts of very small files,
say more than 1500 small files in one hour. (Total size of the 1500 files is about 5 MB)
I'm using the object store api to post the files, one file at a time.
The problem is that for storing 1500 small files it takes about 15 minutes in total. This is with setting up and closing the connection with the object store.
Is there a way to do a sort of bulk post, to send more than one file in one post?
Regards,
Look at the archive-auto-extract feature available within Openstack Swift (Bluemix Object Storage). I assume that you are familiar with obtaining the X-Auth-Token and Storage_URL from Bluemix object storage. If not, my post about large file manifests explains the process. From the doc, the constraints include:
You must use the tar utility to create the tar archive file.
You can upload regular files but you cannot upload other items (for example, empty directories or symbolic links).
You must UTF-8-encode the member names.
Basic steps would be:
Confirm that IBM Bluemix supports this feature by viewing info details for the service # https://dal.objectstorage.open.softlayer.com/info . You'll see a JSON section within the response similar to:
"bulk_upload": {
"max_failed_extractions": 1000,
"max_containers_per_extraction": 10000
}
Create a tar archive of your desired file set. tar gzip is most common.
Upload this tar archive to object storage with a special parameter that tells swift to auto-extract the contents into the container for you.PUT /v1/AUTH_myaccount/my_backups/?extract-archive=tar.gz
From the docs: To upload an archive file, make a PUT request. Add the extract-archive=format query parameter to indicate that you are uploading a tar archive file instead of normal content. Include within the request body the contents of the local file backup.tar.gz.
Something like:
AUTH_myaccount/my_backups/etc/config1.conf
AUTH_myaccount/my_backups/etc/cool.jpg
AUTH_myaccount/my_backups/home/john/bluemix.conf
...
Inspect the results. Any top-level directory in the archive should create a new container in your Swift object-storage account.
Voila! Bulk upload. Hope this helps.

Where is the Word 2007 schema library stored?

Word 2007 allows XML schemas to be attached to a document (under the Developer toolbar | XML group | Schema button). Where is this schema library information stored?
I have documents that I have created with custom XML tags based on a schema but when I pass on the document and the schema to someone else the schema is marked as unavailable, presumably because the file location of the schema is different.
Is there some way to edit this information to change the path to a given schema?
It's not stored with the docx, just the path to it is stored. So passing a document around will almost always break the link. VSTO can get around this by embedding the XSD as a resource in the app.
But for VBA, it's trickier - you need to have a path you can rely on on each user's computer and then deploy your XSD there. One way is to synch the Document_Open (or just use the AutoOpen) event so that when a user opens the document (warning: macro security needs to be dinked around with), you can simply "write" your XSD that is hard-coded as a string in code-behind and then write it to a file and then attach that file with a routine like:
Dim objSchema As XMLNamespace
Set objSchema = Application.XMLNamespaces.Add("c:\something\mynewlycreated.xsd")
objSchema.AttachToDocument ActiveDocument
So as you're not leaving behind artifacts, you could then delete that XSD from the user's computer on Document_Close or AutoClose.