Word 2007 allows XML schemas to be attached to a document (under the Developer toolbar | XML group | Schema button). Where is this schema library information stored?
I have documents that I have created with custom XML tags based on a schema but when I pass on the document and the schema to someone else the schema is marked as unavailable, presumably because the file location of the schema is different.
Is there some way to edit this information to change the path to a given schema?
It's not stored with the docx, just the path to it is stored. So passing a document around will almost always break the link. VSTO can get around this by embedding the XSD as a resource in the app.
But for VBA, it's trickier - you need to have a path you can rely on on each user's computer and then deploy your XSD there. One way is to synch the Document_Open (or just use the AutoOpen) event so that when a user opens the document (warning: macro security needs to be dinked around with), you can simply "write" your XSD that is hard-coded as a string in code-behind and then write it to a file and then attach that file with a routine like:
Dim objSchema As XMLNamespace
Set objSchema = Application.XMLNamespaces.Add("c:\something\mynewlycreated.xsd")
objSchema.AttachToDocument ActiveDocument
So as you're not leaving behind artifacts, you could then delete that XSD from the user's computer on Document_Close or AutoClose.
Related
I have a single SharePoint content database (Sharepoint 2019 - On Premise) that is over 100 GB and I would like to split the SP sites between some new content databases that I will make.
I have created the new content databases but I have no idea on how to move the subsites to them.
Based on research that I have done, it seems I need to:
Create Content Databases
Create site collections in those databases
Move sub collections into new site collections in the new databases.
Question 1 - are the above steps correct or do I have this wrong?
Question 2 - How in the heck do I move subsites out of the almost full content database, into new content Database? Do I move it to
the site collection in the new database? If so How?!?
Thankyou for your brainpower and help
Tried moving subsites and failed
Unfortunately, I could not understand if you wish to transfer just some subsites or a complete site collection, so I will list below both of these ways.
I would strongly suggest that you create a sandbox environment before proceeding with any of the below scripts, just in case you have misunderstood anything.
Before any transfers are performed, you should create the Content Databases that you will be targeting. You can perform such task either via the Central Admin Panel (GUI) or via a PowerShell Script, of which the command would be the below:
#get web app under which you will create the content db.
$WebApp = Get-SPWebApplication
#create the new content database
New-SPContentDatabase "<Name_of_new_Content_db>" -DatabaseServer "<db_server>" -WebApplication $WebApp
#you can also use the below parchment which points directly to the web app.
#New-SPContentDatabase "<Name_of_new_Content_db>" -DatabaseServer "<db_server>" -WebApplication http://web-app/
In case you wish to transfer whole site collections or clone them on to different Content Databases there are three ways to achieve this.
Copy Site Collection, use the Copy-SPSite cmdlet to make a copy of a site collection from an implied source content database to a specified destination content database.
The copy of the site collection has a new URL and a new SiteID.
Copy-SPSite http://web-app/sites/original -DestinationDatabase <Name_of_new_Content_db> -TargetUrl http://web-app/sites/copyfromoriginal
Move Site Collection, the Move-SPSite cmdlet moves the data in the specified site collection from its current content database to the content database specified by the DestinationDatabase parameter.
A no-access lock is applied to the site collection to prevent users from altering data within the site collection while the move is taking place.
Once the move is complete, the site collection is returned to its original lock state. The original URL is preserved, in contrast with Copy-SPSite where you generate a new one.
As you can see, before executing the below script each content database was hosting at least one site collection.
Move-SPSite http://web-app/sites/originalbeforemove -DestinationDatabase <Name_of_new_Content_db>
After the execution, you can see that a site was transfered from the last content database to the second, preserving its original url.
Backup and Restore Site Collection, this combination will save the site collection on the disk and afterwards restore it onto a new Content Database. The Restore-SPSite cmdlet performs a restoration of the site collection to a location specified by the Identity parameter. A content database may only contain one copy of a site collection. If a site collection is backed up and restored to a different URL location within the same Web application, an additional content database must be available to hold the restored copy of the site collection.
Backup-SPSite http://web-app/sites/original -Path C:\Backup\original.bak
Restore-SPSite http://web-app/sites/originalrestored -Path C:\Backup\original.bak -ContentDatabase <Name_of_new_Content_db>
Once I executed the above commands, a new site was restored on the third Content Database, which was basically a clone of the original site. Keep in mind, that with this path you will preserve the original site and will be able to work on the newly restored copy.
In case you wish to transfer just one Sub Site on to a different Content Databases you can follow the below strategy.
Use the -Force flag in case of the below error.
File C:\Backup\export.cmp already exists. To overwrite the existing file use the -Force parameter.
You can import sites only into sites that are based on same template as the exported site. This is refering to the Site Collection and not the SubSite
Import-SPWeb : Cannot import site. The exported site is based on the template STS#3 but the destination site is based on the template STS#0. You can import sites only
into sites that are based on same template as the exported site.
#Create Site Collection in targeted Content Database first
New-SPSite http://web-app/sites/subsiterestoration2 -OwnerAlias "DOMAIN\user" -Language 1033 -Template STS#3 -ContentDatabase <Name_of_new_Content_db>
#export Web object, use force to overwrite the .cmp file
Export-SPWeb http://web-app/sites/original/subsitetomove -Path "C:\Backup\export.cmp" -Force
#Create a new Web under the new Site Collection, although it is not necessary and you can always restore on to the RootWeb. I created the new Web object just to preserve the previous architecture.
New-SPWeb http://web-app/sites/subsiterestoration2/subsitemoved -Template "STS#3"
#Finally, import the exported Web Object on to the Targeted Web
Import-SPWeb http://web-app/sites/subsiterestoration2/subsitemoved -Path "C:\Backup\export.cmp" -UpdateVersions Overwrite
Final Notes
Keep in mind that all of the transfers were performed on sites that did not have any kind of customizations upon them, like Nintex WFs or custom event receivers. These were just plain sites that several Lists and Document Libraries.
Always make sure that once you are performing the below tasks that the Users are not altering data that currently exist within the site collections in question.
To briefly answer your question, yes you have the correct idea of what there is to be done in case you wish to transfer just the a sub site, but you must pick the best method of the above that suits you.
Always pay attention that most of the methods alter the url which points to a subsite, which you should be cautious about if any other third party automations are getting and updating data on Sharepoint with these urls.
I will try to keep this answer updated with the ways of transfering a subsite, in case anything else comes up.
We need to retrieve an ID that uniquely identifies a document, so that when a user opens the same document in different sessions (even a year apart) we can identify this in the logs.
In the API I found DocumentURL but this could change (if the document is moved?) and it might even be empty (if the document is never stored online?). We could hash a combination of properties like Author and Date Created but these too can change and thus can't be fully relied upon.
How do we access the ID of a document? Ideally we're looking for a solution that works for any type of document, but if currently there is only such a property for a Word document then that is sufficient as well.
EDIT: Adding scenarios that need to work because otherwise my request seems too simple (hence the down-votes?):
The user can open, edit, save, etc. other documents and the ID should ALWAYS be the same PER document. Similarly, if a user shares a document with someone else, the ID read by the other user (when running our add-in) should be the same as for the owner of that document.
The add-in needs to be portable and usable on multiple platforms. When a user opens the same document on Word Online and Win 32, on different computers, etc. the ID must always be the same for that document.
To create a unique ID, it takes only a little JavaScript to create a GUID. See this SO post for example: Create GUID/UUID in JavaScript
To store the ID, you could use a custom setting or custom property. See Persist State and Settings
I have 2000 documents which were created by an incorrectly configured application which now uses an invalid merge data path.
e.g. c:\APPNAME\WP\MERGE.TXT
The files now live in H:\MERGE.TXT so all users can access them.
Is there a way to update this path without opening each file in MS Word and reselecting the data source?
Looking forward to your replies.
When we define custom metadata for the components, it is my understanding that this user-given metadata is stored in SQL server. And it is not visible in the component xml. Can anyone explain how exactly a metadata linked to a component will actually get stored?
A Component definition in Tridion has two types of fields: Content fields and Metadata fields. Both field types are stored in the Content Manager database (either SQL Server or Oracle). And both field types are retrieved whenever you read the Component back from Tridion through any of its APIs (TOM, TOM.NET or Core Service).
Only the Content fields are shown in the Source tab of a Component edit window, but the Metadata fields are visible on the Metadata tab of that same window.
If you want to have a single view of the XML of both Metadata and Content fields (as well as many other properties of you Component in Tridion), consider installing the PowerTools or the Item XML extension.
I think you may be confusing things a bit.
The Metadata is always stored as part of the component - under tcm:Metadata. When you publish this component, then the metadata fields will also be available for querying on the Content Delivery Data Store.
Whether these fields get displayed as part of the component presentation depends on your templates. There's nothing stopping you from including these values in the output of your template (typical use case for SEO, for instance).
In summary:
In the CM, the Metadata is stored together with the Component
In the CD, the Metadata is stored as part of the "CUSTOM_META"
associated with this component.
Just a note,
There is another metadata that is not stored as metadata fields, which is the system metadata, such as Last Modified Date or the user that last modified the component. That's metadata in the CMS. Also there is system metadata in the front-end (broker or file system metadata) that gets published when you publish a given component, such as Last Published Date.
You can leverage/use the system metadata in your templates as well.
I'm writing cms like application and I would like my images to be stored as attachments in couchdb.
The problem is in naming the attachments because I don't want my images to be named the same (e.g. /db/doc_id/thumb.jpg)
Ideally attachments names should depend on doc.name field. To make this work I would have to rename attachment each time user changed the name (description|alt) of current photo document.
So my question is: how to change attachment name? or maybe I should go other way in solving my problem?
Szymon,
I'd suggest considering using UUIDs for the document names, and storing the attachment as something "static" like "original", "thumb", "300wide", etc. The name given by the user when they upload the file can be stored as a key, and you can use a MapReduce index to retrieve the image/file using that name later.
If you go that route, though, you'll have to come at the "duplicates" problem a bit differently--as you could easily upload the same image multiple times with the same user-provided name and there would be no conflict.
Depending on what you're building, though, making the user provide a unique name is generally unwise--Flickr (among many others) doesn't, for instance.
If you really do need to make the doc_id == the name given by the user, then it would still be wise to store the attachments under static names, so you don't have to update the attachment name.
Lastly, if you feel you really must change the attachment name (and there are certainly cases where you need to), the simplest way is to GET the attachment from the old location (or with the document), PUT it as the new name, and DELETE the attachment with the old name.
Hope that helps!
Use this command >
curl -v http://localhost:5984/database/DocumentID/OldFileName?rev=RevisionID -X MOVE -H "Destination: NewFileName"