Teamsite Cleanup - content-management-system

I was wondering what is the best way to do a clean up of autonomy teamsite7. I basically have loads of dcr files that are not in use anymore and want to delete these. Whats the best way to search through the CMS and identify what .page and dcrs are published on the live site and what is not used and can be deleted so I dont have to go through manually?

Template-based pages use extended attributes to associate a generated page with a DCR and Presentation template.
In the past, I have written Perl scripts that would generate a list of the active DCRs. You can then compare the DCRs in the workarea to that list and delete those that aren't on the list.
Basically set it up as a cron job that cleaned up the "templatedata" directory once a month.

If I had this particular scenario, I think my first thought would be to run a simulated comparison deployment with OpenDeploy to generate the list of files that were in the workarea but not on the webserver. You could parse the resulting log with a perl script (or whatever your favorite language is) to then remove those files.

Related

Copy and paste Typo3 Sites between 2 backends

I am managing around 12 TYPO3 backends with almost similar content. Is it possible to copy and paste a created site between independent backends? Right now I'm creating by hand 12 sites with the same content. There has to be an easier way.
Well, there is not much I could try. Within TYPO3 I don't see any option to export/import sites from other backends.
First of all you should merge those 12 sites into one backend with multiple root sites and trees. Then you can easily handle different domains and/or languages via the site configurations for those roots.
Of course you can then make use of shared sys_folder pages that contain the content elements, that should be available for multiple sites. To make them available for a specific site, you can use references then.
You can export a page tree and import it into another instance.
On the other hand you can duplicate an instance by copying the complete database and original files.
That includes necessarily the fileadmin/ and uploads/ folders.
typo3conf/ should be duplicated by deployment but might differ in the files typo3conf/LocalConfiguration.php and typo3conf/AdditionalConfiguration.php (e.g. each instance should have other databases).
You can use the core extension impexp to import/export content parts. There is even a context menu entry
Be aware of some drawbacks:
if assets are exported, those are exported & imported you can hit the limit of your memory_limit
take extra care about which uids should be used, e.g. forcing uids can lead to drawbacks
Of course there are other options as well like:
create a custom extension which exports/imports the content on the fly using either something like a custom endpoint or fetching directly the DB if possible
use 1 installation as discussed already
if e.g. using ext:news use something like rss feeds for a poor man import/export with ext:news_importicsxml

Can I use symbolic links in Dokuwiki's page files?

I'm planning to try using dokuwiki to manage my large collection of notes, and one of the major attractions is its flat file basis that'll allow me to edit via scripts etc. I had a question - suppose a page's material fits into multiple namespaces. If I were to create the file in one namespace and then create symlinks in the other namespace directories, would that work? Or would that screw up revisions etc?
Yes, you can do that. But yes, this will mess with your revisions a bit:
when DokuWiki saves a page, it copies the data of the old page to the attic
the name of the attic file is the same as the page that was edited, but with a timestamp appended
because new attic files are created you can't work with symlinks in the attic
Imagine you have the following setup:
data/pages/original.txt
data/pages/copy.txt -> original.txt
You now can edit the pages original and copy in your wiki and they will both always be the same. However old revisions of the pages will be split between the two, depending on which page you edited.
Instead of messing with file level consider
Include plugin to share content between pages.
Creation of some 'commons' namespace for such pages to be DRY.
Namespace templates (+ additional plugin).
Pulling content from page side instead of pushing it to pages. This might be good to start with. You can always include some php code or even write your own plugin.

TFS 2013 build - uploading build output to servier via FTP

I'm hoping someone can help. I've started using the Community TFS Build Extensions, in particular the FTP activity. I followed the documentation here and got to grips with the it pretty easily. I'm encountering one major problem though.
My Web app has a basic enough structure:
I start by creating the FindMatchingFile activity which places the files in the drop location into an IEnumberable variable called FilesToFTP :
String.Format("{0}\**\*.*", BuildDetail.DropLocation)
When I iterate through the variable and print out the results, all seems correct:
G:\builds\Build.1203\CredentialManagement\bin\BusLogic.dll
G:\builds\Build.1203\CredentialManagement\css\style.css
G:\builds\Build.1203\CredentialManagement\AppError.aspx
......
G:\builds\Build.1203\CredentialManagement\Web.config
etc etc.
The problem is, when I pass that IEnumerable to the Ftp activity (converting it to a string array), it FTP uploads all the files on the server however it doesn't keep the directory structure of my Web app. It just piles all the output (dlls, aspx etc) into one directory. See the following two screenshots.
Is there any way I can use the FTP activity to upload all the output from the drop location recursively? I feel like I'm doing something simple wrong.
The FTP activity in TFS Build Extensions doesn't upload files recursively.
I think it would be a good value addition to the activity. Please create a request for the project and we will add in it. For now, you can go around it by calling the Ftp activity recursively for each directory and setting the RemoteDirectory for each.

How do I edit files in place that were uploaded to Moodle?

I would like a better workflow for debugging uploaded SCOs. As things are, I must edit a file in the activity, repackage, upload, and test. Often, I just need to change a single line of code. It would be VERY nice to be able to edit that file, that line of code, on the server. So far, all I've found is that Moodle manages the files, so it seems impractical to locate and decipher the renamed files after upload.
Is there a way to configure Moodle so that it doesn't rename and relocated files in SCOs upon extraction? Actually, I'm open to any suggestions on the best, fastest workflow for debugging SCOs.
Problem background
Since Moodle 2.0, files are no longer stored on server in the conventional /this/is/the/path/to/my.file way. Instead, files are rehashed and stored in Repositories (i.e. spread all over the moodledata folder as a collection of seemingly random data). This increases security and cross-OS compatibility but complicates stuff for people who would like to simply upload a SCORM zip package via FTP. Here's more information on file handling in Moodle 2.0
Path to the soluton
Let's locate the file you want to update, then update it.
Run phpmyadmin, go to mdl_files table, find your file by name in the filename field (let's say it's portrait.jpg)
Look at the contenthash field, it'll look like abcde1234567890. This means your file is stored in moodledata/filedir/ab/cd/ folder under the name abcde1234567890.
Rename the updated portrait.jpg to abcde1234567890, upload and overwrite.
Go back to phpmyadmin and update the filesize field in record for portrait.jpg with the size of the updated file.
Obviously, this process can be automated. You'll have to write a script that allows you to upload a file, then it'll search for that file in mdl_files, save it to the correct folder and update all fields accordingly.
Alternative idea
Enable external package type (and also enable 'Update on every launch'). Go to Site administration / Plugins / Activities / SCORM and check the box down below. Now you'll be able to launch SCORM packages directly from another server, so Moodle won't mess with it. Of course, you can run in other (probably cross-domain related) problems.
Sergey's answer is very good, with one caveat:
In his example with the contenthash of abcde1234567890, the file is stored in the moodledata/filedir/ab/cd/ folder under the name abcde1234567890. Moodle uses the full contenthash to name the file.

Show license agreement before download

I have to solve the following task for our university homepage:
Whenever a pdf is requested the user has to accept a license, which pops up.
On Agree the download starts. If not, no download is possible.
I searched through the extensions but did not find any extension doing the job. Maybe you know one...
So I tried to implement my own extension. Taking the strengths of securelinks (Allows access control to files from a configurable directory ... presents a license acceptation prior to download) and naw_securedl ("Secure Download": Apply TYPO3 access rights to ALL file assets (PDFs, TGZs or JPGs etc. - configurable) - protect them from direct access.) I wanted to combine both extensions to have one that:
whenever a pdf file is requested (naw_securedl)
a license is shown and in case of ACCEPT a redirect to the file happens (securelinks).
This task sounds very easy, since I only have to combine both tasks. Anyway, I failed.
How do you solve this problem?
Do you know some extension doing the job?
Is anyone interested in a cooperation in which we try to create an extension thats doing the job?
Thanks for your help in advance!
Assuming that all donwloads are stored in one folder, I'd recommend writing your own little extension that replaces every link with a link to an intermediate site, like this:
www.mydomain.com/acceptlicense.html?downloadfile=myhighqualitycontent.pdf.
On the accept license page, users need to check the accept license checkbox, then click a submit button, which leads them to the download page, still carrying the GET parameter:
www.mydomain.com/download.html?downloadfile=myhighqualitycontent.pdf.
If not all files are in the same folder, you can replace slashes in the file path with other characters (they need to work in the URL). Or you might need a database table that indexes the files, so you can use IDs for the download files:
www.mydomain.com/acceptlicense.html?downloadfileID=99
If you don't know at all how to write TYPO3 extensions, consider using individual php/html files out of the TYPO3 context.