How to set StaticFileServer() using Kitura? - swift

I want to see files in my localhost directory using Kitura.
I've written:
router.all("/test/*", middleware: StaticFileServer())
but it didn't seem to work
I want to all files in my directory. Similar to directoryIndex

You can pass the path to a directory to serve to StaticFileServer, as path parameter, by default it is "public":
router.all("/test/", middleware: StaticFileServer(path: "MyDirectoryWithStaticFiles"))
Then you will be able to access the files in this directory, but not the directory itself. E.g., you will be able to perform GET /test/someFile.html, but not /test/. You will be able to GET /test/, if your directory will contain index.html.
See https://github.com/IBM-Swift/Kitura-Sample for example of using StaticFileHandler.

Related

How to access a file within a subdirectory using the Github API?

Through the Github API, I can access files at the root of the repository just fine, using this URL structure:
https://api.github.com/repos/kevindecent/decent-salesforce/contents/README.md
But let's say README.md is within a subdirectory called 'files'. This does not work (I get a 'not found' error):
https://api.github.com/repos/kevindecent/decent-salesforce/contents/files/README.md
How can I access files in subdirectories?
EDIT: hitting the following gives me all files at the root, and only one of the 10 directories:
https://api.github.com/repos/kevindecent/decent-salesforce/contents
Very odd.
You have to hit the raw file. It looks like your example is just that, so here's a real file:
https://raw.githubusercontent.com/stefangabos/Zebra_Mptt/master/composer.json

Transfer folder using pscp (Putty) from Windows to Linux

Using puttys pscp -r folder\to\copy\* user#server:/path/to/copy/folder/to it only copies the content of path\to\copy\folder\* and does not include the "main" folder which the subfiles and subdirectories are in.
What I need is that the folder itself is also copied such that I get a folder with the same name as the one I copied with the content inside.
I know I just can create a parent-folder for the one I want to copy and parse that as the path\to\copy\folder\* but that is not the case
Just use pscp -r folder\to\copy user#server:/path/to/copy/folder/to.

gsutil - is it possible to list only folders?

Is it possible to list only the folders in a bucket using the gsutil tool?
I can't see anything listed here.
For example, I'd like to list only the folders in this bucket:
Folders don't actually exist. gsutil and the Storage Browser do some magic under the covers to give the impression that folders exist.
You could filter your gsutil results to only show results that end with a forward slash but this may not show all the "folders". It will only show "folders" that were manually created (i.e., not implicitly exist because an object name contains slashes):
gsutil ls gs://bucket/ | grep -e "/$"
Just to add here, if you directly drag a folder tree to google cloud storage web GUI, then you don't really get a file for a parent folder, in fact each file name is a fully qualified url e.g. "/blah/foo/bar.txt" , instead of a folder blah>foo>bar.txt
The trick here is to first use the GUI to create a folder called blah and then create another folder called foo inside (using the button in the GUI) and finally drag the files in it.
When you now list the file you will get a separate entry for
blah/
foo/
bar.txt
rather than only one
blah/foo/bar.txt

Is it possible to use relative paths in sphinx.conf?

I'm using Sphinx on a Linux production server as well as a Windows dev machine running WampServer.
The index configurations in sphinx.conf each require a path setting for the output file name. Because the filesystems on the production server and dev machine are different, I have to have two lines and then comment one out depending on which server I'm using.
#path = /path/to/folder/name #LIVE
path = C:\wamp\www\site\path\to\folder\name #LOCALHOST
Since I have lots of indexes, it gets really old having to constantly comment and uncomment dozens of lines every time I need to update the file.
Using relative paths would be the ideal solution, but when I tried that I received the following error when running the indexer:
FATAL: failed to open ../folder/name.tmp.spl: Invalid argument, will not index. Try --rotate option.
Is it possible to use relative paths in sphinx.conf?
You can use relative paths, but its kind of tricky because you the various utilities will have different working directories.
eg On windows the searchd service, will start IIRC with a working directory of $WINDIR$\System32
on linux, via crontab, I think it has working directory left over from previously, so would have to change the folder in the actual command line
... ie its not relative to the config file, its relative to the current working directory.
Personally I use a version control system (SVN actually) to manage it. The version from Dev, is always the one commited to the repository, the 'working copy' on the LIVE server, has had the paths edited to the right location. So when 'update' to the latest file, only changes are merged leaving the local filepaths in tact.
Other people use a dynamic config file. The config file can be a script (php/python/perl etc) - but this only works on linux so wont help you.
Or can just have a 'publish' script. Basically, you edit a 'master' config file, and one that can be freely copied to all servers. Then a 'publish' script, that writes the apprirate local path. It could do it with some pretty simple search replace.
<?php
if (trim(`hostname`) == 'live') {
$path = '/path/to/folder/';
} else {
$path = 'C:\wamp\www\site\path\to\folder\`;
}
$contents = file_get_contents('sphinx.conf.master');
$contents = str_replace('$path',$path,$contents);
file_put_contents('sphinx.conf',$contents);
Then have path = $path\name in the master config file, which will get replaced to the proper path, when run the script on the local machine

Why are file mounts outside of fileadmin not visible for users?

I tried to create a file mount to set access rights for a user.
Directory:
typoroot/uploads/
I have set the file mount as absolute with an absolute path to that directory. With an without trailing an leading slashes.
Then I set the file mount for that user.
The file mount is not visible for the user. Another file mount created the same way, but in the fileadmin folder works.
Why do the file mounts outside of fileadmin not work?
I have searched for hours to get this one solved:
For file mounts to work outside of the fileadmin folder, you have to set a config variable in the install tool of Typo3.
Display the full configuration and search for:
"lockRootPath"
Then enter an absolute path to the directory that you want to lock the user in. Most of the cases you will want to take the root of your Typo3 installation.