SVG files not displaying with Cloudberry PSSnapIn - powershell

I am uploading a folder with subfolders containg SVG files. MIME-Type is set to image/svg+xml.
When I transfer the folder using the Cloudberry GUI interface there is no problem and the publication with the SVG files display perfectly.
When I transfer the same folder using PowerShell with the Cloudberry PSSnapIn with -CheckFileConsistency set. the folder uploads but the SVG files do not render.
I believe that the files with the snap-in are being handled as 'octet-stream' whilst those transferred manually are being handled correctly as svg+xml. I've tried transferring with TNTDrive and again the files display with no issue.

found the answer. Not enough to set the .svg content-type using the Cloudberry GUI. You have to set it in the PS script as well using Add-CloudContentType.

Related

how to display power point file and word file in flutter after getting file path

I listed all docx and pptx files from device storage now I want to open them using file path I tried different packages but I am getting a lot of issues can someone provide me a material related to opening document files or any package that can open files using file path thanks
You can try this package, it will open the files in default application from its path.
https://pub.dev/packages/open_file_plus
https://pub.dev/packages/open_file_plus

File selected in WindowsExplorer with Preview Pane locks the file so powershell cannot output to that file

I have a scheduled script that outputs bunch of HTML files with static names to a remote location. I noticed, that if I have one of those files selected in Windows Explorer so that its contents are shown in Preview Pane, then Powershell cannot overwrite that file and skips updating it.
This only happens if output files are in remote location. Works just fine if files are local.
How do I force PowerShell to overwrite remote files in this situation? Lots of users work with those reports and if one of them leaves Windows Explorer window with one of those files highlighted overnight when the script runs, the file is not going to be updated.
Move HTML files to webserver. You will solve your problem entirely. IIS Setup on windows server is Next, Next, Next. You can leave link to a new file location (https://....) in old place, so users can easily navigate to a new place. Possibly this link can be automated (not sure because of modern security standards)
Try [System.IO.File]::Delete($path) just before writing this file. This removes file entry from filesystem, but leaves file open for those who have it open for now. This makes your script to write to a new file with the same name. Old file exists without name (deleted) but leaves open until everyone close it. Check it actually deleted with resresh!
Try [System.IO.File]::Move($path, $someTrashFullName) just before writing this file. $someTrashFullName probably must be on same drive. Same as Delete, but renames file. Some self-updating software use this strategy. File is renamed, but it's still kept open under new name.
Try replace file with shortcut to some file. You can generate files with different names and change shortcut programmatically
HTML files that change location using js ? They read nearby JSON (generated by export script) and lookup there for a new filename. So user opens static unchanged A.html, JS inside lookups at A.json for new name and redirects user to A-2020-08-11.html. I'm not sure browsers allow reading JSON files from JS for files that opened from network drive.
Only way left is to stop network share or\and close open files server-side.
Maybe some fun with to disable preview in this folder \ completely?
Try with -Force. But to me, it seems to be more a permission issue.
Remove-Item -Path '\\server\share\file' -Force

Unable to attach certain file types to MoinMoin page?

I configured a a local MoinMoin server and am trying to attach an excel file to a page. Upload for cpp, ods and txt files works fine but pdf, exe, doc, xls and xlsx files DON'T get attached. Once I click the 'upload' button, I get redirected back to the wiki page. When I go back to the Attachments sections I don't see the file attached to the page.
(Running MoinMoin 1.9.3 + Apache2.2 on Windows XP.)
Workaround:
It seems to be an issue only when moin python scripts are called as CGI scripts. I switched to using WSGI in Apache (following instructions from http://code.google.com/p/modwsgi/wiki/IntegrationWithMoinMoin) and am able to upload any file.

Files on my WebDAV mapped drive output rendered files in IDEs instead of actual content

On my mac I mounted a shared drive using WebDAV by going to "Finder > Go > Connect to server".
Now, when I try to view the files using TextWranger or TextEdit I can see the PHP code that I want to edit.
However, if I try to use an IDE like NetBeans/Eclipse/TextMate and create a new project with my shared drive as the "Existing sources" folder I cannot see the PHP code.
Instead I see the HTML output of the files as if I were seeing them through a web browser. Also, if I try to view a file that isn't normally accessibility (a command line script) I see the output as if it were called from the command line.
But a weird thing is if I use TextMate to edit a single file from the shared drive I can see the php code I am trying to edit. It just doesn't work as a project.
Any suggestions or solutions on how I can use an IDE to edit files over WebDAV? And why do my IDEs display the content rendered, instead of the actual file on the file system.
I'm not a specialist at all but I seem to remember that WebDAV clients do send GET requests.
If I'm correct your server may not be able to discriminate between HTTP GET and WebDAV GET thus rendering your .php files. Why this would work that way when working with a project and another way while working with individual files is not clear, though.
Do you get rendered files when you add files to your project manually as well?

cURL ftp transfer scenario

I'm trying to automate uploading and downloading from an ftp site using cURL inside MAtlab, but I'm having difficulties. Essentially I want one computer continuously uploading new files to an ftp, yet since there is a disk quota on the ftp, I want another computer continuously downloading and removing those same files from the ftp.
Easy enough, but my problem arises from wanting to make sure that I don't download a file that is still being uploaded, thereby resulting in an incomplete file.
First off, is there a way in cURL to make it so that the file wouldn't be available for download from the ftp site until the entire file has been uploaded?
One way around this is that I could upload files to one directory, and once they are finished uploading, then I could transfer them to a "Finished" directory on the ftp site. Then the download program would only look for files inside that "Finished" directory. However, I don't know how to transfer files within an ftp site using cURL.
Is it possible to transfer files between directories on an ftp site using cURL without having to download the file first?
And if anyone else has better ideas on how to perform this task, I'd love to hear em!
Thanks!
You can upload the files using a special name and then rename it when done, and have the download client only download files with that special "upload completed" name style.
Or you move them between directories just as you say (which is essentially a rename as well, just changing the directory too).
With the command line curl, you can perform "raw" commands after the upload with the -Q option and you can even find a tiny example in the curl FAQ: http://curl.haxx.se/docs/faq.html#Can_I_use_curl_to_delete_rename