I am just curious about this.
I had a network folder open on one computer viewing the files in the folder. From another computer I opened the same folder on the network and deleted a file. On the first computer the deleted file immediate disappeared from the list.
The only way that I can think of how it knows that is that it is constantly checking the contents of the open folder. But that sounds like it would waste a lot of resources to do, but I cannot think of any other way it could do that. So I was just wondering...how does that work?
Thanks.
It's probably a push notification. Rather that the client computer constantly checking, the server sends a message to the client when a change is made.
You never specified what platform you're interested in. In general, the only thing that is portable is polling to see when a file or directory has been updated. Polling once a second or so is generally not too expensive, though over a network file system it may be too much.
Various platforms offer a variety of solutions for being notified when filesystems change. Moder versions of Linux provide inotify. Mac OS X provides the FSEvent system. On Windows there is a directory change notification system.
Related
When I wanted to export the model I was working on as a Java application, I encountered an error regarding the databases I loaded into the model. When I said OK to the error, I realized that all the files in the folder I wanted to create the Java application were deleted. That folder was desktop by the way.
Right now all the files (i mean all of them!) on my desktop are deleted and they don't even show up in the recycle bin. How are we going to solve this situation? How can AnyLogic have the authority to delete all files in that folder? How is this authority not shared with me and not warned beforehand?
When you work with software in general, you need to have a version control in place that will allow you to recover your information. These problems occur, and if AnyLogic has access to your computer it's because you grant the permission and it needs the permission. If you make your desktop your project folder, then i would say you are to blame.. why would you do that...
Using GIT as Ben commented, is always a good idea... but it requires you to be conscious about when you commit a version.
What I do, is I use dropbox and all my projects are done in a dropbox folder... the good thing is that dropbox always saves automatically all the files on the folder... this has saved my life multiple times and I suggest you to do something like that in the future. So on one hand you have the autosaving features, which is useful, but sometimes you erase everything by mistake, and the autosave is not useful, but dropbox saves no matter what.
I have been trying to figure this out, and cannot determine if it is possible or not.
Essentially, I commonly work with a VSCode window containing many files located on an external network drive (CIFS mount in Linux). When these files are changed "on-disk", they do not update in the editor until I switch focus to each file by changing the active editor tab. This means I have to switch tabs, wait for the update to process, and then repeat for all open tabs (could be 10 or 20 tabs).
Is there anyway to force all open editors to refresh or revert at once? That would ease my workflow a lot for examining differences between these open files on the fly. There's a command to "Revert File", but that only works on the open file, rather than all currently-opened ones. I've looked in the settings and browsed for an extension, but I can't find anything to accomplish this task.
Well, You can try to map the external network drive to local disk and give appropriate permission for read and write restriction.
If your computer has firewall or anti virus installed, then you must exclude vs access restriction from fw/av inspection.
Otherwise you can also improve your network adapter performance, associate to buffers, throughput, packet latency, etc.
Alternatively, you can use any source control, so your codes could be persist locally and could be synchronized from/to source control server.
Hope this could helps.
I have a simple task to which some simple solution should exist yet I cannot come across one.
I have a huge file tree on computer A (development). I have the same (multiple) such file trees on a computer B (let's call it production). Computer B runs FTP and PHP, nothing much else.
I need to move the changed files from the tree on A to the tree on B but as efficiently as possible. I.e. if just one file changes, it will just transfer that one file. It would be enough to "compare" the local and remote trees using last modification dates, nothing else needed.
I tried to use the good old Ant for it but that really does not work as the FTP task is really bad one there (does not preserve modification dates on PUT and so on). What other options are there if I do not want to write the code for such a task myself? I'd expect there is some tool out there that would make a remote dir listing, download it to local computer, select only those changed files and transfer them to the destination. Do you know how I could do it? Some sort of FTP or PHP-based distributed robocopy?
EDIT: I should have added that I mean doing it on a Windows 10 computer syncing to some FTP/PHP server using command-line automated script, not GUI.
Actually I solved the issue using winscp. I managed to integrate it into ant calling it through the task and using the winscp's synchronize command. For my current folder size it is fast enough, let's see later. The FTP command in ant was not useful since it does not preserve the modification dates.
Theres a few questions similar to this, so I'll try to be clear as possible.
We have an existing, fairly large and complex, GWT webgame I have been asked to make work offline. It has to be offline in pretty much the strictest sense.
Imagine we have been told to make it work off a CD Rom.
So installation is allowed, but we cant expect the users to go to a Chrome/Firefox store and install it from there. It would need to be off the disc.
Likewise, altering of the browsers start-up flags would be unreasonable to expect of users.
Ideally, it would be nice if they just clicked a HTML file for the start page and it opened in their browsers of choice.
We successfully got it working this way in Firefox by adding;
"<add-linker name='xsiframe' />"
To our gwt.xml settings. This seems to solve any security issues FF has with local file access.
However, this does not solve the problem for Chrome.
The main game starts up, but various file requests are blocked due to security issues like these;
XMLHttpRequest cannot load file:///E:/Game%20projects/[Thorn]%20Game/ThornGame/text/messages_en.properties. Cross origin requests are only supported for protocol schemes: http, data, chrome, chrome-extension, https, chrome-extension-resource.MyApplication-0.js:34053 com_google_gwt_http_client_RequestBuilder_$doSend__Lcom_google_gwt_http_client_RequestBuilder_2Ljava_lang_String_2Lcom_google_gwt_http_client_RequestCallback_2Lcom_google_gwt_http_client_Request_2 MyApplication-0.js:34053
Now I was aware same origin policy issues might popup as during development we often tested locally using flags in chrome to bi-pass them.
Thing is...now I dont know how to get around them when we cant use startup flags.
Obviously in the example given its just the .properties file GWT uses to get some language related text. I could dump that inline in one way or another.
However, its only one of many,many,many files being blocked.
The whole game was made to run off *.txt game scripts on the sever - to allow easy updating by non-coders. Really the actual GWT code is just an "engine" and all the XMLHttpRequested files supply the actual "game".
These files are of various types; csv, txt, ntlist, jam.
The last two being custom extensions for what are really just txt files.
All these files are blocked by chromes security. It seems from what I can make out only images are allowed to be accessed locally.
Having all these files compiled in would just be impossible, as they are not fixed in number (ie, one central .txt file determains various scene .txt files which in turn determain various object files and directory's...).
Putting all this into a bundle would be nightmare to create and maintain.
So in essence I need some way to supply a offline version of a GWT project that can access a large number of various files in its subdirectories without security issues.
So far all I can think of is;
A) Theres something I can tell chrome via html or gwt that allows these files to be read in Chrome like FF can. (I suspect this isn't possible).
An alternative to XMLHttpRequest maybe?
B) I need to somehow package a game+a webbrowser in a executable package that has permission to access files in its directory's. (http://www.appcelerator.com/titanium ? ?? ).
C) I need to package and have the user run a full webserver that can then deliver all these files in a XMLHttp accessible way.
D) Bit of a funny one...we cant tell the user to add flags to browser start up...but Maybe I could write a game installer which just detects if they have Chrome or Firefox. It then opens up the games html in their browser with the correct flags for them? This would open up security issues if they browse elsewhere with that instance though, so Id presumably need other flags to disable the url bar if that's possible.
I am happy to make various changes to our code to achieve any of this - but as mentioned above theres no way to determain all the files needing to be accessed at compile time.
And finally, of course, it all has to be as easy as possible for the end user.
Ideally just clicking a html file, or installing something no more complex then a standard windows program.
Thanks for reading this rather long explanation, any pointers and ideas would be very welcome. I especially will appreciate multiple different options or feedback from anyone that's done this.
========================================
I accepted the suggestion to use Chromiumembedded below.
This works and does what I need (and much much more)
To help others that might want to use it, I specifically made two critical changes to the example project;
Because CEF needs a absolute path to the web apps local html, I wrote a c++ function to get the directory the .exe was launched from. This was a platform specific implementation, so if supporting a few OS's (which CEF does) be sure to write dedicated code for each.
Because my webapp will make use of local files, I enabled the Chrome flag for this by changing the browser settings;
browser_settings.file_access_from_file_urls = STATE_ENABLED;
These two changes were enough to get my app working, but it is obviously the bare minimum to make a application. Hopefully my finding will help others.
I'd suggest going the wrapper route. That is, provide a minimal browser implementation that opens your files directly. Options are Chromium Embedded[1]. If the nature of the application absolutely requires the files to be served as non-file urls then bundle a minimal webserver, have the on-disk executable start the server and open the bundled browser with whatever startup arguments you want.
[1] https://bitbucket.org/chromiumembedded/cef
Whenever I alter (or even just resave without altering) a Perl file, it completely takes down our backend. I have no idea what the problem could be. Permissions are correct. Encoding is correct. Encoding is UTF-8. Transfer mode was ASCII.
I might not deal with Perl too much but I have no idea what the problem could be. The network admin hosting our website has no idea what the problem could be.
Text editors I tried: Dreamweaver, TextMate, Vim
Operating systems I tried: Mac OS X, Linux (Ubuntu)
FTP clients I tried: Transmit (Mac), Filezilla (Linux (Ubuntu))
It's not that it's bad code, I even tried to open and solely save and my backend still goes down.
The network admin told me that he ran the files through a dos2unix converter and it worked immediately. I of course tried this and it did not, more so it wouldn't make any sense, since I tried this in some of the most respected editors and I don't think it would make such drastic changes to the file type without any user input. (when I say respected editors Dreamweaver is not included in that sentiment).
I personally think it is some sort of server-side issue because I have crossed my t's and dotted my i's in regards to any possible client side issue but I have tried everything. Any opinions as to what the root of this problem is, and any possible solutions? Thanks in advance.
Try setting binary mode in your FTP client. That will allow you to experiment with different line endings (dos2unix) on the client side, without worrying about them being translated during transfer.
I've had this problem in the past and line-feeds were indeed the culprit.
Your editor and/or FTP program may be mangling the linefeeds.
Running dos2unix on the server is a good test as to the problem but not the cause.
Generate an MD5 hash of the file after each step in saving and transport to find where it changes.
You do not say what kind of framework/server you are using.
Maybe the server reloads the file while it is still being written by FTP or whatever? (I.e. that the file is not complete when the server reads it?)
Will a server restart fix the problem once the file is uploaded?
It sounds like you are using dos2unix before the transfer but the network admin is using it after. Perhaps it's doing something different in that case.
How many lines are in the file? What is the file size before and after you save it, after you transfer it, and after transfer and running dos2unix on it?
If this is just a line ending problem, you might point your network admin at http://www.perlmonks.org/?node_id=586942.
Response to rebra: No frameworks are used, and I don't know what kind of server this is on. This is basically a one man project on a shared host which was pretty horribly maintained and I'm trying to clean house.
Yeah that does make sense and I asked the server people about that, one of my first questions actually, but even if that is the case, I can't reboot via Plesk (kind of like cPanel). But thanks for that, you put into technical words/explanation what I was thinking of the whole time.