I have a samba host computer where the remote files are located. I want to edit those files from my laptop without copying them first just like on windows. I am using VS Code on Ubuntu and same on Windows, both OS in dual-boot. I am in process of transitioning my workspace to Linux but the problem is that I can't edit files directly because VS Code won't display samba mount points. I figured out "gedit" can achieve the editing directly but who uses that as a code editor?
I am looking for a service or application that allows me to mount the samba shares in the way that VS Code can see it and be able to edit directly from it. I don't know any keywords to find the solution to my problem so any help would save me time.
Answering my own question
It turns out that I just had to install this package:
gvfs-fuse
Now i can edit files directly to my samba share regardless of what application I use.
Related
I have a machine without internet access on which I'd like to install a list of VSCode extensions.
Is there an automated way of downloading the extensions while online, so I can copy the files to the offline machine and install them there? Ideally I'd like to be able to re-run the process to download updates.
I'm aware it's possible to manually download each extension, but given the number of extensions and their frequency of updates, I'd ideally like a repeatable process.
Would something like running a script in Portable Mode help?
Thanks!
If you can have vscode installed in the machine which has internet access you can use it to download the extensions and copy it to the other machine.
Let's say machine A has internet connection and machine B has no internet access.
Download the portable version of vscode i.e. .zip version from https://code.visualstudio.com/Download
After unzipping the VS Code download in machine A, create a data folder within VS Code's folder.
The data folder can be moved to other VS Code installations.
Copy the complete vscode directory to the machine B from machine A, this way you will have a portable version of vscode there.
Now whenever you need to update the extensions in the machine B, you can update the extension in machine A and copy the data folder or more precisely copy the extensions folder in the machine B.
FYR. I just copied entensions from below folder to the private network machine and they worked:
C:\Users\{xxx}\.vscode\extensions
I'm developing a Jupyter Notebook for my team to use to catalogue and analyse some proprietary data. I'm ready to share it with the team for on-going execution and development. The team generally have Windows 10 workstations and are skilled engineers, though not data scientists. No one currently uses Jupyter.
I now realise I might have thoroughly misjudged Jupyter's ability to support this sort of working environment.
Option 1: Individual installations
This is the worst case scenario. Anyone that wants to run or modify the notebook needs to install Jupyter. Anaconda is probably the best way to go, but its a big, ugly, scary install. Worse, every user will have to install and manage additional libraries. Any notebook change that requires a kernel change will have to be manually applied to each installation.
Surely, being client-server, this is not the intention of Jupyter.
Option 2: One server, many clients
The obvious alternative is to host the Jupyter server on a network accessible computer and have all users connect to it with a browser. That way there's only one shared installation to manage and each user just needs a URL to access it.
But there's a gotcha - the server expects the notebook to be on its own file system! So every user will access the same notebook file. This makes version control very problematic - no one can check out their own copy of the notebook for independent edit and commit sessions. Instead, changes will overwrite the only copy, and commits/reverts/diffs will have to be done on the server (or by mounting the server's file system).
Option 3: Server in Docker image, each user runs a container
Docker to the rescue? That way we can build/maintain one server image (and even version control it) and each user only needs to have a Docker engine installed to instantiate the image (which is a friendly 8GB download!!). They connect to their own container which, with a bit of scripting trickery, will be pointing at their own copy of the notebook.
This option only took 20 hours to investigate before discovering that it fundamentally sucks. Working with the kernel is tricky with lots of new skills necessary. But more showstopping - nothing that shows a Qt window will work. The qtconsole we can do without, but part of our notebook shows a File Open dialog and the best way to do that is with a Qt Widget. With the server in a Docker Container expecting an X Windows environment, and the client in a Windows browser, the Widget cannot be shown.
The Qt issue was the last of many, many issues trying to get the Docker option running. Everything from matplotlib to path mapping, from os library calls to ipywidgets needed to be investigated, tweaked, Googled, chopped and changed to work. I'm fairly convinced that these dramas would be on-going.
Conclusion
There are lots of discussions around Jupyter version control. There's lots of options for read-only sharing. And there's even a project for runtime-building a Docker container to provide executable access to a notebook. But there is scant advice on using Jupyter in a team environment.
Given the endless complications when the server is not natively running on the same machine as the client, I'm starting to believe Option 1 is the only sane way to go. Before I go to my colleagues with the crappy news, are there any other suggestions?
Ended up having a fruitful discussion on the Jupyter Google Group and have confirmed that out-of-the-box, Jupyter does not support this sort of working environment. Indeed, crucially, Jupyter expects the server to have a single user.
The most promising DIY solution was firstly to deploy JupyterHub, for two reasons:
It launches a new server instance for each user, preventing any multi-user per server issues.
It prompts for users to identify themselves, allowing different actions to be taken depending on the user.
And secondly, have the server mount each user's file system (or equivalent network architecture), so it can point the user back to their own local files.
I have not implemented this strategy (making do with Option 1 for now!) but it certainly makes sense.
I'm working remotely and I need to access files on a server that's only accessible through ssh from another machine.
For example if my files are on server2, I need to ssh me#server1 then once I'm on that machine, ssh me#server2
Is there a way to set up remote systems in eclipse (I'm using Zend Studio) to get access to my files?
Thanks.
The short answer is No, however you you do have some options...
Eclipse is only aware of files in an Eclipse workspace (with some exceptions), thus you need to make your files available to your Eclipse instance. To do so, you could download the remote files and make them available locally or make te files accessible via a network share.
All in all, you need to make those files available on a file system visible to the local Eclipse instance. Once that is done, you can add/import the file into your Eclipse workspace.
Is it possible to set up a remote NetBeans C++ project where the source files are only accessible via SSH?
My project needs to build on a Linux box, but I'd like to develop it on a Windows machine.
Checking out the code via SVN to my Windows machine is not an option since there are a few files that differ only by case, and NTFS is not case sensitive (unfortunately, I can not change them).
I'm well aware that Windows can be kind-of forced be case-aware and the ideal solution is to just re-name those file to something sane.
However, I'm just trying to solve this using NetBeans. Since it's a remote project anyway, why bother to keep any files locally.
Thanks
Currently, no. In general programming files with different cases of the same name is a bad practice.
You can enable case sensitivity in Windows - you may need to have a Professional version or better.
For Windows XP: http://support.microsoft.com/kb/817921
For Windows 7: http://technet.microsoft.com/en-us/library/cc732389.aspx
See also: Windows Services for Unix
Another solution would be to setup VNC/RDP on the remote Unix system. The overall solution should be to conform to a better file naming convention:
Programmer 1: "Hey man, take a look at noCamelCase.cs - I just rewrote it."
Programmer 2: "Um, nocamelcase.cs is blank."
There are two ways of doing remote builds with Netbeans. The first, the project is stored locally. You just create a regular project and on the 2nd page of the wizard you specify the network directory with the source and the remote build host. I've used this for Solaris client to Linux server, but not from Windows as we don't have the mounts exported by SMB. This uses ssh and some shared lib interposers to get the build info.
The second way is to create a remote project. In this case the project is created on the remote host and date is copied on demand to the client. I've only doe a few tests with this as I preferred the first method as it had much better latency.
Lastly, you could either use vnc or install X on your windows machine and do everything on the Linux machine.
What options are there for saving and retrieving documents to and from the cloud, from within Emacs?
I use Emacs at work, on a Windows machine, and at home, on a Linux box, so ideally I would want a solution that works more or less out of the box for both operating systems.
I touched on g-client, but could not quite get it to work. Obviously, if there are no other, simpler options, I'm just going to have to spend a couple of more hours on it.
Many thanks,
Andreas
Dropbox is pretty universal. I store even my Emacs config files there. Works on Windows, Linux, OS X, and iPhone. Syncs automatically. Stores history. Is free. What else do you want?:-)
Two options that I can think of:
If you have access to a server somewhere that runs ssh, then use ssh with tramp. You can also run a ssh server at your home linux box and access your home files through from work. Tramp works perfectly fine on Windows with ssh from cygwin. It will automatically grab a file (provided that you give emacs something like /ssh:yourusername#yourserverhost:~/yourfile), put it to a temporary file at your computer, then copy it back to the host when you save it.
Use a source control system like SVN or Git. Again you can host the server at your home or you can find online hosts (most are for open source and are thus public, but some are free and private; I use unfuddle.com). You would have to regularly commit/update, but you can easily automate that if you want, and the source control system gives you a nice history of your files and a safety net in case you did something very wrong.
Emacs has excellent integration with source control system. If you find the build-in one not sufficient (it is quite generic and thus does not offer interface to some specific features of a particular source control system), there are plenty of good alternative (psvn for SVN, and magit for Git, for example).
sshfs, if you have good connection speed.
Otherwise there's always tramp-mode for Emacs.
Edit: Just saw you are using Windows.
It's been some years since I used Windows as my desktop, but I used WebDrive back then. It sort-of works, although it always was a bit unstable.
Emacs has great support for remote file systems via Tramp. So the real question is what should you use as a remote FS. There are a bunch of them and as long as they have a way of mounting them or logging in via ssh (for Tramp) you should be ok.
I use JungleDisk - works great for Windows, Linux and Mac. Starts around $2 per month and there's a cap of around $90 per year. You can back up to S3 or to Rackspace.
It integrates at the file system level so you can either read/write directly to it or create links from it to your local file system. I use that to share my .emacs, .bash etc between multiple machines.
Chris