I am trying to download a large dataset from my Organization Box account into a remote computing cluster using wget (or even curl). How do I go about doing this? Box's help section is very unclear.
Related
I am trying to run the first example of zguide (zeromq). The project has two files, client and server, and I am doing this using vscode ssh remote on a RHEL7 machine.
Here is the structure of the folder and how it looks like in VSCode explorer:
ZEROMQ[SSH:remote_machine]
|_.vscode
|_tasks.json
|_c_cpp_properties.json
|_client.cpp
|_server.cpp
Once in a while I receive this warning:
Unable to watch for file changes in this large workspace folder. Please follow the instructions link to resolve this issue.
The instructions are of course telling me to increase the limit size which I don't think is the issue here with such a small project.
So, any ideas what's happening and how to resolve this?
I am using the AWS workspace product. I built an environment and I want to take an image of it, however it fails. when I run the included image checker tool it says I have multiple profiles.
Looking in windows users I only see one profile. I do have a couple entries under the (hidden) c:\users for different .net packages I have installed, is that what is causing it?
I am struggling to transfer a file from my hard disk to Magnolia server which is provided by my university. I have posted about the problem here. Someone suggest me to go through this link.
I add PuTTY in my path. Then I opened cmd and did the same thing as suggested on the above link. The file that I want to transfer to the magnolia cluster is a C file, named as mpi_hello.c. I have to transfer the the file inside csc510/mpi. csc510 folder is on the server and inside csc510 there is another folder mpi which is also in the magnolia server. I can show this two folder when I log in to the magnolia.
As the post suggested I typed pscp mpi_hello.c#magnolia.universityname.edu:csc510/mpi/mpi_hello.c
I got the result which shown below
Also nothing transferred to the csc510/mpi directory of magnolia cluster. How can I transfer the file from hard disk to magnolia server?
Thank you.
Im trying to introduce IPython notebook in my work. One of the ways I want to do that is by sharing my own work as notebooks with my colleagues so they would be able to see how easy it is to create sophisticated reports and share them.
I obviously can't use Notebook viewer since most of our work is confidential. I'm trying to set up notebook viewer locally. I read this question and followed the instructions there, but now that nbconvert is part of IPython the instructions are no longer valid.
Can anybody help with that?
You have a couple of options:
As described above convert to HTML and then serve them using a Simple server e.g python -m "SimpleHTTPServer" You can even set up a little python script that would "listen" in one directory. If changes or new notebooks is added to the directory the script will run nbconvert and move the HTML file to the folder you are serving from. To navigate to the server you are running go to yourip:port e.g. 10.0.0.2:8888 (see the IPython output when you run the IPython notebook command) (If you can serve over the network you might just as wel look into point 2 below)
If your computers are networked you can serve your work over the lan by sharing your IP address and port with your colleagues. This will however give them editing access but should not be a problem? This means that they will navigate to your ipython server and see the ipython notebook and be able to run your files.
Host your notebooks on an online server like Linode etc... entry level servers cheap. Some work is needed to add a password though.
Convert to PDF and mail it to them.
Convert to a slideshow (now possible in Version 1.00) and serve via option 1,2 or just share the HTML file with them.
Let them all run ipython notebook and check your files into a private repo at bitbucket (its free private git repo). They can then get your files there and run it themselves on their own machines.Or just mail it to them. Better yet if they wont make changes share a dropbox folder with everyone. If they run ipython notebook in that folder they will see your files (DANGEROUS though)
Get them in a boardroom and show them. :)
I require a small space online (free) where I can
upload/download few files automatically using a script.
Space requirement is around 50 MB.
This should be such that it could be automated so I can set
it to run without manual interaction i.e. No GUI
I have a dynamic ip & have no tech on setting up a server.
Any help would be appreciated. Thanks.
A number of online storage services provide 1-2 GB space for free. Several of those have command-line clients. E.g. SpiderOak that I use has a client that can run in a headless (non-GUI) mode to upload files, and there's even a way to download files from it by wget or curl.
You just set up things in GUI mode, then put files into the configured directory and run SpiderOak with right options; files get uploaded. Then you either download ('restore') all or some of the files via another SpiderOak call or get them via HTTP.
About the same applies to Dropbox, but I have no experience with that.
www.bshellz.net gives you a free shell running Linux. I think everyone gets 50mb so you're in luck!