I am running a Perl script trying to access an Iomega ZIP drive on Ubuntu Linux, and it is returning the following error:-
new: /dev/mo: Bad file descriptor
This is on either running either of the following commands:-
akailist
or
for i in *.a3s; do akaiwrite -d /breaks/ "${i}"; done
The script I am running is from AKAITOOLS, to save audio files in AKAI .as3 format to an AKAI formatted Iomega ZIP disk for use in an Iomega ZIP drive connected to an S3000XL sampler.
The AKAITOOLS Perl scripts are available here:-
http://www.lsnl.jp/~ohsaki/software/akaitools/
A description of the process for successfully running this can be read here:-
http://www.mpc-forums.com/viewtopic.php?f=42&t=178525
These scripts are pretty old (1998) - and I have successfully used them in the past.
I would be grateful for any suggestions/ help to resolve this issue
Create a symbolic link (symlink) from /dev/mo to the device file of your zip drive.
Related
So I am attempting to install some files from Send Grid via Composer usind the CommandLine.
I am following a tutorial with the link here https://www.youtube.com/watch?v=fEobqi3N7zw
The guy in the video has no problem using Composer via the Command Line in the Windows Command Prompt, but when I input the command $ go_www, my PC whines and stamps it's feet, giving me the following line:
'$' is not recognized as an internal or external command,
operable program or batch file.
In a nutshell, why?
System Information:
Windows 10 64x
I have looked at other posts on here, to no avail, I have tried opening the Command Line too as System Administrator, but to avail. I have tried restarting the system, to no avail, I can confirmed I have composer installed into the correct directory, to no avail .
$ in a shell indicates the shell is not owned by a superuser, it is not part of the command. Try running go_www. Also, the video you linked seems to be using a bash shell, whereas you appear to be running a Windows command prompt from the error message you included in your question, which might be a problem too.
In any case, go_www is an alias the video author uses to quickly navigate to the folder of interest. Try manually navigating there using cd.
when I attempt to export data (I want to export all my requests and import them on a different PC) as per the manual (https://docs.insomnia.rest/insomnia/import-export-data), I keep failing to achieve that. I suppose it should store the export on my hard drive (Linux, so in my Home directory), but when I run the export, I only see an empty folder: Empty Home folder.
When I try to store the exported data again, I already can see the previous stored export. However, when I attempt to locate the file on my hard drive using find command, it comes up empty. It appears as if Insomnia is storing the export on some kind of a virtual drive that I can't actually access. I couldn't find anything about this issue online, the few articles related to Insomnia export implicitly suggest that the export gets automatically stored on the real hard drive. Unfortunately, that is not my case. Also, when I open the import dialog on the target PC, it also opens an empty Home folder, so the problem is not restricted just to one PC.
Please, how do I get the export to work with my normal hard drive? Thanks a lot in advance!
https://github.com/flathub/rest.insomnia.Insomnia/issues/4
"It seems like the flatpak exec command should include the additional argument --filesystem=home.
A temporary workaround is to edit the file at /var/lib/flatpak/[...]/rest.insomnia.Insomnia.desktop and add the argument to the Exec section."
[Desktop Entry]
Name=Insomnia
Exec=/usr/bin/flatpak run --branch=stable --arch=x86_64 --filesystem=home --command=/app/bin/insomnia --file-forwarding rest.insomnia.Insomnia --no-sandbox ##u %U ##
...
Note I added the --filesystem=home argument to the entry file."
Then restart the PC
I'm having problems in uploading some folders from the local machine to a server.
When I run the command
scp -r -i pathtokey.txt pathtomyfiles pathtotheserver
For some folders the transfer doesn't succed. I noticed that this happens for the folders where I have R projects. It transfers just usually hidden files named as "source-pane.pper" "chunks.json", but nothing else.
For other folders, where I don't have any of that, the transfer goes fine, meaning that the command I'm using is ok.
Any suggestion of what is happening here and how to solve?
Just to give you all the information, my local machine has windows system
Thanks a lot,
Francesca
I'm unsure why this is failing unfortunately. However an alternative may be to look at using rsync:
rsync <options> <source> <destination>
rsync -azv source_dir/ username#server:~/path/to/folder
I've not tried this while specifying an ssh key explicitly but the answer below explains how to do so. Alternatively you can just type in the password manually if you know it.
https://unix.stackexchange.com/questions/127352/specify-identity-file-id-rsa-with-rsync
FYI, I'm a complete newbie with Perl, as in I can spell it and only a little more so I'm trying to learn. What I'm trying to accomplish is using SFTP to transfer files from a Windows machine to a Linux machine.
I've noticed that Perl issues the SFTP get command, but doesn't wait for the transfer to finish so when the Perl script tries to use a file it can't find it. I know there is the sleep command, but the number and size of files will vary on a weekly basis so using sleep(600) seems a little silly.
Is there a standard way to pause a Perl script until SFTP finishes transferring all necessary files?
TIA.
Using Net::SFTP might have solved this dilemma, but my workplace won't allow me to download and install stuff, especially in production. So rather than waiting on the typical bureaucracy I did some more digging around and discovered this:
By calling SFTP in batch mode using a separate file that contains the SFTP commands, the Perl script has to wait for SFTP to finish executing the commands in the separate "command" file. So by using the batch mode option, the Perl script is paused as long as it takes for SFTP to finish its work of file transfer.
I downloaded symbols for windows2003 server from here http://msdn.microsoft.com/en-us/windows/hardware/gg463028
I did what is described here - http://blogs.msdn.com/b/johan/archive/2007/11/13/getting-started-with-windbg-part-i.aspx. But when I try to run !threadpool it says
0:024> !threadpool
Failed to load data access DLL, 0x80004005
Verify that 1) you have a recent build of the debugger (6.2.14 or newer)
2) the file mscordacwks.dll that matches your version of mscorwks.dll is
in the version directory
3) or, if you are debugging a dump file, verify that the file
mscordacwks___.dll is on your symbol path.
4) you are debugging on the same architecture as the dump file.
For example, an IA64 dump file must be debugged on an IA64
machine.
You can also run the debugger command .cordll to control the debugger's
load of mscordacwks.dll. .cordll -ve -u -l will do a verbose reload.
If that succeeds, the SOS command should work on retry.
If you are debugging a minidump, you need to make sure that your executable
path is pointing to mscorwks.dll as well.
This occurs because you have a different minor version of .net on your computer than the server has. I don't mean .net 3.5 vs 4.0, I mean version a.b.c.d.dll vs e.f.g.h.dll.
You need to get a copy of c:\windows\microsoft.net\framework\v2.0.50727\mscordacwks.dll from the windows2003 server.
Then, follow the steps in this post: http://blogs.msdn.com/b/dougste/archive/2009/02/18/failed-to-load-data-access-dll-0x80004005-or-what-is-mscordacwks-dll.aspx.
Try this first:
!sym noisy
.symfix c:\mylocalsymcache
.cordll -ve -u -l
If that doesn't work, then you'll rename the mscordacwks.dll file, copy it to the symbol location specified on your machine, and try again.
Please do not overwrite the file on your computer with the one from the windows 2003 server. :)