I have noticed that MATLAB (R2011b on Windows 7, 64 bit) tends to slow down if I am in debugging mode for a long period of time (e.g. 3 hours). I don't recall this happening on previous versions of MATLAB.
The slow down is small, but significant enough to have an impact on my productivity (sometimes MATLAB needs to wait for up to 1 sec before I can type on the command line or on the editor).
I usually spend hours on debugging mode (e.g. after stopping at a keyboard statement) coding full projects in this mode. I find working on debugging mode convenient to organically grow my code while inspecting my code anytime in execution time.
The odd thing is my machine has 16 GB of RAM and the total size of all workspaces while in debugging mode is usually less than 4 GB. I don't have any other large process running in the background, and my system reports ~8GB of free RAM.
Also, unfortunately MATLAB does not let me call pack from debugging mode; it complains with :
Warning: PACK can only be used from the MATLAB command line.
I have reproduced this behavior after restarting MATLAB, rebooting my system, and on different days. With this, my question/s are:
Has anybody else noticed this? Is there anything I could do to prevent this slowdown without exiting debugging mode?
Are there any technical notes or statements from Mathworks addressing this issue?
In case it matters, my code is on a network drive, so I added the following on my startup.m file, which should alleviate any impact on performance resulting from it:
system_dependent('RemoteCWDPolicy', 'None');
system_dependent('RemotePathPolicy', 'None');
system_dependent('DirChangeHandleWarn','Never');
I have experienced some similar issues. The problem ended up being that Mathworks changed how Matlab caches files. For some users, it is now storing data in the TMP folder as defined by the environment variables. This folder was being scanned by anti virus and causing a lot of performance problem. Of course, IT wouldn't let us exclude the TMP folder from scans. So we added a line to our start up script that changes the environment variable of TMP to some other location within an excluded folder.
You don't have to worry about changing the variable back or messing up other programs. When applications launch, they copy the environment variables into their own local instance of them. Any changes made to them only change the local copy of those variables, not the system copy.
Here is the function you will need.
setenv('TEMP', 'C:\TEMP');
I'm not sure if it was TMP or TEMP. Check your environment variables to be sure.
I am using MATLAB R2011 on linux 10, windows 7 (32 bit).
I experienced MATLAB slowing down while printing simple variables in command window.
It turned that there was one .m file loaded in my Editor.
It was a big file with 10000 lines. These lines were simple data that should have been saved as mat file. When i closed this file, the editor was back to its normal speed.
Related
Last week, Matlab crashed on my computer and gave a message regarding the figure driver or engine being changed (unfortunately, I do not remember exactly what it said). Ever since then, all of my figures have been extremely laggy and have appeared to have a lower resolution.
Has anyone run into this or have any troubleshooting advice?
I was able to solve this by creating a startup.m file in my MATLAB directory (located in Documents in my case). This file is called every time MATLAB is initialized. In this file I typed: opengl('save','hardware'). After staring MATLAB once with this startup.m file, I was able to delete this line of code from that file (this command changes opengl permanently, unless it is manually changed back).
Alternatively, I could have typed opengl hardware into the startup.m file. In this case this line of code would need to be left in the startup file.
I'm trying to analyze some mini crash dumps. I'm using Windows 10 Pro Build 1607 and WinDbg 10.0.14321.1024. I have my symbol file path set to
SRV*C:\SymCache*https://msdl.microsoft.com/download/symbols
Basically, whenever I load up a minidump (all < 1 MB .dmp files), it takes WinDbg forever to actually analyze them. I understand the first run can take long, but it took mine almost 12 hours before it would let me enter a command. I assumed that, since the symbols were cached, it wouldn't take long at all to re-open the same .dmp. This is not the case. It loads up, goes pretty much instantaneously to "Loading Kernel Symbols", then takes another 30 minutes before it prints the "BugCheck" line. It's been another 30 minutes, and I still can't enter commands into it.
My PC has a 512 GB SSD, 8 GB of RAM, and an i5-4590. I don't think it should be this slow.
What am I doing wrong?
These kind of complaints seem to occur more often lately and I can reproduce it on my PC. This is not your fault but some issue with the Internet or the symbol server on Microsoft side.
Monitoring the traffic with Wireshark and looking at my disk on how the symbol cache get populated, I can say:
only one file is being downloaded at a time.
the problem also occurs with older WinDbg versions (6.2.9200)
the problem occurs with HTTP and HTTPS
when symbols are found, the transfer speed is very slow, then increasing. The effective transfer rate is down at 11 kb/s to 20 kb/s (on a line which can handle 6500 kb/s)
there's quite a high number of packets out of order, duplicate packets etc., especially during the "lookup phase" where no file is downloaded yet. Such a lookup phase can easily take 8 minutes.
even if the file already exists on disk, the "lookup phase" is performed.
the HTTP roundtrip time (request to response) is 8 to 9 seconds
This is the symbol server being really slow. Other have noticed as well: https://twitter.com/BruceDawson0xB/status/772586358556667904
Your symbol path contains a local cache so it should load faster next time around, but it seems that the cache is not effective, I can't tell really why (I suspect the downloaded symbols are not a perfect match and they are being downloaded again, every time).
I would recommend modifying the _NT_SYMBOL_PATH (or whatever is the way your sympath is initialized) to SRV*C:\SymCache only, ie. do not attempt to automatically download, just use the symbols you already have cached locally. The image should open fairly fast. Only enable the symbols server if you discover missing symbols.
I ran into the same problem (extremely slow windbg), but loading/reloading/fixing/caching symbols did not help. By accident, I figured out that this problem persists when I try to print memory with address taken from a register, like
db rax
The rule of thumb is to always use # with the register name.
db #rax
Without this symbol, the debugger considers rax to be a symbol name, and looks for it some time (depending on the amount of symbols you have loaded) and fails to find it eventually, and falls back to treating it like a register name. Printing memory from register with # symbol works instantly, even if you have gigs of symbols loaded in memory. As you can see, this problem is also symbol-related, but in a different way.
Emacs 24 in Ubuntu 14.
I have file opened only in emacs, and it gives me this constantly, after each saving. that is annoying.
This is strange, because earlier everything worked fine. I can hardly guess what could I break during this time. I'am total newbie in Ubuntu, using it according to instructions found in internet.
Now I'm using emacs 23, everything is fine. I guess, I need auto-syncronization of opened buffer with saved file right after saving. Anyway, how can I fix it?
It sounds like some other program on your computer is reading the file when it changes, and possibly even introducing changes (perhaps just to the modification time, rather than to the contents). It's hard to say off-hand just what that would be.
A workaround try M-x global-auto-revert-mode. It will only auto-revert if you have no local modification since the last saving. This is generally a nice mode to turn on if you use multiple editors, and I keep it enabled all the time.
Other ideas:
Check if any other process currently has the file open using fuser /path/to/filename.txt (note: it only shows open file descriptors, not processes that hold the file content in memory and write it later)
Do you use any non-standard filesystem? (check with df -h /path/to/filename.txt and mount)
Is your system time stable? (Manually check date, scan the output of dmesg for obvious errors concerning timekeeping, and look for errors related to NTP in the logfiles in /var/log/.
I left MATLAB running on a simple ode45 + plot, and when I came back I saw that the 5GBs of free space I had on my drive (C:) was no more! MATLAB had stopped due to "no memory".
Can someone please tell me what happened and how I can get my space back???
Thank You.
You can visually inspect hard disk usage and find folders and files which take up a lot of space with a tool such as TreeSize Free.
P.S. You can also try clearing temporary folders either trough built-in disk cleaner or other tools such as CCleaner.
MatLab is one of those apps that have an all world of computing science where you only want to work in a small tiny island of knowledge, the Help folder of it is huge, anyway here's some things you can do to make it slimmer on disk:
Install only the packages you need.
Use JPEGMini to compress the JPEG collection of the huge help folder.
Use Pngyu to compress the huge collection of PNG files to 8 bit depth.
Step 2 and 3 will get you back like a Gigabyte if not more.
Use NTFS compression on the MatLab Folder.
It will get you back another 2 Gigabytes
Both step 2 and 3 must be done with admin privileges, the drag and drop of folder to it must be done with another app with admin privileges also, you can use Explorer++ as Windows File Explorer alternative.
I have two binary files that I'm trying to compare using Matlab's built-in function visdiff, but it only displays the first 2000 bytes as a default. Is there any way to force the comparison tool to display the entire contents of both files side by side?
Edit the file matlabroot\toolbox\shared\comparisons\private\bindiff.m, where matlabroot is your MATLAB installation directory. On line 149, you'll see it sets the variable MAXLEN to 2000. Change this to something bigger (even Inf seems to work).
You may need to type rehash toolboxcache after making this change, in order to get MATLAB to notice.
Please note:
As you're making a change to the MATLAB source, this is at your own risk (it seems fine to me though). Keep a backup of the file you've edited.
That truncation at 2000 bytes is there for a reason - comparing the whole of larger binary files does seem to take quite a while, so be patient. Maybe try gradually increasing MAXLEN, rather than going straight to Inf.
I only have R2011b available to me right now, so if you're on a newer version the file path and line number I mentioned above may have changed. It was very easy to trace through the code from visdiff to comparisons_private to bindiff though, so unless they've changed the deeper structure of the Comparisons Tool between 11b and now, it will probably be very similar.