Memory dump in NetLogo, is it possible? - netlogo

I searched the NetLogo dictionary and didn't find it. Does anyone know if I can clear the NetLogo memory (memory dump)?
Thanks

You can erase the current environment ( kill turtles, clear patch variables, Erase globals, etc with CLEAR-ALL. Is that what you mean?

Related

When generate a network of 500 nodes, the behavior space went wrong and how can I solve it?

I use the NW extension to generate small-world networks and make an experiment in behaviour space to control the . In the experiment, I set ["nb-nodes" 50 100 500] (nb-nodes: the number of nodes).
Everything goes well until n = 500. CPU usage is too high, causing the program to become unresponsive.But when I set a single simulation with n = 500 in the interface, it keeps working. Only when I try to do it in the behaviour space, it goes wrong.
How can I solve it?
solution:
Thank for advice of Jasper and Steve Railsback, it helps a lot :)
In the FAQ (http://ccl.northwestern.edu/netlogo/docs/faq.html#how-big-can-my-model-be-how-many-turtles-patches-procedures-buttons-and-so-on-can-my-model-contain) it says:
"If you are using BehaviorSpace, note that doing runs in parallel will multiply your RAM usage accordingly"
So I just reduce the parallel from 16 to 8, I know it will slow down the program, but at the same time it will also reduce RAM usage and it works.
Of course, change RAM is another way and it can fit higher calculating demand.
I agree that you should start with increasing NetLogo's memory allocation as directed in the FAQ that Jasper referred you to. (At least in Windows, you must edit the NetLogo.cfg file using administrator privileges.) You can start by doubling or quadrupling the allocation, but the only real limitation is how much RAM you have on your machine.
We keep notes and a publication on NetLogo performance issues here:
http://www.railsback-grimm-abm-book.com/jasss-models/

Indirect Addressing in Brainfuck

How would I move the memory pointer to a location described in a memory cell? Super confused.
So if cell 4 is 10, how would I set the memory pointer to 10 given the address of cell 4. Absolutely no idea where to start.
I figured something out using a [>] where all cells were 0 between the two cells, but otherwise I'm completely lost.
You would need to implement some sort of memory model for your program. Brainfuck does not support indirect addressing. But since it is turing complete, it definitely is possible to do whatever.
You're thinking along the wrong lines. You want to simulate indirect addressing in bf. Before you can do that, you need to think about simulating RAM in the first place. I.e. even direct addressing is a problem. You can't just access "the 5th memory location" unless you know exactly where you are, which you don't always know if you're not extremely careful... because it's brainfuck
You might want to take a look at some C to brainfuck projects floating around. They do a similar sort of thing.

Matlab Process Memory Leak Over 16 Days

I'm running a real-time data assimilation program written in Matlab, and there seems to be a slow memory leak. Over the course of about 16 days, the average memory usage has increased by about 40% (see the figure below) from about 1.1GB to 1.5GB. The program loops every 15 minutes, and there is a peak in memory usage for about 30 seconds during the data assimilation step (visible in the figure).
At the end of each 15 minute cycle, I'm saving the names, sizes, and types of all variables in the currently active workspace to a .mat file using the whos function. There are just over 100 variables, and after running the code for about 16 days, there is no clear trend in the amount of memory used by any of the variables.
Some variables are cleared at the end of each cycle, but some of them are not. I'm also calling close all to make sure there are no figures sitting in memory, and I made sure that when I'm writing ASCII files, I always fclose(fileID) the file.
I'm stumped...I'm wondering if anyone here has any suggestions about things I should look for or tools that could help track down the issue. Thanks in advance!
Edit, system info:
RHEL 6.8
Matlab R2014b
I figured out the problem. It turns out that the figure handles were hidden, and close('all') only works on figures that are visible. I assume they're hidden because the figures are created outside the scope of where I was trying to close the figures. The solution was to replace close('all') with close all hidden, which closes all figures including those with hidden handles.
I'll go ahead and restate what #John and #horchler mentioned in their comments, in case their suggestions can help people with similar issues:
Reusing existing figures can increase performance and reduce the potential for memory leaks.
Matlab has an undocumented memory profiler that could help debug performance related issues.
For processes that are running indefinitely, it's good practice to separate data collection/processing and product generation (figures etc). The first reads in and processes the data and saves it to a DB or file. The second allows you to "view/access/query" the data.
If you are calling compiled mex functions in your code, the memory leak could be coming from the Fortran or C/C++ code. Not cleaning up a single variable could cause a leak, and would explain linear memory growth.
The Matlab function whos is great for looking at the size in memory of each variable in the workspace. This can be useful for tracking down which variable is the culprit of a memory leak.
Thanks #John and #horchler!

MATLAB out of memory on linux despite regular "clear all"

I am batch processing a bunch of files (~200) on MATLAB, in essence
for i = 1:n, process(i); end
where process(i) opens a file, reads it and writes out the output to another file. (I am not posting details about process here because it is hundreds of lines long and I readily admit I don't fully understand the code, having obtained it from someone else).
This runs out of memory after every dozen of files or so. Of course, on Linux, the memory function is not available so we have to figure it out "by hand". Well, I thought there is some memory leak, so let's issue a clear all after every run, i.e.
for i = 1:n, process(i); clear all; end
No luck, this still runs out of memory. At the point where this happens, who says there's just two small arrays in memory (<100 elements). Note that quitting MATLAB and restarting solves the problem, so the computer certainly has enough memory to process a single item.
Any ideas to help me detect where the error comes from would be welcome.
This is probable not the solution you are hoping for but as a workaround you could have a shell script that loops over several calls to Matlab.

Growing memory usage in MATLAB

I use MATLAB for programming some meta-heuristics. Recently, I have been working on an algorithm for solving an industrial engineering problem. My problem with MATLAB is getting "out of memory" errors. Now I'm trying some suggestions from Mathworks and Stackoverflow (Hope they will work). However, there is one thing I did not understand.
During the run of the algorithm in MATLAB (it takes 4000-5000 cpu sec for a medium sized problem), even though I preallocate variables, code does not demand dynamic array resizing and does not add new variables, I observe that the memory usage of the algorithm grows continuously. The main function calls some other functions written by me. What could be the reason of increase of the memory usage?
The computer I use for the running of the algorithm has 8GBs of memory and win8 64bit installed.
The only way to figure this out is to see where the memory is going.
I think you may accidentally store results that you don't need, or that you underestimate the size of your output/intermediate variables.
Here is how I would proceed:
Turn on dbstop if error
Run the code till you get the out of memory error
See how much memory is being used (make sure to check all work spaces)
Probably you now know where the extra memory is going. If you don't find much memory being used, continue with this:
Check the memory command to see how much memory is still available
Carefully look at the line being executed, perhaps you actually need a huge amount of memory for it
If all else fails share your findings here and others can help you look for it.
The reason of memory usage growth is CPlex. I tried many alternatives but I couldn't find any other useful solution than increasing virtual memory to several hundred GBs. If you don't have special reasons to insist on CPlex (commercial usage, licensing etc.), I would suggest anyone, who encounter this problem, to use GUROBI. It is free and unlimited for academic usage, totally integrable with MATLAB. That's the solution I have found for my problem with Cplex. I hope this solution works for everybody.