Matlab incorrect out of disk error when compiling script - matlab

I have built a treebagger model in Matlab which I am trying to compile in R2016a using the application compiler. I did this successfully a few days ago, despite the fact that the file for the treebagger model was about 2GB.
I retrained my model because I had made some small changes to my data and now when I try to compile I get an error saying that I am out of disk space, even though I have around 250GB free disk space on the drive. More precisely, the message was "Out of disk space. Failed to create CTF file. Please free -246249051088 bytes and re-run Compiler."
I even retried on a drive with about 2.5TB of free space and got the same issue. Any ideas? Thanks for any help.

Related

how can i allocate more space into Solana program in order to upgrade?

When i try to upgrade a solana program on mainnet using buffer, it has limits because when you deploy a program on Solana, the amount of space allocated for that program is 2x the original program size. so in each upgrade we need more space in the origin program.
when size limit reaches, it threw an error:
Program returned error: "account data too small for instruction"
is there any way to allocate more space to the original program or any other way so i can upgrade my program as much as i need.
PS: i won't deploy it again in order to upgrade.
Currently you cannot increase the account size. This is a known issue that will be fixed in 1.11 https://github.com/solana-labs/solana/issues/26385

Converting extremely large JSON file to object

I have very large JSON files that I would like to explore the structure of before processing. Some of these can be in excess of 4-5 GB, but the one I've picked a smaller one just for exploration purposes. The bonus difficulty here is that the files were minified, so this data is on a single line.
I am attempting to do this with Get-Content in PowerShell 7 (x64), but am getting this error:
Get-Content: Exception of type 'System.OutOfMemoryException' was thrown.
My MaxMemoryPerShellMB is set to the Windows 11 default of 2147483647. If I monitor Task Manager while this is running, the pwsh process hits 3,599 GB then throws the exception. For fun, I also lowered the MaxMemoryPerShellMB to 8096 to put it within the boundaries of my machine (32 GB), but that obviously had no effect.
Any thoughts on why 64-bit PowerShell is throwing a max memory exception at the 32-bit limit? And is there a better method for loading this much data with PowerShell or is it just impossible?
The exact size of this file is 1,832,252,369 bytes.

Getting a Memory allocation error in Dymola

I'm using Dymola 2019 and have to use +50 instances of CombiTimeTable in my model to load a CSV file with a size greater than 200 MB (Yearly Weather data with a resolution of 60 s).
An additional increase of my model resulted in getting the following error message in Dymola:
Error: The following error was detected at time: 0
Memory allocation error
FixInitials:Init: Integrator failed to start model.
A dirty fix to this problem is possible if I split up the big csv file into smaller shunks, but this is obviously not the best solution to my problem.
How can I increase the designated Memory for Dymola or what is a best practice loading big csv files. Is another format more efficent?
Setting Advanced.CompileWith64=2 inside Dymola should generate a 64-bit dymosim-executable that avoids this issue.
Specifically the message "Memory allocation error" only occurs if you are out of dynamic memory for malloc.

Questions on Program execution flow

I was studying operating system concepts from galvin's sixth edition and i have some questions about the flow of execution of a program. A figure explains the processing of the user program as:
We get an executable binary file when we reach linkage editor point. As the book says, "The program must be brought into memory and placed within a process for it to be executed" Now some of my stupid questions are:
Before the program is loaded into the memory, the binary executable file generated by the linkage editor is stored in the hard disk. The address where the binary executable file is stored in the hard disk is the logical address as generated by the CPU ?
If the previous answer is yes, Why CPU has to generate the logical address ? I mean the executable file is stored somewhere in the hard disk which pertains to an address, why does CPU has to separately do the stuff ? CPU's main aim is processing after all!
Why does the executable file needs to be in the physical memory i.e ram and can not be executed in the hard disk? Is it due to speed issues ?
I know i am being stupid in asking these questions, but trust me, I can't find the answers! :|
1) The logical address where the binary file is stored in the hard disk is determined by the file system, the Operating System component that is aimed to manage files in the disks.
2) & 3) The Hard Disk is not a) fast enough b) does not support word addressing. The hard disks are addressed in sectors blocks. Usually the sector size is 512 bytes. The CPU need to be able to address each machine word in a program to execute it. So, the program is stored in the hard disk, that retains its content even being powered off (in contrast to the RAM that losts its content when it is powered off). Then the program is loaded into RAM to be executed. After program finished and possibly stored the result of its execution in the hard disk, the memory is freed for running another programs. The Compiler and the Linkage Editor in your sample are also programs. They are kept in the hard disk. The compiler get its input (the source text of your program) from the file in the hard disk. Then it stores the object file. The linkage editor, or linker for short does the same: it reads the object file and necessary library files and then produces a file with a binary representation of your program.

Does locating Matlab and loaded data on different hard drives slow down the execution?

The Matlab program is installed on hard drive C together with Windows, whereas the scripts and data loaded are saved on hard drive D. Could that be a cause to slower loading of data and slower execution of scripts?
Until someone provides hard evidence to the contrary I don't think that this is something that you need to be concerned with. If there is any impact on execution rate of locating data and Matlab on different disks it will be unnoticeably small.
Once the Matlab program is loaded (from drive C in your case) it will sit in memory ready and waiting for your commands. It's possible that some of the non-core functionality will be read from disk on demand but you are unlikely to notice, and find it very difficult to measure, the time this takes. Whether you then read data and programs from C or D is immaterial.
I look forward to the data that proves me wrong.
I compared loading a 140 Mb .mat file from an external USB-2 drive and an internal (IDE or S-ATA) drive.
Loading time from the external drive: > 15 minutes
Loading time from the internal drive: a few seconds
However sometimes the loading from the external drive is fast as well.