TI-84 corrupted programs - calculator

So I got a TI-84 calculator a few months ago. As of this morning, I had 30 programs that I wrote myself stored on it. The largest size program was slightly over 200, with the vast majority being under 100. The RAM Free was about 14900, and the ARC Free has always been 1919K.
This evening, when I went to check the Memory on it, I noticed that one of my programs (for the surface area of a rectangular pyramid) showed that it had a size of 200+. I took a look at the program, and its commands were scrambled, and had commands from other programs in it. I went back to the Memory management section and deleted the program, thinking that if it was corrupted, then deleting it would be the wisest choice.
I looked through the rest of my programs, and, to my horror, I saw that my program for the volume of a cylinder (the first program I ever wrote) had a size of 17000+. I decided to delete it too, but when I pushed the ENTER button to select the program, the TI-84 froze and the contents on the screen slowly faded into an all white screen. The calculator was completely unresponsive at this point. So, after some research, I pushed the reset button on the back of the TI-84, and that seemed to solve the problem, despite erasing all of my programs, except for the one that was at 17000+ (which I immediately deleted).
I have no idea why this occurred, as my research did not find any similar instances. I know my programs became corrupted, but I want to know what happened and why so I can prevent this from happening again. I already plan on backing up any future programs I write.

Sometimes programs can be corrupted by faulty assembly code in assembly programs and in apps. However, if you have only been using TI-Basic, it it unlikely to be code. Also, the hardware can sometimes get messed up by dropping or hitting the calculator. My calculator has also behaved very strangely while operating with low batteries and batteries of different ages (some more charged than others). Also, it is good to have plenty of extra RAM and Archive memory (although that doesn't seem to be your problem).
As far as solutions/preventative measures go, back your programs up, make sure you only download/use correct assembly (or none at all), and take good care of the calculator (batteries, jolts, etc.).

Related

Can I open and run from multiple command line prompts in the same directory?

I want to open two command line prompts (I am using CMDer) from the same directory and run different commands at the same time.
Would those two commands interrupt each other?
One is for compiling a web application I am building (takes like 7 minutes to compile), and the other is to see the history of the commands I ran (this one should be done quickly).
Thank you!
Assuming that CMDer does nothing else than to issue the same commands to the operating system as a standard cmd.exe console would do, then the answer is a clear "Yes, they do interfere, but it depends" :D
Break down:
The first part "opening multiple consoles" is certainly possible. You can open up N console windows and in each of them switch to the same directory without any problems (Except maybe RAM restrictions).
The second part "run commands which do or do not interfere" is the tricky part. If your idea is that a console window presents you something like an isolated environment where you can do things as you like and if you close the window everything is back to normal as if you never ever touched anything (think of a virtual machine snapshot which is lost/reverted when closing the VM) - then the answer is: This is not the case. There will be cross console effects observable.
Think about deleting a file in one console window and then opening this file in a second console window: It would not be very intuitive if the file would not have been vanished in the second console window as well.
However, sometimes there are delays until changes to the file system are visible to another console window. It could be, that you delete the file in one console and make a dir where the file is sitting in another console and still see that file in the listing. But if you try to access it, the operating system will certainly quit with an error message of the kind "File not found".
Generally you should consider a console window to be a "View" on your system. If you do something in one window, the effect will be present in the other, because you changed the underlying system which exists only once (the system is the "Model" - as in "Model-View-Controller Design Pattern" you may have heard of).
An exception to this might be changes to the environment variables. These are copied from the current state when a console window is started. And if you change the value of such a variable, the other console windows will stay unaffected.
So, in your scenario, if you let a build/compile operation run and during this process some files on your file system are created, read (locked), altered or deleted then this would be a possible conflicting situation if the other console window tries to access the same files. It will be a so called "race condition", that is, a non-deterministic process, which state of a file will be actual to the second console window (or both, if the second one also changes files which the first one wants to work with).
If there is no interference on a file level (reading the same files is allowed, writing to the same file is not), then there should be no problem of letting both tasks run at the same time.
However, on a very detailed view, both processes would interfere in that they need the same limited but vastly available CPU and RAM resources of your system. This should not pose any problems with the todays PC computing power, considering features like X separate cores, 16GB of RAM, Terabytes of hard drive storage or fast SSDs, and so on.
Unless there is a very demanding, highly parallelizable, high priority task to be considered, which eats up 98% CPU time, for example. Then there might be a considerable slow down impact on other processes.
Normally, the operating system's scheduler does a good job on giving each user-process enough CPU time to finish as quickly as possible, while still presenting a responsive mouse cursor, playing some music in the background, allowing a Chrome running with more than 2 tabs ;) and uploading the newest telemetry data to some servers on the internet, all at the same time.
There are techniques which make it possible that a file is available as certain snapshots to a given timestamp. The key word would be "Shadow Copy" under Windows. Without going into details, this technique allows for example defragmenting a file while it is being edited in some application or a backup could copy a (large) file while a delete operation is run at the same file. The operating system ensures that the access time is considered when a process requests access to a file. So the OS could let the backup finish first, until it schedules the delete operation to run, since this was started after the backup (in this example) or could do even more sophisticated things to present a synchronized file system state, even if it is actually changing at the moment.

Meaning of SigQuit in Swift 3, Xcode 8.2.1

I am trying to create a custom keyboard in iOS 10 that acts like a T-9 keyboard. When switching to my custom keyboard, the app extension reads in a list of about 10,000 words from a txt file and builds a trie out of them.
However, I keep getting a "SigQuit" error when I first try to use the keyboard. Rerunning the keyboard right after it failed seems to usually work. Xcode doesn't give me any explanation for why it failed other than the SigQuit error on some assembly code line.
So, my question is, for what reasons might Xcode throw a SigQuit error? I have tried debugging to no avail, and googling SigQuit does not seem to return any useful information. I considered that my keyboard is using too many resources / taking up too much time on startup, but I checked the CPU usage and it peaked at less than 1%. Similarly, the memory used was something like 25mb which doesn't seem terrible.
Keyboard extensions have a much lower memory limit than apps. Your extension was likely killed by the operating system.
See: https://developer.apple.com/library/content/documentation/General/Conceptual/ExtensibilityPG/ExtensionCreation.html
Memory limits for running app extensions are significantly lower than
the memory limits imposed on a foreground app. On both platforms, the
system may aggressively terminate extensions because users want to
return to their main goal in the host app. Some extensions may have
lower memory limits than others: For example, widgets must be
especially efficient because users are likely to have several widgets
open at the same time.
Yeah, seems like you have to Run, then Stop, and it'll run fine on the simulator or device.

Unusual spikes in CPU utilization in CentOS 6.6 while starting pycharm

my system since last couple of days is behaving strangely. I am a regular user of pycharm software, and it used to work on my system very smoothly with no hiccups at all. But since last couple of days, whenever I start pycharm, my CPU utilization behaves strangly, like in the image: Unusual CPU util
I am confused as when I go to processes or try ps/top in terminal, there are no process which is utilizing cpu more then 1 or 2%. So I am not sure where these resources are getting consumed.
By unusual CPU util I mean, That first CPU1 is getting used 100% for couple or so minutes, then CPU2. Which is, only one cpu's utilization goes to 100% for sometime followed by other's. This goes on for 10 to 20 minutes. then system comes back to normal.
P.S.: I don't think this problem is related to pycharm, as I face similar issues while doing other work also, just that I always face this with pycharm for sure.
POSSIBLE CAUSE: I suspect you have a thrashing problem. The CPU usage of your applications are low because none of them are actually getting much useful work done. All the processing is being taken up by moving memory pages to and from the disk. Your CPU usage probably settles down after a time because your application has entered a state where its memory working set has shrunk to a point where it all can be held in memory at one time.
This has probably happened because one of the apps on your machine is handling a larger data set than before, and so requires more addressable memory. Another possibility is that, for some reason, a lot more apps are running on your machine.
POTENTIAL SOLUTION: There are several ways you can address this. The simplest is to put more RAM on your machine. If this doesn't work or isn't possible, you'll have to figure out which app is the memory hog. You may simply have to work with smaller problems/data-sets or offload some of the apps onto a different box.
MIGRATING CPU LOAD: Operating systems will move tasks (user apps, kernel) around for many different reasons. The reasons can range anywhere from it being just plain random to certain apps having more of their addressable memory in one bank vs another. Given that you are probably doing a lot of thrashing, I'm not surprised that the processor your app is running is randomized over time.

Leaks on the phone but not on the emulator?

Hey all, I have a problem and I'd like some advice.
I'm working on a document viewer that's composed of the following major parts:
zip library which unpacks the document container (minizip)
xml library which parses the document (libxml2)
UI code which renders the document on screen
Nothing too complicated or fancy.
On the emulator, everything works beautifully; the viewer performs as expected. I've ran it through Instruments and there are no leaks. ObjectAlloc reports about 5.5 megs allocated over the lifetime of the viewer (that's repeatedly opening my test document over and over).
Unfortunately on the device (iphone 3G, iOS 3.1.2) things aren't as clear. Fairly frequently, repeated opening of the test file causes an out of memory error and the file will fail to open. Initial file opening always works. Even though emulator testing highlighted no leaks and the overall memory footprint was modest, I'm forced to conclude that there IS indeed a leak on the iphone (because why would repeated opening cause out of memory error).
I've attempted to run instruments on the device, but the app stalls (?!) half way through the run, so I actually had no success running Leaks.
I believe that there's a significant leak somewhere that only shows up on the device. So, I'm left with a two options (in no particular order):
Refactor my code in such a way as to avoid using zip library. That would eliminate a potential source of leaks. Time consuming and inconclusive.
Reformat and reinstall everything on my phone (maybe there's something that's causing a problem there). Pretty much as above, time consuming and sucks losing my phone data. Maybe it'll let me run Leaks though.
As you can see, I'm reaching here. Is there anything obvious that I'm missing?
Thanks in advance guys.
Maybe, you should try to run not Leaks, but Allocations Instrument on your device, and to search leaks with it (manually)?
+ (maybe this sounds stupid) Remove app from the device and repeat Clean-Build-Run with Leaks (why not?).
About manual leaks search.
Just start the Allocations Instrument and, while using your app, do every action for several times (for example - press button twice or more; navigate to some panel and back for several times - and so on). Memory should increase significantly only once or increase on action start and decrease on action end (of course, some divergences are possible, but they should be reflected with small amounts of memory). You will see it on graph.
Also make heaps (in Instruments' left panel while Allocations Instrument is selected there is a button for this) - they will help you to detect "still alive" objects that were considered as destroyed (there will be a lot of objects, but the first and easiest step is to check objects of your own classes).

iPhone and Vertex Buffer Objects

I've just started playing around with opengl es on the iphone the past couple of weeks and i'm looking at refactoring some of my code to use Vertex Buffer Objects(VBO). Before I do though I would like to make sure it'll be worth it. The problem is that afaik the only reason you create VBO's is to shift a chunk of data onto the graphics card so that it doesn't need to be retrieved from system ram when it's used. The iPhone however does not have any dedicated ram that I'm aware of so i'm struggling to see why I would benefit at all from using VBO's. I have seen talk around the internet with conflicting opinions and apple certainly want dev's to use it so there's probably still a reason to use them but just wanted to see if anyone on SO had an opinion to add.
I saw no performance improvement on an iPhone 3G. I moved a bunch of stuff to VBOs, but eventually backed it out as it made it more difficult for me to pursue other performance gains. It's not the quick 25% performance increase that I was hoping for.
I've read somewhere that it can make a difference on the newer hardware (3GS), but I don't have references to back that up.
It depends. (sorry).
Rob didn't see an improvement for his setup, but here is an interesting post that did see a large improvement.
The main reason to existence of VBO's is the presence of static data on 3D models. The first bottleneck you encounter is the slowness of copying data to video memory (by using the unavailable glBegin/glEnd block or glVertexPointer, glBufferData and friends).
Let's imagine the old "flying toaster" screensaver. All toasts are static (changing only the position) - why waste resources copying them every frame from CPU's memory to GPU's? Copy it once with buffers and draw it with a single command. And, depending on how you do animations, even the animated toasters can be described in a static fashion.
My first 2D game I started without VBOs. When I changed to VBOs, no difference (like Rob). But, when I refactored to use more static buffers, FPS gone from 20 to 40. Since my goal was to reach 30, I was satisfied. I had some ideas to refactor even more, leaving everything static, but I don't have time now (game is on review, next one to come).