Bitdefender detects my C++ file as a virus - antivirus

I am learning how to code in C++ and at the moment I am creating some basic programs that calculate something or generally do anything connected with math. So, I am using Code:Blocks for this and every time I compile a harmless program, my antivirus, Bitdefender, detects it as a virus and immediately deletes it. I have tried putting it on whitelist but I often make programs and having to whitelist every directory or program takes too much time. Can somebody explain to me why does Bitdefender, which I bought and which usually works fine is mistakenly detecting a harmless file as a virus? (The virus is described as
Gen:Variant.Ursu.'number'

The vast majority of users (of an anti-virus program) will never run a legitimate/safe program that the anti-virus hasn't seen before (less true for people on this site).
Whereas much malware is polymorphic, altering itself every time it is deployed.
Therefore a useful heuristic for an anti-virus is to block all executables the first time they are seen. Unfortunately this hits software developers rather hard. Fortunately this group is likely to be able to work out how to use exclusions to help themselves.

Related

List the four steps that are necessary to run a program on a completely dedicated machine—a computer that is running only that program

In my OS class, we use a text book "Operating System Concepts" by Silberschatz.
I ran into this question and answer in practice exercise and wanted to know further explanation.
Q. List the four steps that are necessary to run a program on a completely dedicated machine—a computer that is running only that program.
A.
1. Reserve machine time
2. Manually load program into memory
3. Load starting address and begin execution
4. Monitor and control execution of program from console
Actually, I don't understand the first step, "Reserve machine time". Could you explain what each step means here?
Thank you in advance.
If the computer can run only a single program, but the computer is shared between multiple people, then you will have to agree on a time that you get to use the computer to run your program. This was common up through the 1960s. It is still common in some contexts, such as very expensive super-computers. Time-sharing became popular during the 1970s, enabling multiple people to appear to share a computer at the same time, when in fact the computer quickly switched from one person's program to another.
In my opinion teaching about old batch systems in today's OS classes is not very helpful. You should use some text which is more relevant to contemporary OS design such as the Minix Book
Apart from that if you really want to learn about old systems then wikipedia has pretty good explanation.
Early computers were capable of running only one program at a time.
Each user had sole control of the machine for a scheduled period of
time. They would arrive at the computer with program and data, often
on punched paper cards and magnetic or paper tape, and would load
their program, run and debug it, and carry off their output when done

Is it possible to write a program that will set the computer on fire?

Let’s assume you have administrator access, and that this is a run of mill laptop or desktop. Is it possible to write a program that will result in a fire or something equally as destructive?
EDIT:
To the ”how do you think bombs work” answer: valid answer, but I’m asking about if I have a pocket universe with just a laptop, is it possible to have a program that when run, will set the computer on fire?
It isn't impossible, but with most off the shelf goods, it is unlikely you will find a deterministic way to do it. Groups like CSA, Underwriters, ETL, are pretty careful about what they give the stamp of approval to.
Depending upon that last time you have flown in the US, you may have heard various warnings that you are not to carry a certain brand of Samsung Phone or Apple Laptop on board; further you are not allowed to store them in your luggage, and if you drop one between the seats, to notify the attendants.
These are all precautions because the FAA has determined that these devices pose a fire risk, presumably due to over-heating. So, if you run caffeinate -- which prevents sleeping -- and ran a heavy workload, you could induce the high enough temperatures to cause ignition.
But, heavy on the could. There are a lot of defenses built into the batteries themselves to prevent this; then there are system management components in the computer to prevent this; then there are monitoring components on the CPU to prevent this. So, whatever you do, has to line up some failure mode of all of these systems simultaneously.
Not impossible, but maybe not far from it.

TI-84 corrupted programs

So I got a TI-84 calculator a few months ago. As of this morning, I had 30 programs that I wrote myself stored on it. The largest size program was slightly over 200, with the vast majority being under 100. The RAM Free was about 14900, and the ARC Free has always been 1919K.
This evening, when I went to check the Memory on it, I noticed that one of my programs (for the surface area of a rectangular pyramid) showed that it had a size of 200+. I took a look at the program, and its commands were scrambled, and had commands from other programs in it. I went back to the Memory management section and deleted the program, thinking that if it was corrupted, then deleting it would be the wisest choice.
I looked through the rest of my programs, and, to my horror, I saw that my program for the volume of a cylinder (the first program I ever wrote) had a size of 17000+. I decided to delete it too, but when I pushed the ENTER button to select the program, the TI-84 froze and the contents on the screen slowly faded into an all white screen. The calculator was completely unresponsive at this point. So, after some research, I pushed the reset button on the back of the TI-84, and that seemed to solve the problem, despite erasing all of my programs, except for the one that was at 17000+ (which I immediately deleted).
I have no idea why this occurred, as my research did not find any similar instances. I know my programs became corrupted, but I want to know what happened and why so I can prevent this from happening again. I already plan on backing up any future programs I write.
Sometimes programs can be corrupted by faulty assembly code in assembly programs and in apps. However, if you have only been using TI-Basic, it it unlikely to be code. Also, the hardware can sometimes get messed up by dropping or hitting the calculator. My calculator has also behaved very strangely while operating with low batteries and batteries of different ages (some more charged than others). Also, it is good to have plenty of extra RAM and Archive memory (although that doesn't seem to be your problem).
As far as solutions/preventative measures go, back your programs up, make sure you only download/use correct assembly (or none at all), and take good care of the calculator (batteries, jolts, etc.).

Perl scripts, to use forks or threads?

I am writing a couple fo scripts that go and collect data from a number of servers, the number will grow and im trynig to future proof my scripts, but im a little stuck.
so to start off with I have a script that looks up an IP in a mysql database and then connects to each server grabs some information and then puts it into the database again.
What i have been thinknig is there is a limited amount of time to do this and if i have 100 servers it will take a little bit of time to go out to each server get the information and then push it to a db. So I have thought about either using forks or threads in perl?
Which would be the prefered option in my situation? And hs anyone got any examples?
Thanks!
Edit: Ok so a bit more inforamtion needed: Im running on Linux, and what I thought was i could get the master script to collect the db information, then send off each sub process / task to connect and gather information then push teh information back to the db.
Which is best depends a lot on your needs; but for what it's worth here's my experience:
Last time I used perl's threads, I found it was actually slower and more problematic for me than forking, because:
Threads copied all data anyway, as a thread would, but did it all upfront
Threads didn't always clean up complex resources on exit; causing a slow memory leak that wasn't acceptable in what was intended to be a server
Several modules didn't handle threads cleanly, including the database module I was using which got seriously confused.
One trap to watch for is the "forks" library, which emulates "threads" but uses real forking. The problem I faced here was many of the behaviours it emulated were exactly what I was trying to get away from. I ended up using a classic old-school "fork" and using sockets to communicate where needed.
Issues with forks (the library, not the fork command):
Still confused the database system
Shared variables still very limited
Overrode the 'fork' command, resulting in unexpected behaviour elsewhere in the software
Forking is more "resource safe" (think database modules and so on) than threading, so you might want to end up on that road.
Depending on your platform of choice, on the other hand, you might want to avoid fork()-ing in Perl. Quote from perlfork(1):
Perl provides a fork() keyword that
corresponds to the Unix system call of
the same name. On most Unix-like
platforms where the fork() system call
is available, Perl's fork() simply
calls it.
On some platforms such as Windows
where the fork() system call is not
available, Perl can be built to
emulate fork() at the interpreter
level. While the emulation is
designed to be as compatible as
possible with the real fork() at the
level of the Perl program, there are
certain important differences that
stem from the fact that all the pseudo
child "processes" created this way
live in the same real process as far
as the operating system is concerned.

Reliably detect 16 bit process

Due to the need to run a 15+ year old application I wish to create a watchdog program to ensure a 16 bit application is running on a 32 bit version of Windows XP Pro and start it if necessary. Normally I'd use EnumWindows() to look for the application's window. Unfortunately, this doesn't work, or at least not reliably, for 16 bit apps.
Given that I have absolutely no control over the code in the application in question, how can I reliably detect whether or not it's running? I'm coding this in C (not C++ or C#).
You'll probably need to enumerate processes (e.g., EnumProcesses with GetModuleFilenameEx or CreateToolhelp32Snapshot with Process32First and Process32Next). If you don't find an instance of ntvdm.exe, then no 16-bit process is running, and you can stop there. If you do find an instance of ntvdm.exe, you can use VDMEnumTaskWOWEx to enumerate the 16-bit processes running in it.
Back when it was still under the original owners, I posted an article on CodeGuru demonstrating how to do all of this. It'd take a bit of work to make it compile under a modern compiler (e.g., it's old enough that it uses iostream.h instead of iostream, but the process enumeration part should still be pretty much right (though, looking at things, you'll also need to enable the SeDebugPrivilege for it to work on Windows 7).