Does downloading any library files from sunfreeware for solaris OS are prone to viruses,or is it safe to download from these sites.
because i had a memory issue where /proc consumed too much space(eventhough it is VFS) and
my / shows 100%full.
afterwards i was unable to login also.
Please comment.
Downloading any binary from any site, regardless of OS, requires you to trust the people who run that site to not build in malicious code, and to protect their site well enough that someone else can't break in and insert malicious code.
That's likely completely unrelated to the problems you hit though, filling a disk happens through many normal operations if you're not paying attention to disk usage.
Related
Windows Defender and AVG/Avast pickup our software application as a virus/false positive everytime we release. We have a code signing certificate and add taggant as well.
Every time we release the software we have to go through the process of doing a false positive form on multiple AV vendors sites.
How can we get our company code signing cert marked as safe or avoid this time consuming false positive report process on each release?
Edit: Is there any premiere support we can pay for to have this done automatically?
Edit2: we actually had our certificate revoked due to "malware distribution" as a result of these false positives. It seems there is no recourse other than to buy another one.
Signing cert doesn't help most of the time, it's probably a coding pattern which is similar to a virus listed in them, best you can do is contacting the AV to whitelist you to get past through that.
My recommendation is to contact with the AV vendors and told them your problem. Probably your software have some strings or patters defined that potentially trigger the heuristics of the AV. You can try to find that strings easily in your base code and base64/xor/encrypt them and see what happens with the AV, that may help to solve your problem
While it is certainly possible that your software shares some characteristics with know malware, I would guess that it is a "cloud" detection.
Cutting through the marketing speak, it basically means that (among other possible caues) your file is flagged as suspicious if it has not been seen on many other PCs.
Try removing any thing that could activate antivirus flags, like self-extracting, UPX, file encryption, suspicious website requests, or suspicious behaviour.
Why to remove these?
self-extracting is triggered because it's a suspicious behaviour (not really normal to do)
UPX is detected as some malwares try to hide the malware by being compressed by UPX, as antiviruses need to decompress it.
File encryption may be easily detected as Riskware / EncoderTool / Ransomware
Suspicious websites: Evit downloading files from strange URL.
I had this problem with a program auto-update, an antivirus detected it as a TrojanDownloader.
If your program doesn't do any of these things, I can't help you more, as that is a problem that the programmer community has.
I wish that could help
I’m trying to profile a website I have to work on in IIS, in Perl. The website uses Catalyst. I’m using Devel::NYTProf to profile it.
By default, the profile file is written in ./nytprof.out. I don’t have access to the command line used to launch the perl, nor to pass arguments (I use use Devel::NYTProf to enable profiling in my perl file).
But I can’t find the file… Do you have an idea where it’d be? How could I profile my website with NYTProf a better way?
I assume you mean IIS.
Have you checked the user the web server is running as has write permission to likely folders? It use to run as IANONUSR (IIRC) or similar, which had very controlled permissions for obvious reasons.
The IIS FastCGI module lets you set environment variables for the FastCGI processes, which should let you set out_file for NYTPROF. If all else fails you can hack Run.pm in NYTPROF and change the location that way, crude but at least you know where it is trying to write to.
I salute your efforts, I would probably just port the application to run under Linux. First time getting NYTProf working under Linux was hard enough, especially as the processes have to terminate normally, so the FastCGI processes got a method added to make them die when I fetched a specific URL, which I'd keep fetching till all the processes were dead.
That said NYTProf was well worth the effort on Linux, was able to track down a regular expression that was eating vast amounts of CPU, and didn't even need to be called 99.9% of the time it was. Experience on Windows was "fork" was a performance killer, but I think Microsoft fixed that somewhat since my IIS days.
I thought of migrating from J 1.5.23 to 1.7 and like almost everyone i too ran into problems (Good i backed-up my site)
The problem i am facing is that my jUpgrade gets stuck at 'Migrating undefined'. 1.7 gets downloaded completely and also extracts correctly. I think i am still facing this problem because i somehow run out of space during the installation. what i wanted to know was How much disk space does migration require?
I have like 25 Mb free on my server and i am allowed only 100 MB so.
Thank You?
and btw i also unchecked the skip downloads options, didnt work for me
You will probably need more disk space than you have available. Your current site, plus the downloaded zip file, plus space for extracting the files plus any backups you have on the server are likely to exceed your 100MB.
I'd recommend taking a backup of your site, setting up the site on a localhost (xampp, wamp, etc) server on your own machine and run the migration there. This will have the benefits of not hitting arbitrary limits of what sounds like a very low budget web host.
Obviously you'll have the extra complexity of setting up your own server on your PC - but there are many tutorials out there that will walk you through the process, and the learning of new skills is always good.
I have written an application in Java using Eclipse IDE and I now need to know the minimum JRE version that is required to run the application! I know that certain methods are only available under later JREs, but I was wondering what the easiest way to find out the highest requirement of my application would be, so any suggestions would be appreciated...
Also whilst I am on the topic of requirements, I would appreciate any advice or methods for determining the minimum system requirements for my software in general - i.e minimum amount of RAM...
Thanks in advance
Method 1: For minimum JRE version, that's going to be tough. The easiest way is to simply require the same version that you're building against, or later, e.g. JRE 6.x.x or higher.
Method 2: Install multiple JDK's, making them available in Eclipse, and just change the version you're building against, running your app's test suite each time, and making sure they all pass. The earliest version of the JDK that allows all your tests to pass is the lowest JRE it can run against. Simply having your app successfully compile isn't enough, because previous versions of the JRE/JDK might have bugs that allow for successful compilation, but don't allow for proper program execution.
Method 3: Always require the latest on the client side, because Oracle is constantly patching security holes, and ultimately, it may be best to require the latest versions, if you have that kind of control, on the client side.
As far as RAM, that's easy. When the JVM starts it sets a 'maximum' amount of RAM (I believe the default may be 128MB), and that's a hard limit that your application cannot exceed without crashing. Profile your app over time, tweaking the memory settings on the JVM, and find out what the minimum amount of RAM is that you'll need for your app to run both (a) with acceptable performance, and (b) without throwing an OutOfMemoryError, and you're done.
Ref: How to configure JVM options and memory?
For other requirements such as CPU req., things get a little fuzzier. There are a lot of CPUs out there, and the throughput that a given system produces can vary not just based on CPU speed, but the speed of the hard drive, the amount of RAM installed in the system, the speed of the network interface (if you're writing a network app), and other things. For requirements such as that, you'll want to just test it on a variety of systems and sort of draw a line somewhere, and say, "You can expect acceptable performance if you have hardware that is at least as powerful as X, Y, Z".
The other thing you could do is build in a benchmark, or some kind of performance logging, and have that performance data sent back to you. Lots of apps do this. You know that "May we send anonymous usage data back to the mothership?" question you get when installing some software? Well, common among that data are system-specific details such as RAM, CPU, hard drive model, and other hardware details (whatever data you determine is relevant to your app), along with performance logging data. By taking that kind of approach, what you get is a lot of performance data from lots of different system configurations without needing to have a huge number of differently configured machines in-house.
You can do the same thing for program crashes and bugs - have the stack traces, system info, and other relevant data dumped to a log file that is sent back to you - but of course, only if your users have said it's okay to send that data back to you.
Any tips How to figure out where is memory leak in my Facebook app, its ASPX using Facebook toolkit DLL and I am afraid the bug may be in that CS library.
The problem is that after one week uptime, server is running out of memory and needs to be rebooted, there are quite many users and I cannot run debugger on this "production server", so I would need to simulate somehow use on my local PC.
Try taking a dump (sorry for my phrasing) of the application.
There are some tips on this blog on how to do it, and how to read the dump files.
Then you can see which objects are allocated, and which ones prevent garbage collecting.