Related
Say we have a SystemC model of decade counter and I want to verify SystemVerilog Counter RTL using SystemC model. How can we connect these two in SV/UVM based testbench so as to communicate between them.
Mentor developed a free package called UVMConnect that was developed specifically for the application you are asking about. See https://verificationacademy.com/topics/verification-methodology/uvm-connect. You will need a simulator that supports SystemVerilog and SystemC simulating together, like Questa.
If you're using QuestaSim I think UVM-connect from Mentor is the way to go. When I first used it(4 years ago) it was very buggy and gave the most cryptic segfault errors I've ever seen. But, with help from the Mentor support I managed to overcome them and get stuff done. It should be more stable now, but if you have problems with it don't hesitate to contact Mentor support. They are very responsive.
However, if you're using Cadence tools and/or the e language I think that UVM-ML from Cadence is a much more comprehensive solution. It allows you to connect components written in any combination of languages(SV-SC, SV-e, SC-e) and it has nicer documentation and examples. I understand it's also compatible with all simulators now. You can find it here : http://forums.accellera.org/files/file/65-uvm-ml-open-architecture/
Not sure what Synopsis folks recommend for their tool suite. Maybe someone who used them can offer more information on this. But I'm guessing that both UVM-ML and UVM-Connect could work since their makers claim that they are portable.
And lastly, if you're planning to use SystemC as a verification language(very unlikely but just for the sake of diversity) there is something called UVM-SystemC which is basically a clone of SV-UVM written in C++/SystemC. It's currently in its alpha release and it lacks many features(register modeling, constrained randomization, coverage collection, etc.). It feels a lot like SV-UVM and I think it's a nice toy to play with in your spare time if you can't afford a commercial simulator license. You can find it here http://accellera.org/images/downloads/drafts-review/uvm-systemc-1.0-alpha1.tar.gz
I have been trying to make work EZSIM with no luck, which is a software to build discrete event simulators in a graphical DOS environment. In this software, my simulator and many others (of the other people in the course I'm taking) don't work, but teacher's simulator (and examples of the downloaded files) does work.
So, I began to distrust of the software.
Do you know any software that resolves the same kind of problems but really works? It will be good if it is free, or I can download an evaluation copy or something like that.
If you don't know any software, do you know any library which might work? Preferably in C#, Ansi C, Java or Delphi.
This may be more than what you're looking for, but check out NS2. It's the standard for open source network simulations, and will allow you to simulate all kinds of network layer behavior.
I've also used JUNG in the past. It's very flexible, although it also doesn't offer much out of the box.
I used Möbius in my computer systems analysis class. It is free for educational use (which sounds like what you're doing). It's a Java GUI which generates C++ code.
The R package queuecomputer. queuecomputer is a computationally efficient method for simulating queues with arbitrary arrival and service times. There is a submitted paper on arXiv describing the algorithm used in the package. Examples can be found within the arXiv paper and the vignette. A web app based on the package is available at https://ace-ebert.shinyapps.io/queue_simulator_mmk/ .
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
The organization that I currently work for seems to be heading in the direction of dictating to software developers which tools, languages, frameworks, etc. must be used. However, nobody has convinced me that this is a good thing. The main argument I have heard is that it will make training easier. But, after developing software for over 10 years, I've never relied on training to learn how to use an IDE, programming language, or anything else; so I just can't relate.
With the rapid speed at which technology evolves, and the s-l-o-w-n-e-s-s at which I know the standards will adapt, I am concerned that my customers will have requirements that I won't be able to easily implement or won't be able to implement as efficiently as I should. For example, if there is a UI requirement for an auto-complete feature in a web app, and no API has been approved for this yet, I would need to implement auto-complete myself as opposed to using one of the many APIs that provide it out of the box.
A more radical example is if my customers wanted to have Google Wave features. In that case I would want the flexibility of configuring my development environment (including the IDE) and selecting appropriate frameworks (ex: GWT) to use.
Please provide feedback on whether or not you think that software developer tools, languages, etc should be standardized and a few points to support your argument.
There is a lot of benefit for standardization. My organization has fairly set standards on what technology we will use. We realize strong benefits in the following areas ...
Hiring. It is easy to describe what technologies we are looking for and make sure our recruiters are looking for the right people.
License/Software costs. I can buy enterprise licenses easily. It gives me the opportunity to keep costs down by letting me spend more with a smaller number of vendors and thus get more leverage.
Consistency of delivery. Our teams have a very good idea of what projects will take to build, rollout and maintain because they have done it with success before (and they know the pitfalls too).
Agility. I can have one team take over for another or one individual take over for another more easily because of standardization.
Quality. We have peer reviews across teams as well as QA across teams.
Without a consistent use of a technology stack, tools, languages and frameworks, these types of benefits would be more difficult to realize. I am not closed off to new technologies, but there has to be a concrete reason beyond "what if I want to ..."
A major issue with standardization is that once standards are out there, they get stamped in concrete and are difficult to change. This is why our corporate IT environment is stuck on IE 6, and the best change control system we have access to is CVS. Given this situation, some developers break the rules, and some find jobs at more innovative companies.
You have a mixed bag here.
I wouldn't standardize on IDEs, because every developer works differently. Those who are insanely proficient in emacs may see their performance suffer if forced to use Visual Studio. I optimize my Visual Studio experience with a 30" monitor and find it incredibly productive.
However, standardizing on some tools, such as SCons or make or something to build products is perfectly reasonable.
Banning some libraries and having a process where new libraries are either approved or not is also very reasonable. I know lots of companies that ban boost, or JQuery, or ban open source libraries in general, etc. And they had good reasons for doing it. I know I got fairly upset when an intern incorporated some random "security" library he found on the internet without running it by anyone.
In the end every company is different. You have to be standardized enough to avoid serious complications and issues as people come and go, or as new products are formed and organizational structures change. But you have to be flexible enough to avoid re-inventing every wheel you need.
The important thing is to have clear reasons for adopting a certain tool or banning some other tool or library. You can't just have management dictate that thou shalt use this and not that without consulting the engineering team and making the decision for good reasons. And once decisions are made those reasons should be written down and clearly communicated.
And also, if, in the end, your favorite tool or library isn't adopted, please don't whine about it. Be adaptable and do your job, or find a new one that makes you happier.
I once worked for a manager who felt the need to innovate at every level of his software development operation. Every development tool had to be cutting edge (preferably in beta). Many of the tools he asked us to use didn't have good documentation, and training was not available. Ultimately, most of the technology we tried simply didn't work. We wasted a lot of time churning through new technologies, only to dump them when it became clear we couldn't make progress.
I tried to make the case that innovation is perfect in the area where your value proposition lies. Innovation can also be used judiciously where standard techniques fail. But for most mundane tasks, using tried-and-true tools and methods should be the default. Less risk, less cost, less management attention needed. So you can focus time and energy on the areas where innovation has the most benefit.
So I think standardization has an important role. But blindly saying everything must be standard is just as sure to fail as my manager who thought everything must be innovative.
The number one argument in favor of standardization is that it maximizes the ability of the organization as a whole to use a common body of knowledge. Don't know how custom web controls are built in ASP.NET/C#? Ask Bill down the hall who has the knowledge. If you use different tools, such organizational wisdom is cut off at the knees. While it is not good to be restricted to a least common denominator (and hopefully your management will realize this) you should not overlook the benefits of shared experience!
UPDATE: I do not agree that innovation and standardization are polar opposites. Indeed, would we have nearly the level of web innovation if we still had the mishmash of networking standards characteristic of the 1980s? No we would not. Of course, we might have more innovation on new low-level networking protocols but is that really worth it? In its place, we've had an explosion of creativity within the bounds of TCP/IP and the Web standards (http, html, etc.)
The trick is knowing how to standardize without using it as an argument for closing down all new exploration. For example, we use only ASP.NET/C#/SQL Server in my company but I'm perfectly open to the use of new tools within this framework (we recently adopted the DevExpress reporting package, for example, supplanting the earlier standard).
Standardization is a must for a productive development team. However that doesn't mean that you can't revist the standards from time to time to adjust them to new technologies and trends.
Whether you develop operations software for internal clients, or products for external clients, there is no compelling reason not to standardize. You certainly did not give one.
Had you seen how companies are struggling with holding heterogenous products together that have been maintained for 10 years or more, and are now a conglomerate of various technologies that developers at some point thought made sense, you would not have asked this question.
From the top of my head, I could name at least 2 well-known software companies that will be driven out of business because their cost of maintenance has become so high that they can no longer compete (but I won't).
I think the misconception here is that suppressing individualism would supress innovation. That is simply not true. It is poor technical leadership that suppresses innovation.
One unpleasant consequence of standardization is that it tends to stifle innovation.
Innovation is scary. It involves cost and risk.
Standardization is not scary. It reduces cost and risk in the short term. Until your competitors have created a game-changing innovation. Then standardization is very costly.
It depends on the organization I think. One like Microsoft, yes, there should be a bit of a standard. A small business with one IT department, no. A larger business with several offices around the world ... maybe.
it all depends :-P
Assuming the organization has a broad suite of enterprise applications to manage, I'd say no for the following reasons, though I may be taking the message of everything being the same a bit too literally:
Compromise on using best-of-breed for systems, e.g. if all the databases are to be MS-SQL then any Oracle DB solution is thrown out. This would also apply to the fact that everyone using an IDE has to use the same one whether they be doing Data Warehouse report development, web applications, console applications or winForms. I'm thinking of systems like ERP, CRM, SCM, CMS, SSO and various other TLAs, FLAs, and SLAs. (LA = Letter acronyms for a decoding hint if you need it)
Upgrading by committee is another interesting issue. Where if each team can choose their tools and have one person that decides it is to upgrade things, e.g. start using Visual Studio 2008 instead of Visual Studio 2005, now have to determine at what threshold is it worth it to upgrade everyone simultaneously which may be a big headache if there are more than a few developers. For example, over the past 10 years when would there be IDE changes, framework changes, etc.?
Exceptions to the standards. Could a contractor bring in something not used in the organization if they believe it helps them build better software, e.g. Resharper or other add-ons that some contractors believe are very worthwhile that the organization doesn't want to spend the money to get? What about legacy systems that may make the standard become a bit unwieldy, e.g. this was built in ASP.Net 1.1 and so everyone has to have VS 2003 installed even if most will never use it?
Just my thoughts on this.
There are several good reasons to standardize.
First, it allows the enterprise better organizational flexibility, if everybody is more or less familiar with the same things. It also allows people to help each other better. I can't help with problems in the ASP.NET stuff, and there's not all that many people who can help me on the C++ side.
Second, it reduces support problems and expenses. Oracle and SQL Server are both decent products, but using both for similar functions is only going to cause problems. Not to mention that I've been in shops using several widely different platforms to do similar things, and it wasn't fun.
Third, there are some things that just have to be standardized. We couldn't operate half with VS 2005 and VS 2008, since we keep project files under source control. We had to pick a time and convert over.
Fourth, in some businesses, it simplifies the regulatory problems. I don't know what business you're in. I work at a place where we can get away with making mistakes right now, but I've also contracted at a bank and a utility, where it's necessary to be able to show auditors that everything is going in a standard way.
Fifth, it can simplify procurement, if you're dealing with software that costs money.
This doesn't particularly limit us, since if there's something we need that isn't standardized on we just go ahead and get it or do it.
If you want to make a business case against standardization, you'll need to have a business-related argument. Your argument seems to be that you won't be able to implement features the user wants, and that is a consideration. Got another argument?
There's nothing wrong with standardizing on an IDE that is rich enough to be configured for individual developers.
However, do make sure that you don't prevent individual developers from using additional tools, as long as the tools are licensed and that the use of the tool by one developer doesn't require all other developers to use it.
For instance, I happen to use NORMA to help me design databases. The output is SQL Server DDL (or anything else I want). I can make the DDL part of the project without making my NORMA source part of it. Later developers do not need to use NORMA to work on the project.
On the other hand, if I decided to use the Configuration Section Designer to create configuration sections, then future developers would also have to use it. A decision would need to be made about whether to use that tool.
The company I work for uses C#, ASP.NET, JavaScript and generates HTML. The advantages over and above those mentioned above are that there is a perception of improved velocity for maintenance and adaptive changes. The disadvantages include generating some boredom for people who are technically savvy (geeky) and prefer to use a mix and match of languages, depending on what they fancy is better suited, or for 'performance reasons'.
Technical and personal supervision is always good to have when you are developing as fast as you can to meet tight deadlines and competing in a highly saturated market for web development.
Are there any technical benefits to Windows/Microsoft as a platform to use while developing, over a Unix dialect such as Linux or Solaris?
I know that companies choose Microsoft at times because there's simply not enough programmers available that know Unix, or that these programmers are much more expensive to hire.
So assuming all developers knew Unix and Microsoft equally well, would there still be cases where you are better off developing in Windows?
To me there's only two arguments for using Windows as a dev platform:
You have to because you're doing .Net/Windows development (or because the company simply gives you no choice); or
The apps, specifically Microsoft Office/Exchange. I'm sorry but OpenOffice is dreadful in comparison to Word/Excel.
Apart from that imho Linux has every other advantage including:
MUCH faster filesystem (particularly important when dealing with lots of small files). Last year I went from a build time of 8-10 minutes to 2-3 just by this switch (ant build of same code base);
Typically your dev environment then matches your production environment (if you're production environment is Windows your dev environment will be Windows almost guaranteed), which can be useful. We've had issues with Java classpath visibility because of differences between JBoss on Windows and Linux; and
A much better set of command line tools (yes I knwo you can use Cygwin, etc but it's not as good).
That's one reason why I find the idea of a Mac as my next dev workstation so appealling: you can look it as either Unix with applications (ie Office) or Windows with a decent filesystem (will be even better if/when OSX adopts ZFS), either way it's a win. The only thing that's really put me off is that Apple does stupid things like delay Java 6 release by a year just so they can put the Leopard Look and Feel in.
Just off the top of my mind:
.NET (even though mono is really great)
Visual Studio - probably the best IDE around
Excellent documentation (The MSDN Library is way much more developer friendly than man pages in my opinion)
Huge userbase (that's more like a business thing but still it is a very important factor)
Binary compatability (it's much easier to support 4-5 kernels and standard C library versions than the infinite number of combinations you can find in Linux distros)
One of the best things you can do is keep your options open. Chopse a platform independant technology and you'll be able to have software for any O/S or implementation. From a technical standpoint, this makes a lot of sense as well as from a business one.
As for specific technical advantages to the Windows platform, other than the large developer community and development information store and widely supported IDE's like Visual Studio, I'd say you'll be hard pressed to find one. Even there, Eclipse can do just as good a job with a platform independant technology.
Microsoft systems tend to have much better integration between different parts - there's a lot less heterogeneity to worry about if you're using binary-only software (x86 and comctl3d is a lot easier to support than everything *nix runs on).
The learning curve on Windows is shallow to begin with but has a longer overall distance. On Unix/Linux the beginning is a struggle but getting stuff done becomes easier later on, when the inner workings of the OS begin to make sense.
At least that's been my experience with them. Windows for quick payoff, Linux if you're going to be doing something much longer-term. And virtual machines if you can't decide :)
I think this question presents a false dichotomy. There's no reason you have to choose windows over unix or vice versa. Virtualization is free and easy. It's the best of both worlds!
One reason we have Windows development platforms (even though our production is on Linux or Solaris) is common environment for all.
That means all the different populations involved in the realization of a softwares:
are not all developers (business, functional people are also concerned with a working environment)
are all on the same platform (Windows)
use all the same tools to write/communicate (as in Word, PowerPoint)
can have their same environment on laptop
In short: uniformity of environment for all (developers and non-developers alike).
The other reason is depreciation: it is easy to manage depreciation for PCs, where the services are lighter than a full-scale Unix server (like a Sun Fire, a F15K or F50K, ...): the latter needs some expensive assistance service contracts (like "bronze", "silver" or "gold" depending on the level needed). A PC is easier to fix/replace, and is not as critical is a developer "mess up" on it and crash it utterly ;)
That being said, the downside of this is you do not change PC every day: it means managing a large parc of desktops, you cannot just decide to upgrade like that (and that goes for Os too).
So where the other answers are all about "virtual machine" whereas your set of PCs is from 2003, with only 40Go of hard-drive and 1, may be 2Go of memory..., you realize "virtualization" is not always an obvious solution.
Hence, some Unix "integration" server are required for developers to test their products in an environment closer to the target. In a way, this is better, since those integration servers are managed in a uniformed way, avoiding the syndrome of "it works for meTM", as opposed to virtual machine, where each developer is the own root/administrator of one's own little world/server ;).
I can give you a common arguments that Windows folks might make, though not one I necessarily agree with.
People sometimes think that Windows boxes at production time are easier to maintain and deploy. That is because there are a lot of visual tools available to the admin. Therefore they prefer .Net or a Windows-specific development language for easy integration.
If your customers or internal clients use all windows desktop computers, some would argue that its less legwork to do stuff with Windows servers. This includes stuff for Microsoft Office document sharing (i.e. sharepoint) or stuff with Windows File Sharing. Obviously its easier to write a .Net application to deal with such Microsoft-specific constraints.
I can't really think of any other reasons. The latter one is probably the most valid -- there just might be some microsoft-specific technology that is hard to integrate with unless you use MSFT development tools.
Peripheral reasons for some specific kinds of development:
you need to see how things look in both firefox and explorer
you're working with flash (which AFAIK you can't develop on linux, and the players are terrible).
you're working on a project that involved MS office integration
you're office has some godawful mail or notes system that you can't log into any other way. ditto for some vpn setups.
I consider all of these things to be regrettable.
Why not use both?
In either scenario, you could use a virtual machine in either Windows or Linux/Unix for basically nothing using Virtual Box or Vmware player. Or you could remote desktop/vnc to the other platform from your development box. If you develop in .net you would probably be better off on Windows for dev. If you develop for LAMP, either Windows/*nix would be fine.
give me apache mysql (ok postgres in a pinch) php and eclipse .. who cares about the OS ..
I've been involved in embedded operating systems of one flavor or another, and have generally had to work with whatever the legacy system had. Now I have the chance to start from scratch on a new embedded project.
The primary constraints on the system are:
It needs a web-based interface.
Inputs are required to be processed in real-time (so a true RTOS is needed).
The memory available is 32MB of RAM and FLASH.
The operating systems that the team has used previously are VxWorks, ThreadX, uCos, pSOS, and Windows CE.
Does anyone have a comparison or trade study regarding operating system choice?
Are there any other operating systems that we should consider? (We've had eCos and RT-Linux suggested).
Edit - Thanks for all the responses to date. A pity I can't flag all as "accepted".
I think it would be wise to evaluate carefully what you mean by "RTOS". I have worked for years at a large company that builds high-performance embedded systems, and they refer to them as "real-time", although that's not what they really are. They are low-latency and have deterministic schedulers, and 9 times out of 10, that's what people are really after when they say RTOS.
True real-time requires hardware support and is likely not what you really mean. If all you want is low latency and deterministic scheduling (again, I think this is what people mean 90% of the time when they say "real-time"), then any Linux distribution would work just fine for you. You could probably even get by with Windows (I'm not sure how you control the Windows scheduler though...).
Again, just be careful what you mean by "Real-time".
It all depends on how much time was allocated for your team has to learn a "new" RTOS.
Are there any reasons you don't want to use something that people already have experience with?
I have plenty of experience with vxWorks and I like it, but disregard my opinion as I work for WindRiver.
uC/OS II has the advantage of being fully documented (as in the source code is actually explained) in Labrosse's Book. Don't know about Web Support though.
I know pSos is no longer available.
You can also take a look at this list of RTOSes
I worked with QNX many years ago, and have nothing but great things to say about it. Even back then, QNX 4 (which is positively chunky compared to the Neutrino microkernel) was perfectly suited for low memory situations (though 32MB is oodles compared to the 1-2MB that we had to play with), and while I didn't explicitly play with any web-based stuff, I know Apache was available.
I purchased some development hardware from netburner
It has been very easy to work with and very well documented. It is an RTOS running uCLinux. The company is great to work with.
It might be a wise decision to select an OS that your team is experienced with. However I would like to promote two good open source options:
eCos (has you mentioned)
RTEMS
Both have a lot of features and drivers for a wide variety of architectures. You haven't mentioned what architecture you will be using. They provide POSIX layers which is nice if you want to stay as portable as possible.
Also the license for both eCos and RTEMS is GPL but with an exception so that the executable that is produced by linking against the kernel is not covered by GPL.
The communities are very active and there are companies which provide commercial support and development.
We've been very happy with the Keil RTX system....light and fast and meets all of our tight real time constraints. It also has some nice debugging features built in to monitor stack overflow, etc.
I have been pretty happy with Windows CE, although it is 'heavier'.
Posting to agree with Ben Collins -- your really need to determine if you have a soft real-time requirement (primarily for human interaction) or hard real-time requirement (for interfacing with timing-sensitive devices).
Soft can also mean that you can tolerate some hiccups every once in a while.
What is the reliability requirements? My experience with more general-purpose operating systems like Linux in embedded is that they tend to experience random hiccups due to their smart average-case optimizations that try to avoid starvation and similar for individual tasks.
VxWorks is good:
good documentation;
friendly developing tool;
low latency;
deterministic scheduling.
However, I doubt that WindRiver would convert their major attention to Linux and WindRiver Linux would break into the market of WindRiver VxWorks.
Less market, less requirement of engineers.
Here is the latest study. The last one was done more than 8 years ago so this is most relevant. The tables can be used to add additional RTOS choices. You'll note that this comparison is focused on lighter machines but is equally applicable to heavier machines provided virtual memory is not required.
http://www.embedded.com/design/operating-systems/4425751/Comparing-microcontroller-real-time-operating-systems