Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Linux Software

First mixed-HDL Simulator for Linux 74

Gino writes "Model Technology Inc. will be making their two most powerful tools ModelSim(TM) EE and SE available for the Linux platform in VHDL, Verilog and mixed HDL configurations. Demand for Linux support has been tremendous, apparently due to customer migration to simulator farm environments - multiple node machines configured with Linux as simulation farms proofed to be quite effective. It is good news to see the big HDL boys paying attention to Linux at last. "
This discussion has been archived. No new comments can be posted.

First mixed-HDL Simulator for Linux

Comments Filter:
  • If you are spending 20K on software.. I think you can do better than comsumer space prossesors (Intel, AMD, PPC G4, etc).. this is a definate case for the big boys like Alpha, PA-RISC, or IBM's high end versions of PPC, etc... This is just simply where those processors are far better, and more economical, than anything else out there.

    And I do not want to hear anything about a Beowolf cluster, you simply cannot scale multiple boxes as well as you can a single box with a dedicated backplane like the big boys use...
  • Well, I've never used Cadence, but I've been using Renoir with ModelSim (on Solaris using VHDL) for a few months now, and it seems to work pretty well. Granted, I haven't been doing any extremely large projects, but it does seem fairly easy to use renoir, and the simulator is pretty good. Only thing, is, from this press release I'm not sure if they've ported Renoir as well, or just the compiler/simulator. If so, that would kind of suck...

    And I think the reason it's so expensive is because they can get it. People designing hardware have the money to get it, and they kind of need it... and I think the alternatives are just as pricey. Not that I don't wish it was cheaper...
  • *smashes head on keyboard*

    ARGH! Did you even read my previous post?

    (1) It is unprofessional and unethical to edit sunmissions of any sort without clear notification that it may be edited. There is no such notification here, so I would be far more concerned if editing *were* taking place.

    (2) I don't recall Hemos being an editor, anyway. He is allowed to post stories *without* being an editor, you know.

    (3) The submissions posted in the stories are *not* submissions in the sense of articles in a journal. They are quotes taken from emails. Go try an test submission. The preview says something like:

    James Lanfear wrote in to say "blah blah blah."

    That is *not* a submitted article (which should probably be edited), but a quote from the text of my messege, which happens to contain a submission of some sort. Do you understand the difference?
  • I read it. I just know you are wrong. :)
  • No, it isn't. That was a direct quote (note the italics); changing quotes is a no-no. The most he should have done is drop a '(sic)' after "proofed".
  • How, exactly, am I wrong? Repeating that you would like Hemos and Taco to do something is not an argument in its favor. If you could actually provide any reason for me to believe that those aren't direct quotes, or that Hemos is an editor, or that editing without notification is OK, I might be more inclined to agree.

    Otherwise, you are, IMNSHO, little more than an above-average troll.
  • Repeating that you would like Hemos and Taco to do something is not an argument in its favor.

    This is more than what I want them to do. This is what the editors of other publications do.

    Hemos is an editor

    Then what else is he?

    editing without notification is OK

    I for one assume that the posts should be edited, notification is not required.


    Or have I gone too far by suggesting that Slashdot is a member of the reputable media? :)
  • This is more than what I want them to do. This is what the editors of other publications do.

    I never said it wasn't. Yet. It isn't.

    Then what else is he?

    A maintainer. Poster. Webmaster. Whatever. But he's more of an editor than I am.

    I for one assume that the posts should be edited, notification is not required.

    Except that notification *is* what professional and reputable journals/newpapers/etc do. It's part of what makes them professionals. Go look in your local newspapers blurb of letter submission. It should say that they will (not) be edited to length, content, etc.

    The reason that is their is simple, to most of us with developed cerebrums: you said "I for one assume...". You aren't supposed to *assume* anything; the editors are supposed to make very clear what will be done with your work (if for no other reason than because you own it). I could assume that my paper will automatically be published, but it won't be. Editors tell you that, very plainly. I could assume that submitting it on a stone tablet is OK; that's why most journals specify media. I could assume that mailing stories to Taco is fine; tht's why he tells you to use the Submit Story link. Getting a clue yet?

    There, I've fed you enough text to feed a family of trolls. Move on.
  • 'More of an editor' should be 'no more of an editor'. Apparently, I'm correct (either way).
  • Finally a VHDL platform for linux. This news means inexpensive software to do large scale VLSI chip design and embedded system layouts. I'm curious to find out exactly how much will the package go for if it isn't free for Linux.

    For most engineers it is common knowledge how expensive just software and the testbed environment is to developers. i am tired of thumbing through code of ocean or electric that only works on certain environments on a smaller scale. This isn't your grand daddy's pspice.
  • While companies use HDL's to design application specific integrated circuits (ASIC's) most of us can use an HDL to program a reconfigurable logic chip called an FPGA (Field Programmable Gate Array). An FPGA is a chip that looks like whatever you want it to. You program which gates are connected and how and then the FPGA loads this configuration information straight from an EEPROM. If you mess up, you can reprogram the EEPROM and reboot the FPGA.

    Most of the FPGA vendors offer free (for windows users :-( ) software for programming only their FPGA line. Most of these software offerings are crippled versions of a full software package like ModelSim. Hopefully soon, we will see a limited version of ModelSim too.
  • what is it?
  • Hemos, that should be "proved" not "proofed."
  • by Signal 11 ( 7608 )
    Well, no beowulf for the ACs in this article... but on the plus side, you'll be able to visualize what your room would look like if you had one... =)

    My tongue is firmly in my cheek right now.

    --

  • Ok, what does HDL stand for?
  • For some strange reason, that is a function of the URL. If you remove a small part of the URL ("fq=Linux&", working from memory, could be wrong), it makes Linux red at ever occurance (and not a bitmap). To try it, check out this URL: http://www.newsalert.com/bin/story?StoryId=Cob0Eub KbytaWmte&Nav=na-search-&StoryT itle=Linux [newsalert.com]
  • Finally, something a beowulf cluster might be useful for...
  • For those of you who aren't sure what an HDL is...

    HDL stands for Hardware Description Language. They are used by engineers to simulate a computer chips (and other ICs) before they are created. Used mainly for testing to see if what they designed truly does what they want it to do. Very neat. You can simulate an entirely different computer architecture using an HDL.

  • Its gino who wrote that, not Hemos. I dont think its his job to grammar/spell check submissions.

  • Hopefully they'll have a demo version out...these things are not for the average hardware tinkerer....some of these licenses can exceed well over ten grand for some of the better HDL packages out there. And the few demo ones I've seen don't include VHDL (ugh) or verilog entry and only run on windows.
  • Did it strike anyone else as odd that throughout the entire article, every instance of the word 'Linux' was enlarged and, in fact, a colored bitmap?

    Can you boys and girls say 'hype'? I knew that you could!

    Seriously though, isn't about time that people stop worrying so darn much about the operating system that the application is running on.. and worry about the actual application?

    As for the actual article, kudos to Model Technology Inc. I know that this will make quite a few people happy.

    --
  • on the face of it VHDL/HDL simulators would be the first to jump on a cheap paralleling systems. However bleeding edge hardware companies can't afford to place their trust in hardware/OS's for simulations until the platform has proven itself. Too much money is at stake for a company to build, for example, an ASICS chip based on designs and simulations run on a dubious platform. The cost of failure is terrific.

    This is one more indication that Linux has become a mainstream OS and another shot in the arm for the increased utilization of Linux outside of the mailserver realm.

  • Ya, i see that.

    I bet whomever found this, did so through some kind of search.

    Yay. Scratch one comment.
  • Maybe instead of trying as hard as you can to get a first post (unannounced or not), you could learn something on the subject first. Verilog and its ilk are circuit-design languages, where you create functional blocks of circuitry and connect them together, and then you can simulate them accounting for latency and the like. Very good for circuit design, expecially on the IC scale, including CPUs.
    ---
    "'Is not a quine' is not a quine" is a quine.
  • He is an editor, by definition, it is his job.
  • Hardware Descripton Language :)
  • For those of you who aren't sure what an HDL is...
    HDL stands for Hardware Description Language. They are used by engineers to simulate a computer chips (and other ICs) before they are created. Used mainly for testing to see if what they designed truly does what they want it to do. Very neat. You can simulate an entirely different computer architecture using an HDL.


    HDLs are used for more than just simulation. They are also used to synthesize logic into gates. They are mainly used in ASIC (Application Specific Integrated Circuit) developement. The typical design flow of an ASIC is as follows:

    1) Define Specs.
    2) Develop an HDL model and a simulation model.
    3) Verify the HDL model by simulation (using the ModelTech tool described in the article) and compare to the simulation model. If there are any mismatches, regress to step two.
    4) Synthesize HDL into a gate level netlist.
    5) Verify the gate level netlist against the simulation model (almost like step 3).
    6) Begin the physical design (Place and Route, creating timing convergence, etc.)

    As you can see, HDLs are used during the complete design process. A quality simulator is just one step in this process, but is very important to the design flow.

  • yup - I'm a daily verilog user (and a one time Verilog compiler writer many years back .... anyone want a compiler for their 68k mac :-) - we buy the biggest, meanest boxes to run our stuff on - they're cheap compared with the software. However some things about Verilog performance:
    • little locality of reference - big TLBs are important - caches of any size don't seem to be small enough (except for reall small simulations - think of it this way - every clock tick everything in memory has to wake up and do something)
    • as a result how fast your main memory subsystem is is very important
    • Beowulf is out - Verilog is amenable to a couple of solution spaces which beowulf is sort of in the middle of:
      • tight SMP (think of Verilog as lots of tiny processes)
      • farms
    • most modern piped architectures suck because there's lots of pointer chasing underlying the simulation engine
    • it probably doesn't matter much how many registers your CPU has - most of the time is not spent calculating expressions but in doing the business of simulation
  • HDL stands for Hardware Description Language. It's most commonly used to design digital ICs. Everything from very simple logic functions to graphics processors gets designed with some sort of HDL. It's essentially a type of programming language, except instead of executable code, a hardware design is produced.

    Two of the more common ones are VHDL and Verilog. If you're really interested, there's a VHDL FAQ [vhdl.org] you can read. The VHDL acronym is interesting: the V stands for VHSIC, which in turn stands for Very High Speed Integrated Circuits.

    -Ed
  • I think that it's main competitors (VCS or cadence) go for lists of $40k/license. A lot more people use these tools rather than modelsim. There's also a bunch of cheapo VFerilog implementations that I don't know of anyone using for real work.

    This is one area that the open source model isn't going to play quite so easily - basicly because there's 0 tolerance for bugs when you're taping out a $500k chip - people are very conservative about what they use - look how long it took for VCS to catch on (and it was probably 20 times faster than the interpreted standard of its day).

    But remember that these numbers are also 'list' price - there's a LOT of discounting goes on - hit the sales guy up at the end of the quarter when he's up against his quota, or bundle it with some other purchase and you can get a much better price. Oh yeah and make SURE they see that you're trying out the competitor's product :-)

  • My very limited experience with it (I'm more of a software rat) bears out the rumor that it actually stands for Very Hard Description Language.
  • No, he is an editor. His job it to screen, edit, and aprove submissions. Haven't you people ever published anything?
  • I've got the source code, but I'll have to reinstall gcc 2.7.2 to compile it (I love you Debian). The static binary core dumped immediately.

    Has anyone used VBS? How does it compare to the commercial packages -- where on the scale from Wellspring VeriWell (sucks) to Cadence Verilog (powerful, expensive, runs on SPARCs) is it?

    I currently use the free (limited circuit size) version of VeriWell on my Linux box and my SPARC (we can use Cadence on the ECE computers here, but I have yet to design anything that complex). I'd just as soon use something free (or better, Free) that doesn't suck.
  • This is a function of thier search engine. If you go to the main page of www.newsalert.com, and search for 'HDL', and click on this article, you'll find every instance of HDL highlighted.

    Likely, the submittor was searching news sites for 'Linux', in hopes of finding something to submit to /.

  • I think the word Linux was highlighted in the press release by the newswire service, not the writers of the release. My theory is that this release was discovered by searching for Linux, and the wire service automatically highlights the matching terms when it displays the results of a search.
  • Likely, the submittor was searching news sites for 'Linux', in hopes of finding something to submit to /.

    Yeah, that thought occured to be after I posted it. Kind of sad, really.
  • by jabber ( 13196 ) on Tuesday November 02, 1999 @05:21AM (#1570136) Homepage
    Yes, I'm very glad to finally see Linux applied to the field of medicine. Even if it is for something so common as cholesterol monitoring.

    You know, you just can't overlook the benefits of having your High Density Lipid levels measured on a robust, scalable, secure and (most importantly for the HMO's) FREE Operating System...

    Wha?? Oh, nevemind.
  • For any HDL hobbyists out there, or anyone who would be intersted in a bare-bones verilog simulator check out the VBS [flex.com] (Verilog Behavioral Simulator) project. You can download the Linux binaries and/or and source code. There's a pre-processor avialable too. This program works pretty well, I had it running on a Linux box of mine for a bit last year. I even wrote a CGI front end for it so I could run some simulations remotly over the web. That was cool. :)
  • There has been demand for Linux in the EDA/HDL world from the engineering side of the house for years (EETimes has covered it for three years now, and it has to be big enough for them stumble over it). This was a response in most design houses to management wanting to dump UNIX workstations for NT, and most of the software companies were doing that as NT became more reliable for intensive tasks (still not there yet).

    Software like this will allow EE programs in schools to have a choice. Many of them were phasing out UNIX for NT because of the software available to VHDL and other courses.

    Also, many of the most intense users of computers are the designers that do VHDL and simulation. These people buy a lot of hardware and spend a lot on software, so they tend to carry a bit of influence in where the overall computer market is going.
  • >Maybe instead of trying as hard as you can to
    >get a first post (unannounced or not), you could
    >learn something on the subject first

    This is the kind of comment that gives /. a reputation for being full of hormone-ridden trigger happy teenage flamers.
    He asked a question (on-topic) about clarification of initials. You managed to not actually say what HDL stands for.
    Yes he could have looked it up, as well as he could have read the newes directly instead of comming here, but defeats the purpose of /., doesn't it?
    No, I can't spell!
    -"Run to that wall until I tell you to stop"
    (tagadum,tagadum,tagadum .... *CRUNCH*)
    -"stop...."
  • at $20k/license this is one thing you go out and buy that cyro-athlon to run on ....
  • ...I don't feel inclined to apologise. I can't believe the waste of time (and here I am wasting even more...) just because you got upset about one spelling mistake.

    I speak four languages and English is not my mother tongue - so the only way I can double check for spelling mistakes is with a spelling checker which of course would not pick up proofed as an error. Is it past tense for proof reading something - I think.

    I'll try to be positive - at least I'll know the difference between proved and proofed in future ;-p

    ...by the pricking of my thumbs,

  • It was written by a university student for his thesis, so it's not going to even compare to Cadence. The only reason to use VBS is becuase it's free. It's a command line program. When you say VeriWell sucks I presume that's becuase of it's lack of features and it's bare interface; not because it makes mistakes. I expected that your going to decide that VBS really sucks. Justifiably, I guess. But you can't really judge a free program based on a comparsion to a commercial program. VBS is best for (as I tried to say in my original post) people who are just playing around with verilog; students and such. Basically, if your serious enough to be willing to pay real money for a good program, then VBS is not for you. That having been said, there are a couple of things that I did find annoying about it. The biggest one was that it doesn't seem to have any built-in gate level primitives (i.e. basic and, or, not, etc.). I had to build those with behavioral code. That was annoying. Oh well. It's free. :)
  • Yes and I don't expect submissions of this type to be edited. If you would actually read the submissions some time, you would see that they are *quoted* from the original text, thus the italics and "x wrote in to say...". It doesn't matter where something is being submitted, if you quote it, you don't edit it, except to make it match the original quote. This isn't like a Letter to the Editor; these are quotes within the body of a larger article (the story header itself). I'm getting tired of repeating myself....

    That being said, it isn't stated anywhere that submissions will be edited. Regardless of what you think it correct, this is the way it is done, and the only way we have any reason to expect it to be done.
  • I figured with the buyout by Andover, they'd gain a level of professionalism around here. Mayhaps that was a poor assumption.
  • (1) This isn't a matter of professionalism. In many situations (such as this one, IMO), it is *very* unprofessional to do what you've asked (editing quotes). Unethical, even.

    (2) Given that there is no indication given on the submissions page that editing will be done, I would very disappointed if it were. Once again *that* would be unprofessional.
  • I would agree with that if it were not for the fact that you/me/each of us are submitting a story. I would rather see my post edited that not personally. That is what he editor is there for. (I cannot beat on this point hard enough.)
  • In defense of Dunkel, I had no idea either, and the article I was reading did not appear to be in English when I first read it. After reading the explanations here it makes a little more sense. Yeah, I'm probably just stupid, but i had no idea you could farm computers...
    :-)



    mcrandello@my-deja.com
    rschaar{at}pegasus.cc.ucf.edu if it's important.
  • There's also V2000 at:

    http://www.v-ms.com/
  • by gid ( 5195 )
    Thanks to everyone for actually answering this guy's question and NOT flaming him. I had no idea what this news item was about, I got intruged, but not intriged enough to read that entire article. Some of us have a life outside of keeping up on /. articles remember. :)

    I Usually come to the comments for the "good" information, but usually refrain from posting, because *gasp*!!! I don't actually read the entire articles. A good skim is what they usually get. :)

  • Farm computers? Sure! Ever heard of Intel's [intel.com] Jones Farm? :)
  • It isn't the OS doing the number crunching; it's the application.

    The HDLs are, AFAIK, languages in their own right, so you probably wouldn't say that the application was written in C/FORTRAN, either. (I've seen examples of one HDL that looks exactly like Ada, with a few keywords changed.)

    --
    It's October 6th. Where's W2K? Over the horizon again, eh?
  • Every HDL I've tried other than cadance verilog has pretty much sucked. Granted Cadance's user interface isn't as nice as say, verilogger, but it never crashes. So how is this going to be in terms of stabiliy? Even the HDL that shipps in my professors book "The Verilog Hardware Description Language" sucks.

    And I havn't looked yet but did I see 20k? Ugh, why is this stuff so darn expensive?

You knew the job was dangerous when you took it, Fred. -- Superchicken

Working...