A lot of semi companies have built their own simulation environments because the EDA vendor's provided tools are very limited in what they can do and difficult to use. Many users go to Python, Excel or MATLAB because the programming/math capabilities of the EDA tools are inadequate. Over time these home-grown environments become a burden and the developer disowns it and it becomes a headache.
The EDA tools have no ecosystem you can hook into and they don't really care about the user trying to put the simulators into a flow to solve their problem. It is a bunch of point tools each with their own embedded interpreter that don't play together. I sure hope there is a plan to create a better set of tools so I can write custom netlist checks and do something novel (like get derivatives out of the simulator) and in-memory (no slow disks) and run custom Julia checks during simulation. Julia is a much better match because running Python or MATLAB code within the simulator is way too slow. I'll keep watch for sure.
I had to leave a company over that. I had made a personal fpga construct generator for various easily parameterizable modules. It worked pretty well but it was a mess and needed to be redone pretty badly, but it was good enough. I shared it upon request with a couple of engineers and before I knew it there were probably 25 EEs using it. They wanted "more" and I told them it was a personal tool and they were more than welcome to extend it. This actually angered a few of them and they "reported" me another manager a couple levels higher. He told me that I had to maintain/extend it and to use as much time as it took. (I was a junior engineer at the time). I told him I was there to be an electrical engineer and not a legacy code maintainer. He gave me an ultimatum and being a person that doesn't like ultimatums, I told him I resigned on the spot. I gathered the few items I had on my desktop waved at a couple buddies and went on out the door. I never regretted that decision. Moral (I guess?) be careful of what you share :) .
If they didn't lay claim to the IP, you should polish it and offer it for sale, or use it as interview material at someplace developing similar technology.
They could argue that since it was a tool that was made to help with his job at at the company, then it's internally developed. If there are no clear grounds or easily presentable evidence (even if there was!), he's out in the wild with a liability.
The only place where I can see him in the clear is if they had a repo going back to before the work on the company and they could prove the tool was a personal project, and unrelated to the job. Even then, Company could still sue and burden them with legal fees/process until they tire him out and BINGO, now they own their IP.
The alternative could be starting a new project, completely open source from the start (probably with one of the more liberal licenses) and get crowdfunding to develop and maintain it. Assuming they are interested in doing that, of course.
Oh the creation of modules has gotten much better with languages like SystemVerilog and SystemC :) . What I did was a very rudimentary effort in perl (later ported it to python). Kind of like people look like "excel gods" at some company when they take some mundane manual tasks and do some excel scripting and take a 3 day job down to 15 minutes. I just happened to be the only EE with the soul of a coder while I was there. I was under very heavy NDA at the time, I'm sure others were doing the same at their companies. tldr; it wasn't anything special and is much more common these days. Basically it eliminated a ton of copy-pasta and manual editing.
I can attest first-hand to the "headache" that comes from semi company simulation environments. Not only are they horribly outdated (in Perl/Tcl), but they're different at every company you work at. There's no gold standard because the standard that these EDA companies ought to be making doesn't exist.
There needs to be an open initiative between semi companies to create a standard simulation environment -- with compilers, unit-test frameworks, and all sorts of simulation (gate-level, analog/mixed signal, emulation, etc). Hell, just give me a free IDE plugin for SystemVerilog that actually works.
This lack of a standard seems to me like the critical path in hardware design. I'm trying to support projects to fix this like SVLS (A language server for SystemVerilog: https://github.com/dalance/svls) but these are all hard problems to solve. This industry is relatively niche and doesn't seem to have many engineers interested in FOSS.
It's an absolute nightmare. Cadence added support for Matlab calculations on simulator outputs, but its clunky and inconsistent. Don't even get me started on how long it takes to do basic calculations on numbers that should already be in memory...
My worst experience was doing a simple min/max of each signal took 7x longer than the simulation. I'd be so happy to toss TCL in the trash. I spent a long time debugging because TCL expr doesn't do -2^2 correctly. The error messages don't tell you the line number and I found no good way to debug. Things like that are just the tip of the iceberg of time wasted fighting with arcane tools. I suppose others have their own stories.
^ stands for bitwise XOR: so [expr {-2^2}] results in -4
* stands for exponentiation: so [expr {-2*2}] results in 4
Both seem correct to me, taking into account how integers are represented in binary (two's complement for the negative ones).
With regards to debugging dynamic programming languages, it is different as compared to their static counterparts, since much is delayed to happen at runtime (as opposed to at compilation time). But it also opens up possibilities (like introspection, ability to intervene in the scripts while they run, ...). It requires a different mindset.
I've always been astounded that much of the rage against Tcl seems to stem from the fact that it works as documented, rather than according to the rules of other languages.
Mmmn, HN stripped out the half of the double asterisks, now I made it confusing myself... Too late to edit my comment above, but it needs to be like this:
** stands for exponentiation: so [expr {-2**2}] results in 4
We are building something along these lines that the DARPA work sits on top of, so if you have a current need, do feel free to reach out (email is in profile).
The EDA tools have no ecosystem you can hook into and they don't really care about the user trying to put the simulators into a flow to solve their problem. It is a bunch of point tools each with their own embedded interpreter that don't play together. I sure hope there is a plan to create a better set of tools so I can write custom netlist checks and do something novel (like get derivatives out of the simulator) and in-memory (no slow disks) and run custom Julia checks during simulation. Julia is a much better match because running Python or MATLAB code within the simulator is way too slow. I'll keep watch for sure.