After looking at a bunch of other languages I'm back to Python. The bit manipulations aren't too bad and seem to speed things up over the high level gyrations you are forced to do in Excel VBA. Just finished describing the ALU as a class and am now debugging it. The interpreted shell IDLE (I assume ala Eric, as Python is ala Monty?) is really pretty nice for this, just cut and paste your code into it and interact with it via small iterated statements, I wish every language had this exceedingly valuable debug feature. There's nothing like getting to know your code via simulation. Like the SystemVerilog description, I made the ALU sim code data width a variable, so even exhaustive checking is simply a matter of reducing the bit width to something narrow like 4 and going to town.
When you do processor simulator design you find out how math oriented languages are rather than hardware oriented - modulo integers have a consistency and magic all their own. You also find out how complex simple seeming things like add, subtract, multiply - and particularly division - are (with Python v3 they added an int floor division to resolve float & int division situational ambiguity - this broke backward compatibility but at least they addressed the issue).
When doing math in hardware it all ends up modulo the registered width, and sign is only a convention, so you pick a convention and stick with it. For instance my ALU simulation code uses signed as the default, and I picked this over unsigned as it makes testing the MSB trivial - in hardware the signed form provides for automatic lead padding. Some kind of sign extension is always going on even when doing unsigned stuff, our number system notation is biased towards leading zero suppression, so when leading ones are called for (2's complement negative) it seems unnatural.