Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

$signed ignored under generate block #999

Closed
veripoolbot opened this issue Nov 12, 2015 · 3 comments
Closed

$signed ignored under generate block #999

veripoolbot opened this issue Nov 12, 2015 · 3 comments
Assignees
Labels
area: wrong runtime result Issue involves an incorrect runtine result from Verilated model resolution: fixed Closed; fixed

Comments

@veripoolbot
Copy link
Contributor


Author Name: Clifford Wolf (@cliffordwolf)
Original Redmine Issue: 999 from https://www.veripool.org
Original Date: 2015-11-12
Original Assignee: Wilson Snyder (@wsnyder)


A PicoRV32 user reported problems with simulating the PicoRV32 CPU with Verilator. So I investigated and found the following bug in Verilator. Sorry for not positing a minimalist test case. I've tried to create one from scratch, but was unable to reproduce the problem this way.

Here is the problem: http://i.imgur.com/evkRLNx.png

alu_lts clearly should be high when reg_op1 is -1 and reg_op2 is 0. But it seems like Verilator incorrectly merges the alu_lts and alu_ltu signal.

Using current git head of verilator (4e4bc7b):

$ wget https://raw.githubusercontent.com/cliffordwolf/picorv32/master/picorv32.v

$ verilator -Wno-fatal --trace --cc picorv32.v

$ grep -r alu_lt obj_dir/Vpicorv32__Trace__Slow.cpp
	vcdp->declBit  (c+8,"v picorv32_core alu_ltu",-1);
	vcdp->declBit  (c+8,"v picorv32_core alu_lts",-1);
	vcdp->fullBit  (c+8,(vlTOPp->v__DOT__picorv32_core__DOT__alu_lts));

Interestingly, in the larger test bench that includes PicoRV32, the signals alu_lts and alu_ltu are not merged, but are still assigned the same expressions. Looking at the generated c++ code:

     // ALWAYS at picorv32.v:570
     vlTOPp->v__DOT__uut__DOT__alu_lts = (vlTOPp->v__DOT__uut__DOT__reg_op1
                                          < vlTOPp->v__DOT__uut__DOT__reg_op2);
     // ALWAYS at picorv32.v:571
     vlTOPp->v__DOT__uut__DOT__alu_ltu = (vlTOPp->v__DOT__uut__DOT__reg_op1
                                          < vlTOPp->v__DOT__uut__DOT__reg_op2);

In this case, manually changing the assignment for alu_lts fixes the problem and the testbench runs successfully:

     vlTOPp->v__DOT__uut__DOT__alu_lts = VL_LTS_III(1, 32, 32, vlTOPp->v__DOT__uut__DOT__reg_op1, vlTOPp->v__DOT__uut__DOT__reg_op2);

@veripoolbot
Copy link
Contributor Author


Original Redmine Comment
Author Name: Clifford Wolf (@cliffordwolf)
Original Date: 2015-11-12T17:50:40Z


I have now successfully created a minimal test case:

module test (
         input [31:0] in_op1, in_op2,
         output reg alu_ltu, alu_lts
);
         generate if (1) begin
                 always @* begin
                         alu_lts = $signed(in_op1) < $signed(in_op2);
                         alu_ltu = in_op1 < in_op2;
                 end
         end endgenerate
endmodule

This produces the following (incorrect) code for alu_lts and alu_ltu:

 vlTOPp->alu_lts = (vlTOPp->in_op1 < vlTOPp->in_op2);
 vlTOPp->alu_ltu = (vlTOPp->in_op1 < vlTOPp->in_op2);

The generated code is correct when the 'generate if' and 'end generate' lines are removed:

 vlTOPp->alu_lts = VL_LTS_III(1,32,32, vlTOPp->in_op1, vlTOPp->in_op2);
 vlTOPp->alu_ltu = (vlTOPp->in_op1 < vlTOPp->in_op2);

(This is with "verilator -Wno-fatal --trace --cc test.v".)

@veripoolbot
Copy link
Contributor Author


Original Redmine Comment
Author Name: Wilson Snyder (@wsnyder)
Original Date: 2015-11-13T03:32:36Z


Thanks for the testcase, as you surmised this requires a $signed underneath a generate block.

Fixed in git towards 3.880.

@veripoolbot
Copy link
Contributor Author


Original Redmine Comment
Author Name: Wilson Snyder (@wsnyder)
Original Date: 2015-12-19T15:33:18Z


In 3.880.

@veripoolbot veripoolbot added area: wrong runtime result Issue involves an incorrect runtine result from Verilated model resolution: fixed Closed; fixed labels Dec 22, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: wrong runtime result Issue involves an incorrect runtine result from Verilated model resolution: fixed Closed; fixed
Projects
None yet
Development

No branches or pull requests

2 participants