我无法弄清楚为什么在下面的例子中“sig2”的信号分配不会成功发生,而对于“sig1”它会发生。随着时钟上升沿,“sig2”变为“X”!
是什么原因?
library IEEE;
use IEEE.STD_LOGIC_1164.ALL;
use IEEE.NUMERIC_STD.ALL;
use IEEE.STD_LOGIC_UNSIGNED.all;
entity Test_tb is
end entity Test_tb;
architecture Structural of Test_tb is
signal sig1 : std_logic_vector (3 downto 0) := (others => '0');
signal sig2 : std_logic_vector (7 downto 0) := (others => '0');
signal clk : std_logic := '0';
begin
clk_generate: process is
begin
wait for 5 ns;
clk <= not clk;
end process clk_generate;
gen_label : for gen_indx in 0 to 3 generate
begin
process (clk) is
begin
if clk = '1' and clk'event then
sig1 (gen_indx) <= '1';
for loop_indx in 0 to 1 loop
sig2 (gen_indx * 2 + loop_indx) <= '1';
end loop;
end if;
end process;
end generate gen_label;
end architecture Structural;
这是因为,当在for循环内部分配信号时,假设驱动程序影响数组(或记录)的所有元素。这是因为它无法在精化时计算出for循环的界限,因为它是运行时概念。这与生成循环不同,生成循环可以在精化时推断出界限。
因此,您需要从进程内部移除for循环,或者为生成循环创建一个本地信号,该信号分配给外部sig2。例如:
gen_label : for gen_indx in 0 to 3 generate
signal local_sig : std_logic_vector(1 downto 0);
begin
process (clk) is
begin
if clk = '1' and clk'event then
sig1 (gen_indx) <= '1';
for loop_indx in 0 to 1 loop
local_sig(loop_indx) <= '1';
end loop;
end if;
end process;
sig2(gen_indx*2+1 downto gen_indx*2) <= local_sig;
end generate gen_label;