Bug Report for https://neetcode.io/problems/weight-initialization
This bug concerns the order in which the random parameters are generated in check_activations. The solution first sets torch.manual_seed(0), then proceeds to initialise the weights and collect them into a list, and only after all of this is done does it initialise the random input.
In principle, first generating the random input and then generating each weight matrix on-the-fly while doing the forwarding should work just fine, but since the order of random variables generated matters, this fails the tests.
This bug is more of a product bug than anything else, and fixing it will improve the user experience. The user should be clearly instructed in the problem description that they must:
- initialise and collect all the weights first without using self.xavier_init or self.kaiming_init (since this would mess up the RNG)
- generate a random input using torch.randn , as opposed to any other distribution
- any deviation from this order will not pass the tests
Bug Report for https://neetcode.io/problems/weight-initialization
This bug concerns the order in which the random parameters are generated in check_activations. The solution first sets torch.manual_seed(0), then proceeds to initialise the weights and collect them into a list, and only after all of this is done does it initialise the random input.
In principle, first generating the random input and then generating each weight matrix on-the-fly while doing the forwarding should work just fine, but since the order of random variables generated matters, this fails the tests.
This bug is more of a product bug than anything else, and fixing it will improve the user experience. The user should be clearly instructed in the problem description that they must: