In the context of neural networks, what is batch normalization, and why is it used? - Study24x7
Social learning Network
03 May 2024 09:54 AM study24x7 study24x7

 In the context of neural networks, what is batch normalization, and why is it used?

A

It normalizes the input features before feeding them to the network, enhancing model convergence

B

It normalizes the output of each layer in a mini-batch, improving training stability and speed

C

It balances the number of samples in each training batch, preventing bias in model updates

D

 It is a technique to regularize neural networks by adjusting the batch size during training

study24x7
Write a comment
Related Questions
500+   more Questions to answer
Most Related Articles