How to make initializers i.i.d

I need to build a MLP and I need to initialized it with weights (W and b) gaussian independent and identically distributed across all layers (all W are gaussian i.i.d. and all b are i.i.d. with a different distribution respect to W). Now, if I specify a keras.initializers.RandomNormal(...) I can do this only for 1 layer, right? If I want to make it across all layers the only way i can follow is to build a custom initializer?