All convolutions inside of a dense block are ReLU-activated and use batch normalization. Channel-sensible concatenation is barely attainable if the height and width Proportions of the information keep on being unchanged, so convolutions inside a dense block are all of stride one. Pooling layers are inserted between dense blocks https://financefeeds.com/dogecoin-doge-and-shiba-inu-shib-holders-won-over-by-new-coin-gearing-up-for-a-20100-run/