Table 1. Performance evaluation of activation functions

Dataset Activation function mAP
Original SiLU (original) 0.863
SELU 0.781
ELU 0.857
LeakyReLU 0.856
Mish 0.860
ReLU 0.856
Proposed method (SiLU + Mish) 0.866
mAP, mean average precision; SiLU, sigmoid-weighted linear unit; SELU, scaled exponential linear unit; ELU, exponential linear unit; ReLU, rectified linear unit.