Expensive 'black-box' optimization problems involve functions whose analytic expression is unavailable and which are computationally-intensive to evaluate. Such problems are often solved with frameworks which combine evolutionary algorithms and surrogate models. A common surrogate variant is the Radial Basis Functions model which combines the output of several basis functions. The latter depend on a hyperparameter which affects their features and as such it also affects the overall surrogate prediction. Typically the optimal hyperparameter value is unknown and is therefore estimated by additional numerical procedures. This raises the question if the additional computational resources spent on the hyperparameter calibration are justified, namely do they translate to meaningful differences in the overall search effectiveness of the algorithm. This aspect has largely not been examined in the literature and typically the hyperparameter impact was assessed based only on the prediction accuracy of the stand-alone surrogate model. As such this paper addresses this open question and analyzes the impact of the hyperparameter on the overall search performance based on an extensive set of numerical experiments. A detailed analysis shows that modifying the hyperparameter strongly affected this performance and that the extent of the impact was related to the basis function type, the objective function modality, and the problem dimension.