The performance of reliability inference strongly depends on the modeling of the productâ€™s lifetime distribution. Many products have complex lifetime distributions whose optimal settings are not easily found. Practitioners prefer to use simpler lifetime distribution to facilitate the data modeling process while knowing the true distribution. Therefore, the effects of model mis-specification on the productâ€™s lifetime prediction is an interesting research area. This article presents some results on the behavior of the relative bias (*RB*) and relative variability (*RV*) of *p*th quantile of the accelerated lifetime (ALT) experiment when the generalized Gamma (*GG*_{3}) distribution is incorrectly specified as Lognormal or Weibull distribution. Both complete and censored ALT models are analyzed. At first, the analytical expressions for the expected log-likelihood function of the misspecified model with respect to the true model is derived. Consequently, the best parameter for the incorrect model is obtained directly via a numerical optimization to achieve a higher accuracy model than the wrong one for the end-goal task. The results demonstrate that the tail quantiles are significantly overestimated (underestimated) when data are wrongly fitted by Lognormal (Weibull) distribution. Moreover, the variability of the tail quantiles is significantly enlarged when the model is incorrectly specified as Lognormal or Weibull distribution. Precisely, the effect on the tail quantiles is more significant when the sample size and censoring ratio are not large enough. Supplementary materials for this article are available online.