Variable selection in regression models has been well studied in the literature, with many non-Bayesian and Bayesian methods available in this regard. An important class of regression models is generalized linear models, which involve situations where the response variable is discrete. To add more flexibility, generalized additive partial linear models can be considered, where some predictors can have a non-linear effect while some predictors have a strictly linear effect. We consider Bayesian variable selection in these models. The functions in the non-parametric additive part of the model are expanded in a B-spline basis and multivariate Laplace prior put on the coefficients with point mass at zero. The coefficients corresponding to the strictly linear components are assigned a univariate Laplace prior with point mass at zero. The prior times the likelihood is mathematically intractable, but we find an approximation by expansion around the posterior mode, which is the group lasso solution in generalized linear model setting for the choice of prior. We thus completely avoid Markov chain Monte Carlo methods, which are extremely slow and unreliable in high-dimensional models. We evaluate the performance of the Bayesian method by conducting simulation studies and real data analyses.