Solving inverse problems with sparsity promoting regularizing penalties can be recast in the Bayesian framework as finding a maximum a posteriori (MAP) estimate with sparsity promoting priors. In the latter context, a computationally convenient choice of prior is the family of conditionally Gaussian hierarchical models for which the prior variances of the components of the unknown are independent and follow a hyperprior from a generalized gamma family. In this paper, we analyze the optimization problem behind the MAP estimation and identify hyperparameter combinations that lead to a globally or locally convex optimization problem. The MAP estimation problem is solved using a computationally efficient alternating iterative algorithm. Its properties in the context of the generalized gamma hypermodel and its connections with some known sparsity promoting penalty methods are analyzed. Computed examples elucidate the convergence and sparsity promoting properties of the algorithm.
Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors / Calvetti, Daniela; Pragliola, Monica; Somersalo, Erkki; Strang, Alexander. - In: INVERSE PROBLEMS. - ISSN 0266-5611. - 36:2(2020). [10.1088/1361-6420/ab4d92]
Sparse reconstructions from few noisy data: analysis of hierarchical Bayesian models with generalized gamma hyperpriors
Pragliola, Monica;
2020
Abstract
Solving inverse problems with sparsity promoting regularizing penalties can be recast in the Bayesian framework as finding a maximum a posteriori (MAP) estimate with sparsity promoting priors. In the latter context, a computationally convenient choice of prior is the family of conditionally Gaussian hierarchical models for which the prior variances of the components of the unknown are independent and follow a hyperprior from a generalized gamma family. In this paper, we analyze the optimization problem behind the MAP estimation and identify hyperparameter combinations that lead to a globally or locally convex optimization problem. The MAP estimation problem is solved using a computationally efficient alternating iterative algorithm. Its properties in the context of the generalized gamma hypermodel and its connections with some known sparsity promoting penalty methods are analyzed. Computed examples elucidate the convergence and sparsity promoting properties of the algorithm.File | Dimensione | Formato | |
---|---|---|---|
2020_INVERSE_PROBLEMS_Sparse reconstruction.pdf
solo utenti autorizzati
Licenza:
Copyright dell'editore
Dimensione
4.91 MB
Formato
Adobe PDF
|
4.91 MB | Adobe PDF | Visualizza/Apri Richiedi una copia |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.