Armando Bellante
banner
ikiga1.bsky.social
Armando Bellante
@ikiga1.bsky.social
Postdoctoral researcher in quantum algorithms at the Max Planck Institute of Quantum Optics, Munich. PhD from Politecnico di Milano. Reverse engineering and binary exploitation with Tower of Hanoi and mhackeroni.
This mirrors the classical #CompressedSensing vs. #ShannonNyquist setting: lower bounds for dense objects stay intact, but sparsity in the right dictionary changes the sample/query complexity.

A huge thanks to Stefano Vanerio, @raistolo.bsky.social, and to everyone who discussed this with me. 6/6
Quantum Sparse Recovery and Quantum Orthogonal Matching Pursuit
We study quantum sparse recovery in non-orthogonal, overcomplete dictionaries: given coherent quantum access to a state and a dictionary of vectors, the goal is to reconstruct the state up to $\ell_2$...
arxiv.org
October 9, 2025 at 12:21 PM
In favorable regimes (e.g., m≈N dictionary vectors, K=Õ(1) sparsity, and well-conditioned support), QOMP lowers the query cost of pure-state #quantum tomography from Θ̃(N/ε) to Ō(√N/ε), breaking known tight lower bounds thanks to the sparsity assumptions. 5/n
October 9, 2025 at 12:21 PM
We prove that under standard dictionary mutual incoherence and well-conditioning assumptions, QOMP recovers the optimal support in polynomial time! 4/n
October 9, 2025 at 12:21 PM
To overcome this, we introduce #QOMP: a greedy, iterative #quantumalgorithm that applies block-encoded projections to isolate the residual, estimates overlaps, and identifies one dictionary vector per round, using an error-resetting strategy to prevent error propagation across iterations. 3/n
October 9, 2025 at 12:21 PM
We formalize and study the problem of #QuantumSparseRecovery: given coherent access to a state and a dictionary, reconstruct the state up to ε ℓ error using as few dictionary vectors as possible. We prove the general problem is #NP-hard, showing that efficiency needs structure. 2/n
October 9, 2025 at 12:21 PM
May 30, 2025 at 12:07 PM