Ayush Bharti
@ayushbharti.bsky.social
830 followers 190 following 18 posts
Academy Research Fellow at the Dept. of Computer Science, Aalto University, Finland. Affiliated with the Finnish Center for Artificial Intelligence. Website: http://bharti-ayush.github.io
Posts Media Videos Starter Packs
Reposted by Ayush Bharti
hpesonen.bsky.social
I'm looking for a Doctoral Researcher (PhD student) to work with me on simulation-based inference at Data Science Research Centre, Tampere University Check the link for details and send an application before October 10th.

tuni.rekrytointi.com/paikat/?o=A_...
Doctoral Researcher (simulation-based inference) / Väitöskirjatutkija (simulaatio-pohjainen päättely)
tuni.rekrytointi.com
Reposted by Ayush Bharti
fxbriol.bsky.social
Just finished delivering a course on 'Robust and scalable simulation-based inference (SBI)' at Greek Stochastics. This covered an introduction to SBI, open challenges, and some recent contributions from my own group.

The slides are now available here: fxbriol.github.io/pdfs/slides-....
ayushbharti.bsky.social
Thank you so much! Glad you liked it.
ayushbharti.bsky.social
Doing so saves hours of computation time for the radio propagation model without any degradation in performance. (5/5)
ayushbharti.bsky.social
Sampling from the cost-aware proposal is done via rejection sampling, and self-normalised importance weights are used to target the SBI posterior. (4/5)
ayushbharti.bsky.social
We propose to sample from a cost-aware proposal to encourage sampling from the cheaper parameterisations of the model. (3/5)
ayushbharti.bsky.social
Oftentimes, this computational cost varies with the parameter value, as is the case with this model from wireless communications field where the cost increases linearly. (2/5)
ayushbharti.bsky.social
Thread below:

Popular SBI methods such as Approximate Bayesian computation (ABC), neural posterior estimation (NPE) and neural likelihood estimation (NLE) require running the simulator thousands of times, which can be a computational bottleneck. (1/5)
ayushbharti.bsky.social
Looking forward to this!
upicchini.bsky.social
Reminder that the next OWABI seminar www.warwick.ac.uk/owabi is scheduled on Thursday the 24th April at 11am UK time. Our next speaker is @ayushbharti.bsky.social (Aalto University), who will talk about "Cost-aware simulation-based inference".
Reposted by Ayush Bharti
mummitrollet.bsky.social
Multi-Head Latent Attention vs Group Query Attention: We break down why MLA is a more expressive memory compression technique AND why naive implementations can backfire. Check it out!
datacrunch.io
⚡️Multi-Head Latent Attention is one of the key innovations that enabled @deepseek_ai's V3 and the subsequent R1 model.

⏭️ Join us as we continue our series into efficient AI inference, covering both theoretical insights and practical implementation:

🔗 datacrunch.io/blog/deepsee...
DeepSeek + SGLang: Multi-Head Latent Attention
Multi-Head Latent Attention (MLA) improves upon Group Query Attention (GQA), enabling long-context reasoning models and wider adoption across open-source LLMs.
datacrunch.io
ayushbharti.bsky.social
Looking forward to speaking at the @approxbayesseminar.bsky.social!
fxbriol.bsky.social
My collaborator @ayushbharti.bsky.social will be presenting our recently accepted AISTATS paper on 'cost-aware simulation-based inference' at the next One World ABI Seminar on the 27th February.

Full details of the seminar series: warwick.ac.uk/fac/sci/stat...
arXiv paper: arxiv.org/abs/2410.07930
One World ABI Seminar
warwick.ac.uk
Reposted by Ayush Bharti
approxbayesseminar.bsky.social
Our next talk will be on Thursday the 27th February at 11am UK time. Our next speaker is Ayush Bharti (Aalto University), who will talk about "Cost-aware simulation-based inference". To receive the link, sign up here: listserv.csv.warwick...
Reposted by Ayush Bharti
liza-semenova.bsky.social
If you are interested in doing a #PhD with me at Imperial College London and qualify as a home student, please reach out (before end of 2024)! Potential topics: spatial statistics, applied deep generative models, probabilistic programming and more.
ayushbharti.bsky.social
Congratulations Matias!
Reposted by Ayush Bharti
huangdaolang.bsky.social
Optimizing decision utility in Bayesian experimental design is key to improving downstream decision-making.

Excited to share our #NeurIPS2024 paper on Amortized Decision-Aware Bayesian Experimental Design: arxiv.org/abs/2411.02064

@lacerbi.bsky.social @samikaski.bsky.social

Details below.
Reposted by Ayush Bharti
lacerbi.bsky.social
@huangdaolang.bsky.social just joined here and you should follow him if you are interested in probabilistic machine learning, (Bayesian) exp. design and AI-assisted decision making. Not to mention that he has *several* NeurIPS papers already while in his 3rd PhD year...
ayushbharti.bsky.social
Looks very interesting. This goes to the top of my reading list 😀
ayushbharti.bsky.social
Hi, I'd like to join if that's ok.