Lucija Batinovic
@metahag.bsky.social
330 followers 750 following 73 posts
Meta-science | Open Science | PhD student | Disability Research | Editorial Assistant @ Meta-Psychology | https://elucidatescience.netlify.app
Posts Media Videos Starter Packs
Reposted by Lucija Batinovic
rezekjoe.bsky.social
I fucking love buying books.
Reposted by Lucija Batinovic
mekline.bsky.social
Hi #openscience friends! As Psych-DS moves forward (stay tuned for an R package update...), we're applying in parallel for renewal of the NIH grant that funds this work.

If you (a) have NIH funding and (b) plan or would like to use Psych-DS for resulting datasets, please drop me a line!
Reposted by Lucija Batinovic
aufdroeseler.bsky.social
ReplicationResearch.org is now open for submissions!

Submit replications and reproductions from many different fields, as well as conceptual contributions. With diamond OA, open and citable peer review reports, and reproducibility checks, we push the boundaries of open and fair publishing.
metahag.bsky.social
Putting high hopes in this one. Happy birthday to me 🥳
Picture of the book “Model to Meaning”
metahag.bsky.social
I think time is just a proxy for cognitive bandwidth
metahag.bsky.social
Feedback is very welcome, particularly if you spot mistakes or unclear interpretations!
metahag.bsky.social
Understanding how different parts of evidence synthesis inform the final result is crucial for trusting meta-analytic findings, which often carry more weight than single studies when it comes to shaping policies and opinions.
metahag.bsky.social
This post is an attempt to introduce meta-analytic concepts and relate them to the GRADE approach, using a few simple examples to show how different parts of a meta-analysis can inform different GRADE domains, which is something I often struggle with as it feels so subjective and nuanced.
metahag.bsky.social
Many meta-analyses in education research (and probably other fields outside of medicine), often overlook aspects such as heterogeneity, publication bias, and quality of evidence when drawing conclusions about the state of the field.
metahag.bsky.social
Thank you! I didn’t realize I had 4.6.0 because I recently updated packages so I assumed that was the latest version. Updating to 4.8.0 worked.
metahag.bsky.social
Is anyone else getting this error or am I missing something: Error: 'se' is not an exported object from 'namespace:metafor'?
#rstats @wviechtb.bsky.social
Reposted by Lucija Batinovic
ianhussey.mmmdata.io
Yet another example of original authors being allowed to say anything they want in replies to critiques unconstrained by verifiable facts.

Reply states we didn’t consider things we explicitly did, and says the original article never said things it explicitly did.

Read Jamie’s thread for details:
jamiecummins.bsky.social
This has now been published in PNAS!

www.pnas.org/doi/10.1073/...

The original authors also posted a reply:

www.pnas.org/doi/10.1073/...

Quick thread on some additional thoughts and then I'm probably done talking about this one 🧵
Reposted by Lucija Batinovic
ianhussey.mmmdata.io
I wrote an R package that creates standardized R project structures that are compliant with @mekline.bsky.social's psych-DS...ish.

It also creates additional features for reproducibility and teaching like a readme, license, .gitignore and Quarto templates

+ can validate existing projects
Creating and validating standardized R project structures that are psych-DS compliant-ish
Making psychological code and data FAIR is hard, in part because different projects organize their code and data very differently. Sometimes this is for good reasons, such as due to the demands of a g...
mmmdata.io
Reposted by Lucija Batinovic
jamiecummins.bsky.social
ERROR @error.reviews was awarded a Commendation from the Society for the Improvement of Psychological Science!
metahag.bsky.social
For anyone who’s published systematic reviews, how often did you succeed in getting unpublished data by reaching out to authors? I don’t mean additional/IP data from published papers, but new data or data they could not get published. #evidencesynthesis
metahag.bsky.social
I asked with the same prompt and z-curve was not mentioned at all. I got too long of a reply but it mentioned:

Among commonly used tests, Peters’ test seems to strike a good balance:
– better power than Begg’s,
– lower false positive rate than Egger’s when heterogeneity is present.
Reposted by Lucija Batinovic
lakens.bsky.social
A lot of discussions about open science in psychology happened on blogs. I have been archiving them and will make their content available. I remember 37 blogs (with 2827 posts!) but which blogs did I forget, and are not on my list? Please share and reply with any that are missing!
Reposted by Lucija Batinovic
syeducation.bsky.social
New post! "A Tale of Two Science Reform Movements," in which I compare the recent #Metascience2025 and #SIPS2025 conferences, and find that I am much more at home at one than the other. getsyeducated.substack.com/p/a-tale-of-...
A Tale of Two Science Reform Movements
Reflections from meetings of Metascience 2025 and the Society for the Improvement of Psychological Science
getsyeducated.substack.com
Reposted by Lucija Batinovic
metahag.bsky.social
Agreed, I feel like the overall direction of the entire conference felt too abstract and almost like it’s engaging in “meta-washing”
metahag.bsky.social
Agreed, I feel like the overall direction of the entire conference felt too abstract and almost like it’s engaging in “meta-washing”
metahag.bsky.social
Yes, very slow, no matter which browser used. It’s been like that since they had the crash…
Reposted by Lucija Batinovic
cosig.net
COSIG @cosig.net · Jun 4
Anyone can do post-publication peer review.
Anyone can be a steward of the scientific literature.
Anyone can do forensic metascience.
Anyone can sleuth.

That's why we are launching COSIG: the Collection of Open Science Integrity Guides, an open source resource for all of the above.

cosig.net
COSIG logo:
COSIG (Collection of Open Science Integrity Guides)

Now available at cosig.net!