6/ The chance of building powerful AI is unusually high between now and around 2030, making the next 5 years especially critical.
If AGI emerges in the next 5 years, you’ll be part of one of the most important transitions in human history. If not, you’ll have time to return to your previous path.
It's often possible to transition with just ~100h of reading and speaking to people in the field. You don't need to be technical – there are many other ways to help.
4/ Under 10,000 people work full-time reducing important aspects of these risks – tiny compared to the millions working on established issues like climate change, or the number of people trying to deploy the technology as quickly as possible.
3/ These accelerations bring a range of major risks, not just misalignment, but also concentration of power, new weapons of mass destruction, great power conflict, treatment of digital beings, and more.
2/ Lots of people hype AI as 'transformative' but few internalise how crazy it could really be. There's three different types of possible acceleration, which are much more grounded in empirical research than a couple of years ago.