Offer to buy equity directly from the founder at or above the minimum valuation. This order will only go through once the auction closes on 04/05/2024, if at that time the founder has received enough buy offers to cover their minimum costs.
When Turing Prize (later renamed to AI Alignment Awards) visited India, I helped them with holding seminars, and workshops at different universities, trying to nerd snipe talented students into working on the alignment problem.
One of the most impactful things I could do was actually get more people to read HPMOR. The Harry Potter fan fiction - Harry Potter and the Methods of Rationality has been the pipeline for some of the best thinkers in the alignment community to find the problem and care about it.
Most of the people who love it, always express their wish they had found the ideas in it earlier. So, I am applying for this grant to get the money to print additional copies of HPMOR.
I also found superintelligence to be an excellent book to give away to the right crowd, in the right manner. At the end of talks, as prizes to those students or participants who engage and answer questions.
Giving away books on the street to random people is not very impactful and is mostly a waste of paper. Still, people are very invested and likely to engage with the material if the giveaway is planned out in a thoughtful manner.
I am working with my friends - Madhumitha (who recently gave a talk on AI safety at Dev Fest followed by book distribution to those who answered questions) and Bhishmaraj, who works at Google Hyderabad and gave a talk on AGI x risk at IIIT Hyderabad with me. They can help me effectively distribute at least 500 copies of HPMOR.
A few months ago, when I worked with the alignment awards org, the rate was around 10 USD per copy of a book, I did not try to optimize for the quality of paper, binding, etc. This time I think I would like to go for 15 USD to improve the quality of the printing.
I am excited to distribute 500 more copies of HPMOR, but I am also interested in giving away copies of Superintelligence, Brian Christian's Alignment Problem, Nate's Replacing Guilt, or Sequences, etc, depending on the funding amount.
I am currently a recipient of a Community Builders Grant from CEA to do AI safety field building in India.
I already worked with Akash Wasil (who is currently at the Center for AI Safety) and Olivia when they did similar book give aways in India.
I am currently a Ph.D. student in India's top-ranked fundamental research university - Indian Institute of Science, and have contacts in academia so I can get the book into the hands of prospective graduate students who can contribute to the technical problem.
I have friends who are in FAANG companies working on LLMs, it would be impactful if they also were made aware of the nuanced risks behind the technologies they help bring to market.
https://twitter.com/adityaarpitha
https://blog.adityaarpitha.com/
https://t.me/everythingisrelative
4500 USD (300 copies) to 600 copies so (15x600 = 9000 USD)
85% of most copies being read fully