Skip to main content

Hu Jiaqi: Please! Our Leaders, Can You Have a Sense of Crisis? —The 9th Open Letter to Leaders of Mankind

Leaders of various countries, Secretary General of the United Nations, top scientists and scholars, well-known entrepreneurs and well-known media leaders.

Distinguished leaders,

I pick up the pen once again and issue this, the ninth open letter so far. Amidst a profound sense of regret, there still remains a glimmer of hope. For the past forty-four years, I have been studying and advocating against the irrational development of science and technology, which will soon exterminate mankind. From the first time I wrote to the leaders of mankind until now, more than sixteen years have passed. However, as I observe this world still stagnating, I am filled with profound sorrow.

In today’s world, technology is advancing at a rapid pace, and unimaginable new inventions are born every day. However, as I have mentioned in previous open letters, the development of science and technology has reached a point where it holds the potential to destroy humanity. We must take immediate action, or it will be too late.

Many scientists have also expressed deep concerns about this. Following the joint open letter from thousands of scientists calling for a pause on high-level AI experiments, recently, over 350 top experts in artificial intelligence have once again signed an open letter. They are demanding a reduction in the existential risks posed by AI and urging for this issue to be considered alongside other large-scale risks such as pandemics and nuclear warfare.

This warning is not without basis. The power of science and technology has the potential to surpass our control and may even lead to our self-destruction.

Despite my repeated warnings and proposed solutions, it is regrettable that our leaders have not taken any substantial action on this matter. What I want to emphasize is that there is no issue more critical than the imminent threat of human extinction caused by science and technology.

Therefore, I reiterate my proposal: to achieve the great unification of humanity and utilize the power of a universal government to control science and technology, collectively lowering its level of risk. We cannot afford to push humanity to the edge of the cliff; we need to provide enough braking distance. It is time for our human leaders to wake up and take action.

I understand that implementing this proposal comes with significant challenges. However, the overall survival of humanity is above all. Faced with the possibility of our own destruction, we have no other choice. Global technological risks must be countered through global efforts.

Our leaders need to take a higher perspective to examine and understand this issue. The development of science and technology is not limitless but should be conducted within the boundaries and limits of human rationality. We cannot allow technologies to progress endlessly at the expense of our survival. Human history has shown that unlimited progress does not always yield positive results; on the contrary, it can lead to our destruction. We must adopt a more comprehensive perspective on technological advancement, acknowledging its benefits while also recognizing the risks and threats it may pose.

As the custodians of Earth, we have a responsibility and an obligation to protect our home and our future. This goes beyond the well-being of the current generation of humanity; it extends to the welfare of our future generations. We must be clear about our priorities, which are to safeguard human survival and continuity.

In light of this, I urge all human leaders to set aside their disputes and come together to address this global issue. I hope our leaders can truly comprehend the severity of this problem and take appropriate actions. Once again, I appeal to you to make the right decisions and take necessary actions with your wisdom and courage. The survival, well-being, and progress of humanity depend on your choices.

Lastly, what I want to say is that there is no issue more critical than the imminent threat of human extinction posed by science and technology. It is high time for our leaders to wake up and take action. I sincerely hope that you hear my appeal and make the right and responsible choices for the future of humanity.

Please! Our leaders, can you have a sense of crisis?!

Yours truly,

Hu Jiaqi

Founder of Save Human Action Organization

June 20, 2023

Join Us: http://savinghuman.org/pages/join-us.html

Email: join@savinghuman.org

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.