From 46074f7f9aa4e107f23bda6e8fe2d49f9e7bf451 Mon Sep 17 00:00:00 2001 From: Joep Meindertsma Date: Mon, 15 Jan 2024 14:56:31 +0100 Subject: [PATCH] Adjacent proposal --- src/posts/proposal.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/src/posts/proposal.md b/src/posts/proposal.md index 7a6c5efd8..7181b3fca 100644 --- a/src/posts/proposal.md +++ b/src/posts/proposal.md @@ -15,7 +15,7 @@ This is why we need a **global Pause**. ## Implementing a global Pause An international agreement is typically established through a summit, where leaders of countries meet to discuss the issue and make a decision. -The UK has stepped up and is has hosted an AI safety summit in the autumn of 2023. +The UK has stepped up and is has hosted an AI safety summit in the autumn of 2023. And two more summits have been announced. [More about the summits](/summit) @@ -47,6 +47,8 @@ Read more about [how these risks can be mitigated](/mitigating-pause-failures). - **Ban training of AI systems on copyrighted material**. This helps with copyright issues, slows growing inequality and slows down progress towards superintelligence. - **Hold AI model creators liable** for criminal acts committed using their AI systems. This gives model creators more incentives to make sure their models are safe. +- **Renegotiate work contracts** to prevent corporations from forcing employees to automate their work with AI. +- **Support the climate movement** to investigate (and sue against) the pollution at data centers, fab labs, and mines. ## Long term policy