A brain made of circuits floating above a road in the desert


AI alignment refers to the process of designing AI systems that are aligned with human values and goals or the goals of the AI creator. It is an open problem for modern AI systems and a research field within AI.  Typically alignment require that the AI system advances more general goals such as human values, other ethical principles.

An AI system is refered to as aligned when it advances its designers’ intended goals and interests, and mis-aligned (or not aligned) when it does not do so.

Subscribe to our Newsletter and stay up to date!

Subscribe to our newsletter for the latest news and work updates straight to your inbox.

Oops! There was an error sending the email, please try again.

Awesome! Now check your inbox and click the link to confirm your subscription.