Published on October 1, 2025 9:09 PM GMTI think alignment by itself is complex phenomenon, but big part of it is sharing the same, or similar ethics.And ethics itself is also very complex phenomenon and often fuzzy and inconsistent, but to a certain approximation from a certain perspective, I see a lot of ethics in big part as software protocol evolved by evolution for mutual cooperation between "same enough kinds".And different ethical frameworks define "same enough kind" differently, like if you include people from your neighborhood, country, earth, ideology, personality cluster, interests cluster, etc., or also animals (also which animals you include and exclude), more aligned or less aligned AGIs, etc., in your ethics as someone you want to cooperate with.You can cooperate more locally or more globally depending on how big is your circle and kinds of people or entities in general you cooperate with, or on a smaller or bigger timescale to collectively survive. You can use a lot of methods for that, like for example maximizing of overall good in utilitarianism, which can potentially be to a certain degree quantified over time (like in a lot of Effective Altruism), or following some formal rules/duties in deontology. Which I think can have some approximated formal basis to it in more forms.But I think a lot of people also operate more on fluid vibes that are hard to ground in some formalism, but there's still a lot of cooperation involved.Discuss