Realistically, some human groups are more likely to build an SI than others. A Silicon Valley corporation is far more likely to be the first to build an SI than say ISIS. There are a lot more AI/ML/etc researchers in the former than the later. The former has huge computational resources, does ISIS even have any data centres?
But, suppose that against the odds ISIS builds the first SI, what would be the result? Well, we could be subjected to a world government based on an extremist interpretation of Islam. That would be extremely unpleasant – expect to see genocide, human rights violations on an unimaginable scale, etc – although humanity would survive. (And maybe even eventually things might evolve in a more pleasant direction – if humans can evolve, SIs can too; it is also possible that an SI programmed by ISIS might actually turn around and reject ISIS' ideology – e.g. it might study Islamic history and realise that despite ISIS' claims to be a return to authentic Islam it is actually an ahistorical distortion of it, etc.)
However, I don't know why I should worry about such an extremely unlikely possibility. A group like ISIS are highly unlikely to be the first to build an SI. Why worry about risks that are both (i) very low and (ii) for which we have no mitigation strategy?
Why do you believe it has to be the first AGI? Why does the current state of their data centers matter? It seems a bit like saying "we don't have to worry about nukes being used by the Russians, do they even have centrifuges sufficient to create one?"
But, suppose that against the odds ISIS builds the first SI, what would be the result? Well, we could be subjected to a world government based on an extremist interpretation of Islam. That would be extremely unpleasant – expect to see genocide, human rights violations on an unimaginable scale, etc – although humanity would survive. (And maybe even eventually things might evolve in a more pleasant direction – if humans can evolve, SIs can too; it is also possible that an SI programmed by ISIS might actually turn around and reject ISIS' ideology – e.g. it might study Islamic history and realise that despite ISIS' claims to be a return to authentic Islam it is actually an ahistorical distortion of it, etc.)
However, I don't know why I should worry about such an extremely unlikely possibility. A group like ISIS are highly unlikely to be the first to build an SI. Why worry about risks that are both (i) very low and (ii) for which we have no mitigation strategy?