"The Singularity" — that magic moment when the ever-accelerating rate of technological progress goes to infinity — is much discussed in the context of artificial intelligence. The hypothesis is that as computers get "smarter" then they will catalyze yet-more-rapid progress which will in turn produce still more progress, in a runaway chain reaction. Supposedly, then, some time around 2040 (plus or minus a few decades) machines will become almost infinitely smart, transcend our ability to understand them, and either become our masters or vanish in a flash of brilliance.
Well, maybe not, even granting all those radical assumptions. Look more closely: technological advancement is far from uniform today. Some countries are far ahead of others, as are some regions within countries. The same thing will happen as AI programs start to reprogram themselves, re-architect microelectronic fabrication facilities, etc. Local zones will approach transcendence first, like bubbles in a boiling liquid at a phase transition. Some will get ahead of others, perhaps spread and take over. Some may try to "push the Reset Button" on others, and IQ-wars will break out. The speed of light will, barring improbable new physics, let multiple singularities happen in disjoint places. Some may build metaphorical walls to protect themselves. Some may turn back from nirvana, Buddha-like, to uplift the rest of the cosmos.
So a single "Singularity"? Doesn't look as likely as a messy turbulent explosion of progress — much like we already have, eh?