"which will then inevitably become a superbeing." My question is why you think this will be dangerous. Sure, itf it thought like a humans does, as a product of evolution and its drive towards blind competition, it would likely be very dangerous. That is not what it is a product of though.
I like Larry Niven's take, explained at Harry's Terran Bar by a couple of very advanced aliens. "Sorry, our friend that gave you the plans for the super sentient computer is a great jokester. Everyone knows that a machine like that eventually just shuts itself off."