Yes a machine or computer is only as smart as the people or person who programs it and thus our current AI is flawed to a point. Having more than one person programming it can reduce the flaws and make it more of a perfect AI. Now if it is self aware and has the ability to learn and do away with the inherent flaws them it could help make sure the next generation of AI is less flawed and better able to help humans out in the long run.
As Elon Musk has pointed out, once an AI reaches a certainly level of complexity, it will be able to evolve itself in a matter of days at worst, and be god-like in intelligence relative to us. If we don't ensure proper restraints on our 'genie', it will be beyond us, and then we can only hope it doesn't care what we do, either way. If it did, we'd be squished like ants, or coddled like puppies.
What if AI already exists today but is not allowed to be on line or in the main stream until they are able to get some of the fear factors on the normal people down to a manageable level. What if some of the Science Fiction movies that are coming out are actually showing some current technology so we are less fearful?
Computer systems predict objects' responses to physical forces. New research examines the fundamental cognitive abilities that an intelligent agent requires to navigate the world: discerning distinct objects and inferring how they respond to physical forces. Source 4q.
Having the ability to think and reason will be the main goal to prove Artificial Intelligence. I do not think that we are very far from seeing this happen.
Superconducting synapse may be missing piece for 'artificial brains'. Researchers have built a superconducting switch that 'learns' like a biological system and could connect processors and store memories in future computers operating like the human brain. Source 6g.