One technology expert has emphasized the real danger posed by artificial intelligence at present and in the future, stressing the necessity for humans to be cautious. This came when Adrian Conzel, Chief Technology Officer at Own, the leading data management company, revealed his positive and negative views on these technologies in an interview with The US Sun.
Conzel explained how artificial intelligence can be extremely useful and said: “I can cite some examples I believe are an amazing use of artificial intelligence, such as when dealing with confirmed data like MRI and medical imaging, the ability of artificial intelligence to detect cancer better than doctors has already been proven in an unimaginable way.”
He continued to focus on how the presence of “human input” in data could sometimes be dangerous, in addition to using old data to train artificial intelligence, pointing out that “there is a real risk in using historical data sets to train things because there is real bias in historical data.”
He added, saying: “Therefore, these issues are of great importance for us to consider, as the use of artificial intelligence in favoring harmful biases that can lead to concern and disgust among other experts, and a recent study has revealed how artificial intelligence may lead to the formation of biased human decisions.”
In a statement describing the research, concerns about humans being exposed to a “dangerous loop” were raised, and Conzel also warned that we are in an early stage of companies adopting artificial intelligence. In light of this, experts still do not fully understand the true impact of artificial intelligence.
Conzel concluded his remarks by saying: “I believe we are at the beginning of the matter and have much to benefit from, we need to be cautious, we certainly need to take caution regarding individuals’ privacy and information,” he also suggested that companies’ data should be a good area to benefit from artificial intelligence programs, and not to use artificial intelligence in analyzing personal data.