Tech News
← Back to articles

The AI complexity paradox: More productivity, more responsibilities

read original related products more articles

peepo/Getty Images

Does artificial intelligence (AI) make working life easier or complicated? Experts suggest the answer depends on the context.

In a recent IDC-hosted interview, SIAC CEO Toni Townes-Whitley described AI as the ultimate weapon against system complexity, noting that her company is employing AI to reduce tech complexity in some of the most complex technology environments on the planet -- within the US Department of Defense.

Also: Amazon's Andy Jassy says AI will take some jobs but make others more 'interesting'

Her team has been able to reduce mission planning and other operational functions at the department from "hours to minutes," she said. AI can have the same impact in commercial businesses, "reducing time and toil for business development, proposal development, searching, and creating new documents and content." On the developer side, AI has reduced the time spent on code generation.

These results are positive. However, other voices advised caution, as AI is just as capable of increasing as reducing complexity. The impact depends on where and how AI is applied, with the right kind of governance, of course.

Also: You've heard about AI killing jobs, but here are 15 news ones AI could create

"The integration of AI into our technological landscape brings with it a host of new complexities," said Supriya Bachal, program manager for R&D at Siemens.

"These complexities exist on multiple levels. Individual engineers and developers who are tasked with integrating AI into the tools we use must deal with new levels of complexity, as do the organizations that must manage these new AI systems."

Skill requirements may complicate the situation. While AI might potentially reduce the need for headcount in many areas, particularly in coding and IT management, applying the technology requires expertise in AI-friendly programming languages and frameworks, machine learning, deep learning, natural language processing (NLP), analytics, math, statistics, algorithm design, and deductive reasoning.

... continue reading