Artificial Intelligence Will Not Replace Humans
October 23, 2024Comments
…but manufacturers embracing artificial intelligence (AI) will replace those who don’t.
This column’s title and the words above are the tagline of a book that I recently read by Dr. Markus Guerster, Artificial Intelligence Will Revolutionize Manufacturing. The book focuses on understanding AI and the strategic advantage of adopting advanced-manufacturing systems where AI does not replace human expertise, but augments it.
Today’s manufacturing plants contain a blend of manual, semi-automatic and fully automated processes where tremendous amounts of data flow through PLCs that capture every detail of an operation using sensors and actuators. A portion of this data may be displayed on a control panel for the operator to use in decision making. Other data may be stored elsewhere for later analysis. The problem for manufacturers: It has become so easy and cost-effective to collect and store massive amounts of process data that extracting and leveraging any actionable insight from it can prove impossible.
The emergence of AI has created a unique opportunity for industries to transform. Guerster asserts that the tools and systems necessary for this evolution are within reach. It’s an ideal time to transition from the conventional factory settings of today to the intelligent, data-driven factories of tomorrow.
The History of AI
Not a new phenomenon, AI arguably boasts a history that stretches back to the early 20th century.
In 1936, Alan Turing published a paper, On Computable Numbers with an Application to the Entscheidungsproblem. The Entscheidungsproblem—literally, decision problem—originally was posed by German mathematician David Hilbert in 1928. Turing proved that a hypothetical computing device could perform any conceivable mathematical computation were it represented as an algorithm. The device eventually became known as the Turing machine. During World War II, Turing helped the British government at Bletchley Park pioneer the technology to decrypt Nazi Germany’s secret Enigma code.
In 1956, American computer scientist John McCarthy and three colleagues coined the term “artificial intelligence” in their proposal for the famous Dartmouth conference in the summer of 1956. This conference essentially birthed AI as a field.
Despite initial excitement, funding for AI dried up and interest from the private and public sectors waned. Research persisted but often in academic settings away from the limelight. During one of these “quiet years,” many fundamental AI aspects emerged, among them machine learning, neural networks and natural language processing.