For a number of years, industry thought leaders have been talking about the concept of the IT or software factory — or software industrialization — in which code is produced in an automated, building-block effect. It’s debatable as to whether enterprises have been able to fully transform into software factories, but the idea of moving away from hand-crafted, one-at-a-time applications and services to something more scalable has merit.
Now, there’s talk of producing an “AI factory.” This involves “industrializing data gathering, analytics, and decision making to reinvent the core of the modern firm,” write Marco Iansiti and Karim Lakhani, both Harvard University professors in their new book, Competing in the Age of AI. “The AI factory is the scalable decision engine that powers the digital operating model of the 21st century firm,” they state. “Managerial decisions are increasingly embedded in software.”
The good news is “you don’t have to be a Netflix to build an AI factory,” they write. Here are the four essential components that go into building and operating an AI factory:
Shore up the data pipeline: The process that involves gathering, inputting, cleaning, integrating, processing, and safeguarding data “in a systematic, sustainable, and scalable way.” For example, Iansiti and Lakhani point to Netflix as an example of a company that has “datafied” its business, “systematically extracting data from activities and transactions that are naturally ongoing in any business.” Cleaning and integrating data can be a major challenge, they caution. The first order of business in building an AI factory, they state, is investing in a well-functioning data pipeline.
Develop algorithms: This is the process of developing predictive capabilities. Algorithms “can be used for a variety of applications, from generating relatively simple predictions like a sales forecast to suggesting stocks to pick for high-frequency trading, to complex image recognition and language translation tasks,” Iansiti and Karim Lakhani explain.
Most AI systems use one of three general approaches to develop accurate predictions using supervised learning, unsupervised learning and reinforcement learning machine learning approaches. Supervised learning seeks to “come as close as possible to a human expert in predicting an outcome,” based on labeled datasets. Unsupervised learning “aim to find natural groupings in the data, without labels, and uncover structures that may not be obvious to the observer.” The most advanced form of machine learning, reinforcement learning “requires only a starting point and a performance function,” the co-authors state.
Add a robust experimentation platform: An experimentation platform is the mechanism “through which hypothesis regarding new prediction and decision algorithms are tested to ensure that changes suggested are having an intended effect.” This is essential to the AI factory, Iansiti and Lakhani state, and requires a state-of-the-art platform — “traditional, ad-hoc approaches to experimentation simply cannot handle the impact of what is required.”
Modernize the software infrastructure: This is the collection of systems that embed the pipeline into consistent and component applications or services that are made available to end-users. “After the data is aggregated, cleaned, refined, and processed, it is made available through consistent interfaces such as APIs, allowing applications to rapidly subscribe, sample what they need, test, and deploy,” the co-authors explain. “All of this lets an agile development team build a new application in weeks, sometimes days.”
The authors provide additional structural advice, emphasizing that well-designed APIs are a key ingredient to AI factories. “APIs throttle the flow of data in and out of software factory systems,” they state. “APIs control access to some of the most critical and private assets within the organization.” Ultimately, they continue, “the data, software and connectivity underlying an AI factory must reside in a secure, robust, and scalable computing infrastructure, increasingly on the cloud, scalable on demand, and built using standard off-the-shelf components and open-source software.”
Big Data Analytics
Internet of Things