lead-with-ai-factory

How to Lead Your Company With AI-Packaged Solutions

Business leaders facing the decision of using artificial intelligence (AI) for their companies must ultimately confront the choices of buying, building, or partnering with AI packaged solutions, according to Beena Ammanath, an executive director of Deloitte. Which one to choose right now? 

Buying pre-packaged solutions, for one, presents an efficient starting point for constructing an organization’s AI ecosystem. This is because AI platforms focused on back-office functions and automated customer engagement have reached advanced stages of maturity, potentially eliminating the need for reinvention. 

On the other hand, building AI capabilities in-house transforms a business in profound ways beyond just implementing applications. As time progresses, an organization can upskill its workforce, make technology investments, and transition to more innovative use cases. Seasoned AI practitioners will more likely develop in-house AI solutions. 

AI partnerships 

AI collaborations or partnerships between business units and data scientists is deemed imperative, though. In such a scenario, businesses identify improvement opportunities, while data scientists devise feasible models and solutions. With the right technologies, processes, governance, and talent, they can craft applications that confer distinctive advantages.

Partnering with other innovative entities can also expedite access to AI capabilities and applications. 

This partnership can take various forms, such as collaborating with service vendors, product companies, or startups that specialize in the desired use case or offer AI-as-a-Service. 

Partnerships can also involve investments in promising startups or collaboration with academic institutions engaged in AI research and development. The enterprise’s AI strategy, goals, and alignment with the marketplace should guide the exploration of potential partnerships.

But now the question is what type of AI system to implement, whether to buy, build or collaborate?

AI’s complex amalgamation 

AI is a complex amalgamation of hardware, software, data, enabling technologies, and highly skilled human expertise. The presence and accessibility of these components guide the most strategic approach to AI adoption for a particular business. Navigating this decision demands leadership, as it impacts multiple facets of the organization. 

Discrepancies might arise between stakeholders’ perspectives on the organization’s AI strategies, with technologists favoring in-house development, marketers seeking easily deployable platforms, and CFOs as well as global mobility companies opting for AI-as-a-Service to optimize financial outcomes and hiring efficiencies, respectively.  

Effective leadership is required to harmonize all these inclinations. A cohesive strategy and governance structure, orchestrated within the C-suite, are necessary for all AI stakeholders to operate in unison. When building a robust AI ecosystem, the enterprise must consider the circumstances under which buying, building, or partnering is most suitable.

Which brings us to the low-hanging fruit for company leaders to tap into — generative AI (GenAI)

Generative AI use case from PwC

Adopting an “AI factory” approach that PwC is implementing could enable you to swiftly implement a single GenAI model across multiple functions and tasks. This could lead to a substantial boost in productivity and efficiency right away, while also paving the way for innovative business models in the near future.

With a significant three-year, $1 billion investment, PwC is rolling out GenAI across its organization and assisting clients in doing the same. It believes GenAI offers extensive benefits throughout businesses, characterized by high levels of repeatability. Its implementation is most effective when viewed as an enterprise-wide capability. 

Here are key considerations for establishing your own AI factory based on the PwC approach:

Embrace GenAI boldly

Exploit the potential of a single, replicable “pattern” of generative AI training and deployment that can be applied across your value chain. GenAI’s pre-trained models can be quickly adapted and customized. This allows for rapid proof of concept or even direct piloting. A 90-day “sprint” is often sufficient to initiate initial use cases and prepare for scaling.

Develop a roadmap. Bring together technology and business leaders.

Identify use cases, assess their value, and seek replicable patterns to deliver value across the organization. It’s worth noting that a moderately valuable use case with repeatability may yield higher ROI than a high-value, one-time use case. 

Identify gaps in data, technology, and skills, and create a plan to close these gaps in terms of cost and timeframe. Common gaps include insufficient, disorganized data sets, cloud-based computing power, cloud engineering capabilities, specialized data science skills, and a comprehensive understanding of responsible AI practices. 

Your roadmap should incorporate a “trust-by-design” approach, embedding responsible AI principles into technology and processes from the outset.

Choose your model and prepare data 

Various specialized companies, including major cloud platforms like AWS, Google, Microsoft, and OpenAI, offer generative AI “foundation models” pre-trained on extensive public data. After selecting a model, collaborate with your cloud provider to establish a private version of the model within your firewall. A private foundation model can be customized with your data, context, intellectual property, and expertise, while maintaining security.

For instance, experienced specialists can embed your data and IP into a GenAI model, making their expertise accessible throughout your organization. As you upgrade internal technology and cloud capabilities for GenAI, pay particular attention to data. 

Although generative AI can make sense of unstructured data, some cleansing and organization may still be necessary to mitigate bias risks. Creating data pipelines with new APIs ensures continuous access to up-to-date data. Enhancing data governance and cybersecurity is essential to manage potential risks associated with GenAI.

Establish an operational model based on pods

To ensure swift and repeatable deployment and execution, constructing an AI factory is key. It’s an operational model based on pods used to identify and assess use cases within a specific domain or line of business. Adapt the foundation model within each pod to deliver value. 

Each pod comprises six roles:

  • Pod Leader: Oversees business objectives and value assessment for potential use cases.
  • Business Analyst: Defines use case objectives, gathers requirements, and monitors deliverables.
  • Prompt Engineer: Designs prompts to customize model outputs, collaborates with others to validate results.
  • Model Mechanic: Customizes the AI model to enhance results, assumes technical responsibility after deployment.
  • Data Engineer: Accesses, prepares, and organizes data and embeddings for the AI model.
  • Data Scientist: Collaborates with the prompt engineer and model mechanic to maximize accuracy and performance.

Creating a GenAI factory might necessitate hiring specialists and upskilling existing business and tech roles. GenAI’s scalability can alleviate workforce demands, as technology specialists can often manage multiple pods concurrently. 

Pre-built toolkits with ready-to-use software code and prompts can accelerate deployment and reduce GenAI costs for personalized customer experiences, content creation, research, software delivery, support services, report generation, deep retrieval, smart summaries, and Q&A engines. (Dennis Clemente)