Chakri Gottemukkara, co-founder and CEO of o9, began his entrepreneurial journey 10 years ago when he and several other employees launched a supply chain management technology platform. We now cover 29 industries and have grown to 3,000 employees. The unicorn, currently valued at around $4 billion after a $116 million infusion last year, remains majority-owned by two promoters. Mr. Chakri, who was in Hyderabad for the inauguration of o9’s facility, spoke about the challenges that businesses face in delivering algorithm-based solutions in complex macroeconomic and geopolitical contexts. excerpt:
What challenges do you see across the supply chain and how can technology address them, especially in the post-pandemic space with macroeconomic and geopolitical issues? Or?
There are two main drivers of value leakage in large enterprises: volatility and complexity. Volatility refers to the unpredictability of supply and demand due to supply chain disruptions, business model changes, and geopolitical events. Complexity arises from the large number of products, markets, and complex supply chains. The combination of volatility and complexity demands intelligent solutions and rapid decision-making. We address these challenges by constantly evolving our software and technology to model complex scenarios and automate decision-making. For example, it enables hyper-automation of daily decisions such as production, shipping, and customer order commitments. Our platform acts as the digital brain that unifies all decisions for a company, powering intelligent, connected decision-making. Technology enables faster scenario planning and faster response to the volatile and complex realities of today’s business environment.
If you are already using algorithms, how will an LLM change the way you work?
The main business problem we are trying to solve is to simplify implementation. Currently, configuring solutions for different industries requires a great deal of knowledge and time. With LLM, you can digitize this knowledge and create AI-powered agents to help your organization with planning and consulting. For example, an administrator can ask a question in natural English, and an AI-powered system will understand the request and run the necessary algorithms to provide an answer. This eliminates dependence on expert knowledge and speeds decision-making.
However, there are hallucination and security challenges when using LLM. How do you deal with these?
We address the hallucination problem by not using LLM to answer the entire question. Instead, harness their power to connect the dots and transform your questions into intelligent queries that your system can answer. When it comes to security, we leverage existing frameworks that ensure only authorized personnel can access data. This framework applies even when queries are made in natural English. Essentially, it leverages the language processing capabilities of LLM while ensuring security and reliable computation through the system.
How do you train your primarily technical workforce to ask the right questions to leverage the power of LLM in business applications?
The key is to transform tribal knowledge into digital knowledge in the enterprise. Tribal knowledge, knowledge that resides in people’s heads, creates silos, prevents learning from past experiences, and dissipates knowledge when employees leave. Our platform, Digital Brain, enables the digitization of this tribal knowledge. This means every prediction, every decision, every outcome is stored and learned from, enabling fact-based decision-making and continuous improvement. This digital brain can also identify patterns and biases in individual decision-making styles, leading to more reliable and consistent decision-making across the organization. We are essentially bridging the gap between human thought processes and systems understanding by transforming tribal knowledge into a dynamic, evolving, and accessible digital knowledge base.
Are you building your own LLM or using a third-party LLM?
We employ a multi-model approach. We are creating an architecture that can use both public and private LLMs trained on domain-specific data. This allows you to leverage the power of large-scale language models for general knowledge, while leveraging specialized models for complex, industry-specific tasks. We aim to develop a system that can intelligently route questions to the most appropriate LLM, ensuring efficiency, accuracy, and security.
What are your growth plans for the next 1-2 years? Is an IPO part of the plan?
We are aiming for an IPO within the next two to three years. We have put in place the necessary processes, including ensuring predictability in our operations and implementing more rigorous reporting and auditing procedures. However, knowing both the advantages and disadvantages, you still have the option to decide whether to go public. Our long-term growth plan is to achieve 10x growth over the next 5-10 years. We believe this is achievable given the significant opportunities in the market.
Published January 20, 2025