Over the past few weeks, we’ve been releasing details about our generative AI efforts, an initiative that has been part of our larger proactive product strategy. All of Evalueserve’s AI efforts — foundational tools, services and products – rely on our larger platform called AI for Research and Analytics (AIRA).
AIRA offers Evalueserve product and operations teams a foundational platform with a wide set of domain-specific reusable AI components. These components work with each other via APIs, allowing teams to evolve modules proactively, releasing them after testing and verifying their outputs.
While the company has been using generative AI for years, AIRA’s modular architecture built on our Knowledge Solutions Framework has allowed us to create new products and augment existing ones quickly. For example, just last week we recently released Research Bot and last year added generative AI to established products like Insightsfirst with some already proven successes.
A Journey to Rapid Evolutions
In the early days, Evalueserve operated as a “mind+machine” that primarily invested in technology as the needs of our customers grew. Our approach to product development was reactive, with our team building applications and solutions in response to common issues faced by clients. In essence, product strategy was driven by the customer demands and the necessity to solve recurring problems.
Before 2019 when I joined the company as CTO, we would experiment, roll out a product to a few clients, and then refine it based on their input. For example, the first version of our Researchstream application was only available to two large financial firms. After receiving their feedback, we made improvements and launched version 2 for a slightly larger client base.
In addition to creating reactive products, we developed bespoke applications for specific client needs. We considered the potential for reselling these custom solutions to other clients, but we didn’t always fully grasp the complexities involved. As a result, we sometimes created applications with intellectual property (IP) rights belonging to us or the client, without fully capitalizing on the potential for reuse.
When I joined the company, our vision and goals shifted. We aimed to transform our product strategy from being reactive to proactive. Instead of just responding to customer needs, we wanted to actively seek out market demands, much like a typical product company. Our goal was to integrate technology solutions with existing service capabilities. As we transitioned to a more proactive strategy, we aimed to identify market needs and develop products that could be integrated with our existing services and called it Product-Led Services, ultimately providing more value to our clients.
Embracing Reusability and Modularity in Product Development
One area in which our company previously struggled was considering reusability when developing solutions. We often approached specific problems with unique solutions, without evaluating if the issue was widespread or if the solution could be adaptable for multiple clients.
To address adaptability, we shifted towards a reusable and modular approach to product development. Modular AI tools, micro-frontends, algorithms and models form the foundational base of AIRA, creating a resource base for the larger enterprise. For example, when we recognized that risk model documentation was a common pain point for our financial clients, so we developed Raptor, a product that could be easily configured for different clients and industry subsegments.
Another execution of AIRA, MagnifAI was launched in 2022. Co-developed by domain experts, data scientists, and engineers, the platform accelerates time to insights/actions for B2B customer and campaign analytics through:
- A collection of technology accelerators for top use cases
- Partner plug-ins
- Low-code modules that unify disjointed workflows for standard use cases OOTB, configurable to client environments within weeks
On the roadmap are modules for adjacent go-to-market use cases such as digital analytics, VoC, branding, pricing, and forecasting. MagnifAI integrates with its Usecasehub governance platform to enforce business and strategy best practices.
In any successful product ecosystem, there is typically some level of abstraction. Take Microsoft, for instance: Word, PowerPoint, and Excel all share common objects that can interoperate between products. This type of abstraction allows for the seamless integration of different components. And so, it is with all of our offerings that reside on AIRA.
With our new product-led vision, we aimed to develop an enterprise component layer that could act as a platform-as-a-service for multiple products. This approach is essential for creating a cohesive product ecosystem that is both adaptable and efficient for our clients. By embracing reusability and modularity in our product development, we are better positioned to meet the diverse needs of our clients and provide value-added solutions.
Domain Specific Knowledge Delivers Usability
Ultimately, what makes the AIRA platform work is our integration with domain-specific expertise, both on the data science level and on the client side. We are able to create products and tools that empower our clients to achieve real outcomes because we have people who have strong expertise within their chosen industries and professions guiding our work.
With powerful tools that are harness to resolve specific use cases for clients within their particular industries, we are able to create models that deliver results. Human guidance is absolutely necessary for success.
Domain experts develop AIRA-powered solutions that meet the growing need to combine various data types (internal, external, structured, unstructured), analyses (qualitative, quantitative), and views (strategy, marketing, etc.). For example, one client invested $2 million annually to integrate customer and market insights into a single platform. High customization needs are better addressed by AI-enabled services than traditional SaaS players.
Applying domain specific approaches to our proactive embrace of generative AI has been essential. When you consider how powerful GPT-4 as a large language model, it feels limitless in its potential outcomes. Having the discipline to apply to specific needs and outcomes is another matter, however. It is within domain applications that the real impacts are to be had, and as they are tools like ChatGPT, and BARD are too unpredictable to have meaningful outcomes on an enterprise lever.
The domain-specific approach to AI addresses some of the limitations of large generative AI models and is more relevant to clients’ needs. Large models like GPT-4 and PaLM are trained on terabytes of data, but the data inputs are static and cannot be controlled by the user. These models require further fine-tuning to answer prompts in a dependable and actionable manner for clients.
Evalueserve takes the extra step of training these foundation models on domain-specific data for high-RoI use cases. We specify the data inputs for a specific use case and integrate them into dynamic data sources so that the model accuracy improves over time. Then we add proven generative AI tools to AIRA so they may be used again for similar functions.
Domain-specificity allows for a robust proactive approach to adding generative AI capabilities rather than simply integrating GPT and hoping it works out. When larger global clients like ours consider AI partners they need more than potential, they require AI models that will meet the use case requirements and achieve outcomes. AIRA provides the foundation to do just that.