IBM is taking on the likes of Microsoft, AWS, and Google by introducing Watsonx, a new generative AI platform, which will help enterprises design and tune large language models (LLMs) for their operational and business requirements.

Watsonx comes with a suite of tools for tuning LLMs, a data store built on lakehouse architecture, and an AI governance toolkit, the company said.

Watson AI is IBM’s artificial intelligence engine that the company had trained on different machine learning algorithms along with question analysis, natural language processing, feature engineering, and ontology analysis. Watsonx can be seen as the evolution of Watson AI.

With the Watsonx platform, the company said it is trying to meet enterprises’ requirements in five areas, including interacting and conversing with customers and employees, automating business workflows and internal processes, automating IT processes, protecting against threats, and tackling sustainability goals.

As part of the generative AI platform, IBM will offer a development studio for AI builders to train, test, tune, and deploy traditional machine learning and new generative AI capabilities with the help of built-in foundation models, the company said.

The AI studio, dubbed Watsonx.ai, will come with a foundation model library and necessary tools for data preparation, model development, and model monitoring, it added.

Examples of some components of Watsonx.ai include a Tuning Studio, a Prompt Lab, the foundational model library, developer libraries, and APIs.

“Foundation models make deploying AI significantly more scalable, affordable, and efficient. With IBM Watsonx, clients can quickly train and deploy custom AI capabilities across their entire business, all while retaining full control of their data,” IBM CEO Arvind Krishna said in a statement.

The current foundation models included in the library have been trained to understand not only natural language but also code, time-series data, tabular data, geospatial data, and IT event data, IBM said, adding that the initial set of foundation models will be made available in beta tech preview to select clients.

Some of the foundation models in Watsonx.ai’s library are fm.code, fm.NLP, and fm.geospatial.

While fm.code can be used to train models to generate code for developers via natural language processing, fm.geospatial can be used to predict weather or climate conditions as it is a model built on climate and remote sensing data.

The fm.geospatial model was built by IBM Research in collaboration with NASA, the company said. The AI studio’s components such as the Prompt Lab currently only support tuning of text and code foundation models.

Under fm.NLP, the company is offering a collection of LLMs that can be customized using client data for better natural language understanding as language or expressions can vary from one industry to the other.

IBM has also partnered with Hugging Face to provide datasets and models built on Hugging Face’s open source libraries within Watsonx.ai.

Watsonx.ai, which will be available as a SaaS offering initially, is expected to be made available in July this year, the company said.

Along with Watsonx.ai, IBM is introducing a data store, which is built on an open lakehouse architecture, for AI workloads.

The data store, dubbed Watsonx.data, will support open data formats and help enterprises with additional capabilities such as data querying and data governance among others.

It can reduce data warehousing costs by 50%, IBM said, adding that the data store offers integrations with an enterprise’s existing databases.

Watsonx.data, which is expected to be generally available in July 2023, supports both on-premises and multicloud environments, the company said.

Big Blue is also adding an AI governance toolkit inside Watsonx, dubbed Watsonx.governance, to enable enterprises to build trusted workflows.

Watsonx.governance, which is expected to be generally available later this year, operationalizes governance to help mitigate the risk, time, and cost associated with manual processes, the company said, adding that it provides the documentation necessary to drive transparent and explainable outcomes.

IBM’s move to add generative AI-building capabilities comes at a time when rival vendors such as Microsoft, AWS, and Google have already announced similar services.

In March, Microsoft released its Azure OpenAI APIs that offered prompt engineering. The following month, AWS released a new service, dubbed Amazon Bedrock, that provides multiple foundation models designed to allow companies to customize and create their own generative AI applications — including programs for general commercial use.

IBM’s efforts to cash in on the generative AI wave with Watsonx might not move the needle too much, according to Andy Thurai, principal analyst at Constellation Research.

“Among the Watsonx announcements, AI studio and data store won't move the needle much. They are catching up, if that, with other vendors who have been offering better AI data stores and data lakes for many years,” Thurai said.

However, Thurai seemed optimistic about the AI governance toolkit that IBM is offering as part of the platform.

“The toolkit allows companies to build trusted AI workflows and help with building explainable and transparent AI workflows. Though it is not available currently, slated to be released late this year, this can be differentiation in the crowded AI market,” the principal analyst said.

IBM is also planning to infuse Watsonx.ai foundation models throughout its major software products going forward, including the Watson Code Assistant.

Watson Code Assistant, which is expected to be made available later this year, will allow AI developers to generate code via natural language processing.

“Currently, Watson Code Assistant is focused on increasing developer productivity for IT automation with Red Hat Ansible. We anticipate expanding to other domains in the future,” an IBM spokesperson said.

“Other features such as content discovery, code optimization, and explanation of code are all part of IBM’s vision for Watson Code Assistant as the product expands its capabilities,” the spokesperson added.

In contrast, rival products, such as Amazon CodeWhisperer, Google’s Bard AI, and Microsoft-owned GitHub Copilot, already offer such capabilities and are trained on languages other than Python.

In addition to the Code Assistant, IBM will also offer an AIOps Insights tool for visibility into IT operations.

Watsonx.ai will also be integrated into IBM’s digital labor products, the company said, adding that a new generative AI-powered suite for managing environmental goals will be offered separately later this year.

IBM’s consulting unit, IBM Consulting, has announced a Center of Excellence for generative AI, staffed with over 1,000 generative AI experts.

With the help of the new center, IBM plans to build a Watsonx-focused practice, which will actively build and deploy Watsonx for clients, the company said.

The new practice around Watsonx can help IBM in generating revenue, according to Thurai.

“This practice, with over 1000 experts, who are focused on WatsonX practices, can be developer advocates in creating traction for the technology with the customers. This is one of the areas where most of the other AI companies struggle — to create an army of practitioners to help customers,” Thurai said.

IT World