IBM Launches AI Platform After Hiring Freeze On Jobs Replaceable By AI

International Business Machines Corp. (NYSE: IBM) unveiled watsonx, a new artificial intelligence and data platform designed to help businesses integrate AI into their operations, on Tuesday.

The debut of the new AI platform comes more than a decade after IBM’s Watson supercomputer gained fame for winning the game show Jeopardy. Watson could “learn” and process human language, according to IBM at the time. However, according to Reuters, Watson’s high cost at the time made it difficult for businesses to use.

After a decade, the chatbot ChatGPT’s sudden success has made AI adoption at organizations a priority, and IBM is seeking new business. This time, IBM CEO Arvind Krishna told Reuters ahead of the company’s annual Think conference that the decreased cost of building large language AI models means the prospects of success are strong.

“When something becomes 100 times cheaper, it really sets up an attraction that’s very, very different,” said Krishna. “The first barrier to create the model is high, but once you’ve done that, to adapt that model for a hundred or a thousand different tasks is very easy and can be done by a non-expert.”

IBM said the launch was driven by the difficulties that many businesses continue to face when integrating AI in the workplace. According to an IBM poll, 33% of corporate leaders see trust and transparency difficulties as impediments to AI adoption, while 42% cite privacy concerns, notably surrounding generative AI.

Watsonx promises to provide customers with the tools, infrastructure, and consulting resources they need to build their own AI models or fine-tune and modify existing AI models on their own data. Users may also validate and deploy models, as well as monitor models after deployment, using, ostensibly streamlining their multiple procedures.

While the new product appears to be similar to Amazon’s SageMaker Studio, Google’s Vertex AI, and Azure’s AI Platform, IBM claims that Watsonx is the only AI tooling platform on the market that offers a range of pretrained, developed-for-the-enterprise models and “cost-effective infrastructure.”

IBM’s provides seven pretrained models to enterprises, some of which are open source. It’s also collaborating with Hugging Face, an AI firm, to integrate hundreds of models, datasets, and libraries produced by Hugging Face.

The three that the business is showcasing at Think are fm.model.code, which generates code; fm.model.NLP, which is a collection of huge language models; and fm.model.geospatial, which is based on NASA climate and remote sensing data.

fm.model.code, similar code-generation models like GitHub’s Copilot, allows a user to issue a command in plain language and subsequently constructs the matching coding workflow. Text-generating models for specialized and industry-relevant topics, such as organic chemistry, are included in Fm.model.NLP. In addition to other geophysical processes, fm.model.geospatial predicts changes in natural catastrophe patterns, biodiversity, and land use.

“We allow an enterprise to use their own code to adapt [these] models to how they want to run their playbooks and their code,” Krishna said. “It’s for use cases where people want to have their own private instance, whether on a public cloud or on their own premises.”

IBM also launched, a “fit-for-purpose” data store designed for both regulated data and AI workloads, under the Watsonx brand umbrella. According to IBM, enables users to access data from a single point of entry while utilizing query engines, as well as governance, automation, and integrations with an organization’s existing databases and tools.

Added to the slate is Watsonx.governance, a toolbox that provides measures to preserve client information, detect model bias and drift, and assist enterprises in meeting ethics standards.

In the Watsonx-related announcement, IBM highlighted a new GPU solution in the IBM cloud that is geared for compute-intensive applications, notably training and providing AI models.

The IBM Cloud Carbon Calculator, a “AI-informed” dashboard that allows users to measure, track, manage, and help report carbon emissions created by their cloud usage, was also demonstrated. According to IBM, it was created in partnership with Intel and is based on technology from IBM’s research group. It can assist visualize greenhouse gas emissions across workloads all the way down to the cloud service level.

According to the firm, AI will add $16 trillion to the global economy by 2030, and 30% of back-office jobs will be automated over the next five years.

Back-office jobs

The launch comes on the heels of IBM’s move to pause hiring for roles it thinks could be replaced with artificial intelligence in the coming years.

Hiring in back-office activities such as human resources will be halted or slowed, according to Krishna in an interview. According to Krishna, these non-customer-facing occupations employ over 26,000 people.

“I could easily see 30% of that getting replaced by AI and automation over a five-year period,” the chief executive said.

The firm, however, asserts that adapting to AI is important to its strategy, most especially for the employees who are deemed replaceable by the evolving technology.

“AI may not replace managers, but the managers that use AI will replace the managers that do not,” Rob Thomas, chief commercial officer at IBM, said in a roundtable with reporters. “It really does change how people work.”

IBM SVP Dario Gil furthered the argument by saying that a firm like IBM still needs “a very large organization and team to be able to bring [AI] innovation in a way that enterprises can consume.”

“That is a key element of the horizontal capability that IBM is bringing to the table,” he added.

IBM, based in New York, exceeded earnings projections in the most recent quarter due to cost-cutting measures, including previously announced job cutbacks. New productivity and efficiency measures are expected to save $2 billion per year by the end of 2024, according to Chief Financial Officer James Kavanaugh on earnings day.

AI fastfood ordering

Joining IBM’s foray into AI, fast food chain Wendy’s is automating its drive-through service with an artificial-intelligence chatbot powered by Google’s natural-language engine and trained to recognize the plethora of ways people order from the menu.

Wendy’s chatbot will be officially launched in June at a company-owned restaurant in Columbus, Ohio. According to CEO Todd Penegor, the idea is to shorten the ordering process and prevent excessive queues in drive-through lanes from turning consumers away.

Wendy’s did not disclose the cost of the program, only that it has been in collaboration with Google in areas such as data analytics, machine learning, and cloud tools since 2021.

“It will be very conversational,” Penegor said. “You won’t know you’re talking to anybody but an employee.”

Wendy’s software engineers have collaborated with Google to develop and fine-tune a generative AI application on top of Google’s own large language model, or LLM—a massive algorithmic software tool loaded with words, phrases, and popular expressions in various dialects and accents and designed to recognize and mimic the syntax and semantics of human speech.

The customized language model includes terminology, phrases, and acronyms that customers have grown to use while ordering burgers, fries, and other goods, such as “JBC” for junior bacon cheeseburger or “biggie bags” for various combinations of burgers, chicken nuggets, and soft drinks. To complicate matters further, Wendy’s milkshakes are known as Frosties, however customers may not always utilize the marketed phrase.

The application can also upsell clients by offering larger portions, Frosties, or daily promotions. When the chatbot receives an order, it displays it on a screen for line cooks. A worker then relays prepared meals to the pickup window and hands them off to drivers.

Similarly, Wendy’s launched a cost-cutting plan in March, aiming for systemwide revenue growth in the mid-single digits through 2025. The corporation intends to keep general and administrative costs constant for the next two years, according to the strategy. According to the company, global revenues will increase by 6% to 8% this year.

Penegor stated that the deployment of the drive-through chatbot was unrelated to the restructuring efforts and that the company does not want to replace employees with the chatbot. He added that the new technology is expected to assist staff by automating many of the manual chores associated with taking drive-through orders.

Information for this story was found via Reuters, TechCrunch, Bloomberg, The Wall Street Journal, and the sources mentioned. The author has no securities or affiliations related to the organizations discussed. Not a recommendation to buy or sell. Always do additional research and consult a professional before purchasing a security. The author holds no licenses.

Leave a Reply