Couchbase IPO: The Company Looks For Database Riches

During the past few years, the IPO market has seen a variety of database companies hit the markets. And the latest one came this week: Couchbase. The company issued 8.3 million shares at $24 each, which was above the initial price range of $20 to $23. On the first day of trading, the shares jumped by 27%.

The origins of Couchbase go back to the development of an open source project, called CouchDB (its an acronym for cluster of unreliable commodity hardware). The developer of this technology was Damien Katz, who formerly worked at IBM. He started on this project in 2005 and launched the first stable version in 2010.

Startups like CouchOne and Membase saw the potential of this technology and began to build their own enhancements. These companies would also merge in 2011 to form Couchbase, which would combine core database technology with caching systems. The goal was to make the platform highly scalable and reliable for enterprise customers. 

The Technology

The relational database is the most dominant model. While it continues to be robust, the technology has had difficulties with the needs for handling millions of users and managing spikes in workloads. Relational databases are also far from cheap. 

“Relational databases were built for large and monolithic applications,” said Matt Cain, who is the CEO of Couchbase. “But today’s applications involve many datasets and heavy interactions.”

To address the problems, there has emerged a new model called the NoSQL database. It is based on a document model and can work effectively with any type of data, such as structured, unstructured, time series and so on. 

However, the first generation of NoSQL databases was not for mission-critical applications. Part of this was due to the underlying technology. But there was also the problem with the implementation because of the need to retrain database administrators.

As for Couchbase, the company has focused on applying enterprise-grade systems to its platform. Consider that it works in a myriad of configurations, whether for the cloud, hybrid, or on-premise environments. 

“At Couchbase, we are combining the best of both relational databases and the NoSQL model,” said Cain. 

A key factor for Couchbase is that the system has been built for seamless deployment. This means there is minimal downtime. As a result, the company has been successful in migrating mainframe and relational database implementations. For about 80% of its customers, the database is used as a source of truth or system of record for some or all of their business.

Market Opportunity

The database market is enormous. Keep in mind that it’s one of the largest undisrupted markets for enterprise software. 

According to IDC, the spending was about $42.9 billion last year and it is forecasted to hit $62.2 billion by 2024. One of the main catalysts for growth is the need for digital transformation, such as to offer mobile apps, manage edge systems and leverage Artificial Intelligence. Such technologies simply do not work well with relational databases.

“With digital transformation, it’s not just about creating new applications,” said Cain. “There is also a need for re-platforming existing ones.”

In other words, the future does look bright for Couchbase. The company has a robust technology that is used by many large customers and handles enormous workloads. More importantly, there is a real need for companies to transition to other approaches—and this may mean that the database market is finally at a critical point for major change.

DevOps: What You Need To Know

DevOps is a blend of two terms: development and operations. Traditionally, both of these departments were isolated and this often created problems. For example, it could take longer to release software or provide for effective testing.

“For many large enterprises that have applications and services spanning the mainframe, cloud, and everything in between, complexity and myriad dependencies can get in the way of speed,” said Margaret Lee, who is the Senior Vice President and General Manager of Digital Service Operations Management at BMC. “This creates uncertain, unpredictable, and in some cases unintended results.”

Then how can DevOps help? Well, it’s about having a more collaborative approach as well as a focus on agility.

“Developers are contributing to deployment and production management, as opposed to just coding it and throwing it over the wall to an Ops team,” said Matt Groves, who is the Technical Marketing Manager at Couchbase.  “It’s removing the barriers between developers and operations.”

A Case Study In DevOps

To get a sense of the transformative power of DevOps, take a look at Liberty Mutual Insurance. It is ranked No. 71 on the Fortune 500 and is the sixth largest property and casualty insurer in the world. But it also has a complex IT infrastructure that includes many legacy systems. So five years ago, the company began a transition towards DevOps for its more than 5,000 technology employees. 

“With the implementation of DevOps, our team is now deploying code 200 times faster, creating more stability, enabling us to experiment more, and allowing us to launch new products and features on a much faster timeline,” said Justin Stone, who is the Senior Director of DevOps Platforms at Liberty Mutual Insurance.  “Fundamentally, DevOps empowers developers to own, run, and manage the end-to-end delivery of an application or piece of software. It eliminates the confusion around ownership and drives developers toward a single automated, developer-managed infrastructure.”

Keep in mind that DevOps is not just about spinning up some software. Rather, the main goal is to change the culture of an organization—and of course, this is no easy feat. 

“DevOps is all about system-level thinking, looking at the end-to-end value delivery process and not at the individual silos of effort that make it up,” said Bruno Kurtic, who is the founding Vice President of Strategy and Solutions at Sumo Logic. “Because of this, good communication and alignment against shared and measurable goals is critical.”

Now there are various process methodologies to help with DevOps. Perhaps the most popular is agile, which involves small teams that develop applications in chunks (say every two weeks or so) and each of these will be user stories. The idea is to get ongoing feedback from customers and not get bogged down in creating a “big bang” program.

Another popular approach is Kanban. With this, there is a board that divides a project into certain categories like user stories, to do’s, testing and so on.  This provides a visual way to understand the progress.

Automation for DevOps

Automation is another key factor for DevOps. There are many robust tools that can lead to major strides in productivity, such as with orchestration, staging, testing, deployment and tracking. 

“Don’t re-write code that you can repeat through automation,” said Stephen DeWitt, who is the CEO of CloudBees. “If there’s one thing developers hate, it’s busy work. With automation, you can remove the roadblocks, the manual work and the toil that frustrate developers and take their time away from writing code. When developers spend their time on low-value tasks, they are not challenged and they leave. Developers are expensive to keep but they are more expensive to hire and onboard.”

But given that cultural change is paramount, the DevOps journey should be taken with care and planning. “We took a ‘crawl, walk, run’ approach by introducing some common quality gates and pre-requisites for automated deployments,” said Christine Hales, who is the Vice President of Technology at Capital One. “When we started, we had to rely on manual on-boarding and after-the-fact data validation. From there, we’ve been continuously layering in automation to reduce human error and simplify audit and compliance. As a result, we have been able to accelerate the volume of new innovations for customers like Eno, CreditWise and AutoNavigator.”

How To Evaluate AI Software

Buying off-the-shelf AI (Artificial Intelligence) software is a good first step for those companies that are new to the technology. There should be little need to make investments in technical infrastructure or to hire expensive data sciences. There will also be the benefit of getting a solution that has been tested by other customers. For the most part, there should be confidence in the accuracy levels as the algorithms will probably be implemented properly.

But there is a nagging issue: there are many AI applications on the market and it is extremely difficult to determine which is the best option. After all, it seems that most tech vendors are extolling their AI capabilities as a way to stand out from the crowd.

Then what are some factors to consider when evaluating a new solution? Let’s take a look at the following:

Data Connectors: AI is useless without data. It is the fuel for the insights. 

But when it comes to a new AI solution, it can be tough to find the right sources of data, wrangle it and integrate it. Thus, when evaluating an application, you need to make sure that there are ways to handle this process. 

“The most complex task in an AI solution is not to implement the machine learning algorithm anymore—this is usually available as a set of functions in every tool—but to collect the data,” said Rosaria Silipo, a Ph.D. and a principal data scientist at KNIME. “That is, to connect to a variety of data sources, on premise, on the web, or on the cloud, and extract the data of interest.”

Flexibility: AI does not have general scope. Instead, it is focused on particular use cases. This is known a “weak AI.”

This is why it is important to see if the application is built to handle your particular vertical or situation. 

“Take search, for instance, where AI can be used to re-rank results and improve relevance,” said Ciro Greco, who is the Vice President of Artificial Intelligence at  Coveo. “When applied to ecommerce, search is searching on semi-structured records, such as products with little text available and we can count on reasonable amounts of behavioral data produced by users who browse the website. A strategy based on user behavior can be very effective, because we can count on having enough data to learn from.”

Yet AI-search for customer service use cases is often much different. It’s often about finding technical documents. “There is plenty of unstructured text, such as Knowledge Articles, and fewer behavioral data points, because customer service websites usually are less visited than e-commerce platforms,” said Greco. “So in that case, a strategy based on NLP for topic modeling will probably be more effective, because we need to maximize the gain from the information we have, in this case free text.”

Ease-of-use: This is absolutely critical. The user of AI is often a non-technical person. If the application is complex, there could easily be little adoption.

Ethical AI: Even if the application is accurate, there could be risks. The data may have inherent biases, which could skew the results. This is why you should get an explanation of the data and how it is used. 

“What many forget when evaluating an AI solution is the potential damage or risk it could pose to your organization,” said Michael Mazur, who is the founder and CEO of AI Clearing. “What if your organization is sued for deploying this solution?”

Costs: “If you’re the first customer in a specific industry for an AI vendor, then you’re a very valuable customer to have and you can use that as negotiating leverage for a beneficial contract,” said Brian Jackson, who is an analyst and research director at Info-Tech Research Group.

Xometry IPO: Looking To Be The Airbnb Of On-Demand Manufacturing

It was a busy week for IPOs, as 18 companies issued shares. And one of the standout offerings was from Xometry, which operates an online marketplace for on-demand manufacturing.

The IPO was priced at $44, which was above the $38-to-$42 price range, and the shares soared nearly 100% on the first day of trading (the current market value is close to $3 billion). T. Rowe Price and Capital World Investors purchased $70 million of the shares in the offering.

The CEO and cofounder of Xometry is Randy Altschuler, who is a serial entrepreneur. He launched two other startups that were sold to pubic companies. 

As for Xometry, he teamed up with Laurence Zuriff (he is the current Chief Strategy Officer). Prior to this, he was the managing partner at Granite Capital International Group.

Both Altschuler and Zuriff were intrigued by the custom manufacturing industry. But they did not immediately create a company. Instead, they spent months researching the market by talking to many small manufacturers. “We saw certain themes emerge,” said Altschuler.

For example, smaller manufacturers were usually dependent on larger customers that were local, which posed considerable risk. They also spent much time responding to requests for custom parts that did not turn into business. 

The buyers of custom manufacturing parts also had challenges. It was difficult to find the best vendors and to come up with the right pricing. 

To solve these problems, Altschuler and Zuriff saw the need for building a two-sided marketplace. But it took some time to get to critical mass. While sellers were interested, there was skepticism from the buyers. 

Then there was the issue of the pricing for the marketplace. Custom jobs do not have SKUs. Rather, each one is unique.

This meant Xometry needed to build an automation system. One approach was to use brute-force and go through the possibilities. “The problem is that it would take too long to build such a system,” said Altschuler. “It would also be difficult and time-consuming to maintain it.”

The next approach then? It was to leverage AI (Artificial Intelligence). Xometry created a proprietary engine—backed with patents—to provide instant quoting based on factors like volume, material, location and the manufacturing process.

“What really got us excited about Xometry was that they were using incredible technology,” said Daniel Docter, who is a Managing Director at Dell Technologies Capital and an investor in the company. “Over time their AI algorithms got better, their modeling became more accurate, and their breadth of capabilities grew.”

The result is that the company has been able to efficiently process transactions for more than 6 million parts since inception. Currently there are over 43,000 buyers and 5,000 sellers on the platform (the customers include roughly 30% of the Fortune 500).

In terms of growth, it has been accelerating. From 2018 to 2020, the compound annual growth rate was 92%, with revenues hitting $141.4 million.

However, without the AI, none of this would have been possible. The technology has been strategic to Xometry .

“The AI learns and predicts how to make a part, how much it should cost, how long it should take, how much material would be needed, how would it likely yield, and so on,” said Doctor. “Xometry literally transforms an old, manual, grossly human limited process, to an automated, much more accurate and predictable, and hugely more efficient AI-driven process. Manufacturers win; customers win; and frankly the world wins as we collectively can do so much more with so much less waste.”

SentinelOne IPO: How The Company Is Riding The AI Wave

SentinelOne, which develops AI-powered software for cybersecurity, launched its IPO today. There was certainly substantial demand from investors. The initial price range was $26-to-$29 but this was lifted to $31-to-$32. The offering was then priced at $35 and the amount raised came to about $1.2 billion. Tiger Global, Insight Venture Partners, Third Point Ventures, and Sequoia Capital also participated in a $50 million concurrent private placement for the stock.

The CEO and cofounder of SentinelOne is Tomer Weingarten.  Before launching the company in 2013, he had helped to create several other tech startups. His background was mostly in analytics.

He would team up with Almog Cohen, who was a security expert at Check Point Software Technologies. Cohen and Weingarten were actually childhood friends and went to the same college.

As for SentinelOne, the vision was to build a next-generation cybersecurity platform that leveraged AI. But interestingly enough, the timing was too early. “In the first few years, it was an absolute battle to get the trust of customers,” said Weingarten.

Yet things started to change as the cybersecurity threats became more frequent and dangerous. The reality was that traditional systems—such as those based on human-powered signatures–were failing even more. It was “akin to bringing a knife to a gunfight,” according to the SentinelOne S-1 filing.

While building the AI system, Weingarten learned some important lessons. First of all, success would not involve building better algorithms. The reason? They tend to be similar, standardized and open source. 

Next, success with AI would not be about having huge amounts of data either. The focus instead should be on having the right data that produces signals that can be modelled. “We look at it as a contextual narrative, such as like telling a story,” said Weingarten. “We even received a patent on this approach.” 

Building the platform has required using the latest in data systems to process petabytes of data in real-time. Every second counts when it comes to fending off cyberattacks. “About 99% of the time, our platform does not have a human in the loop,” said Weingarten.

The SentinelOne system is flexible as well. For example, it can be deployed on environments like Windows, macOS, Linux, and Kubernetes.

The Growth Engine

Consider that none of the company’s customers were impacted by the SolarWinds Sunburst cyberattack. This was definitely a major validation of the AI approach.

And yes, the growth has been standout for SentinelOne. During the latest quarter, revenues soared by 108% to $37.4 million. There are currently more than 4,700 customers and a majority of them are large enterprises. 

Some of the other metrics include:

  • The dollar-based gross retention rate: 97%.
  • The dollar-based net retention rate: 124%.
  • Annual recurring revenue: $164 million.
  • Customer satisfaction: 97%.

Note that the SMB (small-and-medium size business) category has shown even more growth. A key has been the leveraging of MSPs (Managed Service Providers).

The IPO

This was the first public offering for Weingarten. “It was a lot of hard work,” he said.  “But we thought that an IPO was critical. I think it is about becoming a more mature company.”

Being public also helps with the trust of customers. After all, there are stringent disclosure and audit requirements. In fact, some larger enterprise companies will not even purchase cybersecurity software from private companies.

Now it’s true that SentinelOne faces intense competition. Just some of the key rivals include CrowdStrike and Palo Alto Networks.

Yet the market is massive. Based on the analysis from IDC, the spending is expected to reach $40.2 billion by 2024, which represents a compound annual growth rate of nearly 12%.

Confluent IPO: Remaking The Massive Database Industry

Confluent, which develops database technologies, launched its IPO this week. The company issued 23 million shares at $36 a piece—above the $29-to-$33 price range—and the price rose 25% on the first day of trading. The market capitalization hit $11.5 billion. 

Jay Kreps, Jun Rao, and Neha Narkhede cofounded the company in 2014. Before this, they were software engineers at LinkedIn and had faced the tough challenges of scaling the IT infrastructure. One of the main issues was how to effectively process data in real time.

The cofounders searched for a solution but there was nothing that was viable. For the most part, the database technologies were mostly about storing information efficiently—not handling data in motion.

So the founders developed their own platform, which they called Kafka, and made it open source. From the start, it saw significant adoption. 

As of now, Kafka has a developer community of over 60,000 and the software is used by over 70% of the Fortune 500.  In other words, the cofounders were ideally positioned to capitalize on this growth with their own startup. “Confluent is part of a new wave of data companies that offer faster, more scalable solutions for the modern digital era,” said Jedidiah Yueh, who is the CEO of Delphix.

The Technology

An app like Uber or Lyft may seem simple. But the underlying technology is exceedingly complex, as it needs to manage enormous amounts of streaming data to connect customers. 

“In such situations, to provide a delightful customer experience, data needs to be analyzed in-motion even before it is saved in databases,” said Ashish Kakran, who is a Principal at Thomvest Ventures. “Such a feat was almost impossible in the past because data would need to be stored before analysis. This is where Confluent’s open-source Kafka stands out as it turns a sequential analysis process into a fluid one. It provides an event streaming platform and a rich set of APIs that developers can use to quickly be productive.”

The reliance on developer communities has been critical. It has allowed for a bottoms-up strategy for adoption. Let’s face it, selling a complex technology like Kafka directly to Chief Information Officers would likely be tough.

“Now as the same hard-to-sell products grow organically within an organization, they become hard-to-replace and their value becomes self-evident to CIOs,” said Kakran. “Companies like Confluent then step in with innovative business models to monetize their offerings and help organizations get the most out of innovative open-source tools.”

The opportunity for Confluent is still in the early stages. Keep in mind that it estimates the market at about $50 billion. That is, the need for real-time data spans many industries. For example, a retailer can use it for accurate inventory tracking so as to make sure customers get what they want when they want it. Or in manufacturing, a company can leverage IoT (Internet-of-Things) data to help with predictive maintenance, which can mean less downtime.

According to Kreps, in his letter to shareholders: “Today the data architecture of a company is as important in the company’s operations as the physical real estate, org chart, or any other blueprint for the business. This is the underpinning of a modern digital customer experience, and the key to harnessing software to drive intelligent, efficient operations. Companies that get this right will be the leaders in their industries in the decades ahead.”

Biden’s AI Initiative: Will It Work?

The Biden administration has recently set into action its initiative on AI (Artificial Intelligence). This is part of legislation that was passed last year and included a budget of $250 million (for a period of five years). The goals are to provide easier access to the troves of government data as well as provide for advanced systems to create AI models. 

No doubt, this effort is a clear sign of the strategic importance of the technology. It is also a recognition that the U.S. does not want to fall behind other nations, especially China. 

The AI task force has 12 distinguished members who are from government, private industry and academia. This diversity should help provide for a smarter approach.

But the focus on data will also be critical.  “In areas of social importance such as housing, healthcare, education or other social determinants, the government is the only central organizer of data,” said Dr. Trishan Panch, who is the co-founder of Wellframe. “As such, if AI is going to deliver gains in these areas, the government has to be involved.”

Yet there will certainly be challenges. Let’s face it, the U.S. government often moves slowly and is burdened with various levels of local, state and federal authorities. 

“To achieve the initiative’s vision, government entities will need to go beyond sharing best practices and figure out how to share more data across departments,” said Justin Borgman, who is the CEO of Starburst. “For instance, expanding open data initiatives which today are largely siloed by departments, would greatly improve access to data. That would give Artificial Intelligence systems more fuel to do their jobs.”

If anything, there will be a need for a different mindset from the government. And this could be a heavy lift. “Based on my experience in the public sector, the major challenge for the government is addressing the ‘Missing Middle,’” said Jon Knisley, who is the Principal of Automation and Process Excellence at FortressIQ.   “There are a number of very advanced programs on one end, and then there are a lot of emerging programs on the other end. The greatest opportunity lies in closing that gap and driving more adoption. To be successful, there should be a focus as much as possible on applied AI.”

But the government initiative can do something that has been difficult for the private sector to achieve—that is, to help reskill the workforce for AI. This is perhaps one of the biggest challenges for the U.S. 

“The question is: How do we create a large AI data science force that is integrated across every industry and department in the US?,” said Judy Lenane, who is the Chief Medical Officer at iRhythm. “To start, we’ll need to begin AI curriculum early and encourage its growth in order to build a comprehensive workforce. This will be especially critical for industries that are currently behind in technological adoption, such as construction and infrastructure, but it also needs to be accessible.”

In the meantime, the Biden AI effort will need to deal with the complex issues of privacy and ethics. 

“Presently there is significant resistance on this subject given that most consumers feel that their privacy has been compromised,” said Alice Jacobs, who is the CEO of convrg,ai. “This is the result of a lack of transparency around managing consents and proper safeguards to ensure that data is secure. We will only be able to be successful if we can manage consents in a way where the consumer feels in control of their data. Transparent unified consent management will be the path forward to alleviate resistance around data access and can provide the US a competitive advantage in this data and AI arms race.”

Can AI Solve Your Hiring Problems?

In April, the number of job openings hit a record 9.3 million, according to data from the Labor Department. With the pandemic fading away, there has been a scramble to hire new employees and this has become a major challenge for companies.

So then can AI (Artificial Intelligence) help out? Well, it definitely can. The irony is that many companies are using the technology—and don’t even realize it! The reason is that AI is built into the top online job sites. 

“For example, when you type in a search for a job title, say with the phrase ‘job manager,’ the LinkedIn engine will not only look for the title itself, but also people with relevant skills like time management, team coordination, risk assessment and so on,” said Sakshi Jain, who is the Engineering Manager on LinkedIn’s Responsible AI team. “This means that a recruiter gets more results than just the people who already have the exact title or role.”

One of the key powers of AI is that it can detect complex patterns in huge data sets.  In a way, this can simulate the capabilities of a recruiter. And this is definitely important when it comes to finding passive job candidates. 

“In a study we recently published, 74% of talent leaders told us they’ve increased outreach to passive candidates in the past year,” said Hari Kolam, who is the CEO of Findem. “AI greatly speeds up the passive recruiting process–one where it can take upward of ten hours to fill a single role. It can index and surface information on people from hundreds of sources as passive candidates typically aren’t on job or career sites, and many people tend to only include piecemeal information on their LinkedIn and other profiles.”

AI can also help with personalization. This can be a good way to create a good first impression with candidates.

“Currently, there are many hiring workflows that are incredibly inefficient, including the scheduling of interviews and follow-up emails for candidates,” said Vivek Ravisankar, who is the CEO of HackerRank. “With the use of automated scheduling and email follow-ups, AI can help free up valuable time and solve the major pain point of extensive back-and-forth coordination with candidates and interviewers.”

Yet AI is not without its risks. After all, there is inherent bias in datasets and this can result in outcomes that are unfair and discriminatory. 

“AI-driven HR software that is using years old data on previous hires to determine ideal candidates for job openings is a perfect example of where algorithms can go wrong,” said Ingrid Burton, who is the Chief Marketing Officer at Quantcast. “This is especially true in roles that have historically been dominated by men, such as software engineers, which would risk the hiring algorithm to arbitrarily exclude most women and minorities from advancing during the hiring process.”

To guard against this, there must be good governance as well as explainability of the AI models. There also needs to be people in the loop for critical parts of the process. 

“It is never acceptable to set up an AI process and simply leave it to run,” said Ian Cook, who is the Vice President of People Analytics at Visier. “While there is no need to inspect every transaction or process run by the AI, there is a need to constantly review the performance of the AI steps to ensure that the outputs are in line with expectations. Validation, updating and retraining are constant requirements of running any AI process.”

C3.ai’s Tom Siebel: How To Scale AI

C3.ai, which is a top provider of enterprise AI software and services, is a newly public company. It pulled off its deal in December and issued 15.1 million shares at $42 each. On the first day of trading, the shares spiked 120%.

And this would not be the end of the gains. Within a couple months, the stock price would hit an all-time high of $183.

But as the markets started to cool off, so did the shares of C3.ai. Consider that the stock price is now at $64.

Despite this, the future does look bright for the company. “The total addressable market is huge,” said Tom Siebel, who is the CEO and founder of C3.ai. “It’s a third of a trillion dollars.”

Keep in mind that Siebel is a veteran of the enterprise software world. In the early 1980s, he worked as an executive at Oracle and helped make the company the dominant player in relational databases. Then in 1993, he started Siebel Systems and pioneered the CRM (Customer Relationship Management) category.

As for C3.ai, he launched this company in 2009. Siebel was early in recognizing that AI would be a megatrend. 

But he also crafted a solid approach to building the platform. “We were novel in using a model-driven architecture to enable organizations to rapidly design, develop, provision and operate enterprise AI applications at scale,” said Siebel. “We spent about a billion dollars inventing this in the last decade and it is our secret sauce.”

This was in contrast to using traditional techniques, such as with structured programming, that involve a mishmash of open source and proprietary solutions. However, this usually means too much complexity to effectively scale. 

Even some of the world’s top companies have suffered major blunders and failures with AI. IBM’s Watson, for example, has fallen well short of expectations. Then there is GE, which has spent billions on AI and has seen little return. 

The Future Of Enterprise AI

The C3.ai platform can handle applications for global enterprises as well as small businesses. And to get a sense of its power, it currently manages over 4.8 million concurrent production AI models and processes more than 1.5 billion AI predictions per day. 

Now, another key factor for the success of C3.ai is that the company takes a partnership approach with customers. This is essential for AI since it is important to leverage vertical-specific data and insights. 

A case study for this is Shell. “The company is reinventing itself as the fifth largest in the world,” said Siebel. “They want to get to a zero net carbon footprint by 2050, which is no mean trick, right? This is about applying AI to the entire value chain, about delivering cleaner, safer, lower cost, more reliable energy. This is maybe a $4 billion a year economic benefit.”

Granted, the temptation for companies is to build their own systems. But this is really the wrong approach. “Believe it or not, people tried to build their own relational database systems during the 1980s and I don’t think anybody succeeded,” said Siebel. “We are seeing it again with AI. Companies will try it once, twice, three times. Then they’ll wind up firing the CIO and buy the technology from a professional.”

Ultimately, Siebel thinks that AI will be similar to CRM or ERP. In other words, it will be a technology that’s a necessity for a large number of businesses. “Companies that do not adopt AI will no longer exist,” said Siebel. 

AI (Artificial Intelligence): How Non-Tech Firms Can Benefit

Even though AI continues to thrive and grow, there remain challenges to use the technology. Just some include finding data scientists, determining the right problems to focus on, getting quality data and scaling the models.

No doubt, these problems are even worse for non-tech companies. They generally do not have the expertise or sufficient resources to make AI a success.

“Research shows non-tech companies in particular have struggled to take their AI programs beyond the proof of concept and pilot phases–with just 21% of retail, 17% of automotive, 6% of manufacturing, and 3% of energy companies successfully scaling their AI use cases,” said Jerry Kurtz, who is the Executive Vice President of Insights and Data at Capgemini North America.

But despite all this, there are still a myriad of companies that are beating the odds. And they are becoming much more competitive. “There are many opportunities for non-tech companies to leverage AI to improve efficiency and provide a better customer and employee experience,” said Margaret Lee, who is the Senior Vice President and General Manager of Digital Service Operations Management at BMC.

So what are some of the non-tech companies that have been able to move-the-needle with their AI efforts? Here’s a look at two and the lessons learned.

John Deere: Keep in mind that the company has a long history of innovation, going back to the invention of the steel plow in 1837. The result is that John Deere is a world leader, with a market capitalization of $120 billion and annual sales of over $35 billion.   

The company’s AI efforts began with machine vision because of its advantages with GPS connectivity in its equipment and rich datasets. For example, during the spring, the peak data ingestion was 425MB per second or about 50 million sensor measurements per second. 

“Our projects have resulted in products in the market, or soon to be in the market, that reduce chemical inputs but sensing weeds from non-weeds and selectively apply chemical to the weeds only,” said Jahmy Hindman, who is the Chief Technology Officer of John Deere. “In addition, this work has led to vision-based automated control systems in combine harvesters that optimize the processing settings of the ‘factories on wheels’ to minimize the grain lost during harvest.”

A key lesson for the company has been the importance of keeping customer needs and wants in mind. “We work tirelessly to help our customers advance their business and feed the world,” said Hindman.

Levi Strauss: An early project for this company came in response to the Covid-19 lockdowns in Europe. Levi Stratus looked for ways to manage the inventory pile-up. To this end, the company gathered unique datasets on dynamic price elasticities and then applied AI to it. 

It was a small project but was able to scale quickly. What started as a test in 11 stores in Germany in May 2020 grew to 17 countries in Europe by October.  The system was also used for the 11/11 Singles’ Day in China. 

“I have three pieces of advice from this experience,” said Louis DiCesari, who is the Global Head of Data, analytics, and AI at Levi Strauss. “First, choose real, commercial problems that are aligned to your company’s strategic priorities, and chunk them into actionable steps. Second, don’t get too hung up on having perfect data or technology, or using the latest algorithms. Instead, embrace agile, deliver minimum viable products, continuously measure the impact, and continue to iterate and add new features. And, of course, set a vision, communicate the progress toward that vision throughout the organization, and invite feedback.”

DiCesari attributes AI to the company’s ability to accelerate innovation and move faster than ever before. “In 2021, we aim to deliver more value and support all countries and functions of the enterprise, infuse data and AI throughout the business, enable new ways of working and continue streamlining processes, while also digitizing assets,” he said.