Tom Taulli

Why Emergence Invested In Zoom In…2015

Zoom’s second quarter results, which were reported earlier in the week, were off-the-charts as the company’s platform has become a must-have for consumers and businesses alike. Revenues came to $663.5 million, up from $145.8 million during the same period a year ago. There was also a profit of $185.7 million. Wall Street was looking for only $500 million on the top-line and $134 million in earnings. 

With the drop in the markets this week, Zoom stock has taken a hit. But the return is still nearly 10X since the IPO in April 2019. 

Keep in mind that Zoom is not a typical Silicon Valley startup. CEO and founder Eric Yuan did not raise large amounts of capital in the early days. He actually spent two years developing the Zoom app. And when he launched it, there was virtually no spending on marketing.

The Courting

Before Emergence Capital’s Santi Subotovsky met with Eric in 2014, he had spent about three years evaluating conferencing startups. He believed that the trend of cloud computing would lead to a transformation in collaboration.

“People wanted a tool that they could love, not something a CIO has mandated that everyone should use,” said Subotovsky. “There was also a shift in how people were using devices. They would switch from hardwired to Wifi to cellphone networks. But the old school collaboration tools were not designed for this.”

Subotovsky’s own life story was also key with the investment thesis. As someone who grew up in Argentina, he knew that many people lacked sufficient bandwidth for high-quality communications platforms. In other words, the market for conferencing was still untapped.

Now Subotovsky did evaluate a myriad of startups but Zoom was the one that clearly stood out. But there was a problem: Eric did not want to raise capital.

He thought bringing on institutional investors would be too distracting. He instead wanted to be laser focused on making the best product.

For Subotovsky, he did not give up. He went on to build a relationship with Eric and talked about the benefits of having more scale, which would be essential if Zoom wanted to sell to large enterprises.

It’s All About The Product

Eric would eventually make a pitch to Subotovsky and his partners. Consider that there was no investor deck or financials. Rather, Eric did a live demo of Zoom—which he pulled off flawlessly. “He pitched the product, not the company,” said Subotovsky. “But he gave us complete access to all the data. And once we saw it, we were blown away. We had never seen something as capital efficient as Zoom was.”

One of the main advantages of the system was that it was video-first. At the time, the rival systems were mostly about screen sharing. So Zoom was truly innovative. There was also a deep technology foundation, which was easy to integrate and configure.

It certainly helped that Eric had an extensive background with conferencing. Back in 1997, he joined WebEx as a founding engineer. But he would leave the company in 2011 because management ignored many of his suggestions.

The Investment

In February 2015, Zoom announced a Series C round for $30 million. Emergence led the investment and there was participation from existing investors, such as Li Ka-shing’s Horizons Ventures, Yahoo co-founder Jerry Yang, Qualcomm Ventures, and serial biotech entrepreneur Dr. Patrick Soon-Shiong

To get a sense of the traction of Zoom, the company had grown its customer base from 4,500 to 65,000 during the past two years and the number of meeting participants went from 3 million to 40 million. 

Emergence wrote a check for $20 million, which was the biggest one in the company’s history. The equity percentage was also below its normal threshold. “This investment was nerve-wracking,” said Subotovsky. “I was trying to find investors to see the vision. But people would not even take a meeting.”

No doubt, the investment has turned out to be one of the most successful during the past decade, if not in the history of venture capital. For the most part, the deal highlights how important it is to focus on emerging market trends and to not give up on your convictions. After all, Zoom’s market value is roughly $101 billion.

Note: If you want to see my interview with Subotovsky—of course, which was on Zoom—you can check it out here.

AI Startup: What You Need For Your Investor Pitch Deck

The funding environment for AI startups remains robust—and some of the rounds have been substantial. Just recently Dataiku, which operates a machine learning platform, announced a $100 million Series D investment. 

AI truly represents a transformation in the tech world.  “AI is making software much more dynamic and improves as it understands user behavior,” said Gordon Ritter, who is the founder and General Partner at Emergence

OK then, what are VCs looking for when evaluating an AI deal? What should be in your pitch deck?

Well, to answer these questions, I talked to a variety of VCs. Here’s what they said:

Sri Chandrasekar, partner at Point72 Ventures

A slide on Why Now. What technology has recently been developed that has made solving this customer problem now possible. It might be “speech-to-text technology has gotten good enough to use in 95% of a call center’s communications.” Or “recent deep learning focused processors have made it possible to do computer vision on the camera instead of in the cloud.” It’s rare for people to identify a customer problem that nobody has heard about before–usually, what creates the potential for a large new company is that technology is now available to solve that problem in a new or differentiated way.

I also want to see a slide about solving the “Cold Start” problem. This matters most for the earliest stage companies, but AI companies need access to data to start training their algorithms. I like to see that they’ve thought about this problem and have a clear way to get access to enough data to build their business. The answer can be anything from buying data to partnering with an ancillary business to “faking it until they make it,” where they deliver the product or service with humans until they have enough data to build the AI model.

Mark Rostick, who is a Vice President and Senior Managing Director of Intel Capital:

When looking at a presentation of a potential AI deal, we look closely at the specific problem in AI/ML that they are solving and why solving that problem is important enough to build a company—not just a feature or tool. We also take a look at why the team is uniquely positioned to understand the problem they are trying to solve and how they are equipped to execute on it. The team must have line-of-sight to an economic model they can create that is capable of driving growth at “venture scale”.

Jake Saper, a partner at Emergence :

When evaluating companies that use AI to augment workers, I like to see charts that show the percentage of AI-generated suggestions that are taken by the user over time. For strong companies, this portion may start relatively low as the model is training and the UI is being tweaked. As both improve, you want to see the “coaching acceptance rate” improve to >75% and stay consistent.

Kenn So, a venture capitalist at Shasta Ventures:

There are a couple of slides I like to see:

#1: A high level architecture/diagram of how the data flows from source to training to AI predictions to the product that the users interact with. This helps brush away some of the AI pixie dust.

#2: Quantifying the value of the product or model for the user. For example, radiologists save 1 hour per day because the AI automates report writing. Only 10% of radiologists make adjusts to what the AI writes

#3: Defensibility of the data now or in the future. There are different ways to achieve this from proprietary data rights agreements to data network effects. In the end, ML is all about data. One thing to note is that data defensibility is just a minimum and not a sufficient condition of defensibility for AI companies.

Jeremy Kaufmann, a principal at Scale Venture Partners:

One of the most valuable metrics to show investors when pitching an AI company is how the accuracy rate of the underlying algorithm is improving over time. It’s important that investors see this improvement over time, particularly if humans are in the loop, as this analysis points to the fundamental solvability of the underlying problem. Investors are scared of the potentially asymptotic nature of AI algorithms (that they will never get good “enough”), so it’s very important to define “good enough” in a business context (what a business user will accept in terms of error rate) and then overlay this underlying expectation with the quantitative measure of how an algorithm is performing over time.

Snowflake IPO: What You Need To Know

With the equity markets surging and interest rates at historically low levels, the environment is ideal for IPOs. But for Snowflake, which has recently filed for an offering, it would likely do well in just about any market environment. This tech startup is growing like a weed and the market opportunity is enormous.

Founded in 2012, Snowflake pioneered the category for cloud-native data warehouses. The founders actually spent two years developing the software.

And yes, the timing proved to be spot-on. The market was ripe for disruption as traditional data warehouses have a myriad of disadvantages. Just some include: the inability to handle unstructured data and huge workloads, high costs, complex interfaces, problems with consistency and integrity of data, and issues with data sharing.   

Note that the founders—Thierry Cruanes, Benoit Dageville, and Marcin Zukowski—were veterans of the traditional data warehouse market. They had worked at companies like Oracle, IBM and Google (by the way, the name “Snowflake” was chosen because the founders like to ski!) In other words, the founders had a strong understanding of the weaknesses of legacy systems—but also had the creativity to build a much better alternative.

“My company uses Snowflake and also competes with it in some cases,” said Sam Underwood, who is the VP of Business Strategy with Futurety. “Snowflake is growing rapidly, and justifiably so, because it’s filling a gaping hole in the market—namely, a huge need to unify data sources to form a single source of truth across an organization. There are many, many tools that already do this—Google BigQuery among others—however, Snowflake has combined the technical effectiveness with the UI simplicity to really excel among both technical users and high-level decision makers who may not want or need as much granular detail.”

So how fast is Snowflake growing? During the first six months of this year, revenues spiked from $104 million to $242 million on a year-over-year basis. While there continues to be significant net losses, the company has still been able to greatly improve gross margins. 

Consider that a key technology decision for Snowflake was to separate compute from storage. “This offers great performance to customers without the high cost, so they get the best of both worlds,” said Venkat Venkataramani, who is the co-founder and CEO of Rockset. “This was phenomenally compelling at the time and years ahead of even the likes of Amazon with Redshift and Google.”

But of course, Snowflake is more than just about whiz-bang technology. The company has also assembled an experienced executive team, led by CEO Frank Slootman. Prior to joining, he was at the helm of ServiceNow, which he took from $100 million in revenues to $1.4 billion. The current market cap of the company is $93 billion. 

True, Snowflake does have customer concentration, with Capital One accounting for roughly 11% of overall revenues. But then again, this does show the strategic importance of the technology. 

“This IPO underscores a significant change in thinking about the increasing importance of the database market,” said Raj Verma, who is the Co-CEO of MemSQL. “Data has never been more important than it is right now. In the last 25 years, only one company in this sector other than Snowflake went public. And I’m sure we’ll see a couple more companies go out in the new few years as well. There was an iron grip on the database market for more than two decades, with IBM, Oracle and SAP HANA. Now we are seeing a changing of the guard, which gives customers the option of deciding what is best for their business. I can tell you that the technology of yesterday will not solve the data challenges of tomorrow, and this IPO brings newer technology solutions to the forefront.”

Quantum Computing: What Does It Mean For AI (Artificial Intelligence)?

While quantum computing is still in the early phases, there have already been many innovations and breakthroughs. Companies like IBM, Microsoft, Google and Honeywell have been investing aggressively in the technology. 

So then what is quantum computing? Well, it is similar to traditional computing, which relies on bits—that is, the 0’s and 1’s to encode information. But quantum computing as its own version of this: the quantum bit or qubit. This is where the information can have multiple states at the same time. And the reason for this is the impact of the effects of quantum mechanics, like superposition and entanglement. Yes, this is all about the spooky world of Schrodinger’s cat, which is both alive and dead at the same time!

“Quantum computing is a new kind of computing, using the same physical rules that atoms follow in order to manipulate information,” said Dr. Jay Gambetta, who is an IBM Fellow and vice president of IBM Quantum. “At this fundamental level, quantum computers execute quantum circuits—like a computer’s logical circuits, but now using the physical phenomena of superposition, entanglement, and interference to implement mathematical calculations out of the reach of even our most advanced supercomputers.”

One of the fertile areas for quantum computing is AI (Artificial Intelligence), which relies on processing huge amounts of complex datasets. There is also a need to evolve algorithms to allow for better learning, reasoning and understanding.

Then what are some of the things we may see with quantum computing and AI? Let’s take a look:

Christopher Savoie, who is the CEO and founder of Zapata Computing:

Generative models are those models that don’t just limit themselves to answering a question, but that actually generate output such as an image, music, video, etc. As an example, imagine you have a lot of pictures of the side of a face, but not a lot of pictures of the front of a face. If you want security detection capabilities to be able to recognize dual facial recognition on the front side of a face, you can actually use these generative models very accurately to create more samples of frontal views of a face. Inserting quantum processing units into the classical framework has the potential to boost the quality of the images generated. And how does this help us with classical machine learning? Well, traditional machine learning algorithms are as good as the data you feed them. If you try to train a classical face detection model with a small dataset of faces, this model won’t be very good. However, you can use quantum-enhanced generative models to enlarge this dataset with more images (both in terms of quantity and variety), which can significantly improve the detection model. This isn’t limited to generating faces, you can also generate fake molecules, cancer cells, or MRI scans, which are very similar to the real thing. This allows us to train better machine learning models, which can then apply to real data and real-world problems.

Ilyas Khan, who is the CEO of Cambridge Quantum Computing:

For the first time, a Natural Language Processing (NLP) algorithm is “meaning aware” and has been executed on a quantum computer. When we refer to meaning aware we mean that computers can actually understand whole sentences and not just individual words and that the awareness can be expanded to whole phrases and ultimately real time speech without requiring stochastic guesswork that is the state of the art today and which is computationally so expensive. Full scale implementation is dependent on quantum computers becoming much larger than is currently the case. This development of research in NLP is a prime example of the fact that as realistic quantum computers become available, more use cases will also become apparent. Of course, this has been proven to be the case in the past 30 years on classical computers as a precedent.

Dr. Itamar Sivan, who is the CEO and co-founder of Quantum Machines:

Roughly speaking, AI and ML are good ways to ask a computer to provide an answer to a problem based on some past experience. It might be challenging to tell a computer what a cat is, for instance. Still, if you show a neural network enough images of cats and tell it they are cats, then the computer will be able to correctly identify other cats that it did not see before. It appears that some of the most prominent and widely used AI and ML algorithms can be sped-up significantly if run on quantum computers. For some algorithms we are even anticipate exponential speed-ups, which clearly does not mean performing a task faster, but rather turning a previously impossible task and making it possible, or even easy. While the potential is undoubtedly immense, this still remains to be proven and realized with hardware.

Tony Uttley, who is the President of Honeywell Quantum Solutions:

One of the areas being looked at currently is in the area of artificial intelligence within financial trading. Quantum physics is probabilistic, meaning the outcomes constitute a predicted distribution. In certain classes of problems, where outcomes are governed by unintuitive and surprising relationships among the different input factors, quantum computers have the potential to better predict that distribution thereby leading to a more correct answer. Dr. Hayes states:  “The basic idea is that there are problems that require an AI to generate new data that it hasn’t seen before in order to make a decision. Solving this problem may require coming up with an underlying model for the probability distribution in question that it could use in new situations.”

Daniel Newman, who is the Principal Analyst and Founding Partner at Futurum Research:

As it pertains to AI/ML, I think what I’m most encouraged by is the potential for classical and quantum to work together leveraging the elastic nature of the cloud and the powerful, specific problem-solving capabilities of quantum computing. I get the sense that a lot of people are looking at quantum versus classical computing, but in reality, it will be the two working together harmoniously to solve challenging and complex problems. Both have strengths and the development now is for quantum computing to function as part of the solution. Over time, both computing formats will continue to advance, but the ability to accelerate workloads on traditional GPUs and ASICs while also leveraging the power of quantum computing is a recipe for faster, more robust results, which is what the market should be eager to see as quantum computing becomes more widely accessible.

For me, I see a few applications for quantum computing in the immediate future that will gain popularity, but of course there will be many more. Financial Services and Healthcare are immediate applications where Quantum Computing can take advantage of speed and specificity to help tackle complexities. Fraud detection and drug compound identifications have been touted as some of the most exciting use cases. Given the current state of cybercrime and the attention to healthcare in the wake of the pandemic, this makes a lot of sense.

How To Create An AI (Artificial Intelligence) Startup

According to research from IDC, the global spending on AI (Artificial Intelligence) is expected to hit $97.9 billion by 2023, up from $37.5 billion in 2019. This represents a compound annual growth rate of 28.4%.

No doubt, this is an enormous opportunity for startups. “These days, almost every company needs to leverage AI in order to thrive and build a meaningful future,” said Saar Yoskovitz, who is the CEO of Augury. “This is true for younger startups, and it is true for the largest companies, even in the most traditional and nascent industries like manufacturing and insurance. In a sense, AI has become another layer in the tech stack, like databases, and not a business model.”

All this is true. But then again, finding the right product-market fit is extremely challenging. Consider that AI models generally require a lot of customization. In fact, there is often much variation even for companies in the same industry. 

And yes, there must be a high-quality dataset. But of course, this is far from easy for a startup to develop.

“Data creates an interesting chicken-and-egg problem,” said Yoskovitz. “Without a customer, you don’t have data, which means you cannot train your algorithms. Without algorithms, you are not able to provide value to your customers and compete in the market. Therefore you will not have customers to provide data.”

What to do? You can look at forming partnerships, such as with companies that may not have strong AI capabilities. Or another approach is to create a free app that collects data.

“What we have found is that to build a successful AI startup, the key is sourcing and building proprietary data,” said Saniya, who is the CEO and co-founder of Pilota. “This is what makes your business defensible and attractive to investors. Almost every investor has asked us where our data comes from and what would stop others who can access this data from replicating what we do. So when creating a business centered around AI, it is extremely important to make sure that your data is proprietary and you are not just building a business on analyzing public datasets.”

But even with a good dataset, this is still not enough. For example, does the AI model make a real difference? Or are the results mostly minor improvements?

“Investors are interested in startups that are building tailored AI solutions for previously unsolvable problems,” said Jay Srinivasan, who is the CEO and co-founder of atSpoke. “So focus on areas where there are many inefficiencies and repetitive human processes, such as call centers and back-office paperwork processing. Investors want successful AI solutions that address specific workflows and problems, such as a legal document review.”

Then there is the issue of adoption.  Let’s face it, customers are still leery of the powers of AI. It does not help that the models are often complex and opaque. There are also the nagging problems with bias.

“Know your customer,” said Vasiliy Buharin, who is the Associate Director of Product Innovation at Activ Surgical. “You may have the algorithm to solve the worst LA traffic or shorten airplane boarding time 10-fold. But if the solution requires people to behave like preprogrammed automatons, it will never be adopted, and your company will fail. Your customer has a certain way of doing things. Your product must fit this workflow.”

Given all the challenges, a startup will likely undergo multiple pivots.  This is why you need a top-notch team that works well together. 

“If you start with a problem and leave yourself open to many different solutions, you will learn from the market and find the best solution to build your business around,” said Sean Byrnes, who is the CEO of Outlier. “Spending six months exploring and selecting the right problem can save you six years of wasted effort trying to build a business around a flawed idea.”

MLOps: What You Need To Know

MLOps is a relatively new concept in the AI (Artificial Intelligence) world and stands for “machine learning operations.” Its about how to best manage data scientists and operations people to allow for the effective development, deployment and monitoring of models. 

“MLOps is the natural progression of DevOps in the context of AI,” said Samir Tout, who is a Professor of Cybersecurity at the Eastern Michigan University’s School of Information Security & Applied Computing (SISAC). “While it leverages DevOps’ focus on security, compliance, and management of IT resources, MLOps’ real emphasis is on the consistent and smooth development of models and their scalability.”

The origins of MLOps goes back to 2015 from a paper entitled “Hidden Technical Debt in Machine Learning Systems.” And since then, the growth has been particularly strong. Consider that the market for MLOps solutions is expected to reach $4 billion by 2025. 

“Putting ML models in production, operating models, and scaling use cases has been challenging for companies due to technology sprawl and siloing,” said Santiago Giraldo, who is the Senior Product Marketing Manager and Data Engineer at Cloudera. “In fact, 87% of projects don’t get past the experiment phase and therefore, never make it into production.”

Then how can MLOps help? Well, the handling of data is a big part of it.

“Some key best practices are having a reproducible pipeline for data preparation and training, having a centralized experiment tracking system with well-defined metrics, and implementing a model management solution that makes it easy to compare alternative models across various metrics and roll back to an old model if there is a problem in production,” said Matei Zaharia, who is the chief technologist at Databricks. “These tools make it easy for ML teams to understand the performance of new models and catch and repair errors in production.”

Something else to consider is that AI models are subject to change. This has certainly been apparent with the COVID-19 pandemic. The result is that many AI models have essentially gone haywire because of the lack of relevant datasets. 

“People often think a given model can be deployed and continue operating forever, but this is not accurate,” said Randy LeBlanc, who is the VP of Customer Success at RapidMiner. “Like a machine, models must be continuously monitored and maintained over time to see how they’re performing and shifting with new data–ensuring that they’re delivering real, ongoing business impact. MLOps also allows for faster intervention when models degrade, meaning greater data security and accuracy, and allows businesses to develop and deploy models at a faster rate. For example, if you discovered an algorithm that will save you a million dollars per month, every month this model isn’t in production or deployment costs you $1 million.”

MLOps also requires rigorous tracking that is based on tangible metrics. If not, a project can easily go off the rails. “When monitoring models, you want to have standard performance KPIs as well as those that are specific to the business problem,” said Sarah Gates, who is an Analytics Strategist at SAS. “This should be through a central location regardless of where the model is deployed or what language it was written in. That tracking should be automated–so you immediately know and are alerted—when performance degrades. Performance monitoring should be multifaceted, so you are looking at your models from different perspectives.”

While MLOps tools can be a huge help, there still needs to be discipline within the organization. Success is more than just about technology. 

“Monitoring/testing of models requires a clear understanding of the data biases,” said Michael Berthold, who is the CEO and co-founder of KNIME. “Scientific research on event, model change, and drift detection has most of the answers, but they are generally ignored in real life. You need to test on independent data, use challenger models and have frequent recalibration. Most data science toolboxes today totally ignore this aspect and have a very limited view on ‘end-to-end’ data science.”

Python Language: What You Need To Know

Python is one of the world’s most popular computer languages, with over 8 million developers (this is according to research from SlashData).  The creator of Python is Guido van Rossum, a computer scientist and academic. Back in the late 1980s, he saw an opportunity to create a better language and also realized that the open source model would be ideal for bolstering innovation and adoption (by the way, the name for the language came from his favorite comedy, the Monty Python’s Flying Circus).

“Python is a high-level programming language, easy for beginners and advanced users to get started with,” said Jory Schwach, who is the CEO of Andium.com. “It’s forgiving in its usage, allowing coders to skip learning the nuances that are necessary in other, more structured languages like Java. Python was designed to be opinionated about how software should be created, so there’s often just a single appropriate way to write a piece of code, leaving developers with fewer design decisions to deliberate over.”

A way to get started with the language is to use a platform like Anaconda, which handles the configurations and installs various third-party modules. But there are cloud-based editors, such as REPL (I also have my own course on Python, which is focused on the fundamentals).

“Python has become the most popular language of choice for learning programming in school and university,” said Ben Finkel, who is a CBT Nuggets Trainer. “This is true not just in computer science departments, but also in other areas as programming has become more prevalent. Statistics, economics, physics, even traditionally non-technical fields such as sociology have all started introducing programming and data analysis into their curriculum.”

No doubt, a major catalyst for the growth of the language has been AI (Artificial Intelligence) and ML (Machine Learning), which rely on handling huge amounts of data and the use of sophisticated algorithms. 

“Because Python is easy to use and fast to iterate with, it was picked up early on by academics doing research in the ML/AI field,” said Mark Story, who is a principal developer at Sentry. “As a result, many libraries were created to build workflows in Python, including projects like TensorFlow and OpenAI.”

Although, Python has proven to be effective for a myriad of other areas, such as building websites and creating scripts for DevOps. Yet it is with AI/ML where the language has really shined.

“Analytics libraries such as NumPy, Pandas, SciPy, and several others have created an efficient way to build and test data models for use in analytics,” said Matt Ratliff, who is a Senior Data Science Mentor at NextUp Solutions. “In previous years, data scientists were confined to using proprietary platforms and C, and custom-building machine learning algorithms. But with Python libraries, data solutions can be built much faster and with more reliability. SciKit-Learn, for example, has built-in algorithms for classification, regression, clustering, and support for dimensionality reduction. Using Jupyter Notebooks, data scientists can add snippets of Python code to display calculations and visualizations, which can then be shared among colleagues and industry professionals.”

Granted, Python is certainly not perfect. No language is. 

“Due to its interpreted nature, Python does not have the most efficient runtime performance,” said Story. “A Python program will consume more memory than a similar program built in a compiled language like C++ would. Python is not well suited for mobile, or desktop application development as well.”

But despite all this, there are many more pros than cons–and Python is likely going to continue to grow. 

“Python is an excellent choice for most people to learn the basics of code, in the same way that everyone learns how to read and write,” said Tom Hatch, who is the CTO of SaltStack. “But the real beauty of Python is that it is also a language that can scale to large and complex software projects.”

UiPath’s $225M Round: What Does This Mean For RPA (Robotic Process Automation)?

This week UiPath announced a Series E round for $225 million at a $10.2 billion valuation. The lead on the deal was Alkeon and the other investors included Accel, Coatue, Dragoneer, IVP, Madrona Venture Group, Sequoia Capital, Tencent, Tiger Global, Wellington, and T. Rowe Price Associates.

UiPath is the leader in the RPA (Robotic Process Automation) space, with ARR (Annual Recurring Revenues) of more than $400 million. RPA technology allows for automation of tedious and repetitive corporate processes. The segment is also the fastest growing in the enterprise software market. 

“Their growth story is pretty simple, but often gets lost in Silicon Valley as we all strive for the shiny new object,” said Pankaj Chowdhry, who is the CEO of FortressIQ. “They provide a great product that helps their customers get more out of their existing legacy investments.”

The irony is that the core technology for RPA is based on something fairly simple: screen scraping. This makes it possible to replicate a user’s input patterns with applications.

Yet companies like UiPath have innovated RPA systems, such as with machine learning, computer vision, process mining, intelligent OCR (Optical Character Recognition) and NLP (Natural Language Processing). The result is that the bots are getting smarter. 

Note that the COVID-19 pandemic has also accelerated growth–at least for the larger players. “Companies today need to ensure they have the right structure and right processes in place to withstand whatever comes their way,” said Vijay Khanna, who is the Chief Corporate Development Officer at UiPath. “With that, they are increasingly turning to automation to accelerate their digital transformation efforts and ensure they can operate as productively and flexibly as possible moving forward.”

The RPA industry has been the subject of various criticisms. For example, the technology is often perceived as being mostly a Band Aid that cannot scale and requires too much training to create useful bots.

“RPA disruption needs to be consumerized to provide self-service for users,” said Muddu Sudhakar, who is the CEO and co-founder of Aisera.

The competitive environment is getting more intense as well. Keep in mind that Microsoft is investing heavily in its own RPA platform.  And there is buzz that Amazon and Google will make a play for the opportunity. 

“The funding for UiPath is consistent with our view of a technology at the proverbial inflection point,” said Adib Ghubril, who is the research director at Info-Tech Research Group. “IBM has actively sought to strengthen its position in this market since the beginning of the year. Indeed, many believed that Automation Anywhere or UiPath would be the target. The fact that IBM opted for a much smaller player–WDG Automation–suggests that the big three–Automation Anywhere, BluePrism, and UiPath–are liking their odds of going-it-alone, at least for now.”

The UiPath round also highlights that the RPA market is massive and still in the early stages. “The adoption of RPA keeps growing as enterprises realize the benefits automation brings to operational efficiency, cost reduction, and employee and customer experience–making RPA one of the highest priorities for tech investments,” said Barry Cooper, who is the Enterprise Group President at NICE. “With the existing economic reality causing some enterprises to cut down on their workforce and look for cost efficiencies, RPA helps to maintain business continuity by complementing and assisting the existing workforce to do more with less. Robotic software can either take over some of the manual tasks to free up employee’s time for other important areas, or assist the human workforce, side by side, with attended bots.”

But with the large RPA firms bolstering their balance sheets with large amounts of venture capital, does this mean that there are fewer opportunities for startups? Is there a crowding out effect?

Well, not necessarily.

“There is opportunity for small operators to provide unique and price competitive products to disrupt the digital world,” said Vadim Tabakman, who is the Director of Technical Evangelism at Nintex. “With the world focusing on how machine learning and artificial intelligence can help—there are interesting opportunities that small operators can investigate that larger ones will have a harder time to implement and modify their existing tools.”