6 outsourcing questions to ask during an M&A

Merger and acquisition activity is at record levels, requiring selling companies to renegotiate outsourcing arrangements for their divested entities.

Merger and acquisition deal announcements hit an all-time high of in 2015, from Anthem and Cigna to EMC and Dell. And experts expect robust M&A activity this year. But just as major mergers lead to major integration efforts for IT, they also spell significant work around outsourcing arrangements. In fact, the selling company is typically responsible for negotiating new sourcing services agreements before a divestiture is complete.

Not only is the seller often obligated contractually to ensure that the divested entity can operate once it is removed from the seller’s IT infrastructure, “there is always a risk that the sale may be delayed or cancelled, and unless the seller negotiates the right to terminate the new agreement if the sale fails to occur or exercises its right to terminate for convenience, it is now contractually bound to receive those [IT] services,” says Derek Schaffner, an attorney in Mayer Brown’s Washington DC office and member of its business and technology sourcing practice. What’s more, if the seller is actively involved in M&A transactions, the company will want to handle the issue of outsourced IT services well in order to maximize future sales prospects, Schaffner says.

However, addressing outsourced IT contracts prior to a merger or acquisition can be tricky. There is often a lack of clarity around both the previous consumption of IT services by the divested entity and the future requirements of the purchasing company. Here are six questions the selling company must consider to do this well.

1. How accurate is your historical IT outsourcing information?

The volumes of IT services used by the divested entity may not have been measured individually, which can be problematic for pricing. For example, it may not be possible to discretely identify invoice volume in a finance and accounting deal. In such cases, the buyer and the IT service provider may ask for a period of time (typically six months) to develop a baseline of IT services volume followed by an adjustment in pricing at a later date.

“The seller should push to include contractual provisions that describe a formulaic method to set the new volume baseline,” says Schaffner. “If the new volume baseline is significantly higher, the new owner may be faced with a large one-time invoice. To the extent that the new volume baseline deviates significantly from the initial assumption in either direction, the provider’s solution may not be appropriately sized and the parties will need to engage in further contract negotiations which may increase the absolute price or the cost per unit price.”

2. Do you understand the buyer’s unique requirements?

“The buyer may have a different risk profile and set of policies that the provider will need to adhere to,” says Schaffner. However, the acquirer does not have any contractual standing to negotiate terms until the new agreement is assigned to them, and the seller is not always aware of the buyer’s specific positions. The service provider may be willing to accommodate changes after assignment, but bargaining power may be reduced at that point.

Schaffner suggests that the seller solicit buyer input upfront on key legal, commercial, and technical terms. “Subject matter experts of the acquirer should have the opportunity to vet the statements of work, service levels, and pricing since operational ownership will transfer to them after the sale of the divested entity,” he says. “Likewise, the acquirer’s legal representatives should be consulted to provide input on items such as liability caps, termination rights, and intellectual property rights.”
3. Will the terms of a cloud-based deal work for the new entity?

Companies are often eager to start up new IT services for an acquired entity as soon as possible, so cloud computing options may seem like the natural choice. But, notes Schaffner, “as with any cloud agreement, there are tradeoffs between the ability to negotiate customized terms vs. the speed to deploy the solution using the provider’s standard terms.” Typically cloud contracts will stipulate terms that may prove problematic for the buyer, such service-level credits as sole and exclusive remedies, limited audit rights, the right to store and process client data anywhere and the provision of minimal disengagement services.

“The economics of a cloud solution dictate that the provider is not able to offer more robust terms that the divested entity enjoyed under a master agreement while it was owned by the seller,” says Schaffner. “Consequently, the divested entity (and buyer) needs to evaluate these reduced terms and decide whether or not the proposed solution will meet its requirements.”

4. Is the provider adequately prepared for the ‘hypercare’ period?

In any outsourcing transition, services should be closely monitored during the first three months after the handoff to make sure work is proceeding as planned. Providers typically offer extra resources during this time (for an additional fee) to help ensure successful implementation. This so-called “hypercare” period is even more important in M&A transactions, says Schaffner, due to the number of new players involved.

5. What happens if the acquisition fails?

Companies should include a clause that says the selling entity can cancel the new outsourcing agreement if the sale does not go through, says Shaffner. However, the provider may look to recover business development expenses. “Some, but not all, providers are open to including such a clause,” he says. “Those that oppose including this type of clause typically argue that the seller will most likely want to find another buyer for the entity and, therefore, would still need to contract for the services. These providers also cite that the seller can terminate the agreement for convenience—with payment of termination charges.”

6. What if we can’t deliver a new outsourcing agreement before divestiture?

All is not necessarily lost. A well-negotiated services contract between the seller and the original provider might have included the right for the divested entity to continue to receive services under that master agreement, says Schaffner.

Source: cio.com – 6 outsourcing questions to ask during an M&A by Stephanie Overby

Advertisements

Why Privileged Identity Management is Critical for Secure IT Outsourcing

We visited with  David McNeely, VP of Product Strategy at Centrify, to talk about why secure privileged identity management is critical as more of today’s businesses are outsourcing IT functions and relying on vendors to troubleshoot systems and applications.

Centrify recently released a new privileged identity management solution supporting federated privileged access across an organization’s entire security eco-system, including secure outsourcing of IT and application development.

ADM: What is privileged identity management?

McNeely: Privileged identity management reduces the risk of security breaches by minimizing the attack surface. Essentially “the keys to the kingdom,” privileged accounts provide elevated access to an organization’s most critical data, applications, systems and network devices. And as more enterprises embrace the cloud, privileged accounts increasingly lie outside the corporate perimeter and are frequently shared by both internal IT and often remote third parties such as contractors and vendors. Therefore it is no surprise that privileged accounts are top targets for hackers and malicious insiders alike.

Privileged identity management is made up of a set of solutions to enable organizations to control user access and privileges. The first challenge is to consolidate user identities into a centralized identity management platform such as Active Directory. Next organizations should lock down local administrative accounts and their passwords so that you can control who can use this account, also known as shared account password management.

When a user does need privileges to perform their duties, organizations should grant very granular set of rights on specific systems where those privileges are required, also known as super user privilege management. And finally there are other applications or batch jobs that may need to login to another computer or application in order to perform its duties and these passwords should be carefully managed and periodically rotated, also known as application-to-application password management.

ADM: Can you provide an overview of the recent advancements Centrify announced to its privileged identity management solution?

McNeely: Centrify updated its privileged identity management solution to support federated privileged access across an organization’s entire security ecosystem, including secure outsourcing of IT and application development. Centrify is the first vendor in the industry to do this.

Federated identity management enables an organization to establish trusted identity relationships with its outsourcing partners so that employees of the outsourcing partner only need to authenticate to their own company’s identity management system, without requiring them to remember yet another user ID and password for each of their company’s clients.

The company who is contracting services from the outsourcing partner no longer has to create individual accounts for each partner’s employees. They also don’t need to worry about deleting accounts for partner’s employees who leave or change job roles.

Centrify also provides for secure remote access for these outsourcing partners so that they can perform their duties without requiring VPN access. Secure remote access is provided through the Centrify Privilege Service portal, which provides web based access to server console interfaces for both UNIX/Linux as well as Windows, all without requiring any software or plugins to be installed. Application developers can also access internal web based application interfaces through the Centrify portal.

ADM: Why is it critical to govern and secure federated access by outsourced IT, vendors and other third parties?

McNeely: Since more of today’s businesses are outsourcing IT functions and relying on vendors to troubleshoot systems and applications, it is critical that organizations protect privileged access. Recently there have been several high profile breaches that involved third party business partners who were compromised which lead to the data breach.

Centrify recently commissioned a survey and found that every one of the respondents reported that they outsourced at least one IT administrative function as well as one development project. With outsourcing increasing and is expected to be a $335 billion industry by 2019, according to Gartner, it is increasingly important to establish secure processes to enable outsourcing organizations to both authenticate and securely access enterprise resources.

ADM: How does Centrify’s solution differ from traditional privilege identity management solutions?

McNeely: Traditional privileged identity management solutions require organizations to create and manage identities for outsourced IT administrators within their internal environment and grant VPN access. This increases risk as the number of privileged accounts disconnected from an authoritative identity provider grows and more laptops establish VPN connections to internal networks. The result is an expansion of potential attack points for hackers, disgruntled insiders and malware.

Centrify’s approach is unique. It enables an organization to reduce risk by enabling secure remote access through a web-based portal for outsourced IT administrators and outsourced developers to its infrastructure through federated authentication. The outsourcing service retains management of their employee identities, and the customer organization uses Centrify to grant web-based access and privilege for systems and applications.

Privileged access is governed through request and approval workflows, monitoring with optional termination of privileged sessions and reconciliation of approved access versus actual access to critical infrastructure. The solution supports businesses outsourcing to more than one service organization while ensuring identity lifecycle management for outsourced IT administrators and developers remains with their employer, including the disabling of their enterprise identity upon employment termination.

ADM:  What are some of the other new product features?

McNeely: There are two other new features that we announced, Multi-Factor Authentication (MFA) for Linux servers and Application to Application Password Management (AAPM). By configuring MFA for IT administrators who access Linux systems and require elevated privileges, organizations can protect against hackers using stolen passwords and credentials.

Centrify enables multi-factor authentication to be applied granularly based on a centralized policy enabling IT to determine if MFA should be applied at login or individually for specific privileged commands, such as every command that requires root or oracle permissions.

Application password management is even more important for developers building multi-tiered applications or applications that run on top of clusters such as in Big Data environments. Centrify provides both the ability to centrally define and locally provision these accounts as well as enable centralized password management. This enables server account passwords to be periodically rotated so that passwords are no longer hard coded within client applications.

The Centrify CLI Toolkit or REST APIs enable a client application to request checkout of a server account password so that it can continue to perform as desired all without having a hard coded password embedded in the application. This helps organizations meet compliance and security policies as well as to protect against cyber threats.

ADM: What are the benefits for enterprises and developers?

McNeely: Centrify’s cloud-based security solution enables organizations to minimize attack surfaces, thwart in-progress attacks and achieve continuous compliance. Centrify makes it easier to provide outsourced IT and developers with access to the systems and applications they need without having to manage the developer’s identity or passwords as well as provide access without having to give out VPN access.

Developers can also use Centrify to manage the relationships they have with their clients making it easier to access each client’s systems and applications without having to remember different accounts and passwords for each one. Developers will only need to login to their own portal with their company credentials in order to see all of their client’s applications and servers.

Source: appdevelopermagazine- Why Privileged Identity Management is Critical for Secure IT Outsourcing

The Automation Threat to Nearshore

The automation threat is real, but Latin American markets can be expected to navigate through the digital transformation with innovation.

In 2013, a pair of Oxford University economists found that a whopping 47% of jobs were at “risk” of displacement due to the coming computerization of the U.S. labor market. Alan Winfield, a professor specializing in robotics who spoke this week at the global economic summit in Davos, Switzerland, recently put this fear into words in an interview with Bloomberg. “If some of the predictions about tech and employment come true, then we should all be worried,” said Winfield. “There need to be solutions.”

From a global perspective, one of the industries that will feel the most pressure is the BPO and IT domain. The threat is particularly menacing to offshore markets that have relied principally on low wages to win outsourcing contracts.

Business Rationale vs. Historical Reality

There are two competing narratives here. The first is the transformative power of technology when it comes to business processes and IT outsourcing. The ratio of human-powered work to automation-based delivery can — and will — shrink. A report by management consulting firm A.T. Kearney concludes that nearly all the processes can be done cheaper and more efficiently through software-generated means. A.T. Kearney estimates that robotic process automation can yield cost savings of 25%-50% for many business processes.

The second narrative is the historical record. This isn’t the first time we have been told the world will be upended by technological change. In fact, such warnings have been a part of each industrial revolution going back to the first, when the Luddites attacked the weaving machinery that displaced England’s textile weavers. Later, in the wake of the Second Industrial Revolution, Cambridge economist John Maynard Keynes anticipated more pain on the way for workers. He predicted wholesale unemployment, what he termed “technological unemployment,” for those whose hands competed against machines to produce the same goods. Economies would power along thanks to a larger set of machines, he projected, while restricting the role for human labor to one of oversight of the machines and the provision of services.

So where are we this time? Is one technology bound to change outsourcing as we know it? And where does the nearshore fit in?

The Nearshore Challenge from Automation

Latin America is not immune from the disruption of digital transformation. Although business process and IT operations in Latin America rely less on wage arbitrage in order to win outsourcing contracts — instead coupling cost savings vis-à-vis the United States with the draw of strategic geography — BPO in the region remains a human-intensive enterprise. Repetition of tasks characterizes business processing across much of the region, putting many workers at real risk of losing their jobs to automation. The firms and markets most reliant on low cost as a differentiator are most at risk.

Today, the vendors that have begun incorporating BPaaS (business process as a service) in Latin America so far have done so in a few pockets in the region. But those with established links to major global outsourcers are likely to lead the way in incorporating greater levels of automation. This puts the likes of Mexico and Costa Rica at an advantage because they have a history of completing high-end BPO delivery and they enjoy the presence of market-leading outsourcing firms.

Uruguay is also well situated, albeit for different reasons. Uruguay has never been able to compete globally on the basis of labor cost, and this has created a niche market. Uruguay emerged as a delivery destination for higher-value BPO, so it is now positioned to avoid the disturbing effects of automation by delivering knowledge-process outsourcing (KPO), including legal process outsourcing.

Even with regard to the broader disrupting effect of automation, there’s reason to be upbeat. Talent gels when facing crisis. Think of the formation of social media companies amid the Great Recession. Many Latin American markets are chock full of tech talent, and they can be expected to navigate their way through the digital transformation with innovation. There is a lot more innovation — in all sectors — occurring in Latin America than most know, with companies leading the way on renewable energy and software development to streaming video and water purification.

Some industry firms will follow suit, seizing the opportunity created by automation, while others will sidestep the technology to move up the value ladder and deliver higher-level outsourcing solutions. Because BPaaS erodes the advantage of economies of scale and has lower fixed costs than BPO has had in the past, expect new entrants into the delivery market. As much as anything, automation will open up the field. So while BPaaS may upend staid firms, it will also expand the playing field, making way for new entrants to the game.

Source: nearshoreamericas – The Automation Threat to Nearshore

New Business Models Changing Outsourcing Market

According to the 2016 A.T. Kearney Global Services Location Index report, two new business models are expanding the options available to companies looking for alternatives to traditional outsourcing. The report—in addition to analyzing and ranking the top 55 countries for outsourcing worldwide—focuses on robotic process automation (RPA) and business process as a service (BPaaS) and concludes that these two new models are challenging the traditional offshoring outsourcing model.

RPA uses software robots to process rules-based, repetitive operations three times faster than the average human. These software robots are tailored to a company’s own user interfaces for the various operations that the robot is processing. The software robots can work “24/7/365” with no errors, absences, or diminishing returns, while lessening the need to hire and train personnel. Additionally, the report found that an RPA license, on average, costs one third as much as an offshore employee and one fifth as much as onshore staff, estimating that RPA can save a company up to 50% in select back-office processes.

BPaaS is exclusively based in the cloud and uses a standardized interface for multiple customers. Whereas RPA costs are fixed, BPaaS costs are variable based on output or usage. The report states that switching to BPaaS could save a company approximately 10% versus traditional business process outsourcing (BPO). In 2014, the report states, global BPaaS accounted for almost $18 billion of the approximately $160 billion global BPO market. Because of the minimal fixed costs involved, the key growth sector for BPaaS is smaller and midsized companies that do not have the volume or need to enter into large outsourcing contracts.

Given the continued growth of (and potential savings produced by) these models, it appears that all potential outsourcing customers will need to consider the viability of RPA and BPaaS when developing their outsourcing strategies.

Source: Natlawreview – Business Models Changing Outsourcing Market

 

IBM in 10-year IT outsourcing deal with Telefonica

Enterprise IT vendor IBM on Wednesday announced its 10-year IT outsourcing deal with telecom network operator Telefonica.

IBM will modernize and manage different Telefonica Human Resources and Finance Management processes as part of the agreement announced on Wednesday.

IBM will also be acquiring three companies of Tgestiona, a Telefonica company and provider of finance and human resources processes management for communications sector in Spain, Argentina and Peru.

Tgestiona provides business process outsourcing (BPO) services, with offices in Spain, Argentina and Peru serving clients across Europe, Latin and Central America.

Telefonica, which has 327 million subscribers across 21 countries, aims to simplify operations, drive efficiencies, and deliver client experience as part of the IT outsourcing contract. IBM did not reveal the size of the IT contract. IBM is currently working with telecoms including Bharti Airtel in emerging telecom markets.

IBM said it differentiated with its consult-to-operate approach and its digital reinvention point of view, which aligns with Telefonica’s transformation strategy.

“IBM was chosen as our strategic partner based on its ability to demonstrate market-leading best practices in finance and HR, deliver a superior user experience to Telefonica, and demonstrate automation and digital innovation while respecting the cultural diversity of our clients,” said Javier Delgado, director Planning, Projects and Global Services of Telefonica.

Jesus Mantas, general manager of IBM Consulting and Global Process Services, said: “Our deal with Telefonica represents the future of process transformation in the digital age. It delivers efficiencies while addressing the cultural and human elements of digital change, reducing risks and operational disruption.”

Source: Telecomlead.com-IBM in 10-year IT outsourcing deal with Telefonica

Government to review all Atos contracts over £10m – but can it really point the finger?

The Public Accounts Committee recently criticized the outsourcing firm after it emerged that Atos had not delivered on a NHS data extraction system, costing millions of pounds.

The British government is to hold a review of all contracts worth more than £10 million held with Atos, following a scathing report from the Public Accounts Committee that found that the outsourcing company did not show an “appropriate duty of care to the taxpayer” when working on an NHS IT project.

However, whilst Atos has proven to be an ineffective supplier in a number of cases, when the government is conducting its review it should also consider the role it has played in managing these agreements.

It’s worth remembering that this isn’t the first supplier to come under fire for poor performance following a botched IT outsourcing deal. CSC, G4S and Serco have all faced similar scrutiny.

And so, whilst Atos likely deserves its fair amount of criticism for its role, equally I think we need to remember that there are two parties involved here and that the government cannot likely place all responsibility for failure on its private providers.

And that the government’s commercial capability has been regularly highlighted as lacking when it comes to contract management, which often leaves it in a tight corner when things go wrong.

Equally, how effective are these reviews in driving change? Obviously something needs to be done, but it seems that the long term results are never particularly significant. For example, those suppliers that have been investigated or criticised in the past still continue to win government contracts.
All eyes on Atos

The project that has prompted this review is the General Practice Extraction Service (GPES), which was intended to allow eight NHS organisations extract data from all GP practice computer systems in England. The data extracted was supposed to allow for better monitoring of quality, better planning of health services and to keep up with medical research.

However, the Public Accounts Committee found that the project overran by a number of years, the costs increased from £14 million to £40 million (with at least £5.5 million of write-offs) and that the service was only delivering about half of what it was specified to do.

The report stated:

“We are not satisfied Atos provided proper professional support to an inexpert client and are very concerned that it appears to have acted solely with its own short term best interests in mind.

We found that Atos’s chief executive, Mr Adrian Gregory—the company’s witness in our enquiry appeared rather indifferent to the plight of the client; we expect more from those contracting with government and receiving funds from the taxpayer.”

In addition, the Committee recommended that the Cabinet Office should “undertake a full review” of Atos’s relationship as a supplier to the crown. Today the Cabinet Office said:

“In line with the Committee’s recommendation Cabinet Office is undertaking a review of all current ATOS contracts with Central Government with an annual spend over £10 million.”

Estimates of how much the government spends with Atos vary, but is thought that this could be anywhere between £500 million and £3 billion.
All eyes on government

However, as noted above, Atos wasn’t the only party to come under fire from the Public Accounts Committee. Yes, it has a duty of care to the taxpayer, but if the government is going to insist on outsourcing much of its capability to the private sector, it needs to realise that more often than not that those companies are going to mostly be concerned with their bottom line.

And that if they want to get the most out of the suppliers, they need to have a capability that matches what can be found in the private sector.

For example, the Committee’s report also noted:

“The Department accepts that NHS IC did not have the expertise or capability required to run this project and that the governance arrangements were not fit for purpose. There was an exceptionally high level of staff turnover in key roles with ten project managers over a five year period and three Project Board Chairs over three years. The Department did nothing about this despite concerns raised by their own gateway review team. The Department also raised concerns about the adequacy of the testing, but NHS IC did not act on them but instead chose to accept the risk and sign off the system.”

And:

“Whitehall is not learning from past failures in IT projects, and is still repeating the same mistakes. This project exhibits many weaknesses common to other high profile IT failures such as the National Programme for IT in the NHS, the Single Payment Scheme and Tax Credits. These include; lack of staff continuity, inadequate testing, the wrong contracting approach and a governance structure which was not fit for purpose. Whitehall has to start learning from these failures and make real changes to how IT projects are managed and delivered.”

As the Committee highlighted, on this particular project, plus many others, the government did not fulfil its duties to the taxpayer by acting as the best buyer it could to deliver on requirements.

We have seen similar criticisms across a number of IT projects in the past, where the government’s commercial skills have been so poor that suppliers have been allowed to fail without any repercussions. The government’s inability to properly manage contracts has often meant that a lot of the responsibility (and cost) has often fallen with the buyer.

We are all too aware of the skills problem facing the public sector at present, where the need to attract some of the best tech talent is being balanced against the need to slash budgets. And whilst certain gaps are being filled and there is evidence of some top talent being attracted, it’s clear that the skills problem is still very real.

And until that problem is solved (which is likely going to not only come from bringing in top talent, but g-cloud-big-ben-government-westminster-cropalso up-skilling current civil servants and figuring out ways to change behaviours and culture), these project failures will persist.
My take

 

Whilst the Atos review is probably entirely necessary, I hope that when the government is assessing all of the contracts it holds with the company that it is prepared to find that it is also not performing as well as it should.

Both outsourcing provider and government buyer have a duty of care to the taxpayer. But it seems that for the government it is sometimes easier to point the finger of blame outwards, than it is for it to consider that for long term change it needs address its internal faults.

Want long lasting change? Fix the problems internally first.

Source: diginomica.com-Government to review all Atos contracts over £10m – but can it really point the finger?

Artificial intelligence and robotics: Separating reality from the hype

You don’t need me to tell you that two of the biggest technology trends in business right now are Robotic Process Automation and Artificial Intelligence. But if I said to you that they were Intelligent Automation and Cognitive Computing, or Service Delivery Automation and Autonomics, and that these were actually pretty much the same thing, then you might start to question me. I could even take it further and combine both of these into a super-trend of Cognitive Robotic Process Automation, at which point you would have given up all hope in me.

But this is exactly what is happening in the world of technology marketing at the moment. Just when you thought the industry had finally settled on some common terms for something, somebody muddies the waters by inventing a new one. And not because it describes something fundamentally different, but just so that their offering or product sounds different from everyone else’s.

Now, this obviously isn’t a new practice, but the danger now is that the technologies we are talking about really are game-changers, and the need for clarity is crucial. Buyers need to understand whether they are getting, for example, a genuine Robotic Process Automation capability or a macro-driven piece of software that has been rebadged as RPA. They need to understand whether the Artificial Intelligence software really does self-learn or whether it is some convoluted logic dressed up as AI. Knowing these things will make the difference between success and failure of the project, and whether that investment that you worked so hard to secure will actually deliver the benefits that were promised.

So, here’s some pointers that will help you cut through the marketing hype and identify “real” RPA and “real” AI applications.

Robotic Process Automation

The term “robot” is useful here because the software replaces (or enhances) the work that a human being would normally do. Process automation has been around for a long while (even something like SAP can be described as process automation software), but the difference with RPA is the focus on the human tasks. RPA software’s real value is because it works at the “presentation layer” (the user interface) of the vast majority of different types of computer systems and can be trained to access and write to them relatively simply. This sort of “simple complexity” hasn’t been available before.

It is important to remember that the RPA software robots are effectively dumb: they will do exactly what you have trained them to do, 100 per cent of the time. But there is no “intelligence” in them. So, if anyone talks about Intelligent Automation, or Cognitive Robotic Process Automation, then start putting on your cynic’s hat.

Artificial Intelligence

The opportunity for obfuscation with AI is enormous, and many people have openly taken that opportunity. The challenge comes because there is no single definition of AI – my favourite is that it is any technology that is 20 years from fruition. But if you think of AI capabilities in three different categories, then it should become somewhat clearer.

Firstly, there are AI technologies that are great at capturing information. This could be done through Vision Recognition (e.g. recognising a face in a photo), Sound Recognition (e.g. transcribing words that someone is saying), Search (e.g. extracting data from unstructured or semi-structured documents) or Data Analysis (e.g. identifying clusters of behaviours in customer data). The first three of these require what is called Supervised Learning, i.e. they require large data sets to learn the necessary patterns, whereas the fourth uses Unsupervised Learning, which means that it can come up with the answers without you telling it what the question is. But all of these essentially turn (unstructured) data into information, and this is the most mature application of AI in business today.

The second AI capability turns that information into something useful: it works out what is happening. This is done through Natural Language Processing (e.g. extracting the meaning from an email), Reasoning (e.g. how should I act based on the information given) or Prediction (e.g. predicting buying behaviours based on previous purchases). Some of these applications, such as Prediction, are more mature than others, but all of these can provide real value to a business.

Finally there is the capability to understand why something is happening. This area of AI feeds off most of the others I have mentioned. This is the least advanced area of AI and is not yet relevant to business applications, but will obviously have a huge impact once it does.

All AI applications will fit into one or more of the above categories. If what is being described to you feels like a bit of a round peg compared to my square holes, then you should start to question those capabilities. And if the talk is all about “neural networks” or “machine learning” (both of which are underlying AI technologies) then simply seek to understand what it does, rather than what it is.

RPA and AI are two very different technologies and should be treated as such (if you remember one thing make it this: there is no such thing as Cognitive Robotic Process Automation). Each of these technologies do complement each other very well (for example with AI provided structured outputs from unstructured inputs, which can then be processed by RPA) which is why they can be deployed very effectively together. But, please don’t get fooled by the hyperbole of marketing speak that surrounds these – seek to understand or, if it’s still not clear enough for you, seek advice.

Source: SourcingFocus-Artificial intelligence and robotics: Separating reality from the hype