How to prepare IT workers for the impact of automation

What happens to technology professionals as artificially intelligent machines take over increasingly complex IT tasks?

Seven out of 10 corporate executives say they are making significantly more investments in artificial intelligence (AI) than just two years ago, according to Accenture’s recent Technology Vision survey. And more than half (55 percent) say they plan to use machine learning and embedded AI solutions extensively.
CIO March 2016 digital issue
Download the March 2016 digital issue

Inside: What you need to know about staffing up for IoT, how cloud and SDN set Veritas free & much more!

But rapid advancement of AI and robotics in the workplace has many professionals on edge—no more so than in IT. As machines become more sophisticated and able to learn and make decisions, they are becoming an increasingly important aspect of the IT ecosystem. And that creates fear, uncertainty and doubt in the human workforce.

Chetan Dube, founder and CEO of AI systems maker IPSoft, has said that the IT infrastructure of the future will be managed not by people but by smart systems. Engineering “chores” will be automated and technologists will focus on creativity and innovation. However, the organization charged with delivering technology-enable change to the rest of the company is hardly enthusiastic about such transformation for itself. “While developments in artificial intelligence-fueled automation continue to advance, IT as a business function has been reluctant to change,” Dube says. “Instead of designing business operations around emerging technologies, IT is typically forced to work around infrastructure that already exists.”

CIO.com talked to Dube about the immediate impact of increased automation on IT professionals, mitigating disruption using top-down leadership, the obligations of companies and governments have to retrain their labor forces for the future, and the changes we’ll all have to make as our work we do becomes fodder for the robots.

CIO.com: Are IT leaders considering how to manage the impact the individuals on their teams as they incorporate more AI and robotics?

Chetan Dube, CEO, IPsoft: Because the hybrid AI workforce is already the direction that most industries are headed, IT leaders are definitely considering how to manage these changes and concerns directly related to their workforce. However, with many companies still operating on legacy platforms and outdated technologies, not every IT professional knows the best course of action.

Leadership can pave the way, but success will greatly depend on each individual’s willingness to adapt. Those who are willing to embrace change and work at updating their skillset to match the requirements of new roles will reap the rewards.

Within the IT function particularly, IT managers need to think about how they adapt to managing automated processes rather than organizing their functions around the more limited capacity of manual workers. This is massive shift in mindset, but it will lead to greater opportunity for more highly skilled and varied job roles.

CIO.com: When people ask where they fit into a machine-driven future, how do you reply?
Resources

Dube: To overcome resistance to change, our perceptions of artificial intelligence and automation must change from fear to acceptance. Instead of focusing on how machines can replace humans, we should create scenarios where people work alongside them. The [issue] is no longer ‘machine vs. human,’ but rather ‘machine and human.’

We need to remember that people’s skills will be evolving and not static. Each new technological development creates roles that were previously unheard of a decade before. For example, the Twitter and Facebook revolution has led to the creation of tens of thousands of social media experts and even entire agencies dedicated to a new business need.

This adaptation can help shift IT leaders from maintaining day-do-day operations to dedicating themselves to strategic change. An engineer, for example, develops new algorithms or systemic improvements that have the potential to completely transform a business and impact its productivity.

CIO.com: What are the most immediate challenges when introducing increased AI and automation?

Dube: The immediate challenge is properly aiding the shift in perception of an AI-driven workforce and reskilling your workforce to adapt to changes in roles and responsibilities. In order to do so, workers must be given the tools they need to succeed, and this starts with education. Educating the workforce about how automation can result in more enjoyable, strategic, and creative job roles, and how it will bring more productivity and increased revenues, must be a priority. This starts with CIOs and business leaders who can help mitigate the risk of transition. In order for the transition to be successful, it must be completely supported from the top-down within the organization.

While any kind of restructuring will inevitably be met with a degree of resistance, in the medium to longer term the new model has the potential to deliver far greater benefits to a company’s entire employee base.
CIO.com: Might there be some areas where automation is possible but the disruption would just be too great?

Dube: Business leaders must ask themselves a vital question: “What will happen if this change is not embraced?” [There may be] a domino effect. If a business fails to adapt, it is likely to die. If the business dies, then there will be no jobs to protect. Of course, that’s a worst-case scenario; however, it is plausible. Failing to embrace change and undertake automation can affect revenues and hinder competitiveness, which can both have a huge impact on an organization.

CIO.com: What sorts of new role do you see emerging as a result of increasingly advanced automation capabilities?

Dube: The introduction of AI will see the creation of new executives such as the Chief Data Intelligence Officer. These highly skilled workers will be able to filter and interpret the vast amounts of data stemming from automated procedures and transform it into valuable intelligence. The ability to see the bigger picture and spot trends and opportunities will become a key skill within IT, as it maximizes the varying abilities of both people and machines.

Data scientist roles didn’t exist (under that name, at least) 10 years ago. Demand for commercially savvy data analysts is already vastly outstripping supply, with companies including Microsoft and Google on massive hiring spree.

Another example is a demand for candidates within AI organizations who combine skills in mathematical logic with knowledge of the human mind in order to automate activities that can improve quality and speed of service across all industries. At its core, driving ROI and fostering innovation has been—and will continue to be—driven by people.

According to Accenture’s recent Technology Vision 2016 report, the qualifications for leading employees are changing. Only a few years prior, ‘deep expertise for the specialized task at hand’ was ranked as one of the most important hiring factors for IT and business executives. Today, this was only the fifth most important characteristic required for employees. [Today, skills such as] the ability to quickly learn, ability to multitask, and the willingness to embrace change [lead]. This strongly indicates that CEOs and CIOs are placing a premium on candidates whom they believe will evolve with their business and have the aptitude to learn quickly. This can only be achieved through proper education and training.

CIO.com: How can business leaders win the trust of their workforces amid this expansion of automation?

Dube: Building trust in an organization involves transparency in workforce changes and providing the right tools for both current and future employees to succeed. Leaders must prepare today as if tomorrow were already here. What jobs would exist? What would be the needs of corporations in a cognitive era? In order to build trust amongst employees, show them that the roles of the future are not to be replaced by AI.

For instance, rather than classic systems engineers, develop automation engineer roles today. By doing so, your engineers can be a part of galvanizing the transformation to a more efficient ecosystem. When your entire organization is embracing the change, it provides a smoother path to the future with less fear of the unknown.

CIO.com: What are forward thinking companies doing to ease uncertainty during this transition?

Dube: There are two core elements to effectively enabling changes: education and incentives. Successful leadership will involve educating the workforce about how automation and overhauling the exiting operational model can result in more enjoyable, strategic, and creative job roles, and how people and machines working together will bring more productivity and increased revenues.

Once the correct technology is in place, staff must be incentivized to assist in the smooth running of the transition. Incentives may come in many forms including financial rewards and the offer of additional training or better roles within the organization. This needs to be communicated from the highest level and filtered down throughout the workforce. Alongside incentives, workers must be given the tools to be able to succeed—be they financing, time, technology or management support.

CIO.com: What about the ethical considerations involved in creating policies to ensure that these changes have a positive impact on workers in the long term? What responsibilities do companies have to train their employees for different roles?

Dube: All stakeholders have a role to play. In addition to organizations investing in skills development, local and central government bodies need to look at the policies and incentives they can provide to accelerate reskilling. It’s also unquestionably the responsibility of all of us as individuals to adapt and seek out ways to take our own steps in learning more about the new skills we need to thrive—not just survive—in this new world. Labor flexibility to adapt will be a primary factor in the ability of entire economies to reorient themselves in order to prosper in this smart machine world. Galvanizing all stakeholders to collaborate positively is the only way forward.

CIO.com: What sorts of policies and processes will be more effective in helping professionals make this transition?

Dube: By hiring employees who have the aptitude to learn quickly, leaders can build a more adaptable workforce by providing these fast learners with the right tools to drive growth.

Although technology is what is driving this disruption, it is also at the center of creating the solutions to mitigate these risks. This includes massive online open courses (MOOCs) that are enterprise-focused and scalable. In addition, internal collaboration tools that foster group thinking and problem solving will also build a more effective workforce.

But this education must also go beyond the four walls of the office. There is a great opportunity for corporations to align their corporate social responsibility initiatives with STEM education [efforts] in order to integrate futuristic technologies into the education of our youth in order to better prepare them for the future. This includes exposing them to robotics and AI at a younger age and giving them opportunities to experiment with emerging technology. With AI already entering the workplace, we must prepare the next generation of the workforce to work seamlessly with these technologies as a partner, rather than seeing them as a threat.

Source: CIO.com-How to prepare IT workers for the impact of automation

Advertisements

IT pros don’t fear rise of the robots

IT professionals are more likely to view robotics and other automation as an asset that can free them up from menial tasks than a threat to their jobs, according to a recent survey.

Robotic process automation (RPA), artificial intelligence, machine learning and autonomics —which can automate a number of IT tasks and enable those outside of IT to create software and intelligently manage IT infrastructure — have been touted as an alternative to traditional human-based IT services outsourcing for years. This automation of IT and business functions traditionally performed by skilled professionals has been called a job killer.

But a recent survey reveals that IT workers are not worried about the impact of RPA, artificial intelligence, or other forms of automation on their job security. The survey, conducted by Spiceworks and commissioned by Arago (an enterprise IT automation provider), targeted 196 IT workers at the administrative level of medium to large enterprises who were not purchase decision makers.

Automation can make IT departments more strategic

Nearly all respondents (93 percent) said that automated tools do not put their jobs at risk and a similar number (91 percent) do not believe that automated IT tools are the beginning of an AI takeover In fact, 88 percent said that automation either has or likely will free up their time to focus on more strategic initiatives. The biggest anticipated benefits of increased automation were increased free time to focus on important IT initiatives (88 percent), increased efficiency (85 percent), faster problem solving (54 percent), and a reduction in errors or compliance reinforcement (54 percent).

The majority of IT workers surveyed said they already use traditional automation in some capacity. The most common automation processes or tools being used are group policies (91 percent), custom scripts or process (81 percent), and deployment or update tools (63 percent).

An overwhelming majority (85 percent) indicated that intellectually stimulating and learning or development activities are most important to their job satisfaction. Today, their time is split between menial and more stimulating tasks—most often troubleshooting or assisting users (92 percent), modernizing technology (92 percent), routine maintenance (91 percent), future IT planning and strategy (84 percent), and ticket documentation and reporting (75 percent). Their most preferred tasks are creating innovative technology or solutions (72 percent) and strategic planning (60 percent) and their least favorite work is ticket documentation and reporting (59 percent) and routine maintenance (39 percent).

Those surveyed said that they would prefer to spend their free time during the workday innovating around technology and solutions (72 percent), modernizing existing technology (63 percent), and undergoing training or other development (55 percent).

A separate report by outsourcing consultancy Everest Group said that spending on RPA grew at over 100 percent over the last two years and is expected ultimately to impact 30 to 40 percent of total IT and business process services spending. Everest estimates that RPA can reduce costs 35 to 65 percent for onshore IT services and 10 to 30 percent for offshore delivery while also offering benefits in the form of improved process quality, governance, security and continuity. Half of global in-house IT service centers surveyed by Everest Group said that they are actively planning or pursuing RPA pilots while more than a quarter (28 percent) have implemented RPA across multiple processes.
Source: CIO.com-IT pros don’t fear rise of the robots

The business processes behind services management

We take a look at the essential steps an organisation must take to multi-source or bring in-house IT services.

Where and how are corporate IT services best located: in-house or outsourced?

Enterprises are constantly re-evaluating their IT sourcing using outcome-focused commercial models. In-sourcing allows an organisation to maintain in-house technical teams or large, single-source suppliers.

It can also be more adaptable by exploiting competitive market behaviour, which can cut costs and capitalise on innovation.

But Gartner published a study of IT sourcing capabilities in 2014, which found only 11% of respondents mastered their approach to sourcing.The remaining 89% needed to improve competencies and significantly raise their maturity levels to manage multi-sourcing successfully.
The drivers

There has always been a flow of in-house/outsource use – often outsourcing for cost reasons, and then finding the consequent loss of control costs the organisation dearly. Alongside this, many executives are considering taking traditionally outsourced 
IT services back in-house for a variety of economic, service quality, responsiveness and flexibility reasons.

Technology developments in mobility, virtualisation and cloud computing – as well as the ubiquitous “X-as-a-service” offerings – redefine the operational/capital expenditure equation and directly sway IT sourcing decisions. Similarly, business developments – such as acquiring assets from mergers and acquisitions – affects IT sourcing preferences. These strategic business changes all require the corporate IT department to have both plans and tools in place to adapt quickly to such changes.

With many more, smaller suppliers providing commodity services based on open standards and internet access, the trend towards multi-sourcing has gathered pace, resulting in average sourcing deal sizes shrinking in price and time, while the number of contracts expands dramatically.

When an organisation takes back control of its IT from a lead outsourcer, there are several tasks its IT department must accomplish. These are to:

  • Establish and run its own datacentres with virtualisation and hybrid cloud – the remit of IT service management;
Manage multiple suppliers, using consistent internal metrics. Experienced and qualified sourcing staff will be required to maintain the longer term supplier relationship;
  • Undertake performance measurements and risk assessments to determine value for money. IT needs to regularly review its supply chain, as technology and other environmental factors disrupt established procedures;
  • 
Maintain a wider supplier overview to replace existing providers which are underperforming.

To address these challenges consistently and over longer periods of time, the IT department needs an analytical framework. The most commonly used frameworks for best practice service management involve two major components:
Service integration and management (SIAM); 
The information technology infrastructure library (ITIL) V3.

Of course, other frameworks related to service management may apply as well: Application service library (ASL), CoBIT5 governance, ISO/IEC 20000, Agile, SSADM, RAD, CMMI, Six Sigma quality measures, and TQM and Prince 2 for project management – but these are generally just adjuncts to SIAM and/or ITIL.
Meeting the challenges of SIAM

Unlike ITIL V3 (for IT service management), SIAM is not a formalised framework and does not yet have an established body of knowledge, due in part to a general lack of published best practice. This presents some challenges to the IT organisation. The risks include confusion between systems and service integration, confusion between SIAM and IT service management of devices and users, uninformed decisions to outsource SIAM, and the use of multiple, best-of-breed suppliers – incurring large management overhead costs and difficulty in managing end-to-end services.

SIAM is both a model and a function which manages in-house corporate applications and the migration of applications to cloud hosting. It is seen as a way for large IT organisations to better manage and control multi-sourced operations, by compiling (and then sharing between themselves) best practice and their most successful management methods. The SIAM framework covers the complete lifecycle of services, both on the service supply and service demand side. A SIAM structure relies on a set of working practices with clear bounds of responsibility.

SIAM can manage application processing and storage requirements. With cloud, there is generally considerable pay-as-you-go capacity available, but customers need to be able to scale up and down to make the best use of it. That involves setting up auto-scaling such as provided by Amazon’s AWS EC2 and learning how to optimise the cost and flexibility benefits of cloud hosting with the service levels of a fully managed service.

The task list for a full SIAM service developed by third parties – notably KPMG and ISG – is based on the ability to provide clear and concise interfaces between the service providers, users of ICT services and the contracting organisation. These may include:

  • 
Core service integration, management and tooling;
  • Service desk; 
IT information security support;
  • Service transition planning and support;
  • Service knowledge management;
  • Service/provider assurance;
  • Service validation and testing;
  • 
Transformation design and change services relating to SIAM/tower service model.

The SIAM services provided by system integrators such as Accenture, Capgemini, CSC, Atos, TCS, HP and IBM may be disaggregated into service elements and these elements may be procured or delivered separately, according to individual authority requirements.
Finnish supplier Sofigate offers an open approach to SIAM. The firm launched an open source SIAM model in 2015, focusing on enabling managed service integration through standardised processes and tools. Atos, CGI, Cognizant, Elisa Appelsiini, Enfo, Fujitsu, HCL, L&T Infotech, Sonera, Tech Mahindra, Tieto and Wipro have since joined Sofigate’s SIAM ecosystem.
The ITIL V3 contribution

ITIL V3 encompasses 35 sets of non-prescriptive best practice such as the e-competence framework and the management of portfolios. IT departments choose the parts suitable for them from system integrators such as Accenture, IBM Emptoris/Procurement, Atos, Capgemini, TCS, Cognizant and CGI.
Contract management

More granular service management systems – such as contract management systems (the e-competence framework in ITIL) – are much more affordable for mid-range companies, and come from hundreds of providers, such as SAP Ariba, Agilsoft, Concord, Novatus, Gimmal and GEP. Contract management software automates the creation, tracking and monitoring of contracts and agreements. This allows for the pan-organisational sharing of contract documents for top-down control and monitoring of contract status, key dates and personnel responsibilities.
Supplier management

Supplier management plays a key role. This includes third parties as well as internal suppliers, and the services they provide. The aim is to ensure quality, consistency and value for money. In ITIL V3, supplier management is part of the service design process, to allow for a better integration into the service lifecycle.

Relationships are managed in a supplier and contract database (SCD) such as the Enlighta ITIL suite, to ensure consistent efficiency and effectiveness when implementing the corporate supplier policy. In an ideal world the SCD would make up part of the overall configuration management system and the united information access repository from providers such as Attivio, Search Technologies and BA Insight which extends the core capabilities of Microsoft SharePoint and Fast search. Data stored in these repositories should hold details about contracts, their length, what services are provided by suppliers and any associated configuration items.
SME approaches

Size matters in this context – frameworks can be too heavy for small companies (for example with fewer than 200 employees), below which the overhead of incorporating formal best practice often outweighs the benefits gained. The effort spent on setting up and running internal control-based methodologies is probably better spent on giving the customers what they actually want. Conversely, the effective use of commodity standards-based IT should mean that integration and support requirements are much less onerous than managing a more labour-intensive system.

Some SMEs choose to adopt an intermediate, layered approach with a third-party service integrator such as Edifit to manage the SME’s IT suppliers. Other may adopt a tower sourcing model for different service areas – desktops, applications, hosting and networks. Each tower can have a lead supplier, with a strategic partner appointed to manage the whole group, with responsibilities from the early planning to the implementation, monitoring and support for IT services. It effectively extends the role of the third-party service integrator such as HP or Capgemini to take on more day-to-day management over the long term. Besides improving service quality, it adds the system integrator’s risk management skills, and expands the ability to scan the market for innovative suppliers.
IoT – the new entrant

There are many devices and applications making their way into corporate service management. The biggest challenge is no doubt the service management of the internet of things (IoT). With continued virtualisation, our whole understanding of “what is a service” will face a significant challenge. Next-generation service management must include IoT systems as consumers of services. Best practice will be needed to support this entirely new and massive type of support request. This will be qualitatively different from those that were developed to support the classic SIAM interaction. Best practice requires baselines and solid exchange of operations experience – and so a lot more work lies ahead for the various forums dedicated to SIAM and ITIL.
Essential elements of the information technology infrastructure library V3 SCD

Populating the ITIL V3 SCD requires:

  • Identifying business requirements. Produce a programme of requirements, provide agreement by documenting a strategy and policy, and finally develop a business case;
  • Categorising suppliers and contracts relative to business areas and criticality;
    Evaluating existing and potential suppliers. This requires gaining references, assessing supplier abilities relative to corporate needs and also assessing the potential supplier’s financial viability and robustness. It is key to utilise the change management process to introduce new suppliers into the SCD. Other service management processes such as IT service continuity management (ITSCM), availability management and information security management will all have an interest in the new supplier and their abilities to provide service;
  • Managing performance of suppliers and contracts. Typically, this is performed at the operational level and based on service level agreements (SLAs) in line with service management policies and processes and supported by an appropriate service level reporting regime. The performance of suppliers will also need to be measured and managed with appropriate action taken if the supplier is underperforming. The supplier needs to provide the service that the business requires, and if that changes, then the supplier will need to change service provision accordingly;
  • Renewing/ending a contract. This is a strategic function and needs the existing contract to be reviewed as to its relevance in the future if the business model changes. If the supplier is no longer required, an accurate assessment of the repercussions needs to be carried both from a financial and operational perspective.

Source: computerweekly.com-The business processes behind services management

Outsourcing Data Science: What You Need To Know

More companies are creating data science capabilities to enable competitive advantages. Because data science talent is rare and the demand for such talent is high, organizations often work with outsourced partners to fill important skill gaps. Here are a few reasons to consider outsourcing. What can go right and wrong along the way?

A great number of companies are investing in data science, but the results they’re getting are mixed. Building internal capabilities can be time-consuming and expensive, especially since the limited pool of data scientists is in high demand. Outsourcing can speed an organization’s path to developing a data science capability, but there are better and worse ways to approach the problem.

“The decision to outsource is always about what the core competency of your business is, and where you need the speed,” said Tony Fross, VP and North American practice leader for digital advisory services at Capgemini Consulting. “If you don’t have the resources or the ability to focus on it, sometimes outsourcing is a faster way to stand up a capability.”

A recent survey by Forbes Insights and Ernst & Young (EY) revealed that most of the 564 executive respondents from large global enterprises still do not have an effective business strategy for competing in a digital, analytics-enabled world.

“Roughly 70% said that data science and advanced analytics are in the early stages of development in their organization,” said John Hite, director, analytic architect, and go to market leader for the Global Analytics Center of Excellence at EY. “They said they had critical talent shortfalls, inconsistent skills and expertise across the organization.”

Unfortunately, data science projects and initiatives can fail simply because organizational leaders don’t think hard enough about what the business is trying to accomplish. They also need to consider what resources, if any, are already in place, and how the project or initiative will affect people, processes, technology, and decision-making.
Have a Goal in Mind

Businesses are building their data science capabilities with the goal of driving positive business outcomes. However, success must be defined more specifically, and the results of the effort must be measurable.

“A lot of times, the client feels like the faster they launch a project, the faster they’ll achieve the outcome without defining first what needs to be achieved,” said Ali Zaidi, research manager at IDC.

Goal-setting, particularly at a departmental level, business unit level, or for a one-off project may actually work against a company’s best interests, especially when the strategic goals of the organization have not been contemplated.

“The first conversation [shouldn’t] just focus on the fire that needs to be put out, but the key challenges faced at the top level of the organization,” said Eric Druker, a principal in the strategic innovation group at Booz Allen Hamilton. “You also need to understand how analysis is currently done, in stove pipes, or whether data is being shared across the organization. You also need a coherent strategy for linking subject matter experts to data scientists.”

Even if the business problem is well-defined, the data science team, whether wholly or partially outsourced, needs to work backward from the goal to understand how the planned change will impact end-users, business processes, and decision-making processes.

For example, an EY client built a customer churn model that was capable of identifying which customers would defect in two weeks. Unfortunately, the marketing and sales teams needed 4- to 6-week lead times to take appropriate action, so the model had to be re-tuned.

“Starting with the end-user and how the [business] process is going to change can sometimes be overlooked,” said EY’s Hite. “Even if you get that right, do the end-users have the skills required, and are they incented to take the action you want?”

One company built a predictive model capable of identifying the customers who were likely to pay late. However, the customer service representatives tasked with sending payment reminders to those customers were compensated on customer satisfaction levels, not whether customers were paying their bills, Hite said.
Choose Your Partner Wisely

The growing demand for data science and data scientists is creating a market ripe for consultant organizations that now include the big consulting firms, systems integrators, traditional tech vendors, boutique firms, startups, and firms focused on specific vertical industries. One option is extending the relationship with a current service provider, assuming that provider actually has the level of expertise the organization requires.

“[If] you have a trusted partner relationship, you have everything contract-wise you need. Speed is paramount,” said Capgemini’s Fross. “You also need to consider who will give you the best resources immediately.”

Different parts of an organization may be outsourcing different data science projects or initiatives to different parties to achieve different goals. Sometimes the lack of orchestration among the various operating units can have an adverse effect on the enterprise.

“Data science is a cultural change in the way we make decisions. Firms that come in to solve an ad hoc problem miss all these great opportunities to understand the context for decision-making and how decision-makers use data,” said Booz Allen Hamilton’s Druker. “[If you’re working on an ad hoc basis,] it creates an impression that progress is being made. But because it’s firefighting, it may inhibit the movement on a data science capability down the road.”

Some organizations choose to work with outsourcing partners who specialize in a particular industry or who have consultants with specialized business domain knowledge. Others are looking for expertise that that is best found outside one’s own industry.

“People in your own industry will be laggards in the same way you are,” said Capgemini’s Fross. “If you want to understand customer context, you want to consider someone from a retailer, because they know context better than anyone. If you’re a pharma company and you’re trying to get your act together around data and MDM (master data management), you probably want to look at something from financial services.”

Regardless of which types of firms are on the short list, companies should put more effort into due diligence than they often do.

“As I’m talking to the vendors, I’m asking them about recurring business,” said Jennifer Bellisent, principal analyst at Forrester Research, in an interview. “How many of these projects are one-offs? And how often are you engaged on a retainer basis, so you’re not just doing a project, but you’re available to answer questions [and] be an extension of [the] strategy organization.”
Get Ready for Change

Outsourced data science projects and initiatives may be strategic or tactical, depending on the nature of the work and the mindset of the people hiring the help. A strategic engagement will involve an assessment of the business and what it is trying to achieve; an understanding of where the company is now, and what needs to happen when; and a concept of how the changes will affect the organization, its people, and processes.

Tactical engagements tend to address a specific problem, sometimes in isolation. Either way, the project will likely effect some level of change that should be comprehended and managed.

“The senior leaders need to understand the value that data scientists and analytics can provide, but we also need to have the broader community and managers at all levels understand the value, see the benefit, and [leverage] training for the take-up and use of the capabilities,” said Martin Fleming, VP, chief analytics officer, and chief economist at IBM, in an interview.

Only 10% of the executives responding to the Forbes/EY survey recognize data analytics as one of their core competencies. Those companies share three traits:

They’re using data analytics in decision-making most of the time or all of the time.
They’ve seen a significant shift in their organization’s ability to meet competitive challenges.
They consider themselves “advanced” or “leading” in their ability to apply data analytics to business issues and opportunities.

“Many companies have reached a reasonable point in terms of being able to produce analytics insights, but they’re having trouble driving that into business processes,” said EY’s Hite.

Companies are also having trouble retaining the data science capabilities they have in house, either because the roles are not adequately supported or because the individuals hired are not challenged with work they consider interesting enough.

“We often find data scientists are not part of a larger team. They’re sort of sole practitioners,” said IBM’s Fleming. “They either don’t have the level of support they need, or they don’t have the functionality that’s necessary, so they struggle with effectiveness and career development.”

Similarly, players on the outsourced team may leave. Even if they don’t, a lot of knowledge that could have been transferred doesn’t get transferred because knowledge transfer wasn’t part of the engagement.

“At some point, if customers realize their data science needs are increasing, they need to start hiring some of the skill sets internally,” said IDC’s Zaidi. “There has to be a stream of knowledge transfer, because when the project is done, the customer will have some of that knowledge inside.”

Continuing to outsource can get very expensive and so can hiring underutilized specialists in-house. This is another reason why companies need to understand their goals, their own ability to address those goals, and the resources they require to meet those goals — all of which are dynamic factors.

Stop Looking for a Silver Bullet

The savviest outsourcing resources have difficulty making a positive impact when the client organization is change-resistant and unclear about its goals. Tools can help facilitate good data science, but they’re no replacement for the human side of data science, at least not yet.

“You need to have a combination of human capital and software. If you just use human capital, it will take much longer to deliver data science solutions. If you just use tools, there’s no tool that is so cognitive it can provide business insight on its own, so you need human capital,” said Zaidi.

Some of the large consulting firms are building platforms and other software capabilities that supplement their service offerings. For example, EY recently launched its Synapse analytics-as-a-service platform, which expands the company’s managed services capabilities.

“Most IT organizations are struggling with traditional BI warehousing. Now we’re throwing big data constructs at them, and the way data science wants to leverage the technology is different from BI/OLAP environments and processes,” said EY’s Hite. “Finding the right mix of skills is important, but you also need the constructs to make it [work].”

There is no shortage of open source and commercial tools available. The constant stream of innovation is making it difficult to keep up. Outsourcing partners are often brought in to assess the environment and to make technology recommendations based on a client’s business objectives.

Tools are making it easier to operationalize data science, but the underlying data science must be sound in the first place.
Bottom Line

Outsourcing partners are available in all shapes and sizes, but when it comes to data science, not all of them can solve the same problem equally well. Many outsourcing relationships are less successful than they could be because the client failed to consider its own objectives or the client organization resisted change.

Source: informationweek-Outsourcing Data Science: What You Need To Know

Outsourcing versus insourcing, trust innovation to the experts

It’s that time of year again where key decision makers get together to scrutinise every aspect of their business model. As they prepare to execute on strategies for the next quarter they’ll be looking at how their service delivery models may need to adapt to meet the ever-changing business needs and customer requirements.

While the topic of outsourcing versus insourcing has been the fuel for much debate, it’s time to recognise both methods of IT service provisioning for what they really are: the enabling force that drives business change, growth and development. Both outsourcing and insourcing have their respective benefits, and neither is superior to the other, except to the degree that it helps the business meet strategic goals and business objectives.

A change in thinking is necessary
Even though we have seen the emergence of bi-modal IT system where the traditional approach to enterprise technology (one that emphasises efficiency, stability, accuracy and scalability) can coexist with a second approach that focuses on agility, speed and innovation, we’re still nevertheless fixed on thinking that the benefits are to the extent that outsourcing the ‘house-keeping’ component of IT will cut costs and free up resources so that innovation can be addressed in-house.

Essentially multi-sourcing can provide an organisation with the freedom to focus on developing new products, applications and solutions, which is undoubtedly important in today’s highly competitive business environment. Being first to market is a significant differentiator, however, that doesn’t mean it’s best to keep this process in-house, as outsourcing the innovation might prove to be a smart move after all.

It’s not about superiority, it’s about suitability
Because there are many driving factors that might influence a company’s decision to outsource an asset or a service, arguing over which mode of delivery is better detracts from valuable time and effort that should be invested in innovation to engage with the business and its needs. At the end of the day, the debate shouldn’t be over which is better, but rather which is more suited to business needs, which are as unique as each organisation.

Until recently, companies were generally inclined to look beyond their own IT departments in a number of scenarios – the first would be where the organisation would like to cut costs and have a competitive edge. In this situation significant business changes are taking place and the company needs to look for short-term experts to address these changing needs and they would insource that function and keep it in-house, while outsourcing the day-to-day IT activities.

The second scenario typically involves an organisation that has been around for a few decades with long-standing systems and policies already in place. By choosing of a few key IT professionals to align with business, the rest of the work would be outsourced to a specialist service provider – tasks like managing infrastructure, applications and the like. This would enable the chosen key IT professionals to get on with the business of innovating.

Given that technology is changing faster than ever before, the issue of whether to outsource the business transformation aspects of IT delivery becomes more and more relevant. The best justification for outsourcing is that a company might not have the availability of all the skills needed, or they might want to replicate something that has already been done elsewhere and then it becomes simply a matter of finding the right outsourcing partner for the job to take advantage of the many benefits of outsourcing.

There is general consensus that the perks of outsourcing can be expressed as a cost-benefit. This is because it’s the outsourcing of a service that is already performed for other customers, which is where the scale and commercial advantage comes in. Companies get to make use of case studies and learn from the mistakes or successes in different usages, and get assistance in determining whether international IT trends have local relevance.

There’s also the critical benefit of access to skills, which can be difficult to hire and retain internally. Most importantly, by outsourcing the innovative aspects to a specialist, an organisation can speed up their get-to-market strategies, by looking at automation possibilities for large-scale innovation projects. In the same vein, smaller organisations can handover to outsourcing companies and in the process become more agile and responsive to market needs.

In conclusion, regardless of whether an organisation chooses to outsource, insource or multisource, it’s important to bear in mind the goals and objectives of the business. While cost-cutting can be a benefit, it’s critical that this is not the sole deciding factor. Organisations need to choose whichever mode of IT delivery that will help them to deliver a more seamless customer experience once they’ve won that customer over with their first-to-market innovative products and solutions.

Source: itnewsafrica-Outsourcing versus insourcing, trust innovation to the experts

 

​Outsourcing key as enterprise moves core infrastructure to hybrid IT

“It will eventually make sense for businesses to move core IT components to the cloud and outsource the majority of important infrastructure.”

IT professionals are under increasing pressure to maximise IT agility in support of new business needs, such as digital customer experience, while also holding down IT costs.

Such pressure is resulting in a re-imagining of what IT infrastructure can be, with hybrid IT providing a way forward for businesses that need to get more out of their IT budget.

“Conventional IT infrastructure traditionally required a substantial amount of equipment to be deployed and managed internally,” says Stuart Mills, Regional Director of Australia and New Zealand, CenturyLink.

“That is now changing, with more and more IT infrastructure being deployed externally as a service, helping to free up internal resources, and provide a greater return for IT budgets.”

For Mills, each company needs to determine the optimal mix of services, working with an implementation partner to analyse business needs and implement appropriate solutions.
“This lets organisations combine outsourced services and internally-managed infrastructure to get the greatest value from every element in the overall IT footprint,” Mills adds.

“This approach, often referred to as hybrid IT or bimodal IT, is fast becoming the go-to IT model. It is ideal for businesses that need to scale up their IT infrastructure quickly or that may have already made extensive capital investments in infrastructure they need to depreciate over time.”

With hybrid IT, Mills says existing infrastructure can continue to operate with new, outsourced services, for optimal cost-effectiveness and performance.

“The hybrid IT model has the potential to replace high capital expenditure (capex) requirements for internal IT infrastructure with much lower ongoing operational expenditure (opex) for IT services,” Mills adds.
Read more
Strength and innovation keys to survival, says CA’s Kenneth Arredondo

“This can dramatically lower the barrier of entry for organisations of all sizes, and makes it easier to project budgetary requirements.”

For example, Mills believes a growing number of businesses are choosing data centre colocation services to outsource first because it is a fast way to scale up capacity. It also means they can avoid the high expense and expertise needed to build a data centre.

As IT service providers become more diverse and provide a greater variety of enterprise products and services, Mills claims that businesses will be able to make the move from outsourcing non-core IT elements, such as development environments, to mission-critical operations like enterprise resource planning platforms or integrated security.

“It will eventually make sense for businesses to move core IT components to the cloud and outsource the majority of important infrastructure as they become more comfortable with hybrid IT,” Mills adds.

“As long as a company starts on the hybrid IT journey with an implementation partner that offers a broad range of managed IT services, it will get the most out of its IT budget now, and in the future.”

Source: reseller-Outsourcing key as enterprise moves core infrastructure to hybrid IT

How long can IT outsourcing deliver more for less?

Digital transformation is driving even smaller and shorter deals for IT service providers, but how low can providers go?

IT and business process outsourcing deals have been getting smaller in dollar value and shorter in duration for some time. And, according to analysis of 2015’s deal activity, enterprise digitization is strengthening that trend. The number of outsourcing contracts signed last year hit an all-time high, but annual contract values were down 8 percent, according to outsourcing consultancy Information Services Group (ISG).
“Digital helps fuel a fire that was already burning very well,” says ISG partner John Keppel. “Digital impacts across industries are fundamentally changing the way business is done—not just with implications on IT but on entire business processes. It even impacts places where businesses interact with their customers, deliver their product and services and potentially even drive growth and profits. In times of radical change or disruption, speed of response is everything. So flexibility is key, and shorter deals provide this flexibility.”
How emerging technologies are shaping the IT outsourcing market

And because companies are exploiting emerging technologies, they’re also engaging with newer and smaller providers. “These firms are less capable of scaling or supporting huge implementations,” Keppel says. “Therefore smaller, tactical deployments are used as proof points and pilots.”

Outsourcing spending overall is increasing among companies of all sizes, although ISG’s research shows that small firm spending is up the most—increasing five-fold since 2011.

Restructured contracts reached a new high and new scope awards nearly did the same, even as their annual values declined 6 percent and 9 percent, respectively, for the year. “Deals are coming up for renewal at an ever-increasing pace,” Keppel says. “As you would expect from a growing industry, these renewals represent the increasing churn in the market.” In the past, these awards might go to one or two providers; today, they’re being split among multiple suppliers. “Those pieces are then leveraging emerging technologies which are coming to market at lower price points and utilizing the shortest of durations,” Keppel says.

The as-a-service model offer by the likes of AWS, Microsoft Azure, Google and Rackspace has disrupted infrastructure pricing models as well. “On-premises, fixed-pricing models have given way to dynamic, on-demand models, which represent more change for service provider business models, which in some cases are in danger of requiring complete reinvention,” says Keppel. “These top-tier public cloud vendors have been leveraging economies of scale to conduct an ongoing price war.”

Keppel estimates that traditional service providers lose three to four dollars in traditional hardware and software revenue for every dollar they make on cloud computing offerings. “They must manage this cannibalization of their existing revenues,” he says. “Also, as cloud services change the existing industry dynamic, service providers must note that organizations are moving not just toward shorter-term contracts but to effectively pay-as-you-go models to ensure more flexibility in terminating contracts and procuring new services.”
Resources

The average outsourcing deal was 3.5 years long — a full 15 percent shorter than they were three years ago, with ISG noting that many deals of three years or less in the marketplace as well. How much shorter and smaller can IT outsourcing deals get? “Now that clients have realized the flexibility and value inherent in shorter, smaller contracts, they won’t give that up,” says Keppel. “We expect the trend toward lower contract values and shorter-term commitments to continue. In fact, as we move toward an on-demand norm, it is likely the whole concept of a deal term might feel antiquated and contractual relationships will look more like framework agreements with negotiated terms rather than volume and duration-driven contracts.”

There were nine mega-relationships (worth $100 million or more annually) signed during the last quarter of 2015. However, the total number of such large deals awarded in 2015 — 23 in all — was the lowest in a decade, according ISG.

The next stage may be one in which providers combine IT and business process services to radically engineer a core business function—either via a software-as-a-service solution or through automation. “This is where the next real growth leap is likely to come from,” says Keppel. “Buyers recognize the value of outsourcing and continue to participate enthusiastically in the market. Revenue growth in the future will come from using sourcing to solve new problems.”

Source: cio.com-How long can IT outsourcing deliver more for less?