Why M&A due diligence flounders in the face of Big Data and how smart technology can help

August 2017  |  PROFESSIONAL INSIGHT |  MERGERS & ACQUISITIONS

Financier Worldwide Magazine

August 2017 Issue


Is due diligence actually working in today’s M&A transactions? To what extent does it help companies achieve the strategic business goals of the deals in question and pave the way for successful integration? Roger L. Martin, writing about M&A in the Harvard Business Review, suggests that “typically 70 to 90 percent of acquisitions are abysmal failures”.

If a recent Axiom survey focused on due diligence is any indication, many in-house lawyers who work on M&A deals for large global enterprises would likely agree with that assessment. Survey respondents were remarkably sceptical about the value of the typical due diligence process today, with only 21 percent indicating they felt the outputs of due diligence in M&A transactions were “very effective” in helping a deal reach its expected synergy targets.

Respondents cited several important factors that contribute to these perceptions, including a lack of standardised processes, inadequate data alignment and output, and underuse of technology. As deals increase in size and complexity – and as the volumes of data that due diligence teams have to sift through and analyse continue to grow exponentially in this era of Big Data – it seems likely the significance of these factors will only increase.

Exact projections vary, but most experts agree that the size of the digital universe is now doubling every two years. This explosion of data has coincided with a global boom in M&A activity, driven by abundant capital, low interest rates and investor pressure on corporations to find ways to grow. That means that teams of financial advisers and lawyers performing due diligence have to review increasingly large volumes of information held in disparate data sets from distinct organisational silos. While due diligence is a distinctly human enterprise that depends on the skills, experience, judgement and collaborative efforts of highly trained legal and business professionals, it is also clear that the teams performing this important work in the age of Big Data need better tools to be able to do the job right.

Why does Big Data matter in due diligence workflows?

Due diligence teams typically focus on identifying the risks and opportunities that a proposed transaction might entail. It is not simply an exercise in verifying the accuracy of the target company’s financial statements. Verification is certainly important, but a well-executed diligence process should also be able to shed light on the strategic rationale of a deal, anticipate downstream integration challenges and opportunities, and evaluate the prospects for a transaction to realise long-term value. As data volumes increase, it is easy for these qualitative considerations to fall by the wayside.

Due diligence has traditionally focused on information like the target company’s assets, contracts, customers and sales, past and current litigation, employee agreements and employee benefits, as well as environmental, product, supplier and tax issues. But in the age of Big Data where information is considered a valuable corporate asset, the scope of due diligence has broadened considerably. For example, it now often includes operational areas related to data assets and associated risks and liabilities. Reviewers may examine a target’s information governance policies and procedures around protecting the target’s data, such as protocols related to data privacy and data security, as well as insurance policies covering data-related assets and practices.

Reviewers for the buyer are also asked to gather information about, and provide analysis of, the target’s strategic fit with the buyer as well as the target’s competitive landscape. They will need to understand any antitrust and regulatory issues that the target company has dealt with in the past and be aware of any recent changes in the regulatory environment that may have a bearing on the proposed transaction.

This is where due diligence can fail to live up to its promise. In the face of overwhelming data volumes combined with tight disclosure schedules, inflexible budgets and a narrow focus on the completion of checklists, it is not uncommon for teams to neglect more qualitative imperatives like thoroughness, objective analysis, strategic insight and synthesis of complex information. Yet in the end, accurate valuation depends on all of these things.

The role of technology in Big Data due diligence

Deploying data analytics to structure and filter large, complex data sets before they make their way into the data room is one promising approach to the problem. But such tools can be expensive and difficult for non-technical stakeholders to evaluate.

There is, however, another way to address the challenge of Big Data in M&A due diligence – by streamlining the intensely collaborative, high-skill human processes and workflows that will always be necessary for deals to be evaluated and executed effectively. Reviewing teams need to be able to examine large quantities of data in various formats and from multiple sources quickly and efficiently. They need tools that will help them organise disparate forms of data, analyse it and effectively communicate its significance to inform accurate valuation and set the stage for a fruitful integration.

Unfortunately, the results of the Axiom survey suggest that corporate due diligence teams – as well as the outside firms that are sometimes brought in to perform due diligence – still rely far too much on inefficient processes, and general counsels, legal executives and corporate leadership may not have a good grasp of the technological requirements they should look for in solutions that help streamline these very specialised workflows.

So what would a more effective process look like? How can due diligence workflows be improved so that teams on both sides of the transaction can arrive at a more accurate and nuanced understanding of the proposed deal?

A better virtual data room (VDR): technology for meeting the due diligence challenge

While many data rooms today still rely on a patchwork of discrete solutions and time-consuming processes involving a lot of printing and copying and paper-shuffling to get the job done, technologically sophisticated and comprehensive solutions created specifically for M&A due diligence are now available and are making a big difference. Some of the most important factors to consider when evaluating virtual data rooms are listed below.

Security. Security features and protocols should be on a par with those in force at the best-run financial institutions and government agencies. A cloud infrastructure is preferred, in part because sensitive data and work product can be contained within the shared online workspace. This means it is always accessible to authorised users, and there is no temptation to copy documents and take them outside the workspace – a serious security risk. Copying and printing of documents is much harder to control when teams rely on multiple tools installed on local machines.

Look for a platform that allows you to set up private, secure ‘meeting rooms’ in which individual buyer teams can collaborate privately using their own preferred evaluation methodologies and create annotation threads on any document. Likewise, seller representatives should be able to conduct their own private conversation threads to plan updates or log observations about the diligence process. Tiered permissions systems are also essential for sellers who need to control which tools, folders and documents are accessible to individual users or parties at each stage of the process.

Speed. In data rooms bursting with data to review, ‘little things’ add up quickly. Advanced data rooms will feature sophisticated previewing capabilities to speed up the process. If reviewers can scroll through and preview individual pages to make a decision about relevance without having to open the entire document – and when that advantage is multiplied across thousands of documents – the time and money savings can be considerable. When a reviewer chooses to open a document, even the largest among them should open in seconds. For their part, sellers can benefit from the ability to ‘drag and drop’ individual items or entire folder structures into the online data room. Again, cloud infrastructure is best because it can scale quickly to massive influxes of data without affecting performance.

Search. Fast, effective searching is at the heart of due diligence work, and inadequate search tools are simply no longer acceptable in the age of Big Data. Users need to be able to execute and save keyword searches across the entire body of documents in the data room so they can instantly locate matching terms within open or closed documents, system tags, team annotations or the Q&A panel. Advanced searching techniques which identify variants of words and filters like those found in Google should be available as well.

Organisational tools. Users should to be able to attach margin notes and create hyperlinks to relevant passages simply by highlighting text with a cursor and clicking. Custom tagging of documents and notes facilitates efficient collaboration and creates a tight feedback loop between parties. Buyer and seller dashboards and reporting tools allow managers to immediately identify new materials that have been added to the dataset, as well as monitor project status, assign responsibilities and manage workflow. Effective organisation is probably the most important weapon a due diligence team can use in analysing very large data sets.

Due diligence is, or should be, more than a verification and box-ticking exercise. The formidable challenges of Big Data should not become an excuse to retreat from the qualitative objectives of a sound diligence process. A virtual data room that provides a secure cloud environment, fast-loading documents, powerful searching capabilities and purpose-built collaborative tools ensures a more efficient, cost-effective and satisfactory due diligence process.

A sophisticated data room can help due diligence teams on both sides of the transaction consume large quantities of complex data in a more structured way. This facilitates more consistent and effective workflows, producing valuable analysis to illuminate the strategic logic of a transaction. This should ensure the potential synergies between the buyer and the target are actually realised once the deal is completed.

 

Josh Kirk is an associate at Opus 2 Forum. He can be contacted on +44 (0)20 3008 5900 or by email: jkirk@opus2.com.

© Financier Worldwide


BY

Josh Kirk

Opus 2 Forum


©2001-2024 Financier Worldwide Ltd. All rights reserved. Any statements expressed on this website are understood to be general opinions and should not be relied upon as legal, financial or any other form of professional advice. Opinions expressed do not necessarily represent the views of the authors’ current or previous employers, or clients. The publisher, authors and authors' firms are not responsible for any loss third parties may suffer in connection with information or materials presented on this website, or use of any such information or materials by any third parties.