Data Governance Archives - Thomson Reuters Institute https://blogs.thomsonreuters.com/en-us/topic/data-governance/ Thomson Reuters Institute is a blog from Thomson Reuters, the intelligence, technology and human expertise you need to find trusted answers. Wed, 28 Dec 2022 15:33:06 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 How to improve handling of law firm rate increase requests through data: A view from in-house counsel https://www.thomsonreuters.com/en-us/posts/legal/handling-law-firm-rate-increase-requests/ https://blogs.thomsonreuters.com/en-us/legal/handling-law-firm-rate-increase-requests/#respond Wed, 28 Dec 2022 15:33:06 +0000 https://blogs.thomsonreuters.com/en-us/?p=55064 For years, the in-house legal team at Volkswagen Group of America, Inc. (VWGoA) used a manual, time-consuming approach to review law firm rate increase requests. Law firms would email proposals to various in-house attorneys, who in turn coordinated with legal operations professionals and leadership.

This process then kicked off a volley of communications — internal and external — and necessitated forwarding emails, PDF letters, and spreadsheets for analysis and follow-up. The legal operations team provided some central support, but this was often challenging because data limitations made it difficult to account for past rate increases and freezes across different firms. Overall, the efforts felt somewhat ad hoc and very time-consuming.

“It has always been important to us to get this right,” says Antony Klapper, Deputy General Counsel in Product Liability & Regulatory at VWGoA. “We want to be fair to our law firms, whom we view as trusted partners. At the same time, we must manage our company’s finances responsibly — and execute all of this efficiently with a leanly-staffed team.”

Trisha Fletcher, Legal Operations Specialist at VWGoA, emphasizes these points as well. “Collectively, our team had a strong desire to find a better way to do this.”

Taking a new approach

The VWGoA team launched a new initiative to process rate increase requests more effectively for 2022 and beyond — one that would ultimately win them an ACC Value Champion Award.

The first step, the team decided, was to establish a more centralized, uniform approach. This would be managed by legal operations with strategic guidance from legal leadership. Of course, there would still be coordination with in-house counsel, but in a more efficient way — built around a centralized process, featuring stronger use of data analytics, benchmarking, and core decision governance from leadership.

The next step then, was to improve the in-take process. Outside law firms were asked to submit their rate increases within a designated window of time and through a common portal. This allowed the team to consider them all together, performing side-by-side comparisons of similar firms to ensure more consistent treatment under then-current market conditions. This commonality also enabled the use of greater analytics capabilities to assess past rate increase history, as well as internal and external benchmarking comparisons.

Within this framework, the team also began examining firms’ compound annual growth rate (CAGR). A law firm’s billing rate CAGR shows a multi-year view of the firm’s rate increase history, accounting for past increases and rate freezes. Standardizing the figures this way enabled better side-by-side comparisons across the portfolio, and showed which law firms were high or low outliers based on their multi-year rate history.

The VWGoA team also found it very helpful to use data to model the dollar impact of the requested increases per timekeeper for the coming year. This was instrumental in identifying the most impactful requests in order to focus on managing costs.

Seeing the benefits

Through this new approach, VWGoA legal leadership and legal operations were able to implement more effective governance and decision logic to streamline the rate decisions in light of portfolio metrics and company financial considerations. By streamlining and consolidating the process, they freed up considerable hours that their staff had previously spent responding to rate increase requests as they came in, managing them all through one common workflow. They saved further time be setting auto-approval thresholds for certain rate increase increments.

In the end, the projected savings for the coming year were significant, with rate increases for various timekeepers, for example, trimmed to about one-half of the increment originally sought. The VWGoA team devoted particular attention to adjusting high outliers and managing the impact on budget in a sustainable way.

Beyond time and money savings, the team built a process that leveraged better data to drive better decisions. The result is a strong business case showing how those in legal can use technology and data more effectively to increase productivity and execute against business metrics.

From law firms’ perspective, understanding the data that informs a client’s financial position is a helpful way to focus their rate increase conversations onto a productive end for both sides.

“We recognize that, in this economy, many clients are facing challenging headwinds,” says Susan Vargas, Partner at King and Spalding. “As trusted partners, we are glad to talk about goals and metrics to strengthen our relationship in mutually beneficial ways — and we welcome informative data to help us do that.”

]]>
https://blogs.thomsonreuters.com/en-us/legal/handling-law-firm-rate-increase-requests/feed/ 0
Is your cyber coverage ready? Cyber insurance uptake is rising, but coverage questions remain https://www.thomsonreuters.com/en-us/posts/news-and-media/cyber-insurance-coverage/ https://blogs.thomsonreuters.com/en-us/news-and-media/cyber-insurance-coverage/#respond Wed, 21 Dec 2022 13:36:03 +0000 https://blogs.thomsonreuters.com/en-us/?p=55019 Just because cyber-attacks are no longer all over the news doesn’t mean that they’ve gone away. In fact, the opposite could be true as cyber-attacks have now become an expected part of doing business. Indeed, cyber-attacks against tax & accounting firms have increased 80% between 2014 and 2020, according to the Association of International Certified Professional Accountants (AICPA), while the American Bar Association (ABA) reported in 2021 that 25% of US law firms had been breached at some time.

As those cyber risks have increased, so too has the growth of insurance coverage for cyber incidents. But while cyber insurance has begun to receive more uptake, increasingly stringent standards for coverage as well as confusion about the options available for cyber incidents could leave some companies in the lurch.

According to the 2022 Cyber Readiness Report from insurance provider Hiscox, almost two-thirds (64%) of companies now have cyber insurance as either a standalone insurance policy or as part of another policy. This represents a small rise from 58% two years ago. The highly regulated financial services sector has the highest rate of cyber insurance adoption at 74%, while the construction and travel/leisure industries have the lowest adoption at 53% each.

Crimes of opportunity

Judy Selby, a partner in the insurance practice at law firm Kennedys and a regular speaker on cyber issues, said that she’s beginning to see an improvement of companies’ general cyber awareness that current hacking incidents are largely “crimes of opportunity,” rather than dependent on the industry in which a company operates.

“I think for years, there was a thought process that nobody would be interested in my data, my company’s data,” Selby said. “And if you remember the days of the big retail incidents, the data breaches, I remember companies saying to me personally, well, we don’t have credit cards, so nobody’s going to want our information.”

Now however, she added, “I think the uptake is getting higher now than it used to be. And part of that was this realization that yes, it can happen to us, which is a really big deal. And also recognizing that the exposures come from so many different angles.”

Indeed, the Hiscox survey found a strong correlation between exposure to a breach and a desire for cyber insurance. Out of the firms that did not have cyber insurance or did not plan to get it, nearly 80% had not experience a cyber-attack within the past year. Just over half (51%) of those were also considered “novices” in cyber readiness, according to the Hiscox scale.

Even among those companies that had cyber insurance, however, there remained some stratification between the types of coverages they held. Notably, companies were split roughly down the middle as to whether they held a standalone cyber policy or covered cyber as part of a larger policy. Among companies with 250 or more employees, 35% had a standalone cyber policy in place, and 40% had cyber coverage as part of another policy. At companies with under 250 employees, those figures were 28% and 29%, respectively.


“I think the uptake is getting higher now than it used to be. And part of that was this realization that yes, it can happen to us, which is a really big deal. And also recognizing that the exposures come from so many different angles.”


Selby said she is a proponent of standalone coverage, if possible, for a few reasons. First is simply “because the coverage is so comprehensive, you have all this great first-party coverage for dealing with an incident.” Particularly with more sophisticated cyber-attacks, policies that include business interruption coverage, regulatory coverage, and liability coverage are coming into play.

Concerning the latter, Selby noted that many companies are “not technically or financially able to respond to an incident on their own.” When a network is encrypted and the company’s access to it is blocked, for example, even the simplest of questions become complicated: How do we communicate with each other? How do we hire vendors to come in and help us? And even if we wanted to pay a ransom, how would we do that?

“These are things you don’t want to have to learn on your own,” she explained. “And so, the first-party coverage can be a real lifeline to companies to efficiently and effectively manage this incident from [not only] a financial standpoint [and] an operational standpoint, but also from a reputational standpoint.”

Preparing for a cyber incident

Outside help on cyber incidents may be increasingly necessary because overall cyber readiness is falling, the Hiscox survey notes. Respondents’ self-assessment of overall cyber readiness fell by 2.6% overall during the past year, with the number of companies qualifying as “experts” falling from 20% to 4.5%. The survey attributed those decreases to awareness of new vulnerabilities such as the Apache Log4j logging library vulnerability, as well as a continued talent crunch for cybersecurity experts.

That’s why Selby said she tells clients to not only get to know the details of their insurance providers’ coverage options (and subsequent limits on policies), but also what she calls providers’ “cyber squad” team. A typical cyber insurance provider will have a mix of panel firms, forensic analysts, notification vendors, and more that can be a godsend in a pinch, often provided at discounted rates.

This extra value can be important when making a business case for cyber insurance as well, she added, as the insurance has become more expensive and the scrutiny for coverage has gotten more intense. Some security measures, such as multi-factor authentication, are now table-stakes for coverage, which could scare off some businesses. However, Selby drew an analogy to property insurance: Every provider is going to ask not only about fire incidents that happened in the past, but sprinkler systems and fire exits that could help prevent them in the future.

“It always surprises me when people… complain about having to provide the information,” Selby said. “It’s like, if you don’t understand your own risk, why would you expect another company to say, okay, we’ll insure that for you, we’ll take that risk on your behalf when you don’t know what it is? And then when you say that, they go, oh yeah, that makes sense.”

Ultimately, cyber issues aren’t going away, particularly as the Hiscox survey found the median cost of a cyber-attack nearly doubled in the both the United States and the United Kingdom last year. That means cyber insurance will also continue to represent a piece of companies’ risk mitigation profile by necessity.

“The issues that people have with applying for the coverage, that shouldn’t stand in the way,” Selby said. “I think people should proceed and get the coverage, and when you get it, keep it, even if the price has gone up.”

]]>
https://blogs.thomsonreuters.com/en-us/news-and-media/cyber-insurance-coverage/feed/ 0
Visibility into supply chains takes center stage as regulatory, corporate pressures mount https://www.thomsonreuters.com/en-us/posts/international-trade-and-supply-chain/supply-chains-esg-visibility/ https://blogs.thomsonreuters.com/en-us/international-trade-and-supply-chain/supply-chains-esg-visibility/#respond Thu, 08 Dec 2022 18:13:49 +0000 https://blogs.thomsonreuters.com/en-us/?p=54769 As supply chains have become a primary growth driver and key activator for environmental, social & governance (ESG) initiatives, they have simultaneously gained importance in the board room at many companies.

As a result, visibility into supply chain actions and outcomes has catapulted to the top of many corporate wish lists — but many business leaders become frustrated when their operations and technologies don’t deliver. Still, experts say, better visibility into corporate supply chains can be achieved, but only if companies are willing to think about their sustainable supply chain initiatives in a more innovative way.

According to a September EY report on sustainable supply chains, visibility has become one of the top priorities among supply chain leaders. Of the 525 large corporations surveyed, 58% said that increased end-to-end visibility in their supply chain was among their top two priorities in both the past two years and the upcoming two years. However, despite that desire, just 37% of supply chain leaders reported achieving supply chain visibility over the past two years, indicating a large gap between the desire for more visibility and the progress many organizations are practically achieving.

Rae-Anne Alves, ESG & Sustainability Supply Chain Leader at EY Americas and co-author of the report, said that visibility is the key first step to compliance. “When companies are thinking through their supply chain and trying to make it more sustainable, they need end-to-end visibility to know is what is happening,” Alves said. “Companies are lacking the transparency that they need from their suppliers through logistics, especially in areas outside of their four walls.  Achieving this transparency will give them the visibility they need across their supply chain.”

Recent research from the Thomson Reuters’ Market Research & Competitive Insights team mirrored these findings. In interviews conducted with senior leaders of US-based companies charged with tracking ESG efforts, large numbers of companies say they have established dedicated ESG efforts but collecting data and measuring those efforts remains disconnected and lacks consistency.

The issues in raising visibility

When it comes to trying to raise the visibility of supply chain practices and outcomes, many corporate leaders have run into an unfortunate reality: the difficulty of gathering and mingling data that lives in disparate systems. One public company ESG head explained that a common supply chain review pulls data from systems as broad as risk management and operations software, human resources software, and procurement and supplier-oriented software.

Combining all of these types of data into one truth remains difficult. “I don’t even know how they collect their data,” said the supply chain head of another public company. “Every vendor has their own process.”

This problem is only increasing as companies are beginning to scale up the types of data that they collect, EY’s Alves added. To take a firmer grasp on their supply chain, many companies are looking to catalog not only emissions from scope 1 (directly owned by the company) and scope 2 (indirect use of energy the company purchases), but increasingly scope 3 emissions that result both up and down the company’s value chain as well. Indeed, the more a company’s data collection scope expands, the more complex the visibility question becomes. Many supply chain-centric software providers have arisen in recent years to try and compile and display all of these data sources, however, currently, there is not a leader that has captured a substantial share of the market.


Some companies have been able to achieve more supply chain visibility, becoming sustainable supply chain “trailblazers” with an “extreme focus on transparency”


“It’s unclear yet whether there will be a provider that is able to deliver the end-to-end capability needed for a digitally network-connected supply chain,” explained Gaurav Malhotra, Partner and Americas Supply Chain Technology Leader at EY. “There are many factors that have to come together, versus just a singular platform from a control tower or visibility standpoint to enable the orchestration.”

Instead, many companies have tried to apply other technological fixes to the issue, often without much success. “Almost everything is run on Excel. It’s truly terrible,” a public company’s supply chain head told Thomson Reuters Institute. “We have very few tools for environmental stuff. Everything is reported through Excel, everything is measured in Excel, everything is rolled up in Excel and it’s extremely inefficient because we have all these different teams.”

Supplying more visibility

Still, some companies have been able to achieve more supply chain visibility. EY’s report designated certain companies as sustainable supply chain “trailblazers” and noted that one of the traits they have in common is an “extreme focus on transparency” through which “[t]hey can significantly or moderately peer into Tier 2 and 3 supply networks.”

EY’s Malhotra said these leaders often undertake two simultaneous shifts to aid this transparency. One involves automating individual supply chain functions so that they can run more efficiently and be consistently reliable. The second involves integrating those individual functions and making sure their output data is portable to enable the needed effective real-time communication, both internally and with external supply chain ecosystem partners.

Currently, he explained, most supply chain networks are “not digitally integrated in their true sense” because they operate in multiple stages. Data is processed by one organization that controls their section of the supply chain ecosystem, then it is transmitted to be able to be consumed or processed by other organizations. While Malhotra concedes that it takes “time and effort to ultimately get to a mostly autonomous state,” he believes combining, integrating, and automating these steps will be the future of supply chain management.

“What we have found is that some leading companies have moved towards an integrated process and singular platform that allows the right level of visibility, orchestration and actioning with their supply chain network partners,” Malhotra said. “Enabling trust, effective execution and accountability with the overall network in play, resulting in a highly efficient, highly integrated, differentiated and reliable supply chain.”

Leading companies are also pushing for data standardization among common supply chain suppliers, Alves added. Many sustainability frameworks are available, and increased regulatory attention continues to add more complexity. Increased standardization can make supply chain data more actionable, and auditable, potentially lowering a company’s risk profile. When asked about top supply chain priorities for the coming year, the ESG head of one public company was clear: “We want to make sure that we have auditable processes in place, that the data is sound.”

However, Alves added that for sustainable supply chain measurement and reporting businesses are “definitely not there yet.” As both public and regulatory attention in the space continue, expect that visualization into supply chain processes and data will become even more important, and leading organizations will continue to invest resources and personnel to get their supply chain data house in order.

]]>
https://blogs.thomsonreuters.com/en-us/international-trade-and-supply-chain/supply-chains-esg-visibility/feed/ 0
ACAMS: Fighting financial crime in the Metaverse https://www.thomsonreuters.com/en-us/posts/investigation-fraud-and-risk/acams-2022-financial-crime-metaverse/ https://blogs.thomsonreuters.com/en-us/investigation-fraud-and-risk/acams-2022-financial-crime-metaverse/#respond Mon, 05 Dec 2022 19:06:40 +0000 https://blogs.thomsonreuters.com/en-us/?p=54740 LAS VEGAS — As lawmakers continue to debate how to regulate digital assets and fight new forms of financial crime made possible by the current version of the internet, many tech and financial-crime experts are concerned that the next iteration of the internet — Web 3.0 and the Metaverse — may be an even more welcoming playground for criminal activity.

“Technology advancements are a great thing, but the Metaverse combined with Web 3.0 allows people to be more anonymous than ever,” says Jim Lee, chief of the Internal Revenue Service Criminal Investigations unit (IRS-CI). “As a result, we all know that the criminal element is going to come out somewhere, somehow.”

Is prevention possible?

Lee spoke recently at the recent 21st Annual Anti-Money Laundering & Anti-Financial Crime Conference, held by the Association of Certified Anti-Money Laundering Specialists (ACAMS), where the challenge of preventing Web 3.0 from becoming a safe haven for criminals was discussed in a number of forums.

ACAMS is attended primarily by bank regulators and other defenders of the traditional financial system, and the consensus opinion among this crowd is that anticipating how criminals could exploit Web 3.0 is the key to preventing it. Mistakes made building the current internet should have taught us that addressing content problems after the fact is a losing game, many experts say, so it’s essential to build controls and safeguards into Web 3.0 before criminals even have the opportunity to commit a crime.

Or so the thinking goes.

There are several holes in that proposition, however. Among them: i) regulators and Web 3.0 technologists would need to find a way to work together somehow; ii) not everyone agrees on the nature of the problem or how to prevent it; iii) lawmakers have a dismal record when it comes to recognizing and addressing issues involving technology before they happen; and iv) criminal activity in the Metaverse is already on the rise, so the clock is ticking.

What is Web 3.0?

Though the terms are sometimes used interchangeably, Web 3.0 and the Metaverse are not the same thing. Web 3.0 is the underlying architecture of the Metaverse, which itself is the immersive, three-dimensional digital world that proponents of the technology (such as Meta CEO Mark Zuckerberg) claim is the future of the internet.

Though the Metaverse is still in the early stages of development, elements of Web 3.0 are already being used today in the world of cryptocurrencies and other digital assets (such as with non-fungible tokens and stablecoins), all of which are based on blockchain technology. One of the key differences between today’s internet (Web 2.0) and Web 3.0, however, is that the latter is built entirely on blockchain smart-ledger technology-driven by machine learning and artificial intelligence.

The key features of Web 3.0 that most concern government officials and law enforcement are decentralization and anonymity. Not coincidentally, these are the same features that make crypto-based crimes and crypto-enabled criminal networks so hard to thwart.

The core idea of Web 3.0 and hence the Metaverse, however, is that it is entirely decentralized, meaning that no central power or government controls it. And for many Web 3.0 evangelists, that’s the central selling point of Web 3.0: Freedom from governmental control.

From a government regulator’s perspective, however, total decentralization is a huge problem. What it essentially means is that anyone can do anything, anonymously, and with no accountability, and there’s very little that conventional law enforcement can do to stop it.

Virtual crime, real-world victims

That’s not all. The trouble really starts when criminal activity in the Metaverse leaks over into the real world. At ACAMS, Lee asked his audience to imagine strapping on some virtual-reality (VR) goggles and walking into a building in the Metaverse: “Floor 1 is the ID theft room, where you exchange some sort of digital asset and they instantly give you a driver’s license, a date of birth — Personal Identifiable Information (PII) — that you can then go use for credit-card fraud, bank fraud, or whatever crime you can think of using PII.”

Floor 2 is the firearms floor in Lee’s digital dystopia. There, you can purchase the location of a gun in the real world, with no background check, “and now you’ve got a person who shouldn’t have a weapon,” Lee says. Floor 3 is devoted to human trafficking. Floor 4 to money laundering. Floor 5 to terrorism. And so on.

“It’s an ugly picture,” Lee warns.

Anjana Rajan is the Chief Technology Officer for Polaris, the largest anti-human trafficking NGO in the United States. At ACAMS, she explained that Congress should be concerned about the rise of Web 3.0 because of the “philosophy” of institutional distrust behind it. “It’s really about society and the future of our political system,” Rajan explains. “In its best form, this technology can create economic inclusion and a more secure internet, but in its worst form it can also drive the same thing that happened on January 6.”

Proponents of Web 3.0 have a distressing amount in common with anti-government violent extremists, namely, that “they don’t trust US institutions, they don’t trust the US dollar, and they don’t trust the corporations and oligarchs who run the economy,” she adds.

The difference is that Web 3.0 and the Metaverse are being built by some of the richest, most powerful people — and the largest tech companies (such as Meta, Google, and Microsoft) — in the world.

Re-thinking trust

A lawless, entirely unregulated Metaverse is not inevitable, these experts say, but it will require a re-thinking of some of the basic concepts upon which financial institutions and society at large are currently based, such as identity and trust. For example, our concept of identity in the real world revolves around a person’s PII, such as date of birth, social-security number, driver’s license number, address, etc. However, it may be time for the government “to start moving away from normal concepts of identity-based trust and instead move to concepts of trust within the ecosystem,” notes Frederick Reynolds, Chief Compliance Officer for the fin-tech start-up Brex.

In the ecosystem of the Metaverse, one’s identity is defined by the metadata on their blockchain, and trust within the ecosystem is built through blockchain activity that is independently verified by a decentralized network of fellow users. So in a sense, blockchains build trust by eliminating the need for it.

For better or worse, this is how the Metaverse works. Yet, if we’re not careful, these experts warn, criminals will figure out how to make it work for themselves before law enforcement can figure out how to stop them.

]]>
https://blogs.thomsonreuters.com/en-us/investigation-fraud-and-risk/acams-2022-financial-crime-metaverse/feed/ 0
New communications demand a new approach to compliance https://www.thomsonreuters.com/en-us/posts/investigation-fraud-and-risk/new-communications-demand-a-new-approach-to-compliance/ https://blogs.thomsonreuters.com/en-us/investigation-fraud-and-risk/new-communications-demand-a-new-approach-to-compliance/#respond Mon, 28 Nov 2022 13:22:58 +0000 https://blogs.thomsonreuters.com/en-us/?p=54596 Modern unified communication (UC) tools have become a critical part of the communications infrastructure for many organizations. The use of Short Message Service (SMS), collaboration, and chat applications to conduct business is powering the work-from-anywhere era.

Yet, mistakes, data breaches, and data exposure tend to happen when people communicate and share information digitally, and firms need to make it as straightforward as possible for employees to leverage modern UC tools while remaining compliant and secure.

“Increased reliance on simple, easy-to-access but unauthorized chat and text platforms will pose a significant challenge for many types of entities operating in our markets. Internal compliance programs must adopt internal controls consistent with this new landscape. Firms must inculcate a culture of compliance at all levels of their organization to mitigate the risks associated with using unauthorized chat and text platforms.”

Kristin N. Johnson, commissioner, US Commodity Futures Trading Commission (CFTC), September 2022

In its 4th annual survey report on modern communications compliance and security, security and compliance software firm Theta Lake highlights the complex challenges faced by those professionals tasked with maintaining compliance, security, and data privacy within firms and companies. The report is based on the views and experiences of more than 500 compliance and security professionals from the heavily regulated financial services, healthcare, and government sectors across the United States, the United Kingdom, and Canada. The report provides a snapshot of how communication platforms are being used and the issues with which organizations are struggling and can help organizations benchmark their own practices and expectations against those of the wider industry.

Heightened regulatory focus on modern communications

The survey findings come against the backdrop of fines of more than $2 billion already levied by the US Securities and Exchange Commission (SEC) and the CFTC for failures of organizations to capture, retain, and supervise communications. The situation underscores that a lack of visibility and oversight is one of the biggest risks faced by firms in a modern hybrid workplace. For example, the survey showed that two-thirds (66%) of financial services leaders believe employees are using unmonitored channels, posing heightened compliance and security compliance risks.

“As technology changes, it’s even more important that registrants appropriately conduct their communications about business matters within only official channels, and they must maintain and preserve those communications.”

Gary Gensler, chair, SEC, September 2022 

The crackdown on non-compliant communications is the clearest indicator yet that regulators have lost patience with firms that have yet to address supervision and record-keeping risks that were exacerbated by the pandemic.

Attempts to offset these risks is made harder by the limitations of legacy supervision and archiving approaches, which also pose real risks and costs to businesses. As a case in point, 39% of survey respondents cited gaps in coverage as a top challenge with their existing archiving tools, while only 9% reported having no issues. Another 45% said they needed to be able to selectively archive written in-meeting communications like chat without having to record the video or audio. A mismatch between legacy tools built for email and today’s workplace, where 81% use chat and 63% use video equally or more than email, has created critical gaps in records. It has also put a spotlight on dated compliance tools that are unable to capture, retain, and supervise dynamic communications data.

“The time is now to bolster your record retention processes and to fix issues that could result in similar future misconduct by firm personnel.”

Sanjay Wadhwa, senior associate director of enforcement, SEC, September 2022

As a result, organizations face growing challenges to both enable communications across the platforms that employees and customers use while deploying technologies to appropriately capture, retain, and supervise these interactions to meet regulatory obligations.

“The [survey report] findings show just how integral modern communication platforms have become in today’s workplace, but there’s a lot of catching up to do when it comes to the compliance and security tools currently being used. The more than $2 billion in fines is the biggest wake-up call yet that compliance and unified communications teams need to be in lockstep to ensure a comprehensive approach to record-keeping and supervision.”

Stacey English, director of regulatory intelligence, Theta Lake

Proactive compliance needs modern tools

The views and experiences of survey participants highlighted numerous challenges that organizations need to overcome in order to stay safe and compliant in an increasingly complex communications environment.

Organizations are seeking specific capabilities in modern compliance tools, including the ability to capture contextual information such as reactions, emojis, GIFs, edits, or deletions as well as features like whiteboards. Tools also need proactive compliance functionality, including the capability to automatically post disclaimers and remove problematic content.

“Let me be clear here: I am talking about more than putting together a stock policy and giving a check-the-box training. This requires proactive compliance, and this type of approach has never been more important than today — a time of rapid and profound technological change.”

Gurbir S. Grewal, director, SEC Division of Enforcement, October 2021

Unsurprisingly, the control environment across all organizations is varied and complex, as approaches evolve to meet the rapid and constantly changing nature of communications and regulatory expectations.

Some 66% of survey respondents in the financial services industry are using documented usage policies as controls, with 65% using internally built platform controls, and 62% using specialist software to enforce policies. Almost half (45%) of organizations take a more draconian approach, however, by disabling features to limit the risk of new channels. Perhaps not surprisingly, the most frequently disabled features are camera functionality, file sharing, and screen sharing.

communications
Source: Theta Lake

In the short term, bans and blocks may work as a control. Given that the features being disabled are essential, however, it is only a matter of time before employees circumvent such policies — an observation reinforced by the recent regulatory enforcement action.

Organizations need modern compliance and security technology to give them the confidence and assurance to unlock the value of the platforms in which they have invested, rather than disable them, allowing staff and customers access to the features they want to use.


For more, you can download a copy of Theta Lake’s 2022 Modern Communications Compliance and Security Report here

]]>
https://blogs.thomsonreuters.com/en-us/investigation-fraud-and-risk/new-communications-demand-a-new-approach-to-compliance/feed/ 0
The Shearman Analytics model: 6 steps before beginning your law firm tech implementation https://www.thomsonreuters.com/en-us/posts/legal/shearman-legal-tech-implementation/ https://blogs.thomsonreuters.com/en-us/legal/shearman-legal-tech-implementation/#respond Wed, 02 Nov 2022 13:07:25 +0000 https://blogs.thomsonreuters.com/en-us/?p=54033 In this quarter’s International Legal Technology Association (ILTA) Peer-to-Peer magazine, the Thomson Reuters Institute sat down with law firm Shearman & Sterling to explore the firm’s technology implementation and decision-making process. There, firm tech leaders mapped out a journey that Shearman has dubbed Shearman Analytics.

The end goal of Shearman Analytics, says Glenn LaForce, the firm’s Global Director of Knowledge and Research, “is really modernizing firm systems.” After he and his fellow tech leaders joined the firm in early-2019, they enacted a plan to remove and replace legacy tech systems within a three-to-five-year window, re-architect them with a data link at the center, then be able to filter that data in and out across the organization “to provide greater transparency, decreased cost, decreased risk, [and] increased profit.”

Even if the concept may sound simple, the execution is anything but. The Shearman team is still executing the Shearman Analytics modernization program, only recently undertaking some larger-scale implementations after focusing on the firm’s underlying tech infrastructure and data governance. Indeed, Chief Knowledge and Client Value Officer Meredith Williams-Range jokes that over the last several years, the firm was not “bringing the sexy back — now we’re getting to the sexy.”

On the way, Shearman learned some lessons about implementation that other firms can model. Here are the six steps of the Shearman Analytics model that the team undertakes at the beginning of every tech implementation project.

1. Governance — Law is a heavily-regulated industry, after all. Before actually implementing a piece of technology, Shearman says it’s crucial to determine what regulations will cover its use. “Do we need any new policies in place? How are we going to regulate this data? How are we understanding the governance aspects of that?” Williams-Range says. “Because if you put technology in place with zero governance, it is a crapshoot at that point.”

2. Change management — Before actually starting the technical work on implementing technology, Shearman begins its communication strategy around why it is making the change early. Williams-Range dubs this an “engagement plan,” which solicits more active feedback than a training or communications plan. “If it’s going to take them out of their norm day-to-day, we have to have an engagement plan to do that,” she says.

Lawrence Baxter, Shearman’s chief technology officer, agrees, adding that leadership backing is crucial to affect change. “We don’t do stuff without sponsorship,” Baxter explains. “You’re going fail, and you work harder than you thought.”

3. Rip & replace — With the baseline governance and change management underway, now comes the beginning of the technology portion, specifically how to remove a legacy technology system and replace it with something new. By necessity, this comes with a technology analysis of not only how the new system will work, but also how it will interoperate with the firm’s pre-existing technology stack. “It’s an octopus with 42 arms that are the other systems. So you have to look at it holistically, otherwise you’re going to lose a leg,” Baxter notes.

4. Process analysis — Simultaneously with the technology change, Shearman analyzes whether the new technology will change firm processes and how the firm’s employees actually extract value from the tool. Or as Baxter puts it, “If you throw technology at a bad process, you just end up with a really fast bad process, right?” The firm will map out what type of processes interact with the piece of technology; and if any can be re-architected to provide more efficiency and less risk, the firm will map out a plan to begin that change. “Wherever possible, it is easier to change your processes to fit the technology as opposed to changing the technology to fit your processes,” he adds.

5. Data analysis — This type of data analysis is less tracking the metrics of the tool’s use or its ROI, and more the actual data that the technology uses. Determining what data is actually being utilized can provide an opportunity to dispose of data that could provide another risk vector for the firm. Williams-Range notes that with Shearman’s recent financial system implementation, “we are literally going point-by-point of data. Why is it here? ‘Well, because it’s always been.’ That is not the answer. The answer is, should it be here? Is this the right placement for this? Is this the golden source for that data architecture, and should it go somewhere else?”

6. Architecture — Finally, the technology team determines the method of implementation and what is driving the technology on the back-end. Increasingly, the answer is the cloud. In recent years, the firm has implemented a new global background based on SD-WAN [Software-Defined Wide Area Network]; Office 365 across the organization; and an Azure-based active directory single sign-on.

Jeff Saper, Shearman’s Global Director of Enterprise Architecture and Delivery Services, says the firm’s tech leadership intends for the cloud to continue to be the architecture answer moving forward. “We had the very similar mindset of saying, it gives us greater agility,” Saper says. “We become less reliant on capital expenditures and more reliant on agile services.”

]]>
https://blogs.thomsonreuters.com/en-us/legal/shearman-legal-tech-implementation/feed/ 0
Emerging Legal Technology Forum: Building stronger client relationships requires balance https://www.thomsonreuters.com/en-us/posts/legal/emerging-legal-technology-forum-building-stronger-client-relationships/ https://blogs.thomsonreuters.com/en-us/legal/emerging-legal-technology-forum-building-stronger-client-relationships/#respond Thu, 27 Oct 2022 13:59:24 +0000 https://blogs.thomsonreuters.com/en-us/?p=54023 TORONTO — Since the start of the COVID-19 pandemic, a shift has occurred in how clients and their law firms interact. What was once a regular set of in-person meetings suddenly shifted to a calendar filled with Zoom calls, and although some in-person meetings have resumed, the mix between the in-person and virtual has been irrevocably altered.

At the same time, a parade of collaboration technologies such as Microsoft Teams and Slack began to take even more prominence, creating new touchpoints for law firms to track and measure.

The result has seen an explosion of customer relationship data to help firms make decisions and better establish connections with their clients. In order to best take advantage of this new paradigm, however, it’s still important to utilize both this new data as well as a more traditional, personal touch, said panelists at the Thomson Reuters Institute’s recent 5th annual Emerging Legal Technology Forum. The key, of course, is finding the right balance.

The data in hand

Joy Cruz, Director of Business Intelligence & Data Analytics at management consulting company RSM US, said during the Forum’s panel, Ascendant Engineering: Emergent Techniques in Data Analytics and Strategic Account Management, that some of the common metrics that law firms should be using to measure their client relationships haven’t changed: profitability, productivity, client satisfaction, realization rates, and related data “bringing that whole story together in terms of understanding what you have, what you’re doing, how you operate historically, [and] what you can do.”

But what’s different since the pandemic is that data sources have exploded, meaning that even knowing where all of the necessary data resides is an even harder challenge than ever before. For a law firm trying to gather a response for an RFP, 85% of the time may be spent hunting for the relevant answers, Cruz estimated. And while many law firms are talking about executing a data plan, many firms can’t even take the first step of having insight into their data.

Joy Cruz, of RSM US

“The goal is to flip that so it becomes easily accessible to you.” Cruz explained. “One of the things we’re missing is that we’re not able to do the analysis piece yet, because it’s not available to you.” Indeed, without the data gathering step, “you’re making decisions based off of data that’s provided to you, but that might not be the full story,” she added.

Panelist Olalekan (Wole) Akinremi, a partner at law firm Deeth Williams Wall, noted that from his days on the corporate side, clients have already begun to take that step in evaluating their outside firms — particularly when it comes to tracking costs. He said that tech-enabled analysis can better look into outside counsel time and billing, contracts, and automation to free up time for more complex matters that are becoming more commonplace. Law firms also can take cues from their clients about how to use data to augment their arguments, Akinremi noted.

For example, “you can also go to management and say, we have two paralegals handling 1,000 requests, we need more support,” he said. “The proof is in the results.”

With the rise in data-driven decision-making, however, can come a tantalizing misstep: Over-reliance on data at the expense of other tools in the relationship-building toolbox. Panelist Philipp Thurner, CEO of relationship management software company Nexl, said that while raw data figures certainly help, “that might not tell you the quality of the relationship.

“Data can tell a story,” Thurner added. “But you can have one data set and can tell a million different stories from it.”

Thurner gave the example of counting email interactions: a hundred emails back and forth between a firm and their client could be construed as a strong relationship, particularly if those emails are increasing over time. But if those emails are surface-level interactions or about administrative tasks, the raw number may not reveal a relationship on rocky ground. “How do you judge a relationship?” he asked. “I think it’s up to us as human beings.”

Where data & relationships collide

In a later panel, titled Journey’s End: Maximizing Value in Client Experience, the discussion elaborated on that general premise. Suzanne Donnels, Chief Business Development & Marketing Officer at law firm Davies Ward Phillips & Vineberg, said she has noticed a difference between corporate clients who are actively involved in the firm/client relationship, and those purely focusing on data. “It’s harder for Davies to compete when you’re dealing with procurement departments, [because] they’re just looking at a number next to a name,” she explained, adding that a closer relationship means differentiation with “understanding their clients and the business that they’re in, and really figuring out solutions.”

Olalekan (Wole) Akinremi, of Deeth Williams Wall

Panelist Janet Sullivan, eDiscovery Counsel and Global Director of Practice Technology at White & Case, agreed with Donnels, noting that success metrics will inherently be different for different clients. Her firm’s strategy is called LIFT — Local Information, Firmwide Transformation — which establishes a standardized firm goal of how to drive success, but with the flexibility for bespoke solutions for each client.

To actually measure whether a firm relationship is successful, Sullivan said that repeat business is of course important, but that is just the baseline metric. What can set a firm apart, she said, is consistently gauging and collecting those success metrics throughout the life of a matter. “Not waiting until the end to say, ‘How did I do?’, then having to do a post-mortem and go back to all the things we might have done wrong.”

Sullivan admitted that it can be a fine line between asking for this data while not placing an undue burden on the client; however, there’s more than one way to tackle the issue depending on the type of data that’s needed.

However, panelist Fernando Garcia, who has served as General Counsel for a number of smaller legal departments, noted that law firms should approach this process with caution because of the time and personnel resources needed, as well as another hidden danger in soliciting client feedback.

Firms then need to respond to what they’ve learned, Garcia explained. “Be careful when you ask,” he said. “Because you’re going to get answers, and you have to act on those answers when you get them.”


You can learn more about how to create the kind of partnerships that will drive the strategic, financial, and operational priorities of your corporate law department here.

]]>
https://blogs.thomsonreuters.com/en-us/legal/emerging-legal-technology-forum-building-stronger-client-relationships/feed/ 0
Practice Innovations: Zero trust — Never trust, always verify https://www.thomsonreuters.com/en-us/posts/legal/practice-innovations-migrating-zero-trust/ https://blogs.thomsonreuters.com/en-us/legal/practice-innovations-migrating-zero-trust/#respond Fri, 21 Oct 2022 13:34:59 +0000 https://blogs.thomsonreuters.com/en-us/?p=53978 How can you best secure your computers systems in today’s world? “Trust no one or anything — and always verify.” This the basic idea behind zero trust, a new way to look at computer security. Zero trust works on the assumption that your networks are already breached, your computers are already compromised, and all users are potential risks.

Traditional systems security for years has followed the Trust but verify method in which once users are logged into a system then they are automatically trusted. The emphasis there is on protecting internal systems and information from outside attackers by using firewalls and passwords.

Unfortunately, as technology and attackers have grown more sophisticated, the Trust but verify method has become harder to maintain and less effective. Organizations have had to change their approaches to systems security in order to accommodate traveling users, users that work from home, users that bring in their own devices, as well as cloud-based software, other repositories, and more. The traditional boundaries of a network perimeter are drastically changing.


Migrating to a zero trust model can be done gradually, which is a benefit for smaller organizations that cannot afford a large initial investment.


With the growth of cloud computing, organizations are very globally connected; and their digital information is stored and used in private and public clouds of data and applications. Conventional boundaries for an organization’s network have expanded and become ever more obscure, opening the potential for cybersecurity problems. Zero trust offers a new way of viewing our computers and information that may make securing them easier.

With zero trust, implicit trust is eliminated, and continuous verification is required. By always assuming that a security breach has likely already occurred, a zero trust system will constantly limit access to only what is needed while continuously looking for malicious activity. Zero trust can reduce an organization’s risk from data breaches, ransomware, and insider threats. While zero trust is clearly more restrictive, it can simplify an organization’s cybersecurity defensive posture and provide a more easily secured system environment to better protect the organization’s data and assets.

In a security breach, trust is a vulnerability that is exploited. By eliminating trust as an issue, an organization’s systems become more secure and data breaches are prevented. However, this lack of trust doesn’t mean you don’t trust your users, instead it is akin to requiring users to use a key card every time they access a building.

Zero trust recognizes the reality that today’s computer systems are hostile places. Yet, zero trust is a not a product or an application. It is a set of principles that help you define a cybersecurity strategy based on an acknowledgement that threats exist both inside and outside traditional network boundaries.

The first step with zero trust, as with any new method or technology, is to understand how it addresses your organization’s unique business problems. What outcomes do you expect? How does zero trust address your needs? Without understanding your business needs and problems first, any new method or technology will ultimately fail.

Building zero trust

Migrating to a zero trust model can be done gradually, which is a benefit for smaller organizations that cannot afford a large initial investment. According to the US National Institute of Standards and Technology (NIST), many organizations may continue operating their newer zero trust in tandem with their older perimeter-based systems for years. To plan and architect your zero trust network, the following initial steps are suggested:

      • Start by building leadership trust — You need to seek understanding, support, and input from your firm’s leadership. Management support is critical to a successful transition to zero trust.
      • Define your most vulnerable attack surfaces — Start by identifying your biggest risk areas both now and in the foreseeable future, and work to apply initial zero trust initiatives that encompass processes, people, and your existing technology. Moving gradually will keep your firm from becoming overwhelmed with implementing new technology and policies across entire systems.
      • Map how your data flows — Document how your data moves around your devices, applications, and assets. It is essential to understand this data flow. Who is using it? Where is it coming from? To identify which data flows should not be trusted, you need to know which are critical to your firm and should be allowed. This mapping of data flow is the key to making zero trust work.
      • Harden your identity management — Users are the weakest link in any security system. Review your user authentication process and implement multi-factor authentication and tougher password policies to harden your identity management. Also, implement and regularly review login names and make sure they match active users.
      • Assign minimum rights (least privilege) — Review how your systems and data are secured and assign the minimum rights to the minimum number of accounts needed to access data or systems. The default access should be no access.
      • Whom do you trust? — Build a whitelist of who to trust. This includes users, devices, applications, processes, and network traffic.
      • Micro-segment your security — Dividing your security into smaller segments allows you to minimize any damage in case of a breach or compromise of any one area.
      • Define your zero trust policies — After you have architected your new system, write the needed policies to match. Defining who, what, when, where, why, and how for every user, device, and network that gains access to your system.
      • Monitoring is critical — As you build your zero trust system, it is critical to have an aggressive monitoring system in place. For zero trust to be effective you will need to continuously monitor access and look for any area where trust should be revoked and any unwanted access and be identified.

Zero trust is a journey that will take years to complete. “Never trust, always verify” is a fundamental shift in how we currently think about security, but it is a necessary shift. Security breaches are on the rise, and our old paradigms of security are not working as more devices come online and local networks evolve to cloud networks. Our data is increasingly at risk, and zero trust is a new and more effective way to protect ourselves.

]]>
https://blogs.thomsonreuters.com/en-us/legal/practice-innovations-migrating-zero-trust/feed/ 0
Practice Innovations: Knowledge management strategies in a zero trust model https://www.thomsonreuters.com/en-us/posts/legal/practice-innovations-knowledge-management-zero-trust/ https://blogs.thomsonreuters.com/en-us/legal/practice-innovations-knowledge-management-zero-trust/#respond Tue, 18 Oct 2022 14:02:49 +0000 https://blogs.thomsonreuters.com/en-us/?p=53929 We understand that knowledge management (KM) is the preservation and sharing of what we know, and that what we know is gained through individual experience as well as tacit and implicit knowledge. Therefore, organizations and leadership might infer that the zero trust Model and zero trust architecture — a security framework that assumes no traditional network edge and requires all users, even those in-network, to be authenticated and continuously authorized before being granted access — are an impediment to a mature KM culture.

Yet, what is considered an impediment and barrier to KM is often the result of confusing KM with information management (IM).

Instead, KM and IM should be considered more alike in their value systems rather than a competing priority in which an organization must choose between securing information and data versus sharing information and data. In accepting that there are both enablers and barriers to any organizational priority, a strong KM culture includes many of the same enablers that zero trust is tasked with supporting. KM, when it is aligned with zero trust, creates an even stronger KM value in the organization. And zero trust, like KM, succeeds best when working from the position of the four KM enablers: people, process, technology, and governance — as well as a strong organizational policy, which is critical for zero trust.

The successful implantation of KM and zero trust should be:

      • business focused;
      • supported by senior management;
      • embedded with the strategic vision and principles of the organization;
      • focused on higher value knowledge and higher value data;
      • able to demonstrate measurable benefits, such as competitive advantage and process improvement in tandem with risk mitigation and security; and
      • employed as a full organizational change.

Despite the decades-held belief that most security threats are external, it is inside threats that have risen to become a serious cause for concern, most recently this is due to the extension of network access across mobile devices, cloud users, and employees working in hybrid or fully remote environments.

Behind the emergence of zero trust is a broad concept that applies to technologies, networks, IT architectures, and security policies. This concept holds that users within a network should be treated as if they could pose a threat. Therefore, enterprise resources and data are to be protected individually and access to these resources should be evaluated and analyzed continuously.

The zero trust future

Zero trust is not a particularly unique approach. IT professionals would consider the principles of this model to be a good housekeeping practice for any healthy secure enterprise. Most IT professionals have long taken great pains to design systems that consider inside risk as dangerous as any other risk. Therefore, zero trust systems have been developed to behave as an integrated platform that contextualizes information based on identity and security that has shifted risk measures from traditional perimeter models (e.g. firewalls) to one that is identity-centric. Through this process, key questions emerge, such as who has access to what information? When do they have access? How much access is given, and what business purpose does their access support?

This identity-centric approach is consistent with KM mapping. KM mapping outlines the business challenge of what we know with strategic goals that can then be supported with KM interventions, such as a knowledge base, intranet, sales wikis, and CRM platforms. Additionally, to be successful, both KM and zero trust require agreed-to measurable outcomes.

This simplified explanation of zero trust in a KM world is consistent with KM values that improve business agility which brings with it the priority of protecting internal data and internal assets.

Strategies to overcome perceived KM barriers brought on by a commitment to zero trust overlay with the implementation of zero trust models. These strategies include:

      • mapping “need to know” information (KM) alongside “need to secure” (zero trust);
      • finding common alignment with strategic goals;
      • outlining business objectives and agility with business security; and
      • agreeing upon measurable benchmarks and outcomes, remembering that i) not all measures are monetary values; ii) not all measures should be targets; and that iii) common solutions can be identified to overcome “imposed” targets.

Much like KM, zero trust is a new mindset that requires sweeping changes to be implemented effectively. On the surface this seems daunting, but after evaluating KM and zero trust, both can be implemented to improve organizational value and effectiveness.

]]>
https://blogs.thomsonreuters.com/en-us/legal/practice-innovations-knowledge-management-zero-trust/feed/ 0
How will the landscape for ESG tech & data analytics solutions evolve? https://www.thomsonreuters.com/en-us/posts/investigation-fraud-and-risk/esg-tech-data-analytics-solutions/ https://blogs.thomsonreuters.com/en-us/investigation-fraud-and-risk/esg-tech-data-analytics-solutions/#respond Mon, 10 Oct 2022 18:58:16 +0000 https://blogs.thomsonreuters.com/en-us/?p=53832 The ecosystem of technology and data analytics solutions around environmental, social & governance (ESG) activities is vast and fragmented, yet new developing areas in this landscape continue to emerge.

Before any organization begins the procurement process of what solutions a company may need for its ESG strategy and tools, there are critical elements that first need to be worked through, in order to ensure the best solutions are evaluated and fit for purpose, and that valuable resources are not wasted.

Defining ESG for a specific organization

There are several aspects to the challenge of defining ESG for a specific organization. The first step is to figure out what elements, climate risks, health and safety, and governance structure, of the multitude of possible frameworks best apply to a particular company within a specific sector. Also, it’s necessary to understand what parts from these frameworks could have a materially financial impact on the organization’s operations. For public companies, it is important to go one step further to ascertain each factor’s influence on the company’s overall creditworthiness as assessed by external rating agencies.

“The definitional problem exists within the companies, the rating agencies that seek to judge them, and certainly within the models and the processes of investors who are looking to compare and contrast companies’ ESG risks and opportunities and investment worthiness,” says Andrew Archer, Head of ESG Advisory at European shareholder intelligence advisory firm Investor Update.

Ensure data quality with standard processes, controls & governance — Good data that provides transparency and accountability is critical to any ESG initiative. The challenge, however, is that sources of ESG data can sit across multiple corporate functions in siloed information systems. The richness — yet isolated nature — of that data leads to complexity that complicates information integrity around definition because the data often comes from different sources and is housed on different platforms.

Make it comparable with industry peers — One of the ongoing complaints about ESG from a variety of corporate investors is that a lack of standardization makes it difficult to analyze a point-by-point comparison for investment opportunities. This in turn makes it challenging to determine whether a company within a particular industry is doing a more effective job at reducing ESG risks relative to its competitors.

The data analytics solution

ESG technology and data analytics solutions can be valuable in solving for the ability to extract ESG data from multiple information systems into a common platform and to create the kind of comparative basis for assessing and judging companies that operate in multiple jurisdictions with many business lines across different geographical markets.

Despite the multilayered, convoluted requirements and landscape, ESG is graduating toward the highest common denominator, not the lowest at least from an access-to-capital-markets perspective. And this is a key difference when compared to past financial regulatory and compliance-related endeavors, explains Archer.

Yet, problems in this process remain. The current internal enterprise technology systems and processes being used to collect, analyze, and communicate ESG information are inadequate and need a review, tweaking, or an overhaul. The technology companies claiming to solve a piece of the complexity problem across the ESG landscape remain fragmented, although the technology capability exists.

Below are several examples of technology solutions that taking on some aspect of the ESG data complexity problem.

Follow the capital — Companies too often do not know how to judge their own performance or how to understand where best practices really exists, especially around their ESG information and strategy. As a way to cut through the noise, some solutions can provide transparency into where ESG-related capital is flowing to see how it is being allocated by investors across a specific sector or a set of peers.

For example, a company can see exactly how much capital is being allocated by investors to its competitors or peers through Investor Update’s platform, Archer says. Those companies attracting the most capital from specialist ESG investors are also the ones that demonstrate best-practice ESG disclosure, and it is those behaviors that should be emulated, he adds.

Improve transparency by industry — The corporate legal function plays an important role in a specific company’s ESG strategy and direction. Indeed, the legal team provides advice and contributes to the company’s overall ESG objectives through its internal activities and use of outside legal counsel.

As part of this legal supply chain, the general counsel selects law firms based on fees and skills, now with an ESG lever being factored into the selection process with increased frequency. Law firms are not known for their transparency on ESG, however, due mostly to the difficulty in collecting and comparing information across firms, according to Yannick Hausmann, a former Group General Counsel and co-founder of impactvise.

To increase transparency for the legal industry as a whole, Hausmann and Adrian Peyer, a former in-house counsel executives and current company secretary, created impactvise, a platform that includes a database of publicly available ESG information on more than 1,000 law firms across the globe. The platform also gives users the ability to compare data by geography and peers, using the World Economic Forum’s framework.

By focusing on the legal industry, the platform lets both general counsel and law firms understand where they rank within their peer group on key ESG factors. In addition, it also includes scoring related to “skills of the future” specifically for lawyers, tracks how law firms treat employees and care for employee well-being, and, by using a media sentiment analysis, ensures ESG is more than a branding exercise for the firm. Also, from law firms’ governance perspective, impactvise scores how the firm navigates conflicts of interest and client acceptance policy, according to Hausmann and Peyer.

Mitigate ESG risk — Another area evolving in the ESG data landscape is identifying ESG risk and predicting such risk in the future. For example, a partnership among Kona AI, the Massachusetts Institute of Technology (MIT), and several law firms established Integrity Distributed, a not-for-profit shared technology platform that allows organizations all over the world to contribute their ESG and corruption intelligence as a mechanism to train algorithms to better detect patterns of fraud and corruption in their respective industries. The goal of the collaborative project is to predict improper or corrupt payments with up to 90% statistical accuracy.

The divergence of systems and platforms continues to come market without standardization in mind, but that is improving as the various definitions and frameworks converge. Only time will tell how fast or slow the process of merging occurs, yet the willingness and necessity to create a positive impact will set the pace for change around ESG data analytics going forward.

]]>
https://blogs.thomsonreuters.com/en-us/investigation-fraud-and-risk/esg-tech-data-analytics-solutions/feed/ 0