The Fully Outsourced Fund Manager

Scott PriceGuest Contributor: Scott Price, Regional Director, Americas, Custom House Global Funds Services

There will come a time, possibly in the not too distant future, when a fund management firm will look to outsource virtually all of its operations – even trading – for funds with the right strategy.

Of the biggest challenges facing fund managers today when it comes to running their businesses for success, performance is the obvious and ongoing challenge. But putting that aside, from a business standpoint, it’s the uncertainty from the regulatory environment which is causing the most pressure. New regulations regarding FATCA require investment management firms to invest in new technology and systems to deal with these types of pressures. Managers will have to either build technology on their own or outsource it. From a compliance standpoint this is very expensive because the regulatory environment is always changing.

Costs continue to grow for investment management. The cost of building technology is obviously very expensive, and it’s not viewed today as an expense that is always necessary. The majority of the management firm’s expense should be focused on research, and a lot of managers believe that anything that’s post-trade should be at least somewhat outsourced. Whether it is regulatory reporting, risk reporting, or portfolio accounting, if outsourced, the management firm can spend their money and efforts on research, execution and, obviously, raising capital.

Another major challenge facing management firms is the ability to react to their investors’ demands, from a transparency perspective. During the operational due diligence process managers are required to show how they are calculating risk and it’s very expensive to build up that type of infrastructure. Going forward, more regulation will drive the need for more transparency. With the uncertainty of the regulatory environment, successful managers will focus on their core activities and outsource many of these functions to their administrator.

Not only are costs rising, but fund management firms are also increasingly being pushed down on fees. The 2/20 and 2/30 model just does not exist anymore; the new paradigm is closer to 1/15 or even zero management fees. As performance fees come under increasing pressure, other fees for running a fund are being passed along to investors rather than at the management level. Management firms, as they are being asked to reduce fees, are looking across their operations and deciding whether they need a full time compliance officer, a full time trader or a full time COO. The answer may be “no, no, no.”

Having all of your services bundled into one relationship is the new concept. For today’s fund management firm, simply making trading decisions and raising capital with everything else being outsourced is very much the play. Fund management firms are rationalizing their business not just to reduce cost but to focus on their core responsibilities.

As a result, various stakeholders within the fund management firm are looking at which functions can be outsourced. Funds are looking to completely outsource the COO responsibility to an external firm including compliance, reporting, daily administration, and technology infrastructure, allowing the principals to concentrate purely on capital raising and portfolio management. And increasingly, broker services are being built to outsource trading for those funds that only need to execute smaller daily orders, allowing some funds to outsource front office functions as well.

The issue of outsourcing is both a time and cost factor. But having one vendor or provider can limit the amount of cost, whether the function is risk reporting, compliance services, or administration such as striking the actual NAV (net asset value). It’s about aligning priorities. By bundling all these services to one provider, a fund management firm can significantly push down the cost of ownership.

At the same time, fund managers want to avoid the complexities and headaches of dealing with multiple vendors. It’s much easier to manage one service level agreement with one entity than having multiple services level agreements with multiple entities.

 

Posted in Guest Blog | Tagged , , , , | Leave a comment

When is the Cloud Not the Cloud?

Mal_CullenGuest Contributor: Mal Cullen, Head of the Americas and Eagle ACCESS℠, Eagle Investment Systems

Companies in the financial services sector are increasingly moving to cloud-based systems for their software solutions and with good reason. Understandably, cloud-based solutions were initially treated with some circumspection with question marks hanging over data privacy and security.

As time has gone on, the widespread adoption and success of cloud computing in other sectors has allayed these concerns to a large extent and financial services firms that two or three years ago were reluctant to entertain web-hosted technology solutions are now keen to embrace it and the wide-ranging benefits it offers. Last year, 95% of our clients’ solutions were deployed over our private cloud and it is now very much the exception to receive requests for on-premise installations.

True cloud-based solutions can deliver numerous tangible business benefits. For example, a platform that has been developed for the cloud using intelligent open architecture, opens up the possibility of vendors providing complementary products and services – such as performance measurement tools – that can plug into the platform. Clients can therefore have access to integrated solutions, with fewer partners to manage and less complexity.

Furthermore, cloud-based solutions can be accessed, and services provided, from anywhere in the world, at any time. In addition, it also offers the possibility of variable, ‘pay-as-you-go’ style pricing structures that are popular with many clients.

Any solution should be underpinned by a robust set of transparent SLAs outlining its capabilities, availability, disaster recovery and security.

Note that I said true cloud-based solutions… The Eagle technology has been specifically enhanced and optimized to integrate into a larger cloud platform. The fact is, many software providers claim to offer hosted or cloud-based solutions when in fact they are offering either a solution that is still installed on-premise and supported remotely or one that is an on-premise solution that has been taken and housed in a data center. Neither of these approaches are truly cloud-based solutions and come with significant drawbacks.

Because they haven’t been designed from the bottom up with an intelligent open architecture, the benefits around flexible pricing and complementary product plug ins cannot be achieved. At the same time, these solutions may come with risks attached. Where servers are sitting on a network somewhere on the internet, that is perhaps managed by a third-party, keeping the solution secure is a huge challenge. As well as having industry leading security controls, we organize our clients’ data so it remains segregated or isolated meaning the data remains identifiable and easily retrievable as well as completely secure.

In addition, a true cloud provider should provide comprehensive support to clients to help make the transition from an on-premise, managed solution to their cloud-based solution. A successful implementation of a cloud solution is reliant on helping the client to adapt their processes to take full advantage of the service’s functionality and ensure maximum benefit to their business.

Listed below are seven questions to ask any cloud-based solution provider.

1. Is there an ability to provide capacity on demand and at a variable cost basis?
2. Do you have the ability to integrate with other providers under an intelligent open architecture?
3. Do you have security controls implemented around SOC1, SOC2, ISO 27001/2 or NIST and are you continuously monitoring these controls?
4. Can the solution be accessed from anywhere in the world, at any time?
5. Are systems managed and upgraded by you, remotely?
6. Will you provide a comprehensive and transparent set of SLAs?
7. Will you actively support implementation by matching the business processes with the solution’s functionality?

Unless they can answer ‘yes’ to all of them, they are not truly cloud-based.

Posted in Cloud, Financial Technology, Guest Blog | Tagged , | Leave a comment

Greek Mythology and the Boulder of Compliance

Steve_TaylorGuest Contributor: Stephen Taylor, Senior Market Manager, U.S. Enterprise Risk & Compliance, Wolters Kluwer Financial Services

In Greek mythology Sisyphus, the King of Ephyra, was punished by the Gods for his wrong doing. His sentence? Roll a very heavy boulder up a steep hill, only to watch it roll back down. Repeat this action for eternity.

Sometimes, the compliance world feels as if it has been punished by the Greek Gods. Regulations are released and implemented, only to change again. This process seems never ending.

Across the financial services industry, regulatory and industry change is inevitable. But in recent years the volume of change has expanded since the financial crisis. Dodd Frank shook things up significantly. Thousands of changes have been announced on top of thousands already underway.

All rules are important, but some are more important than others. And, because of the volume and velocity of change, it’s easy to miss the crucial changes in all the noise. The most critical changes are the ones that can affect the operational effectiveness of an organization, especially those that can lead to regulatory censure, reputational damage, internal losses, and lots of finger pointing by senior management.

Many organizations have experienced sleepless nights over the developments proposed by the Volcker rule. Likewise, the regulation of OTC derivatives, globally, is causing considerable heart burn. Multiple agencies are involved in these developments, sometimes proposing exactly the same changes but in different documents. The margin for error is huge especially since for many this is uncharted territory. Developing even a basic understanding of the requirements is difficult. But then, figuring out how to implement these requirements against firm-specific trading and compliance processes can cause anxiety in its purest form.

For many, managing regulatory change is done manually. Organizations will go to a regulator’s website and see what’s new. For organizations regulated by multiple federal agencies and states, this is a very onerous task (Sisyphus might think he got off lucky). Add the global dimension into the mix, then the resource requirements for manual tracking are huge in time and are costly. But tracking the regulations is one thing; understanding which ones are relevant and how they impact the business is entirely another, which makes this task a very high risk process.

There is no easy way around the challenges presented by such an environment of regulatory change. Employing a manual process is not the answer. It takes more than a group of dedicated individuals to roll this particular boulder up the hill. Those firms that are becoming successful at managing regulatory change employ technology to do a lot of the heavy lifting, but, again, technology without the right initial planning, processes and culture may prove pointless.

When thinking about success, firms should consider doing an audit of the rules that will affect their business. For some, especially the larger institutions operating in multiple jurisdictions across multiple product lines, this is a massive task. Unfortunately, there is no way around it. Regulators demand that organizations know the rules they are beholden to and how they are managing these compliance obligations. This task shouldn’t be seen as one to simply satisfy the regulators. Mapping the rules that affect your business can have significant up stream benefits in the fullness of time. And it could mitigate regulatory action.

Once firms understand their obligations, the obligations can then be mapped to policies, procedures, lines of business, desks, functions, jurisdictions, groups and individuals. Again, this is a huge initial undertaking but one which can ultimately save considerable time and cost when it comes to managing ongoing changes.

Once the mapping is complete, it most likely will be immediately out of date. In fact, for some, it can become out of date during the initial mapping process! This is when technology blended with the appropriate regulatory content can become incredibly beneficial. With the right platform and the right content update feeds, maintaining the compliance infrastructure can be more easily achieved than trying to manage it all through Excel pivot tables.

Critical to success is the ability to quickly identify what has changed in the external environment, what are the key elements of that development, where they are mapped to the business, what is the potential impact (if any)? Then finally, who is on the hook to ensure the right implementation plans are initiated, tracked, chased and ultimately closed out. Each phase needs to be carefully managed and each phase needs transparency, so the entire end to end process is visible to program managers.

There are professional publishers that can help alert firms to the regulatory changes, but few technology systems that can easily identify where these changes could impact the business and then provide the right workflow tools to manage each obligation through to implementation in a meaningful way. This is the most difficult piece. Getting visibility into the rule changes isn’t enough. Firms need to pull this obligation through the business and clearly demonstrate that relevant action has been taken. The ultimate verification is that controls have been put in place to mitigate any potential risk and that these controls have been positively tested.

But if you need a PhD in Astrophysics to use the solution, then the business will not adopt the technology and as a result you could have a huge white elephant staring at you for years to come. One of the biggest roadblocks to implementing software is its ease of use… or lack of it. Firms need to pay particular attention to this. If an end user can see how the technology can save them time and effort, then they are more likely to use it and a virtuous circle is created. Make it difficult to use and this creates a barrier no amount of training will cure.

Creating that virtuous circle of effective regulatory change management is critical to success. For this to happen, there has to be the right “tone at the top.” Having the right culture for compliance is crucial and this can be improved if it’s demonstrated that effective compliance is not to be seen as an ineffective cost center, but as a way of running an ethical business which not only can improve the strategic direction of the organization but can improve the firm’s reputation within the market.

Sisyphus may still be rolling that boulder up the hill, but now is the time to ensure your regulatory change management program isn’t.

Posted in Compliance, Guest Blog | Tagged , | Leave a comment

Using FATCA Compliance to Improve (Not Hinder) Customer Relationships

strevus_headshotGuest Contributor: Haydon Perryman, Director of Compliance Solutions, Strevus

Achieving compliance and creating a positive customer experience are not mutually exclusive.

Many financial institutions still rely on disconnected, incomplete approaches to regulatory compliance, preventing them from gathering accurate, up-to-date information about clients and creating a customer experience crisis. Institutions are damaging relationships by repeatedly asking clients for the same information.

There’s no better, or timely example of the mounting pressures on financial institutions than the Foreign Account Tax Compliance Act (FATCA), which aims to stop Americans from using offshore accounts to evade tax by requiring foreign financial institutions (FFIs) to report information about accounts held by U.S. taxpayers.

Central to FATCA is customer due diligence (CDD) outreach and documentation, which depends on efficiently collecting, validating and reporting client data. If financial institutions get this right, they’ll have a well-run FATCA program; if they get it wrong, it will nosedive into a remediation program.

Creating a systematic, client-driven FATCA program

A last-minute approach to a program that requires iterative CDD outreach, combined with status quo data management solutions, is woefully impractical. Dismal, delayed response rates and incomplete, unreliable client input will disrupt the flow of business and lead to additional, intrusive customer outreach.

As a result, financial institutions need to prepare for CDD immediately. In addition, a structured, consistent approach to client outreach can ease the path to compliance and simultaneously support interactions that contribute to, instead of detracting from, strong client relationships. A client-driven FATCA program is characterized by systematically collecting customer information, then validating, reporting and provisioning it to counterparties, clients and regulatory agencies. It should include centralized access to all client-related information assets, a persistent communication link with clients, and secure information rights management for all compliance-related information exchanges.

Once a client-based program is in place, financial institutions can also leverage it for driving efficiencies and insight into client analytics. This allows firms to then engage with clients in new ways that grow the business, even amidst widespread and changing regulations.

Evidencing outreach

FATCA regulatory requirements are onerous and deadlines are imminent, making 100% compliance unlikely for most banks. While FATCA CDD response rates and validation may be outside institutions’ control, documenting outreach is within their control, and can be a proactive defense against enhanced regulatory scrutiny, fines and reputational damage.

Rather than “smiling and dialing,” institutions should implement a system that provides a full audit trail detailing every attempt to reach out to the client and the handling of every customer attempt to reach back. Therefore, institutions will always be able to prove at any level (e.g. IGA scenario, jurisdiction, client) where they are on their journey to compliance.

Mitigating withholding requirements

Under FATCA, institutions are required to terminate the relationship or impose a 30% tax withholding on Non-Participating FFIs and Recalcitrant Account Holders in Non-IGA countries. To address this, many institutions are spending tens of millions of dollars on a withholding engine in order to be FATCA compliant – a capability that, while necessary, shouldn’t be the first line of defense.

Instead of focusing on the penalty, institutions will benefit from focusing on the solution: what customers to retain and under what circumstances. By whittling down the pool of clients for which outreach is required and focusing resources on the right clients, the population on which the 30% withholding is imposed will be smaller; and in turn, the withholding engine will be used less.

To understand the scale of an outreach campaign, the first step is to get a “first cut” FATCA view of the institution’s client base to size and scope the extent of an iterative outreach campaign. The delta between outreach and getting a valid response (as required by FATCA) means that some customers and countries may present a particular burden. Non-Excepted NFFEs, for example, can take six times more outreach effort compared to an FFI already familiar with FATCA.

Secondly, institutions should create a judicious off boarding program. This doesn’t mean simply pursuing a “zero tolerance” approach to off board all non-participating institutions and recalcitrant account holders, and thereby avoid withholding altogether. Instead, institutions should review which customers and jurisdictions to retain, and under what circumstances the bank would retain a non-participating/recalcitrant customer. Since withholding and closing accounts do not always apply in IGA countries, for example, it probably doesn’t make sense to apply resources in those areas.

Thirdly, institutions will have to make well-informed assumptions about IGA status.
Contrary to popular belief, financial institutions do not have two years commencing from July 1, 2014 to complete FATCA CDD. In non-IGA countries, CDD must be completed on FFIs by December 31, 2014. As a result, institutions will need to use their best judgment as to what jurisdictions are likely to become IGA countries (e.g. G20 and OECD countries) before July 2014.

Finally, it’s important to start FATCA education early enough to give those clients who may not know how to comply – e.g. a local bakery in Europe, who rarely talks to the financial institution and has never heard of FATCA – time to understand what they need to do in order to respond. Because financial institutions have control over when to start this process, delays are arguably unfair to the client. The later the outreach, the lower the conversion rate, the larger the population of unhappy clients, and the larger the extent of withholding that could have been avoided with better customer care.

By building a proactive, client-focused FATCA program, financial institutions will simplify compliance without burdening the client for repeat information. In turn, institutions will be better positioned to quickly and predictably achieve regulatory compliance, while driving long-term customer satisfaction and business growth, even as requirements evolve amidst the tsunami of new regulations.

Posted in Guest Blog | Tagged , , , , , , | Leave a comment

Cloud Computing Adoption in Asset Management

ImageGuest Contributor: Jonathan Schapiro, Global Head of Cloud Solutions, BI-SAM

These days, the term “Cloud” has become ubiquitous – even the un-savviest of technology users have a general concept of what the term alludes to. But while “the Cloud’ is really just a metaphor for the Internet, “cloud computing” means so much more – and has become a disruptive force within the “business of IT”. Cloud computing is a form of service provisioning — the delivery of IT resources and services within a cost-effective and scalable framework. Across various industries, cloud computing has now become the norm in the way many companies operate their business.

Within asset management, growth in cloud computing is just as explosive. According to a recent CEB TowerGroup report, more than 71% of firms confirmed their intention to adopt cloud computing or increase its usage by 2017. However, 2015 is the expected year when the majority of applications will be delivered through a cloud computing model. Another key finding from the report suggested that cloud adoption is potentially highest for post-trade activities, such as accounting, reporting and performance measurement.

Historically, asset management firms have been reluctant to outsource IT services to third-party providers, primarily relating to information security and business continuity risks and the ability of third parties to support their requirements for application performance, platform scalability and availability. This has since changed, and these concerns have been quelled through significant improvements in data security, encryption and isolation techniques employed by cloud providers within sophisticated, global tier-4 data centers.

Despite the expected growth in adoption of cloud computing within asset management over the next few years, some firms will still retain the traditional on-premise deployment model. As a result, we can expect to see some polarization over the next five years, with global, super-tier firms continuing with the traditional on-site approach, while others adopt cloud computing models.

Because of this, the industry needs system vendors to provide a continuum of deployment options, rather than a one-size-fits-all approach. The real value, however, will be delivered by vendors who can deliver beyond just managed technology services: it is the ability to provide managed business services as well. By taking on operational processes like data sourcing and validation, third-party providers can enable business teams to focus more effort on core activities.

It is our view that cloud computing will lead to a new type of system vendor within the asset management industry: one that will provide a full spectrum of services, from the traditional method of applications deployed on-premise by in-house teams – through a range of managed IT cloud services – to business processing systems leveraging cloud technology. Vendors apt to do this will be in the best position to meet the demands of a diverse range of asset management firms.

The disruptive force of cloud computing is based not on technology but on the business model under which IT operates within individual firms. The rationale of whether to self-manage or utilize the Cloud through a third-party provider will vary. Due to the complexity and scale or system platforms today, smaller organizations will increasingly lack the resources to manage them, while larger firms will focus their staff on higher value activities in support of their core business.

Most importantly, due to the increasing commoditization of hardware, the emergence of cloud computing underscores the fact that the value of technology is in the software, not hardware. Cloud computing can achieve a utility model with economies of scale far beyond what most companies can achieve with traditional on-premise IT platforms.

Posted in Buy-Side, Cloud, Financial Technology, Guest Blog | Tagged , , | Leave a comment

Time for Asset Managers to Modernize Performance Measurement Systems

BobLeaperGuest Contributor: Bob Leaper, Head of Business Development, North America, DST Global Solutions

Research from CEB TowerGroup cites that almost half, 45%, of asset managers say they will adapt or replace their performance measurement system by 2018[1]. The same research cited that 4 out of 10 performance measurement systems leveraged by buy-side institutions date back to 2006 or earlier.

These powerful statistics indicate that many asset managers have been using fragmented performance measurement technology or processes for a number of years. System entrenchment can contribute to a number of operational risks including: loss of efficiency from business users, inconsistency, degradation of data integrity and, most importantly, an inability to provide true exposure calculations from underlying holdings. This is because many older specialty systems lack enterprise data integration, advanced look through and data visualization functionality that enable operational cohesion and support better investment decision making. Given the dramatic changes to the global investment landscape; the rise in new, complicated investment vehicles that offer greater potential for yield; the growth in diversification strategies and the numerous technology improvements in the market, it is perhaps not surprising that buy side institutions should be primed to modernize their performance measurement structures.

Replacing or upgrading a performance measurement system comes with a litany of legitimate concerns including integration, scale and cost. These concerns, however, can be put into the context of the overall business risks associated with not considering advanced performance measurement and attribution functionality.  Internal and external demands for transparency have compounded the pressure placed on the middle office for better information faster. New performance and attribution technology can offer a holistic view of data, decompose the sources of risk and return from any asset class and enable data to be analyzed with robust dashboard visualization features. Innovative, new technology features like those mentioned can provide a sustainable competitive advantage especially as firms seek strategies to expand their global footprint.  To download the full CEB TowerGroup Research Report click here.

Join DST at FTF’s Performance Measurement Conference on March 19th for a look at this and more on the panel “Systems and Options for Performance Measurement.” 

Posted in Guest Blog, Performance Measurement | Tagged , | Leave a comment

Reducing Risk and Gaining Control of Corporate Actions Processing

Brendan_farrellGuest Contributor: Brendan P. Farrell, Jr., executive vice president, SunGard’s XSP

How can the industry turn the tables on risk in corporate actions processing? Industry insiders estimate that $1 billion is lost every year through missed or mismanaged corporate actions events. Losses like these result from current industry conditions and longstanding practices that impede optimal processing performance. Doing nothing to address them prolongs problems and ultimately incubates risk.

There are several primary causes with specific implications: the higher volume and complexity of corporate actions is increasing risk; inefficient manual process and workflow add to the burden; and outdated or ill-equipped home-grown systems create an underlying liability.

Relying on manual methods to manage the corporate actions lifecycle makes for a costly, labor-intensive activity. Human error can infiltrate the workflows for both relatively simple announcements such as stock splits and more intricate multi-part events such as cash stock options or rights issues, which require a complex, multi-step process to complete related notifications. At the same time, when processing steps come into question, their resolution may be left open to interpretation.

Facing an improperly handled corporate action can put an unwanted public spotlight on poorly executed processing. For example, under U.S. regulations, companies could be fined a combined maximum of $3 million per incident for unintentionally failing to furnish correct information in a timely manner to both shareholders and nominees. By streamlining and centrally managing the processing of corporate events through automation, organizations can work more productively, efficiently, consistently and securely to process events within required time frames.

In the application of an automated corporate actions process, there are a number of qualities that differ from manual processes. Corporate actions information is automatically compiled from many disparate sources to create a single event using configurable logic, and when new information becomes available, it is automatically captured and applied to the event upon receipt. In addition, all differences in critical information for each source are clearly visible and prioritized for review by staff, which highlights any errors and inconsistencies for correction and therefore increases staff efficiency by reducing manual tasks. Event information, including the deadline for the voluntary choice event, is also automatically distributed to all investors affected by the announcement. When applied through election processing, event settlement and the final reconciliation or closing out of the event, automation helps minimize risk while significantly reducing the time to complete a voluntary corporate action.

When evaluating a corporate actions processing solution, the following criteria should be considered:

  • End-to-end corporate actions processing
  • Comprehensive data management and scrubbing
  • Integration via an array of data interfaces
  • Adherence to XML and SWIFT ISO messaging standards
  • Flexible trade and positions functionality
  • Easy-to-use dashboards and interfaces with support for mobile
  • Cloud-hosted managed services, Software-as-a-Service (SaaS) or local deployment options

Implementing an automated process for all corporate actions can be difficult for a firm to attempt on its own. The implementation requires a combination of technical and domain expertise so that the resulting solution is tailored to address the firm’s specific requirements and risk factors. In addition, the solution needs to be designed to provide the flexibility and security to support new standards, technologies and ways of doing business going forward. Through this approach, companies can run more agile, smarter operations that help readily adapt to whatever comes their way.

Concerns about the improper processing of corporate actions notices are real. Organizations can spend years trying to recover from the effects of a failed corporate actions solution implementation. Firms that act to deploy scalable best-practice solutions for this function can expect to achieve noticeable improvements in productivity, accuracy, STP and customer service. What’s more, vulnerability to risk and reputational damage will be significantly reduced as manual steps are replaced with a streamlined, automated process that contains built-in governance.

As a result, these organizations can accomplish corporate actions processing more confidently and with fewer resources, even as the complexity and volume of events grows. Freed from manual tasks, staff can subsequently refocus their efforts on high-priority assignments that add strategic value to the business. Ultimately, these organizations can make significant strides toward improving and sustaining their competitive position through greater agility, smarter operations and reduced risk.

For more on reducing risk and boosting organizational effectiveness by automating corporate actions, read SunGard’s new insight report, “Turn the Tables on Risk: How to Gain Control of Corporate Actions Processing.”

Posted in Corporate Actions, Guest Blog | Tagged , | 2 Comments