www.waterstechnology.com/ird
August 2011 Volume 6 Number 5
Swift, DTCC Developing ISO 20022 Standard for Corporate Actions in Multi-Year Plan NEW YORK—Swift and the Depository Trust & Clearing Corporation (DTCC) are looking forward to the migration of corporate actions messages to the ISO 20022 standard, which is expected to conclude in 2015 with the complete phase-out of DTCC proprietary formats, according to New Yorkbased Malene McMahon, senior business manager at Swift. A pilot of ISO 20022 messaging for corporate actions announcements began in April and runs through November. By mid-2012, Swift and DTCC will add other corporate actions, such as eligibility balances, elections, instructions, payments and other parts of the corporate actions lifecycle, to the ISO 20022 messaging testing, explains McMahon. All these forms of corporate actions will then be live in production in early 2013, she adds.
Inside
Emerging Potential 18 When dealing with evaluated prices in emerging markets, it’s not just the availability and timeliness of pricing data you need to be concerned about
Mapping LEI Links 12 The new LEI standard has created burgeoning data demands for the securities industry
1-8 news.indd 1
“Users will be able to automate the whole election and payment process, where clients send automated messages back to DTCC,” says McMahon. Malene McMahon, Swift “This will have clients really gaining some of the benefits of automation they have been waiting for.” From 2013 onward, Swift and DTCC intend to roll out the aforementioned uses of ISO 20022 for corporate actions to more and more firms, says McMahon. Even before then, Swift and DTCC expect to be gradually adding more firms to the four currently halfway through piloting ISO 20022 for corporate actions. These are: Brown Brothers
Harriman, J.P. Morgan, BNY Mellon and National Financial Services (NFS). NFS is partnered with Fidelity ActionsXchange to provide corporate actions services, and therefore the organizations are working together on the ISO 20022 corporate actions pilot. Working with Swift and ISITC on corporate actions messaging, Fidelity ActionsXchange is contributing to and seeing development of corporate actions, according to Deborah Culhane, chief operating officer, Fidelity ActionsXchange. “It’s very difficult to standardize, based on the complexity of the markets,” she says. “We’re trying to address that through the model and the extensions now called supplementary data. We want to make sure there’s a good process around that so it > continued on page 4
SEC’s Large Trade Reporting Rules Promote Centralized Data Models, Says Deloitte CHICAGO—Reporting requirements for large traders issued by the US Securities and Exchange Commission (SEC) on July 26 will require firms whose trading is above certain daily or monthly share or value thresholds to strengthen their data governance capabilities, and move away from decentralized reference data management operations models, according to Chicago-based Matthew Schlatter, securities practice information strategy leader at Deloitte Consulting. “This [regulatory request] places additional stress on whether you have strong data governance capabilities that are actually well-structured, well-communicated and enforced,” he says. “If you have a reasonably well-centralized reference data management operations layer, the reporting becomes fairly easy to include. If you have a widely distributed or decentralized model, this becomes a much more invasive request, because you have to touch many more systems and operational groups. From a controls
perspective, it will be more difficult to determine if you actually covered all your bases.” The SEC’s new rules require large traders to register using a new form, 13H, and set recordkeeping, reporting and monitoring requirements on registered broker-dealers that large traders use to execute transactions. The SEC will assign each trader an identification number to give to the broker-dealers, who must report information collected to the SEC on request. “Any time you introduce another identifier, you’re creating another level of complexity,” says Ed Ventura, president of Ventura Management Associates, based in Princeton, NJ. “It’s a good idea to have a separate identifier in this situation.” Large traders are defined by the rules as those who trade more than 2 million shares or $20 million in value of exchange-listed securities daily, or more than 20 million shares or $200 million in value monthly. > continued on page 4
8/8/11 4:07 PM
Accurate, Accessible and Actionable Information.
Speed to Market Asset Control provides the information needed for targeted product innovation and faster time to market to deliver increased revenues and create strong competitive differentiation. To learn more visit: www.asset-control.com
Asset Control, the Asset Control logo are trademarks or registered trademarks of Asset Control N.V. or its subsidiaries or affiliates in the U.S. and/or other countries. All other trade names are trademarks or registered trademarks of their respective holders.
Untitled-1 1
NEW YORK: +1 212 445 1076 LONDON: +44 207 743 0320 HONG KONG: +852 3798 2570 Follow us on
: www.twitter.com/asset_control
8/9/11 12:51 PM
Golden Copy
a Standards Battle Is Joined Michael Shashoua
[email protected] Last month, I wrote that Inside Reference Data readers were anticipating the issuance of legal entity identifier (LEI) standards, which has come to pass, along with an industry reaction of the Global Financial Markets Association (GFMA) favoring Swift and DTCC/Avox to administer the standard. This answered some of the questions asked here in July, making it likely that the two organizations would indeed be caretakers of LEI. These answers set the stage for service providers, who are beginning to clamor for the opportunity to support market participants in managing LEIs. With the foundations now in place, competition among the providers will heat up. They may not have a piece of administering the standard itself, but they are raring to go on other functions surrounding it. Chris Pickles, head of industry initiatives for global banking and financial markets at British Telecom (BT), who shares his perspective on the road to LEI on page 20, points to a principle that sounds familiar from my current graduate school business studies. In marketing, the role of a sales representative can include gathering information from customers on what they’re looking for out of a product or service, and channeling that to management to better allocate resources and design products—not just selling blindly. Pickles says BT doesn’t intend to tell the industry to migrate to a standard it dictates, but rather seeks to help users —the customers—deploy the message formats and
standards they believe are best for their business. Non-profit organizations could still keep a stake in the LEI game, the lessons of GSTPA described last month here notwithstanding. Standards organization GS1, which had partnered with FinancialInterGroup to seek the GFMA’s recommendation to become the registration authority to LEI that ultimately was bestowed on Swift, is advancing a collaborative model for implementing the standard. Allan Grody, president of Financial InterGroup, who discusses the challenges of managing LEI on page 12, says all business interests, data vendors and financial market utilities can co-exist in a non-profit and collaborative model by synchronizing proprietary numbering systems— under the GS1 registry, of course. It appears the next battle to be joined in the progress of the LEI standard is whether implementation is best achieved by non-profit co-operatives or market-driven service providers focused on how they can best serve the market and derive profit that validates the job they are doing and their performance in it. That’s separate from the question of whether the standard itself achieves the goals intended by its creators. It will be more than just the overall competence of service providers or collectives that determines the LEI’s success. It’s the performance of the companies or concerns that emerge as leaders in providing LEI services that will make the difference.
Contents
Frank dos Santos, Standard & Poor’s Securities Evaluations, p18
FEaTuRES
NEwS
NEwS
12 Mapping LEI Links
Regulation & Standards 1 SEC’s Large Trade Reporting
Corporate Actions 1 Swift, DTCC Developing ISO
The new LEI standard has created burgeoning data demands for the securities industry. How can they be addressed, and which service providers are up to the task?
15 Time for Action Asset Control president and CEO Phil Lynch discusses data management, current trends and the firm’s strategy moving forward
Rules Promote Centralized Data Models, Says Deloitte 4 CDMG Raising Legal Entity Identifier Awareness to Avoid Reg Disconnect 6 ISO Chooses Anna as Registration Authority for 6166 Standard 8 Firms Struggle to Meet Regulatory Risk Data Challenges, Say Panelists at JWG Seminar
16 Putting Enterprise into Data Management Embedding enterprise data management platforms and data governance into projects and initiatives is becoming increasingly important
18 Emerging Potential When dealing with evaluated prices in emerging markets, it’s not just the availability and timeliness of pricing data you need to be concerned about
www.waterstechnology.com/ird
1-8 news.indd 3
Special Report, p14
A panel of industry experts brought together by Inside Market Data and Inside Reference Data discuss what is needed to accurately estimate exposure to entities, in a webinar sponsored by S&P Valuation and Risk Strategies
20022 Standard for Corporate Actions in Multi-Year Plan
Data Management 4 UBS Sets ATS Data Record Straight
6 Lepus/SAS Near-Real-Time Risk Report Highlights Data Management Pitfalls 7 Debt Ceiling’s Market Volatility Fallout Could Increase Data Processing Issues 9 Bloomberg Raises Client Hackles With Category Changes to Data Fee Model
COLuMNS 6 News Download 10 Interview With... Karla McKenna, Citi
20 Industry Warehouse 22 People Moves and Calendar
August 2011 3
8/8/11 4:07 PM
Corporate Actions www.waterstechnology.com/ird
August 2011 Volume 6 Number 5
Swift, DTCC Developing ISO 20022 Standard for Corporate Actions in Multi-Year Plan NEW YORK—Swift and the Depository Trust & Clearing Corporation (DTCC) are looking forward to the migration of corporate actions messages to the ISO 20022 standard, which is expected to conclude in 2015 with the complete phase-out of DTCC proprietary formats, according to New Yorkbased Malene McMahon, senior business manager at Swift. A pilot of ISO 20022 messaging for corporate actions announcements began in April and runs through November. By mid-2012, Swift and DTCC will add other corporate actions, such as eligibility balances, elections, instructions, payments and other parts of the corporate actions lifecycle, to the ISO 20022 messaging testing, explains McMahon. All these forms of corporate actions will then be live in production in early 2013, she adds.
Inside
Emerging Potential 18 When dealing with evaluated prices in emerging markets, it’s not just the availability and timeliness of pricing data you need to be concerned about
Mapping LEI Links 12 The new LEI standard has created burgeoning data demands for the securities industry
“Users will be able to automate the whole election and payment process, where clients send automated messages back to DTCC,” says McMahon. Malene McMahon, Swift “This will have clients really gaining some of the benefits of automation they have been waiting for.” From 2013 onward, Swift and DTCC intend to roll out the aforementioned uses of ISO 20022 for corporate actions to more and more firms, says McMahon. Even before then, Swift and DTCC expect to be gradually adding more firms to the four currently halfway through piloting ISO 20022 for corporate actions. These are: Brown Brothers
Harriman, J.P. Morgan, BNY Mellon and National Financial Services (NFS). NFS is partnered with Fidelity ActionsXchange to provide corporate actions services, and therefore the organizations are working together on the ISO 20022 corporate actions pilot. Working with Swift and ISITC on corporate actions messaging, Fidelity ActionsXchange is contributing to and seeing development of corporate actions, according to Deborah Culhane, chief operating officer, Fidelity ActionsXchange. “It’s very difficult to standardize, based on the complexity of the markets,” she says. “We’re trying to address that through the model and the extensions now called supplementary data. We want to make sure there’s a good process around that so it > continued on page 4
SEC’s Large Trade Reporting Rules Promote Centralized Data Models, Says Deloitte CHICAGO—Reporting requirements for large traders issued by the US Securities and Exchange Commission (SEC) on July 26 will require firms whose trading is above certain daily or monthly share or value thresholds to strengthen their data governance capabilities, and move away from decentralized reference data management operations models, according to Chicago-based Matthew Schlatter, securities practice information strategy leader at Deloitte Consulting. “This [regulatory request] places additional stress on whether you have strong data governance capabilities that are actually well-structured, well-communicated and enforced,” he says. “If you have a reasonably well-centralized reference data management operations layer, the reporting becomes fairly easy to include. If you have a widely distributed or decentralized model, this becomes a much more invasive request, because you have to touch many more systems and operational groups. From a controls
perspective, it will be more difficult to determine if you actually covered all your bases.” The SEC’s new rules require large traders to register using a new form, 13H, and set recordkeeping, reporting and monitoring requirements on registered broker-dealers that large traders use to execute transactions. The SEC will assign each trader an identification number to give to the broker-dealers, who must report information collected to the SEC on request. “Any time you introduce another identifier, you’re creating another level of complexity,” says Ed Ventura, president of Ventura Management Associates, based in Princeton, NJ. “It’s a good idea to have a separate identifier in this situation.” Large traders are defined by the rules as those who trade more than 2 million shares or $20 million in value of exchange-listed securities daily, or more than 20 million shares or $200 million in value monthly. > continued on page 4
Michael Shashoua, Editor Tel: +1 646 490 3969
[email protected] Carla Mangado, European Editor Tel: +44 (0)20 7316 9122
[email protected] Tine Thoresen, Executive Editor Tel: +44 (0)20 7316 9744
[email protected] Lee Hartt, Publishing Director Tel: +44 (0)20 7316 9443
[email protected] Jo Garvey, Commercial Director Tel: +1 212 457 7768
[email protected] Claire Light, Marketing Manager Tel: +44 (0)20 7004 7450
[email protected] Jon Lloyd, Chief Subeditor Lorna Graham, Group Production Manager Tel: +44 (0)20 7316 9707
[email protected] Gill Harker, Subscriptions Manager Tel: +44 (0)20 7968 4618 Dominic Clifton, Subscriptions Manager Tel: +44 (0) 20 7968 4634
[email protected] Incisive Media (US) 55 Broad Street, 22nd Floor, New York, NY 10004. Tel: +1 646 736 1888 Fax: +1 646 390 6612 Incisive Media (UK) 32–34 Broadwick Street, London, W1A 2HG Tel: +44 (0)870 240 8859 Fax: +44 (0)20 7316 9250 E-mail:
[email protected] UK Tel: +44 (0)870 787 6822 +44 (0) 1858 438 421 (overseas) US CS: +1 212 457 9400 Incisive Media (Hong Kong) 20th Floor Admiralty Centre, Tower 2, 18 Harcourt Road, Admiralty Tel: +852 3411 4888 Read it first online: Articles from this issue are published first on our website, as well as other news articles that aren’t published in the issue. Set up your online access or re-set your password here: http:// www.waterstechnology.com/home/forgot_password and enter your email address.
© 2011 Incisive Media Investments Limited. Unauthorized photocopying or facsimile distribution of this copyrighted newsletter is prohibited. All rights reserved. ISSN: 1750-8517
4 August 2011
1-8 news.indd 4
Swift, DTCC Developing ISO 20022 for Corporate Actions in Multi-Year Plan > continued from page 1 doesn’t interfere with the objective to get these messages standardized and straight-through. We’re paying particular attention to the extensions and supplementary data to make sure there’s industry agreement, so we’re not creating a ‘non-standard’ standard.” The pilot phase is being used to develop the ISO 20022 standard, says McMahon. “We gather feedback on what [fields] can be shifted, what’s needed and what’s not,” she says. “We get a lot of feedback on the actual structure of the messages, which will eventually benefit the whole change request process and the messages will start to evolve. Then we also have an industry consultation group running in parallel, helping design the next phase.” The group includes broker-dealer input, she adds. The pilot fi rms expect to start using ISO 20022 for corporate actions by the end of this year, or at least early 2012, says Culhane.
“We’re going through extensive volume testing and we’re near completion on all messaging,” she says. “People are generally pleased with the amount of progDeborah Culhane, ress that has been Fidelity ActionsXchange made, particularly in the last two months.” Although the Swift and DTCC migration plan will phase out DTCC proprietary corporate actions message formats, the ISO 15022 standard will remain for the foreseeable future. “There will be a need for co-existence of 15022 and 20022 because of the nature of the custodians and clients we support,” says Culhane. “We cannot migrate off those until the industry continues progress to do so.” Michael Shashoua
Regulation & Standards
SEC’s Large Trade Reporting Rules Promote Centralized Data Models, Says Deloitte > continued from page 1 The purpose of the rules is to prevent a repeat of the May 2010 fl ash crash, says Schlatter, and although they may not ultimately be able to meet that stated goal, they should create more transparency in the markets, he adds. “They will be able to respond very quickly to an anomaly in the market,” he says. “The rules provide nextday information and previous-day information across that entire span of large traders. If they need to do an investigation, they get exactly the data they require with the access they need, so they can decompose the situation and take action. It took months to decompose the spaghetti they had underneath when they went through fl ash crash analysis.” Having identification of large traders allows the SEC to see what is happening across multiple broker-dealers at once, explains Schlatter. Before the rule, regulators could not go out to all the broker-dealers and get all the activity by large traders for certain time periods to understand, across all the brokerdealer relationships, what the full scope of activity is by a large trader, he says. The SEC is expected to allow 60 to 90 days, until October or November, for adoption in the industry. “It’s going to be a massive thing for
companies to do,” says Ventura. “But it will have some impact because they will have to track a new identifier.” Individual account numbers must be correlated with cross-reference numbers, he adds. “It’s a good idea for this purpose, and it takes the name of the entity out of the equation to only agnostically view transaction information,” says Ventura. “Firms need clarity on what level to report at, and some may have multiple entities, each with multiple reporting requirements,” he says. Reconstructing each day’s trading provides leverage and prevents traders from “playing games,” because they know they will be watched, says Ventura. Additionally, under the new requirements, large traders will have to maintain a new class of reference data for all the traders, in the form of those identifications, for all the traders with whom they interact, explains Schlatter. “They need to be able to create a reporting facility on an on-request basis, where if the SEC wants trading activity for a certain time period for a certain large trader, they can provide that,” he says. “That will be challenging, because it’s not just trading and clearing systems that need to be addressed, but reporting solutions as well.” Michael Shashoua
www.waterstechnology.com/ird
8/8/11 4:07 PM
Regulation & Standards
Regulation & Standards
CDMG Raising Legal Entity Identifier Awareness to Avoid Reg Disconnect
ISO 17442 wins Industry Support as LEI Standard
LONDON—The Customer Data Management Group (CDMG), led by London-based regulatory think tank JWG, is looking to raise greater awareness among industry groups and supervisors around the need to create a more homogeneous vision of the business requirements of the legal entity identifier (LEI) and other G20 identifiers, Inside Reference Data has learned. The group, which is currently tracking the different regulatory requirements that impact the legal entity data space, aims to create a holistic view of what will be expected from fi rms once the LEI is in place. London-based PJ Di Giammarino, chief executive officer at regulatory think-tank JWG says there are currently disconnects between the various regulatory requirements and industry efforts. Other regulatory requirements at this stage do not refer to the LEI, he explains, adding that unless it is adopted in detailed rules, embedding it into fi nancial institutions’ operating models will be difficult. “We are talking to the rulemaking bodies and trying to make them more aware of the need for joined-up thinking. We want to raise awareness with the different industry groups to co-ordinate agendas and consider when it will be appropriate to bring in the LEI, as well as what the business requirements for it will be,” explains Di Giammarino.
“we are talking to the rulemaking bodies and trying to make them more aware of the need for joinedup thinking” PJ Di Giammarino, JWG
He also explains that prior to looking into the semantic requirements, the industry must have a clear understanding of what the objectives are, what decisions will need to be taken and the scenarios those decisions will be made in. “As far as the LEI goes, we have identified the areas that are highest priority for regulatory purposes. We are looking into trade and position reporting, clearing and transaction reporting, as well as risk reporting,” he says. Carla Mangado
Data Management
UBS Sets ATS Data Record Straight NEW YORK—By disclosing monthly volume data and performance metrics from its alternative trading system (ATS) for US stocks, UBS can demonstrate the nature of the liquidity on its ATS and prevent miscounting and mischaracterizations of its activity, officials tell Inside Reference Data. “We wanted to make sure we were publishing clear numbers for ourselves, the way we think they should be published,” says Charlie Susi, co-head of global direct execution in the electronic trading group at UBS. “We make a big point of knowing what kind of participants and flows are coming into our venue. By posting this data, people can see the diverse flow we have.” The UBS ATS, part of the fi rm’s equities business, offers electronic trading users nondisplayed crossing and price improvement. It executes an average of 90 million shares daily. The monthly metrics to be published will include average daily volume executed, average daily notional executed, total symbols www.waterstechnology.com/ird
1-8 news.indd 5
The Global Financial Management Association (GFMA), a federation of global financial services trade associations, recommended the ISO 17442 standard for the Legal Entity Identifier (LEI) last month. A draft of ISO 17442 is expected to be published as an ISO International Standard by January 2012. The standard is being developed by the ISO Technical Committee for Financial Services (known as TC 68) and is currently in the Draft International Standard stage. GFMA asserts that ISO 17442 is a basis for a viable, uniform and global LEI. Some of the key attributes of the standard include being able to defi ne open governance of the issuance
active, total symbols traded, peak day and average price improvement. UBS is developing reports for its website and for consultants and the media, says Susi. “A lot of clients want to source specific data
“The usual criticism I’ve seen is that Swift can be relatively slow, but this is absolutely not the case” Brett Lancaster, Swift
needs. Depending on what they’re looking for, we’re able to hand over their post-trade data,” he says. “There’s a lot more focus now from clients on getting detailed trading data. We’re very transparent and able to do it all through FIX. We can also do specific ranges of historical data.” While UBS will now disclose monthly data for the ATS, the ATS order book will remain undisplayed and confidential.
and maintenance of the LEI scheme that is scalable, global and free from assignment limitations. Brett Lancaster, managing director, securities sales and initiatives, Americas, at Swift, says the new ISO 17442 standard was developed in “record time.” Swift and the Depository Trust & Clearing Corporation (DTCC)/Avox are working with ISO, the International Organization for Standardization, on implementation of the LEI. “The usual criticism I’ve seen is that Swift can be relatively slow, but this is absolutely not the case,” says Lancaster. “We went to the board early on to get buy-in from the industry.”
Michael Shashoua
Carla Mangado, Michael Shashoua
“There’s a lot more focus now from clients on getting detailed trading data” Charlie Susi, UBS
August 2011 5
8/8/11 4:07 PM
News Download
Regulation & Standards
SmartStream TLM Corona Awarded SwiftReady 2011 Labels
ISO Chooses Anna as Registration Authority for 6166 Standard
SmartStream’s TLM Corona version 7 platform has been awarded the SwiftReady Reconciliations and SwiftReady Payments Exceptions and Investigations labels for 2011, in recognition of its Swift compliance with the E&I and Reconciliation SwiftReady criteria.
Indexium Chooses Cadis
Zurich-based index services provider Indexium, a joint venture between SIX Group and Deutsche Börse, has selected Cadis to streamline its data management process and calculate and distribute all indices for the Stoxx, SIX Swiss Exchange and Deutsche Börse Group. The move will enable the firm to create new data feeds, validate data and establish new workflows. Cadis will also roll out its security master and allow Indexium to manage corporate actions.
JWG’S Latest Report Highlights Costs of Lack of Risk Data Consistency
A preliminary report by Londonbased regulatory think-tank, JWG, entitled “Strong Foundations for Regulatory Reform?”, highlights the need for regulatory data standards definition to avoid immense economic implications. The report suggests firms that fail to establish a consistent data management plan may face huge costs down the line, and states initial studies suggest it will be in the region of $390 million per firm over the next five years. The report also explores to what extent the industry defines global data standards to meet G-20 regulation, a key issue affecting the total cost of compliance. The full report will be released in September.
S&P Adds Financial Institution Data and Risk Analytics to RatingsDirect
S&P Valuation and Risk Strategies has launched CreditStats Direct for Financial Institutions, a data and analytics application that provides direct access to creditadjusted financial data with up to three years of history for banks rated by S&P.
6 August 2011
1-8 news.indd 6
ZURICH—The Association of National Numbering Agencies (Anna) has reached agreement with the International Organization for Standardization (ISO) to become the registration authority for the ISO 6166 standard, which governs ISIN numbers used to uniquely identify securities, according to Zurich-based Nourredine Yous, chair of ISO sub-committee 4 (SC4), and responsible for strategic support in the data division of SIX Telekurs. Anna is responsible for implementing ISO 6166 and ISO 10962, which is the classification of financial instruments (CFI) code. It is also seeking registration authority for ISO 10962; the abbreviation of securities terms known as ISO 18773; and the financial instrument short names known as ISO 18774. Financial instrument short names indicate the different types
of instruments that all deal with one specific security, such as redeemables, preferred types and types of shares. “All these standards, if Anna reaches the same agreement Nourredine with ISO as for ISO 6166, are Yous, SC4 very good news for the industry, as these standards have been waiting for implementation on a large scale for many years,” he says. “If they are implemented, they will facilitate substantially the communications between market players, as well as reporting to the regulators of each country.” In May, ISO selected Swift as registration authority for its legal entity identification standard. Michael Shashoua
Data Management
Lepus/SaS Near-Real-Time Risk Report Highlights Data Management Pitfalls LONDON—A siloed mentality and poor data quality levels within firms remain two of the outstanding data management challenges institutions have to overcome to enable a more near-real-time risk management environment, according to a report released by research firm Lepus and business analytics software provider SAS. The report highlights that despite the fact firms are looking into data management and quality improvement, more has to be done to facilitate a real-time environment. London-based Duncan Ash, marketing manager, financial services at SAS, explains: “There is a big data quality and timeliness issue at the moment, and effective data aggregation is a must prior to being able to run any analytics... you can’t do the analytics until you are confident you have an accurate view of all your different positions and trades.” And while the financial institutions that were approached did have plans, and in many cases were already focusing on data management and quality improvement, the real-time approach is yet to be fully embraced in some cases. “Firms do focus on data quality across the board, but the batch overnight mentality remains,” says Ash. “They are focusing on high-quality data for tomorrow, rather than for right now.” London-based Dale Stevens, head of capital
markets at SAS, says: “While we noticed a number of organizations are addressing fundamental data quality issues, many of the systems financial institutions use only reconcile their books at the end of the day and lack the concept of continuous reconciliation or data quality checking.” Meanwhile, the research also focuses on how banks are still struggling to pull together an aggregated view of their exposure across asset classes, and highlights how despite
“Many of the systems financial institutions use only reconcile their books at the end of the day” Dale Stevens, SAS
efforts to overcome the siloed mentality, this feature remains a barrier to achieving the necessary transparency and enabling the management of positions and portfolios on a near-real-time basis. “[The siloed] approach continues to exist within risk management, and it’s not unusual to find that the market or credit risk teams do not communicate with the liquidity or operational risk teams, for example,” says Ash. Carla Mangado
www.waterstechnology.com/ird
8/8/11 4:07 PM
Data Management
Debt Ceiling’s Market Volatility Fallout Could Increase Data Processing Issues NEW YORK—If there still is a downgrade of US Treasury bonds, following the 11th-hour solution of the federal budget and debt ceiling crisis earlier this month—or if future economic travails affect the bonds, the ramifications for data management could be more than just noting ratings changes, according to a data vendor preparing for the possibility of such events. The challenge will not come from cascading credit events such as downgrades to municipal bonds, housing bonds backed by Fannie Mae and Freddie Mac, state credit ratings and various other fixed-income securities—all of which could be caused by a downgrade of US Treasuries. Instead, it will come from ensuing market volatility, says Liz Duggan, managing director of global evaluations at Interactive Data. “The reference data and pricing are linked,” she says. “Once the reference data changes happen because of the down-
Untitled-1 1
1-8 news.indd 7
“Once the reference data changes happen because of the downgrading of a security, the operational challenge will be market volatility” Liz Duggan, Interactive Data
grading of a security, the operational challenge will be market volatility. If there is large market volatility, there will be an increase in price challenges and an increase of tolerance breaks for clients as they
process their pricing files. Operationally, those challenges could be large or small, depending on the market’s reaction.” Duggan says Interactive Data is recommending that its clients review systems and quality control processes, especially lessons learned from the volatility, tolerance breaks and price challenges that happened in the 2008 financial crisis. For its part, the provider is reviewing its quality control processes to ensure they “are not diluted due to increased volume,” she says. Three critical scenarios Duggan identifies that could occur with a downgrade of US Treasuries are negative spreads in Treasury yields, changes in benchmarks that are used, and bonds pricing on a pure dollar or yield basis—no longer pricing based on Treasuries. Interactive Data has also adjusted its pricing applications to account for any of these scenarios and their effects, according to Duggan. Michael Shashoua
7/1/10 10:40:10 AM
8/8/11 4:07 PM
Regulation & Standards
Firms Struggle to Meet Regulatory Risk Data Challenges, Say Panelists at JWG Seminar LONDON—Despite efforts to tackle the risk data challenge, firms are a long way from ideal scenarios for compliance with upcoming regulatory reforms worldwide, according to speakers at a seminar held in London in July by regulatory think tank JWG. Panelists agreed that even if the focus on risk data, fuelled by a regulatory push and increasing interest, has increased, firms are far from being able to overcome their internal risk data challenges. The bar has been set extremely high. “Right now, it is the highest it has been in the 20 years I have been in the industry. The cost of failure is stoppage of trading and massive fines, so I think we clearly see the risk-reward potential of not being in the game correctly,” said one of the speakers at the event. London-based John Matthews, chairman, Asset and Liability Management Association and speaker at the event, said: “We are a long way from where we need to be and we need to be practical about it. The bar we need to reach is much higher and the operational dependencies are more complex. “We are a long way from defining many of the elements we want to put together... I would like to see a data dictionary including business definitions at granular level, for example,” said Matthews, adding that the chances of banks being able to produce a uniform data set that supports a grouplevel cross-risk stress test with the necessary accuracy and granularity is minimal. London-based Christopher Clack, founder of the UCL Systemic Risk Group and speaker at the event, highlighted how the semantics of data are essential to data collection across different jurisdictions. “Data collection from different regimes can be problematic, especially if the same piece of data is interpreted in different ways in different regimes. So you have to look at how the request was interpreted and what is being compared,” he explained.
Data Volumes & Regulatory Approach
Speakers discussed to what extent the volumes of data being collected for regulatory purposes will have an impact on the levels and granularity of analysis that can be carried out. UCL Systemic Risk Group’s Clack said he was concerned regulators might collect data without knowing exactly what it will be used for. “The types of analysis we could do with a small specific set of data, we just cannot carry out with a large and cumbersome amount of data,” he said, adding that it is not ideal to gather large amounts of data while only understanding later how it will be used. “If you collect so much data that you cannot possibly analyze it in real time, months after an event may have taken place, someone would tell you that you should have known, as you had that data,” he said. “But actually it was buried under a huge mountain of data. I see it as a dangerous, expensive, unmanageable strategy.” Risk factor granularity should still be part of the equation, according to the speakers. Eventually, regulators will need a canonical representation of data they receive, whether granular or aggregate. One of the speakers said: “Having the ability to stress test certain scenarios that would be prohibited to do so within the firms... I think it’s important for regulators to have an infrastructure such that those stress tests could be conducted,” adding that this would need to be done in such a way that the data would not be polluted with modeling assumptions as it flows through the various systems.
Moving Ahead
While firms no longer face a technology challenge, speakers agreed today’s challenge has more of an internal politics twist. While firms no longer face the technology challenge, having reached a higher maturity
level, they still face a challenge of mustering political will within divisions of their firms to complete projects. “How do we get through the techno politics to solve some of these challenges?” asked one of the speakers. “We have thousands of legacy systems we are dealing with—siloism, divisional and cultural boundaries, merger-related challenges—so while we are heading in the right direction, we are not quite ready yet. The data that is important to me within the electronic trading business, is different than the data important to anyone else. Inevitably firms continue to struggle with siloism internally,” he added. However, speakers agreed firms are now in a better position to tackle the data challenge. A survey conducted by JWG in February found that 90% of 120 professionals from more than 45 financial institutions indicated that improving risk data is a priority for their firm and are confident their organization fully understands the regulatory and business requirements for better risk management. “We have never been as well positioned as we are today. Every single one of the top 10 firms is currently sorting out its customer data, reference data and legal Design entity data& as one of their flagship initiaCreative Strategy Brand Development tives,” explained one speaker, adding most firms have already supplied the budget, mandate and management necessary as well as having the right people in place. “Prior to 2008, we all knew data was the lifeblood of the firm. In 2010, all the initiatives that had been dormant for a couple of years started to really get the funding and the attention they deserved,” he said. “Just because we do have management attention, doesn’t mean our job is less complex. In many ways, the extreme focus on what we need to do and the passing along of data to the regulators has slowed down some of our internal machinery.”
Creative Presentation TTMDWTGWTGID_CD-P1
Carla Mangado
Premium content for financial-market professionals. Trial today – visit waterstechnology.com/trial
TTMDWTGWTGID_AD184x46.indd 8 August 2011
1-8 news.indd 8
1
08/08/2011 15:42 www.waterstechnology.com/ird
8/8/11 4:07 PM
Data Management
Bloomberg Raises Client Hackles With Category Changes to Data Fee Model Bloomberg has resurrected plans to charge securities services firms that use the vendor’s data to calculate fund net asset values (NAVs) for the right to distribute data to their clients, according to sources familiar with the situation. Bloomberg called off plans to introduce the data redistribution fee for fund administrator clients early last year, but has put forward fresh proposals that would link the right to distribute some of its data to the new pricing model—dubbed the New Commercial Model (NCM), which came into effect from the start of June for existing contracts facing renewal and from April 1 for new accounts—for its Bloomberg Data License service. As part of the NCM, the data giant now recognizes “securities services”—covering custodians and fund administrators—as a new business unit, whereas it previously only differentiated between buy-side and sell-side clients, and is now looking to request a premium fee for clients to redistribute some of its data. Opponents of the redistribution charge say that—coupled with the NCM—securities services firms face a double-whammy of fee increases. Under the old commercial model, customers paid a monthly charge per security, with prices based on six categories of instrument type and three categories of data type—a security master incorporating corporate actions and prices; derived data; and issuer data. Under the NCM, Bloomberg has retained the monthly charges and the link between prices and data instrument type, but has replaced existing categories with a greater number of new categories, which results in higher fees overall than in the old model. For example, the security master, corporate actions data and prices for a corporate security were previously bundled together for $1.50 per security per month, but are now sold separately for $1.70, $0.50 and $0.75 per security per month respectively—a total of $2.95 per security per month. Bloomberg has expanded the six instrument categories—including a category covering corporate, government and money market assets; one for municipals; agency pools; collateralized mortgage obligations, commercial mortgage-backed securities whole loans and asset-backed securities; equity options, futures, warrants, funds indexes and currencies; and economic statistics—to 11 categories, by splitting out different asset types into new, individual www.waterstechnology.com/ird
1-8 news.indd 9
categories, such as separate categories for funds, US government and syndicated loans. Meanwhile, the vendor has divided issuer data into three component categories—credit risk data, fundamentals and estimates—meaning monthly fees for a corporate security have increased from $2.50 to $6.50 in the NCM, an increase of almost 160 percent. The cost of derived data has risen by up to 50 percent depending on the asset class, while the vendor now charges for accompanying corporate actions data, regardless of whether a corporate action event actually occurred that month. Under the NCM, multiple requests from firms who wish to view the data more than once per month will now also be charged between one and three cents per security per day in the new commercial model, depending on the asset class and data type. Previously, the first multi-request was free.
Flexibility or Smoke and Mirrors?
Bloomberg officials say the new model is intended to provide more flexibility and value, and to allow clients to “only pay for the data they want and need.” But one market data manager at a European asset manager firm calls the change a “pure slicing and dicing” exercise, adding that if a business needs to subscribe to all the content,
“Those who have already made an investment to rationalize Bloomberg face a rise of 100 percent” Jean-Pierre Gottdiener, Lucidine Conseil
“You get nothing new or extra—you just have to pay a lot more for the same data.” To soften the impact of the changes for existing clients, Bloomberg’s Data Solutions group will provide enterprise data license consultants to help clients manage their data usage, and is phasing-in the increases, so clients renewing their Data License contract this year and early next will see stepped cost increments, limited to a total increase of no more than 7 percent in the first year and a further 7 percent in the second. Some clients praise this softlysoftly approach but are concerned about the impact after that initial two-year period. “In our peer group, we are sharing knowledge on how much it will impact us. For some, it’s 2 percent, for others it’s 30 or 100 percent, depending on what data you
take and how exposed you are to certain services,” says a market data vendor manager at a second European asset manager. “Seven percent in the first year, then another 7 percent in the second is fine, but after that, when it hits you fully, that’s what we’re worrying about.” In addition to incremental rises, Bloomberg will offer “optimization,” whereby if a firm has multiple contracts with the vendor across different branches or business units and requests the same data on the same security in the same month via those contracts, then— excluding intraday and derived data—the vendor will only charge between one and three cents for the second request, rather than twice the full price, which it expects to deliver better value for clients. However, Jean-Pierre Gottdiener, manager at Paris-based consultancy Lucidine Conseil, says firms who have made the biggest efforts so far to reduce costs and administration by consolidating multiple contracts across branches will not be eligible to take advantage of optimization, and will have to pay the most. “If you only have one contract because you have already rationalized your request to Bloomberg, there will be no optimization, and you will support nearly the full increase of the prices,” he says. “Some firms have made no optimization on Bloomberg and their increase was only 30 percent, whereas those who have already made an investment to rationalize Bloomberg face a rise of 100 percent.” Some say the vendor’s prices are fair, given that data volumes have increased considerably since the last time it increased prices—more than a decade ago, according to Bloomberg officials—but Gottdiener says Bloomberg’s leading position in the market means “the industry is facing a real issue from the policy, and will probably need to find alternative solutions.” In fact, the NCM has prompted dissatisfied buy- and sell-side firms to reassess their data consumption. Some participants have even said they will look to alternative parties for cheaper data for some parts of the Data License, such as corporate actions, where plenty of alternative providers exist. “Often with Bloomberg, you just absorb the whole universe and pump it everywhere, so it’s good that we now have to look at what data we use, where we use it, and why,” adds the source at the second asset manager. Faye Kilburn
August 2011 9
8/8/11 4:07 PM
Interview With...
Forging Consensus The securities industry is putting the pieces together for compliance with the legal entity identifier standard, a crucial element being the ISO 17442 standard. Michael Shashoua speaks with Citi’s Karla McKenna, who is also chair of ISO TC 68, about how the functions and players are coming together under the standard
With the ISO 17442 standard winning the blessing of the Global Financial Management Association (GFMA) for the legal entity identifier (LEI) standard set by the US Treasury’s Office of Financial Research (OFR), the International Organization for Standardization (ISO) has taken its place beside Swift, Depository Trust & Clearing Corporation (DTCC)/ Avox as caretakers of LEI implementation for the financial industry. Karla McKenna, director of market practice and standards in the Global Transaction Services unit of Citi, is chair of the ISO Technical Committee for Financial Services (TC 68) and also participates in Citi’s internal group to prepare for the LEI standard. The group brings together representatives from all units within the bank that will be affected, such as data, technology, risk, compliance and legal departments. For all these entities working together, the standards have been
kept relatively flexible, according to McKenna. “The solution endorsed by GFMA based on the ISO 17442 supports self-registration, thirdparty registration and also, at the suggestion of the industry, a federation of the registration process and assignment process,” she says. “We view this as one of the strengths of the proposal we submitted.” ISO 17442 is scheduled to be finalized and published in January, setting the stage to institute a uniform, global LEI. The current work on the standard is a “ramp-up” process, according to McKenna. “We see players in the derivatives markets and systemically important financial institutions all needing LEIs, and perhaps the regulators will cast the net wider at this particular point,” she says. “We see an implementation, a peak and then a little bit of a fall as we do regular registrations and go back into business as usual. If there are other requirements with other deadlines and other groups of parties gearing up to need LEIs, we will see another peak. We may also see this from a geographical perspective as well.” Identification has been an ongoing issue for years, well before the OFR put the wheels in motion for the LEI standard. And the basic identifica-
tion needs of firms haven’t changed much with the addition of regulatory needs, explains McKenna. “We all need legal entity information to be able to manage our business relationships, what we call knowyour-customer processes, to assess and manage counterparty and concentration risk—and to link the LEI with the transaction chain to support STP,” she says. “Internally, we need to link the LEI with our own internal codes, and also link LEI with counterparty IDs, namely the BIC.” What happens with implementation of LEIs in the US is likely to be replicated, or at least faced again, in other markets worldwide. Yet a country focus will not necessarily drive adoption of the standard. “Global regulators will look at the exposure from a systemic risk perspective for derivatives, for example,” says McKenna. “It may be driven by asset class.” While some time remains before publication of ISO 17442 and institution of a uniform global LEI, that time is likely to go by fast for McKenna and her colleagues at ISO and Citi. They certainly have their work cut out for them, but they also have the chance to set an example for the industry.
“We all need legal entity information to be able to manage our business relationships, what we call know-yourcustomer processes, to assess and manage counterparty and concentration risk” Karla McKenna, Citi
10
August 2011
10 Interview.indd 10
www.waterstechnology.com/ird
8/8/11 4:08 PM
Five separately bookable days in New York and London
Tutors
John Best Principal Consultant, KENDALE SYSTEMS AND SERVICES LTD
Ed Ventura President and CEO, VENTURA MANAGEMENT ASSOCIATES LLC
Market data, reference data & trading technology New York 17–21 October 2011 incisive-training.com/marketdatausa
London 31 October – 4 November 2011 incisive-training.com/marketdatauk
Day
Introduction to the financial markets and market data
Day
Market data delivery infrastructures: RMDS, data feeds and system management
Day
Cost control and data administration
Day
Low latency market data and trading systems solutions
Day
Managing reference data
1 2 3 Untitled-1 1
4 5
For more information please contact us via
[email protected] 7/1/11 11:15 AM
Regulation & Standards
Mapping LEI Links The new legal entity identifier standard has created burgeoning data demands for the securities industry. How can they be addressed, and which service providers are up to the task? Michael Shashoua hears from practitioners and observers on how handling LEIs is likely to play out
The legal entity identification (LEI) standard put in place by the Office of Financial Research (OFR) in July will generate more data for firms to track and manage, creating opportunities for service providers while burdening investment firms, according to industry observers and service provider executives. The fields that must be filled in for the LEI will only be a couple more than were previously used, such as counterparty names and addresses, according to Stephen Engdahl, senior vice president of product strategy at New York-based enter-
themselves regarding LEI because they expect their clearing firm to shoulder the responsibility, according to Culbert. Large firms realize it sooner if LEI data service either is not available externally or will have a cost. Firms must also consider that data quality can be affected by relying on third parties to interpret LEI needs. Culbert sees a “cottage industry” growing around mapping the LEI data. “You can see small firms out there asking to let them deal with chasing this down for you,” he says. “Some of the bigger banks offering clearing services are looking for ways to be stickier, because the margin compression within the clearing industry will clearly be
and everything else with a reference to the legacy LEI has to be converted as well. You face this massive convergence exercise. These are the volume issues we’re looking at.”
Identifying Structures
Industry associations have made moves to determine how the LEI standard will be administered, with the Global Financial Markets Association (GFMA) last month recommending Swift as registration authority, DTCC to collect requests for new LEIs and store reference data on each LEI, and Avox as validator of the LEIs. Yet these organizations are not the only ones interested in taking on these roles. Standards organization GS1, which adminis-
“In the future, LEI will be the key to automate the exchange and synchronize legal entity data between Thomson Reuters and its customers” Tim Lind, Thomson Reuters
prise data management software vendor, GoldenSource. The need to archive positions and transactions, however, will create much greater data warehousing capacity needs, adds Engdahl. Clearing firms’ role in supporting the new LEI standard remains undefined, says Sean Culbert, a partner and co-lead of the finance, risk and compliance practice at financial industry consultancy Capco. “Will a clearing firm allow a bank to continue to use the current LEI structure and will they provide a mapping service for the bank, or will the bank be responsible for doing those conversions themselves?” he asks. Some firms may not do anything
12 August 2011
12-13 LEI.indd 12
an issue. Bigger banks have to get stickier so the buy side doesn’t just hop from one clearing firm to another looking for the lowest-cost transaction. This may be a way for them to add value-added services just on the mechanics of clearing and settling a trade.” Mapping solves the issue of managing LEI data but creates a large burden of perpetually having to perform mapping work, explains Culbert. “Why not just convert it? Identify where all your LEIs are and convert old to new,” he says. “The second effect is that all the financial reporting, risk management, management information systems, regulatory reporting, legal reporting
ters an identification system using barcodes and electronic messages for 25 sectors including healthcare, manufacturing and retail, teamed up with New York-based Financial InterGroup Holdings in 2010 to work on using GS1’s identification numbering system to identify products and entities in the financial industry, both in LEI form and in other securities identifier forms. Additionally, Financial InterGroup and GS1 sought consideration from regulators to become the registration authority for LEI. “In the end, these identifiers are needed so valued position data can be aggregated and made available to regulators. The volume of such data would www.waterstechnology.com/ird
8/8/11 4:09 PM
not be any more than financial institutions put together now,” says Allan Grody, president of Financial InterGroup. “They have to aggregate their information for doing their own enterprise risk management, and they do it with a huge amount of mapping services and processing to get this information collectively looking like it is in a standard format. Our proposed approach is not to demand the information gets sent to any government agency, but rather that it will be available in a standard database within a financial institution. Regulators could sweep the databases using search technology to do aggregations on the fly and look for hot spots. When they find something, they can go into the organization itself; they always have access and look deeper.” Structured data such as LEI data is easier to organize, according to Grody. “We have a whole different gestalt to our solution,” he says. “We don’t have to go through elaborate data dumping and cleansing. If the data is just left where it is, we can sweep the databases and get it. You don’t have this option to solve the problem with unstructured data.”
The Value of Identification
Aside from registration authority and other LEI administration duties, the industry will need more help to contend with all the communications LEI will require, says Tim
Lind, global head of strategy for enterprise content at Thomson Reuters. The company intends to support the objectives of the LEI initiative and the requirements of its global clientele, to establish LEI as the definitive key in linking financial data with entities. Certain core parts of a securities identifier will become commoditized, according to Lind. These include name, country of origin, legal form, address and core identifiers. “Our focus is on the value-added content that will be linked to the core LEI record,” he says. “In the future, LEI will be the key to automate the exchange and synchronize legal entity data between Thomson Reuters and its customers. That’s why we’re embracing this.” The new standard will drive the creation of dashboards of information, says Lind. “First, we will discover connections within complex corporate structure, then link entities to value-added information—securities, news, historic information and financials—then track it and give the data that our customers need to manage changes that may impact credit-worthiness or the risk profile of their counterparties,” he says. “The precision with which we integrate data with our customers will be enhanced with the presence of the LEI.” Currently, if a user sought information on 1,000 entities, either the provider or the user has to adopt a proprietary identification scheme to link records between
disparate databases, according to Lind. The LEI will function like a social security number for entities, he adds. The need to create these linkages, in turn, will “spawn a new set of business applications that can consider a broader set of information to model market and credit risk,” says Lind. Linkages must happen to support modernization of risk models and help prevent a recurrence of the 2008 crisis, which was why the OFR supported the LEI standard. With the OFR still developing infrastructure to assess systemic risk, risk management providers must also develop an application or model that considers data components that weren’t previously included in identifications, says Lind. New models must capture all the information now available, including media reports coming at a speed not previously possible, notes Lind. Key information can also include supply chain impact on creditworthiness, exposure in the repo market and value of collateral. “All these can be leading indicators of an episode of risk,” he says. “The impact will be a new renaissance of risk management assessments and credit assessments, [definitions of] what is systemic risk and how do we set limits, embrace certain counterparties and set a universe of people we will do business with. All that has to be rationalized through some kind of assessment and model. There will be a lot of interesting work there and those models are all going to need a lot of data.”
Future Challenges
With the LEI generating more data, and service providers devising ways to map, structure and manage that data—enabled by the industry indicating a preference for registration authority and administrative functions—firms that need to consume identification data must have dashboards, linkages and models. They need to create these internally or choose the right provider for the job. Putting an LEI standard in place is only the beginning.
“Some of the bigger banks offering clearing services are looking for ways to be stickier, because the margin compression within the clearing industry will clearly be an issue” Sean Culbert, Capco
www.waterstechnology.com/ird
12-13 LEI.indd 18
August 2011 13
8/8/11 4:09 PM
Special Report
Unlocking Exposures A turbulent economic environment and world events impacting businesses have in recent years highlighted the obstacles related to accurately estimating exposure to entities. Inside Reference Data and Inside Market Data gathered a panel of industry experts to discuss the requirements for achieving a complete picture of a business in a webinar on June 30, sponsored by Standard & Poor’s 1. Has your firm made any changes to improve its ability to unlock exposures in the past three years? 35% Yes No I don’t know
55%
10%
2. How important is effective data integration for a firm to be able to unlock exposure and mitigate risk? 2%
0% Very important
32%
Quite important Slightly important Not important
66%
3. Do you currently review both equity and fixed income/credit exposures? 26% Yes No I don’t know 51%
23%
4. What is the main reason for firms incorrectly calculating exposure to entities? The complexity of the problem Poor data integration Poor data quality Missing information Missing or insufficient technology 0
5
10
15
20 %
14
August 2011
14 Special Report 2098693.indd 10
25
30
35
Following the financial crisis and, more recently, an array of natural disasters impacting the economy, it is now widely recognized firms need to integrate data assets and unlock exposures. Financial data professionals have been left with the challenge of identifying ways to improve data integration and ensuring firms capture sufficient data elements to mitigate risk. The importance placed on this has resulted in many firms reviewing processes and systems in recent years, opting to invest in improvements. In fact, in a poll held during the webinar, 55% said their firm has in the past three years made changes to improve the ability to unlock exposures (figure 1). London-based Stuart Mead, vice president, S&P Valuation and Risk Strategies, said some organizations might find this less important, but the score in the poll is a good number. “In my experience, this is an ongoing process in the market,” he said, explaining that the primary drivers for this trend are risk mitigation, an opportunity to derive profit from the information available, and regulation. With raised awareness of how closely businesses are connected in today’s market, monitoring exposure has become increasingly complex. Denver-based JP Tremblay, CFA, senior director, S&P Valuation and Risk Strategies, said: “We’re seeing a convergence of asset classes, and I think exposure again comes from multiple perspectives inside your portfolio.” There is a clear trend firms are focusing on adapting to the new landscape. “We’re seeing this conversion and people wanting to understand all the moving parts,” he said. One of the challenges though is to source, integrate and maintain the data needed for managing exposure
risk. In a poll, 98% of listeners said effective data integration is either quite important or very important for a firm to be able to unlock exposure and mitigate risk (figure 2). London-based Rickie Glasgow, associate director, S&P Valuation and Risk Strategies, said: “It’s important to try and assess your risk exposure and do that using the tools you have available.” The focus needs to be on getting the complete—and accurate— picture, and this is where many firms have recently made changes. More than 50% of listeners said they
With raised awareness of how closely businesses are connected in today’s market, monitoring exposure has become increasingly complex now review both equity and fixed income/credit exposures (figure 3). In addition, many favor receiving data from a primary source rather than a derivative or aggregated feed. Still, there are a number of reasons why strategies for unlocking exposures continue to be taxing. It is not easy to get it right. In a poll, 32% said the main reason for firms incorrectly calculating exposure to entities is missing information and 23% said poor data quality (figure 4). But for many, it still comes back to the complexity—which 20% of listeners said was the number one challenge. To manage the complexity and prepare for future events, it is at least clear that the challenge cannot be ignored. In today’s market, it is vital for firms to focus on building connections between different data sets to enable the business to unlock exposures. www.waterstechnology.com/ird
8/8/11 4:09 PM
Q&A
Time for Action Carla Mangado speaks to Asset Control’s president and CEO Phil Lynch about data management, current trends and the firm’s strategy moving forward How do you think the reference data space has developed in the past two to three years? It has become a pressing business issue. Firms are realizing that to be successful, they must make it a strategic concern. Whereas the focus has traditionally centered on ensuring low latency in the front office, firms are realizing that details matter. The most successful will be those that elevate the role of reference data in the organization to a C-level function and remember that the devil (and the competitive advantage) is in the details. What have been the highlights for Asset Control so far this year? We’ve had a great first half to the year, culminating in Dean Goodermote and Frank Fanzilli Jr. being brought to our board of directors as we look to build the strongest possible management team and further shape our strategic direction. Dean has considerable operational expertise and an impressive track record of repeat success in building value at software companies. Frank brings a great deal of relevant insight into what is important to our clients from both a business and technology standpoint. Crucially, we’ve successfully transitioned to being a multi-product solutions company, launching a viable product for the mid-market that brings agility, efficiency and control to players where low total cost of ownership is key. We’ve also broadened our penetration from large, tier-one financial institutions and hedge funds to the middle market, at the same time establishing operations across continental Europe and throughout Asia. Regulation has become part of every data conversation at the moment and all eyes are on legal entity identifiers. How will this change the data space? It enables independent data providers to be able to go to market more www.waterstechnology.com/ird
15 Q&A.indd 15
easily, because the coding issue that has been the domain of the large aggregators is now in the hands of the industry. The fact that it is not proprietary anymore means it is easier to integrate direct data sources using common identifiers owned by the industry. To what extent do cost constraints remain a challenge for the back office and what areas have been affected most in recent years? The back office faces a huge challenge when it comes to the cost of sourcing information efficiently and the manual processes used to manage it. Indeed, these factors become significant barriers to growth given the explosion of information and the adoption of more aggressive investment strategies. This underscores the urgent requirement for many firms to invest in the technology enabling the back office to catch up with the abilities of the front office. Financial institutions have money tied up in legacy processes that have become bottlenecks to their business. It is essential that firms invest in technology, and they should be willing to make this investment because it’s a lot cheaper, more robust and delivers a much better return in the long run than other measures. While the data space develops, some question to what extent firms have properly dealt with ownership and governance challenges. What are the outstanding challenges firms are facing? I think very few firms have achieved a true partnership between IT, operations and ownership of data management. This is partly because the middle to back office has been overwhelmed by demand, and because there has been a lack of understanding and respect for the challenges faced. It has traditionally been an area firms throw more resources at
rather than approaching it as a strategic enabler. Industry experts are saying it’s time for action rather than theory. What should be high on firms’ to-do lists? What steps should they take to tackle these challenges? You need a comprehensive and robust infrastructure for managing the data your company relies upon. Frankly, there has not been nearly enough focus on the middle and back office. Firms must look to leverage technology strategically to be more agile, efficient and transparent. It is vital to recognise tactical or ‘one-size-fits-all’ solutions simply don’t work. Firms need a flexible and scalable solution that can grow with their business and allow them to respond efficiently and easily to customer demands, new regulations and changing market landscapes. In this way they can increase revenues and create competitive advantage, while ensuring risk management, audit, control and compliance systems are fuelled by accurate, accessible and actionable data.
Phil Lynch, Asset Control
August 2011 15
8/8/11 4:10 PM
Data Management
Putting Enterprise into Data Management As firms prepare for the changes the legal entity identifier standard will bring to data management, embedding enterprise data management platforms and data governance into projects and initiatives will become more prevalent. Carla Mangado speaks to industry experts about what firms should focus on and how to tackle governance to succeed
Since late 2010, data management discussions have inevitably focused on legal entity identification, the need for a global legal entity identifier (LEI), who would have to push the adoption of this standard, and what requirements the standard would have to meet to succeed. However, LEI or no LEI, firms continue to face an ongoing data challenge internally. Industry experts fear time and budget constraints could squeeze out necessary data management projects, and scupper any plans to replace out-of-date systems. Traditional issues that have been tackled for years such as data
more traditional challenges are bound to fall behind. However, while the motivation is there, current activity is disappointing. London-based Gert Raeves, who joined TowerGroup as research director back in April, says implementation of large-scope data management programs is still at an early stage. “The excitement around data management might be high, there is enthusiasm, but when it gets to implementing platforms, many firms are still going back to basics,” explains Raeves. “The early adopters do have a security master in place, but the industry as a whole has yet to see
“The excitement around data management might be high, there is enthusiasm, but when it gets to implementing platforms, many firms are still going back to basics” Gert Raeves, TowerGroup
ownership, data integration and how to launch data governance initiatives remain problematic for many. Enterprise data management (EDM) continues to be a tough nut to crack, despite ongoing efforts and the fact that regulation has placed data on the radar. Inevitably, as the LEI discussions progress and regulatory pressure increases, firms yet to tackle these
16
August 2011
16-17 EDM.indd 16
that as a mainstream need.” “As soon as you have gone through the steps of external market influence, internal data management governance, ownership awareness and program implementation, you still face having to overcome the siloed approach,” he explains, adding that firms still tend to implement asset-class specific solutions.
Looking for the Right Governance
Data governance has long been an ongoing discussion within the data space, but it seems only now firms are looking to tackle it directly. Often seen as a traditional challenge, it is now becoming regarded as something other than a buzz word or an ongoing long-term project without a clear return on investment. It has slowly made its way into most data management debates, but more importantly, into the actual data management initiatives. London-based Colin Rickard, managing director, Europe, the Middle East and Africa at DataFlux, says firms should not tackle data governance with one big-bang approach. “I don’t think the place to start is to set up an enterprise data governance team and committee and then take on a large corporate-wide approach,” he explains, adding that the topdown-driven approaches usually end up getting no further than the committee stage. Hong Kong-based Stuart Plane, managing director at Cadis Software, say: “The answer to the question ‘how should a data governance initiative be approached’ is unequivocal—data governance needs to be embedded in the organization’s culture, and it does take time.” He adds that it would www.waterstechnology.com/ird
8/8/11 4:10 PM
be a huge challenge to conquer data governance without a data management roadmap. But while Plane says data governance is “very much about culture within the organization, more than a specific piece of functionality,” Rickard explains there are still many people in the industry who have yet to realize they need to do something more systematic and robust in this area. “Currently, we are looking at a 20–80% divide. In 20% of the conversations I have, people do recognize how regulation will make them have to look into this space more closely, but I am still having plenty of conversations where people claim data validations are not necessary. They are still in business-as-usual mode.”
Re-committing to EDM
While there are plenty of challenges, experts agree data has come a long way since the days when buy-in was a challenge and data was not on regulators’ radar. One of the most positive effects of the regulatory push is the leading role industry associations have taken, explains Raeves. And the LEI is a good example. “They have shown they want to proactively come
“In 20% of the conversations I have, people do recognize how regulation will make them have to look into [governance] more closely, but I am still having plenty of conversations where people claim data validations are not necessary. They are still in business-as-usual mode” Colin Rickard, DataFlux
up with recommendations of the ideal model and how things should work in the legal entity identity focus rather than sitting on the fence and limiting themselves to reactive discussions,” he adds. Experts agree it is now time for executives to adopt and commit to an active role within the industry, to get involved and actively contribute to the industry debate. “If they don’t do so, their larger competitor will,” adds Raeves, adding that especially in the case of the smaller firms, associations provide them with the amplifier to make their voices heard. And with the upcoming decisions around the LEI, this need to speak up and have a say is not expected to change anytime soon. www.waterstechnology.com/ird
16-17 EDM.indd 16
Ongoing Pitfalls & The Greatest Successes Colin Rickard, DataFlux “In my view, the biggest issue revolves around the fact some firms continue to struggle to recognize data governance is not something that should be tackled as part of a BI or analytics initiative, but a factor that has to be embedded as part of business-as-usual. It needs to be an operational matter, something that needs to have business rules embedded within it.” Biggest Success: An ever-growing minority of firms are viewing data governance as a business-as-usual program rather than an activity that sits in the analytical world.
Stuart Plane, Cadis Software “One of the main challenges remains being able to implement an EDM platform with a reasonable budget while meeting the goals the firm has set out to achieve,” says Plane. He explains that prior to embarking on a data strategy, firms should have a clear view of what they want to achieve. “They have to know what success means for them and what they want from the initiative...it’s not all about putting in place a security master but discussing and knowing the destination of the data from there onwards,” he adds. Biggest Success: “Organizations’ greater confidence when embarking on a data management project. They are not scared of failure as they were even just two years ago, this is due to the large number of companies that have proven that doing a data management project is achievable and will not destroy budgets and careers.”
Steve Engdahl, GoldenSource “Firms have to stop trying to bite off more than they can chew. EDM is an ongoing process... as you retire different systems you have to implement new ones; you have to deal with mergers and acquisitions, as well as react to a constantly changing landscape due to events in the marketplace.” “Firms do recognize they need more than what they have in place... and they need to be able to develop an enterprise view for regulatory and risk purposes. In the past, keeping things at a lower level would have been OK. Not any more. Now it’s all about having the right cross references and linkages.” Biggest Success: Progress around the practice of independent price verification.
Gert Raeves, TowerGroup While all eyes are on data, this focus should not be taken for granted, Raeves explains. “There is a danger that data management may just be squeezed out of a list of projects because there isn’t time and money to cover all the ‘must do’s’ in the time-frame necessary.” “As an industry, we have to do more to link market and infrastructure changes with specific data management projects—a clear opportunity for managers to make sure this is the case,” he explains, adding that currently he is having several conversations with professionals in the enterprise architect role. “It is clear there is a strong need for strong business alignment between business managers of data and data architects—there is a huge risk of disconnect in terms of prioritizing, focusing on technology versus data management,” adds Raeves. Biggest Success: Greater collaboration, discussions and will to take on an active role and have a say.
Fritz McCormick, Aite Group “The biggest issue that remains is actually driving the initiatives from the top level to the project level and understanding the environment—what is strategic, what is tactical and how can we make a more elegant and sophisticated data management program.” Biggest Success: “The establishing of data management as a true discipline. The challenge was creating enough awareness of the problem forcing firms to institutionalize the discipline itself.”
August 2011 17
8/8/11 4:10 PM
Pricing & Valuation
Emerging Potential When dealing with evaluated prices in emerging markets, it’s not just the availability and timeliness of pricing data you need to be concerned about. James Rundle looks at what solutions are being developed to help firms diversify their investment strategies and maintain their focus on data management With regulation, market crashes, sovereign instability and lashings of fiscal uncertainty of late, it’s no wonder the more developed markets aren’t quite the attractive prospect they once were. More and more, investment firms are seeking to diversify their businesses for more profitable returns, and the emerging markets are often the first port of call. For fixed-income bonds, securities and other asset classes, however, evaluated prices can be an altogether more complex issue. The largest stumbling block for evaluated prices in emerging markets is availability of information, its accessibility and the timeliness of data. Source scarcity, a lack of automation and other areas were also highlighted as being obstacles to providing accurate and fast pricing data, but naturally, these vary depending on the specific region you’re dealing with. “It’s difficult to generalize because different markets have
these markets, but generally where a market is new, information is going to be scarce.” These kinds of conflicts can emerge from the entire spectrum of an emerging market, and not just the pricing data. Phil Lynch, New Yorkbased chief executive and president at Asset Control, says prices aren’t the only area of concern. “Data has to be accurate, accessible and actionable. That challenge is considerable for both pricing and reference data, because the business needs to understand the details behind it to make good decisions,” he explains. “The firm will want to know the type of bond, the features of that bond, such as the currencies in which it trades, and other outstanding debt issues of that issuer. All those details are every bit as important as the price because they put the valuation into context. For all markets, you need a lot more information than that to accurately understand trading instruments and their
“As a business, we can stand in a market as an independent provider of pricing information, which is an attractive proposition, particularly for dealer associations and regulators where neutrality can be beneficial” Malcolm Oldham, Thomson Reuters
different nuances,” says Malcolm Oldham, head of evaluated pricing for EMEA and Asia at Thomson Reuters. “Some emerging markets have set up effective trading and information-sharing infrastructures to develop the availability and accessibility to data within
various components. In emerging markets, sourcing that information is even more challenging, but the importance of accessing the data in a timely fashion is not diminished.”
Old tricks, new dogs
The real issue here is timeliness. Brokers used to trading in developed markets will encounter a high level of automation that delivers information very rapidly to their fingertips, often
measured in nanoseconds. This fluid transfer of knowledge has become endemic to trading practices in, particularly, American and European markets, but in the emerging arena there rarely is the kind of infrastructure that can support that, or participants have to account for differing regulations, local conventions and many other issues. Increasingly, to provide the reference data so crucial to the operation of evaluated pricing, firms are looking to vendor-provided, commercially neutral networks of information. “In the past, where markets are emerging or had very limited availability, vendors may have walked away due to a lack of available information to work with,” says Oldham. “Over the past two or three years, however, Thomson Reuters has worked with local partners, industry associations and trading firms that are active in those regions to create a benchmark of information that can then be used as an independent source of reference for traders. Vietnam and India are examples of where we have done this. As a business, we can stand in a market as an independent provider of pricing information, which is an attractive proposition, particularly for dealer associations and regulators where neutrality can be beneficial.” Of course, vendors aren’t the only providers of data. Some companies are offering their own information. “Firms continue to search for additional data sources, in some cases, mining the terms and conditions data themselves,” says Frank dos Santos, New York-based vice president at Standard & Poor’s Securities Evaluations. “With any market, as it matures and becomes broader, www.waterstechnology.com/ird
18-19 Emerging markets.indd 18
8/8/11 4:12 PM
© Brandon Alms
the availability of data increases because liquidity generally increases.”
Timeliness and transparency
The way forward in dealing with these obstacles, say some, is for firms to refocus their investment. Typically, a heavy level of priority is placed on the front office. This makes the money, after all, but in emerging markets, it’s not the only priority. According to Asset Control’s Lynch, firms need to look at their back and middle offices, and their ability to provide this crucial data. “Over the past 10 years, there’s been a huge focus on timeliness in the front office, around a very small number of attributes—the last price, the bid, the ask—the attributes that are critical to execution,” he explains. “But once you execute, you need all the details, and there hasn’t been nearly as much focus in the middle and back office on timeliness around the details. Firms’ technology investments have been inordinately focused on front-office low latency. And now the middle and back offices are dealing with electronic trading in very high volumes, and realizing they need the details in a very timely fashion as well, but there hasn’t been that needed investment. It’s typical that the investment leads in the front office, but also, for that to be effective, it has to be followed up with a significant investment in the back office as well.” Other factors will necessarily compute into the amount of adjustment a firm will need to make, and given the relative size and complexity of the pricing space, there isn’t a one-size-fits-all application that can www.waterstechnology.com/ird
18-19 Emerging markets.indd 16
be effectively implemented. “I think the size of company and investment in an asset class would dictate how a firm’s infrastructure is focused,” says Standard & Poor’s’ Dos Santos. “Properly pricing securities is a very complex business. For it to work properly, there needs to be a fair degree of transparency in the market. There must be a level of transparency in the information and the people or conditions who supply it, that it is sufficient and reliable.” Anthony Belcher, London-based director of European fixed income for
evaluating these markets as the fixed income markets themselves.”
The enigma of emergence
Overall, the same issues seem to be pertinent to evaluated prices in emerging markets. It’s important to note, though, that these aren’t exclusively confined to this area but face the industry as a whole. “Evaluated pricing for [Interactive Data] is a combination of data, systems and above all evaluators who understand the markets,” says Belcher. “Emerging
“With any market, as it matures and becomes broader, the availability of data increases because liquidity generally increases” Frank dos Santos, Standard & Poor’s Securities Evaluations
Interactive Data, agrees with this while highlighting the importance of external factors in any approach to evaluated prices in emerging markets. “Given the difficulties in the data and formats for emerging markets, it very much depends on the approach used,” he argues. “If there is a black box model or algorithmic approach, this can suffer with the issues that arise from emerging markets. An approach that can deal with the differences in emerging markets should cope with the differing needs of each market, such as understanding the economies and politics, which are as important in
markets is no different from that. Having people who understand the markets and data from those markets remains key.” Timeliness remains a major concern, but it can be abated with the correct investment in key areas—personnel, infrastructure and operational professionalism. All this must be tailored to the needs of the individual firm, however. An over-reliance on a narrow section of the market will not reap the same kind of rewards as a flexible approach, looking at not just pricing data in and of itself, but the periphery that surrounds and contributes towards that. The smart money, as always, is on a wider view. August 2011 19
8/8/11 4:12 PM
Industry Warehouse
The Long and Winding Road to LEI After years of effort, the pieces to the puzzle of an ISO-standard legal entity identifier are falling into place. British Telecom’s Chris Pickles looks at how LEI can complete its final mile toward adoption At last there is a new ISO standard for a unique legal entity identifier—ISO 17442. For a considerable part of the last decade, lots of people working on standards around the world were battling towards achieving such a standard. The international initiative to create an ISO-standard International Business Entity Identifier (IBEI) was brought to a standstill, and instead the ISO-standard Bank Identifier Code (BIC) was massaged to turn it into a “Business Identifier Code”— one reason being that banks wouldn’t then have to change their legacy BIC-based systems. However, that didn’t end up meeting the overall requirement either, and work had already been moving ahead on an ISO Issuer and Guarantor Identifier (IGI) standard—an adaptation of the IBEI—when the US Office of Financial Research (OFR) demanded an appropriate, workable result—and fast! So the ISO Legal Entity Identifier (LEI) standard was born. Having declared who will be the central administrators for this new standard, it is not yet clear to everyone how the process of obtaining and issuing LEIs will work. On the one hand, the concept of having one central body for the whole world issuing unique identifiers seems optimistic. On the other hand, there are already international bodies that have addressed and solved very similar problems: think article numbering for bar-coding, for example. The financial sector has a known tendency to reinvent wheels and build do-it-yourself solutions where larger, better and more costeffective solutions are already not only available, but are also already
20
August 2011
20 Ind Warehouse.indd 20
being used in a different part of the same financial institution. Unique LEIs are not new. To become a legal entity, organizations generally must register with a national body that is often a department of the relevant national government, which then issues an identification number for that legal entity. Each entity identifier is nationally unique. The difference with the ISO LEI is that each identifier will be both globally unique and
ISO LEIs may have been created to meet a regulatory and risk management issue, but their applicability is much broader contain no “information”: the ISO LEI should not give a clue as to what country the legal entity is registered in. That implies that only one central body will issue all LEIs. For a one-country system, such as a solution for the US OFR’s requirement, this wouldn’t appear to be a problem. The US government can require any organization to register for an ISO LEI, regardless of whether the organization is a US legal entity, a non-US legal entity doing business in the US, or a non-US entity doing business with any US organization anywhere in the world. Any country can make its own rules. The difference here is that the ISO LEI standard needs to be adopted globally. Adoption is the true measure of success of any standard, and if the ISO LEI registration process is not made totally transparent and cost-
efficient, adoption will end up being made more difficult than necessary. But there are other operations besides financial markets and the management of systemic risk that have a need for unambiguous entity identifiers. One such very sizeable area is that of international business and e-commerce. Being able to reconcile the suppliers and customers that relate to electronic orders, e-invoices, remittance advices, electronic payments and VAT is critical to an efficient e-commerce environment. Banks are now seeing that their revenues from payments processing are rapidly declining as a result of legislation such as the EU Payment Services Directive (PSD) and of increasing activity by both domestic and foreign competitors. Major banks are now looking to the business opportunities presented by supporting their corporate clients in their e-commerce activities as a replacement revenue stream for the future. And that brings us back to the enterprise data model and the single golden copy of reference data for a financial institution to use. ISO LEIs may have been created to meet a regulatory and risk management issue, but their applicability is much broader. Rather than addressing the adoption of ISO LEIs in a siloed manner, as though they are just data elements to be used in a firm’s financial markets activities, financial institutions need to recognize how ISO LEIs can help them address the whole range of the enterprise’s activities, so they can plan to maximize benefits at least cost, rather than just building yet another reference data silo.
Chris Pickles is head of industry initiatives—global banking and financial markets, at British Telecom
www.waterstechnology.com/ird
8/8/11 4:12 PM
Complete access for your entire organisation to WatersTechnology Our information solutions provide everyone in your organisation with access to the best information resource on the financial market technology industry.
WatersTechnology delivers: • • • •
•
Breaking news and in-depth analysis on the financial market technology industry Detailed features on your market and the deals and people who shape it Video features and technical papers A fully-searchable archive of content from WatersTechnology and from all of the other marketleading titles incorporated under the site (Buy-Side Technology, Sell-Side Technology, Inside Market Data, Inside Reference Data and Waters) Full compatibility with any mobile device
To find out more about the benefits an information solutions package would bring, email
[email protected] or call +44 (0) 20 7484 9933 / +1 646 736 1850
waterstechnology.com
SWT_WT11_AD210x297.indd 1
17/06/2011 17:31
People Moves Thomson Reuters Cuts Several Execs
Thomson Reuters announced a major reorganization of its financial information organization on July 22, cutting senior management including Devin Wenig, chief executive of the markets division. Chief executive Tom Glocer has taken over the markets division. Shanker Ramamurthy, formerly head of sales and trading, now leads the new financial professionals and marketplaces department. Eric Frank, head of the investment and advisory unit merged into this department, has left. Susan Taylor Martin has taken the reins of the group’s media department. She had been president of global investment focus accounts, and replaced Chris Ahearn in the role. Other departures in the reorganization included chief marketing officer Lee Ann Daly, managing director of global sales and customer service Joerg Floeck and global head of human resources for the markets division, John Reid-Dodick.
Madsen Leaves Saxo Bank
John Visti Madsen, head of market data sourcing and strategy at Saxo Bank, has left the firm after four years. He joined as head of data management in December 2007 and was appointed head of market data in May 2009. Previously, Visti Madsen was team lead, sales support and client relations at VP Financial Information from February 2007 to November 2007, and a customer consultant at VP Securities Services from May 2005 to January 2007. He was also a customer consultant at Telekurs Financial, a role he held for three years. Visti Madsen is an expert on vendor sourcing management, data quality and governance, and has had an active role in the development of the Financial Information
of DTCC systems. Wysota reports to Robert Garrison, chief information officer. He also joins DTCC from Barclays Capital, where he had been a managing director in prime service information technology since 2008. Wysota’s experience also includes work for Lehman Brothers and Morgan Stanley. He is replacing Lea Moskowitz, who is retiring from DTCC in September after 34 years there. Bari Jane Wolfe, DTCC
Associate (FIA) exam, an FISD-led and industry-recognized professional accreditation for practitioners in the global financial information industry. He is a member of the FISD Financial Information Associate Curriculum Committee.
DTCC Hires Wolfe, Wysota
The Depository Trust & Clearing Corporation (DTCC) has appointed two new managing directors: Bari Jane Wolfe, who will serve as head of regulatory relations, and Adam Wysota, who will serve as chief technology officer. Wolfe will lead the DTCC’s regulatory relations functions, interacting and working with US and international regulatory bodies. She will report to Michael Bodson, chief operating officer. Previously, Wolfe was a managing director at Barclays Capital, managing the group responsible for litigation, disputes and regulatory matters. She also spent 23 years at Lehman Brothers in key positions in the legal and compliance groups. In his role with DTCC, Wysota will be responsible for strategic direction and oversight of information technology infrastructure, as well as a technology risk mitigation initiative. He will review and approve technology upgrades and supervise maintenance
Broadridge Names Three in Europe
Broadridge Financial Solutions has hired three new senior executives in its investor communication solutions business: Naren Patel, director and regional head of relationship management; Keir Tutt, senior director of customer services, international; and Jonathan Ford, director of sales in London. Patel will focus on strategic account management for Broadridge’s European clients. He has held senior positions over a 25-year career in the securities business, at firms including National Westminster Bank, Royal Bank of Canada/Royal Trust (now RBC Dexia), Swift and Fiserv. Tutt was previously director of client services at Thomson’s Online Benefits unit, while Ford was head of the mid- and small-cap sales team and a client director at Investis, a consultancy. His experience also includes work for Thomson Financial and the London Stock Exchange.
MarketPrizm Taps Kirkup
Market data and trading infrastructure services provider MarketPrizm has hired Tom Kirkup as client director on its European commercial team. His past experience in financial markets includes several roles at Reuters, including global account, country and sales management, in locations such as London, New York, Tokyo, Frankfurt and the Gulf region.
Calendar September 13: Inside Market Data Chicago. Chicago. Organized by Inside Market Data. Details at: www.financialinformationsummit.com. September 14: Tokyo Financial Information Summit. Tokyo. Organized by Inside Reference Data and Inside Market Data. Details at: www. financialinformationsummit.com. September 19–23: Sibos 2011. Toronto. Organized by Swift. Details at: www.swift.com. September 27: European Financial Information Summit. London. Organized by Inside Reference Data and Inside Market Data. Details at: www.financialinformationsummit.com. October 8–12: World Financial Information Conference 2011. San Francisco. Organized by FISD. Details at: www.fisd.net.
22
22 People.indd 22
August 2011
October 25: Frankfurt Financial Information Summit. Frankfurt. Organized by Inside Reference Data and Inside Market Data. Details at: www.financialinformationsummit.com. November 7–9: Asia-Pacific Financial Information Conference. Hong Kong. Organized by Inside Reference Data, Inside Market Data and FISD. Details at: www.financialinformationsummit.com. December 1: FISD General Meeting & Networking Reception. London. Organized by FISD. Details at: www.fisd.net. December 2: FISD Women’s Group Luncheon. London. Organized by FISD. Details at: www.fisd.net. December 15: FISD Issue Brief & Networking Reception. New York. Organized by FISD. Details at: www.fisd.net.
www.waterstechnology.com/ird
8/8/11 4:13 PM
European Financial Information Summit 2011
Market data Reference data Enterprise data management Corporate actions Latency Data architecture
Hear from keynote speaker
Chris Donnan, European Head of Automated Trading Technology, BARCLAYS CAPITAL
FREE attendance for qualified delegates from financial institutions
London, September 27 The annual European Financial Information Summit brings together market data, reference data and data management executives from leading financial institutions across Europe to discuss the challenges facing their business Hear from leading practitioners: Steve Ellenberg, Market Data Strategic Sourcing Consultant, BANK OF NEW YORK MELLON
Alberto Ricciotti, Head of Global Modeling & Warehousing CFO Data Governance Planning, Finance & Administration, UNICREDIT
Neil Boosey, Head of Market Data Technology, ROYAL BANK OF SCOTLAND GROUP
Hans Christian Reinhardt, EMEA Head of Quantitative Trading Services, MORGAN STANLEY
Chad Giussani, Global Head, IT Quality and Regulatory Reporting, HSBC
Nigel Matthews, Global Reference Data, NOMURA INTERNATIONAL
Ian Davidson,
Julia Sutton, Director, Global Head of Customer Accounts and On-Boarding, ROYAL BANK OF CANADA CAPITAL MARKETS
EMEA Product Head, Electronic Markets, CITI
Lead sponsors
Panel sponsors
Co-sponsors
Hosted by
For sponsorship opportunities please contact Jo Garvey via
[email protected] or call +44 (0)20 7316 9474 www.financialinformationsummit.com/eu Untitled-2 1
7/29/11 12:26 PM
SIX Telekurs Solutions: Energising your business.
SIX Telekurs designs, implements and maintains solutions, allowing our clients to take maximum advantage of the full scope of data we offer. We pay close attention to your individual requirements and energise your business by offering you the right solution for your business needs. For more information e-mail us at:
[email protected] www.six-telekurs.com/solutions
Untitled-1 1
8/9/11 12:57 PM