MARY ANN DOWLING: Good afternoon. Treasury Strategies is very pleased to be part of the Alexander Hamilton Awards, and to moderate the Solution of the Year segment. The finalists here to my left are: Jamie Ballingall, Courtlandt Gates and Ron Chakravarti. Thank you.
I’ve been asked to give our vision, Treasury Strategies vision, of what treasury is. Treasury is today now the nerve center of a company, and it interfaces with both internal and external stakeholders. Now more than ever, it needs to provide business intelligence and analysis to these stakeholders. Because of changes in regulations and disclosure, treasury is called upon now more than ever to provide essential analysis for decision-making, and to be a payment liquidity and risk adviser to the business.
In order to do this, treasury has to rely on technology. Without technology, without systems to capture the day-to-day activities, applications like Bloomberg, Reuters, or the applications or solutions that these three companies are going to present today, treasury would not be able to provide the initial data or analysis that management needs in a timely fashion.
A strategic treasury focuses on these five key areas, and hopefully your companies are focusing on them too. What I heard today I think this is where most treasuries are looking, especially when we heard this morning about funding and liquidity, and the issues that arose during the financial crisis. Being able to go to market, and raise the funds when you need to, or knowing where your cash is. What we’ve seen is there are companies that have gone out, and have prefunded 2011, 2012, and are sitting on, as we heard this morning, significant cash, and are very reluctant or conservative about using it.
So therefore, they have to invest these funds in securities that preserve their principal, and have it ready if they can’t go to market to fund. Part of liquidity management is just cash forecasting, and we work with companies looking at enhancing of cash forecasting capabilities. One of the things we hear all the time is that our businesses are not forecasting accurately, we don’t know what’s going to happen; we get receipts in before we’re supposed to remit payments out.
On the risk management side, everyone has focused on market and operational risk, but because of the Lehman debacle, treasuries had to decide how else they were going to assign limits to their counterparties. In the past, they were using the credit ratings from Moody’s and S&P, but after Lehman went under over a weekend, an investment grade of A wasn’t sufficient enough information to allocate a line to, so they had to identify other criteria, and there was more focus on what is that criteria, and then how do you assign lines.
The focus on payments is to have a straight-through process, so that businesses are putting in the payments through some sort of Web-based program, and that it goes straight through out to the banks to avoid fraud. And controls are important, because from a controlled perspective if you look at the last 20 years a lot of the financial disasters that happened in the 90s and early in the 2000s was due to a breakdown in the controls in the companies. So focus on controls, and make sure that there is no fraudulent activity occurring.
Treasury management has evolved from Generation 1.0 to a Generation 3.0. In the Generation 1.0, we’re looking at a very operational and clerical environment, and I guess the best way to describe that is if you looked at a before-1970 treasury, there were no foreign exchange issues, because exchange rates were fixed; there were no money market funds, there were no workstations. It was very clerical and operational.
As you moved into the 70s and the 80s, you had foreign exchange rates being floated, you had the dollar come off the gold standard. In the early 80s, you had options being introduced to the market, financial futures, interest rates. All of this warranted treasury to become more analytical and proactive in managing the operations for the company.
Treasury Generation 3.0 is a much more advisory strategic, and if all of those solutions that we heard today lead to that, that’s where treasury needs to move to, in order to provide the information for management to make informed decisions. It’s a payment liquidity adviser, and it’s a top-down approach using a single depository. What we’re seeing companies doing are setting up data warehouses so that they can capture the information into a location -- so they can then generate reports for management in a more timely and effective manner.
If we were to look at the technology for a Generation 1.0, it’s very limited. It’s usually very manual, and it’s very convoluted and data doesn’t rely and reside in one place. In a Generation 2.0, we have a more automated process. You have treasury workstations, you have the ability to send information back and forth electronically, but there’s still manual processes occurring. You have Excel spreadsheets, and you have some of the businesses using the phone to request a transaction. This is where we find a lot of treasuries today:stuck in this Treasury 2.0, and what we work on with these treasuries is to move them into the architecture for Treasury 3.0.
We do best practices studies with other companies. We look at enhancing their policies and strengthening them, and we look at what technology will help to automate the manual processes.
The technology for Generation 3.0 is the implementation of a data warehouse, which would capture all of the information, and enable those reports. Treasury Generation 3.0 uses technology to the fullest extent, and it gives the enterprise a broad view of the company. It’s a journey for companies. It’s not something that happens today -- it takes time to get to this solution or to get into this, and I wonder where in your corporation are from a generation standpoint. Are you in a 2.0 or a 3.0? What you need to do to move into a Generation 3.0 is to identify the obstacles that might prevent you from automating, and then put a plan in place, and we’ve heard about that today. Lewis talked about a plan. Where are you going to go? And then you can move forward so that you can have a strategic treasury operation.
So wwe’re going to move onto the solutions of the year.The Bronze winner is Clearwater Analytics, and it’s a Web-based system that provides investment portfolio reporting and analytics for the institutional investment asset managers and custody institution. It enables companies to monitor the performance, as well as asset managers against a standardized index with accounting data throughout the entire system, and allows for real-time strategy decisions.
CEO Courtlandt Gates brings to Clearwater more than 25 years of experience in the financial service sector. Prior to joining Clearwater, Gates founded Vesper Investment Co. in California. From 1997 to 2006, he served as president, providing enterprises with merger-and-acquisitions, strategic and private equity advisory services. Prior to this, he was engaged in private-equity general business management and corporate development activities. He also served as a high-yield bond trader at Goldman Sachs, and a financial analyst at Morgan Stanley in New York and Tokyo. Gates holds an M.B.A. from Harvard Business School and an A.B. from Harvard College. Congratulations.
COURTLANDT GATES: Thank you very much. In case you haven’t heard, bronze is the new platinum as far as I’m concerned. In all seriousness, it’s an honor to be named a finalist with Citi and Credit Suisse for the 2010 Alexander Hamilton Solution of the Year Award. I’m going to try to stay within the boundaries of the fine line of sales promotion. But I’m very excited about the product we offer, and I need to spend some time telling you a little bit about what we do.
My primary message today is that technology and outsourcing can make the best and highest use of your precious, precious human resources. I’ve been involved in outsourcing for a long time. My first summer job in the 1970s, I would fetch golf balls out of a water hazard in the Long Island Sound in real time, and I would charge golfers a per ball fee to get their golf ball back. Golfers outsourced to an expert, who happened to be 11 years old, using technology swim goggles. They saved time and money, and they kept their clothes on, for which everyone was very grateful.
Fast-forward 40 years. Clearwater provides aggregated reconciled integrated investment-portfolio reporting and analytics. We aggregate cash positions and transactions from safekeeping entities for both internally and externally managed portfolios of our clients. We pull in security master information from third-party data providers, and using that information, we reconstruct our client’s portfolios on our servers.
The next step is vital. We don’t just aggregate and regurgitate that information. We take those virtual portfolios, and re-reconcile them against the custody entity, and we identify and resolve exceptions. This creates a solid taxable level foundation, on which we construct integrated daily web-based accounting-compliance performance and risk reporting and analytics. We’re currently reporting on over $500 billion in assets for 2,500 institutional investors, and we service the accounting book of record for the majority of our clients.
Clients include institutional investors, as well as investment managers, custody banks and portals for which we provide private label solutions. Eighty percent of the assets on the system are corporate operating funds, and we’re currently reporting on between 30% to 40% of the cash and short-term investments of all public U.S. non-financial corporations.
We’ve been successful because we solve an urgent and pervasive problem: and that is the need for accurate, timely, actionable investment portfolio reporting and analytics. That is the need to know and understand what you own. As mentioned a number of times today, the financial crisis has elevated the profile of treasury within the organization, and it has also elevated the importance of understanding the investment portfolio. The financial crisis has also exposed the shortcomings of spreadsheets and manual processes in generating actionable information. Finally, treasury departments are understaffed, and it’s vital to make the best and highest uses of those precious human resources.
As I mentioned, solving the market problem has been a major driver of our success. There frankly was probably some luck involved. One of the founders of Clearwater had experience in a large corporate treasury department in Silicon Valley, where he developed a deep understanding of the challenges of reporting on portfolios for treasury departments, and where he was also surrounded by other peers in sophisticated treasury departments who were open to outsourcing and applying technology to solve problems.
The other three founders had deep domain knowledge in fixed-income sales and trading, a passion for portfolio transparency, financial wherewithal and a background in technology. The combination of the founders’ skills was certainly very powerful, and since inception, hiring great people has been critical to our success. To make the highest and best use of personnel, it is vital to leverage technology and external expertise. And using us as an example, we have a lot of very smart software developers at our company, and they could write a customer relationship management program for our sales team and sales manager to use. Our sales manager could track activity and pipeline status in Excel.
We chose an outsource solution, Salesforce.com, which is built on best practices, and delivered software-as-a-service. Our developers focus on improving our product, and our sales manager focuses on driving revenue.
This year we submitted our name for consideration as Solution of the Year with a case study about Cisco. Cisco has been a client and development partner for many years. In addition to its greenness, which we learned about a little earlier, Cisco is distinguished by the size of its portfolio, and the efficiency of its treasury department as managed by assets per person. Since inception Clearwater has continuously improved its offering. I see this award more as a lifetime achievement award than an award for one performance, but this year we rolled out functionality that was of use in particular to Cisco, including performance contribution at the individual tax lot level and, a broader selection of benchmark indices against which to measure Cisco’s portfolio’s performance.
This is not the first diagram you’ve seen like this today. Treasury reports on the investment portfolio to a number of important constituents ranging from the board of directors to tax, accounting and audit. Meeting the needs of those constituents is cumbersome, involving data aggregation, data management, spreadsheets and manual processes. Reporting is, frankly, very messy. It’s a Sarbanes-Oxley nightmare. And as you know, it’s impossible to write controls around spreadsheets.
With an optimal solution all constituents, whether it’s the board of directors, the CFO, tax, audit, accounting, extract information built on the same aggregated and reconciled holdings, the information in accounting, compliance performance and risk tie out. The aggregation and reconciliation is outsourced, and the reporting is backed by a SAS 70 Type II Audit. Your employees are liberated to make informed decisions. Technology and outsourcing can offer automation, simplification and efficiency, and the delivery of actionable investment portfolio information. I’d like to thank Treasury & Risk and the sponsors for putting together this very informative event. Thank you very much.
DOWLING: The Silver award goes to Citibank for its treasury diagnostics. They have developed a solution called a comprehensive online benchmarking tool using comparative data collected by Citi to measure performance across six pillars of the operation. It enables the companies to compare their policies, practices, and processes against their peers in best-class companies worldwide, and provides a yardstick for prioritizing their investments and operational changes and improvements, and information being collected online.
Ron Chakravarti will be representing Citi. He is responsible for global solutions management for Citi’s enterprise liquidity management. He joined Citi in 2006. Prior to Citi, he was with ABN AMRO, heading its North and Latin American liquidity management advisory teams, and with Treasury Strategies, my organization, as a principal in the global corporate consulting practice.
RON CHAKRAVARTI: Thank you, Mary Ann. I’d actually first like to thank our clients. What I’m going to describe was developed for our clients, but it’s really because of them that it became useful to them and ultimately sort of led to this. Of course, I’d like to thank the Alexander Hamilton panel of judges for awarding us this. So thank you very much.
Lewis Booth of Ford said this morning, I’ll just quote, he said that, ‘Good advisors need to be included in the process,’ and this was his key takeaway at the end. That sort of brings us to what Citi Treasury Diagnostics as we call it is about. I’ll give you a little bit of background.
Citibank’s large corporates are very diverse but essentially large multinational client base, or companies that are growing into large multinationals. And throughout the years they’ve asked us: ‘Help us. You bank other companies like us. Help us take a look at our practices, our policies, our processes within treasury, and give us feedback on what we should be doing better. Not just in terms of bank product usage, yeah we get that. I mean ultimately there’s a commercial relationship, but as banking partners we would like and in fact expect you to do more than that.’
Now, there is a lot of work that happens obviously in the corporate finance and capital market side. It’s really treasury operations, which is what we decided to focus on, classic treasury operations admittedly so, to focus on where these questions have come up where there isn’t necessarily a systematic process, and thus was born treasury diagnostics. If you sort of look at the cash conversion cycle, which that diagram is meant to represent, kind of, and if you look across and look at where treasury operations fits in, you’ve got sort of the classic areas of liquidity management, working capital processes. And then within treasury, you’ve obviously got risk management, subsidiary funding and rate repatriation, liquidity management. And then ultimately the foundational layer, your policy and governance, and as Mary Ann was talking about, the technology, which is not just there for efficiency and automation, but really as a key enabler in making all of this work.
So if you think about it, and if you agree that those are six pillars as we call it of treasury operations, essentially what we did was create two things: a secure online tool where companies can go and our clients can go in, and answer a series of questions across these six areas around policies, processes and practices; and then an analytical engine that takes the output, slices and dices it, and creates output in terms of how they compare against other companies, and against the universe, against peer group companies, against self-selected samples, against peer groupings that we create. And the result really is what you see at the bottom. It gives our clients a benchmarking of how their practices, how what they’re doing in treasury operations compares against other companies that they regard as peers, or against the universe as a whole.
None of our clients believe that it gives the answer to anything. The point is that clients get a data point, get a series of information points about how they compare against others, and that’s what clients find incredibly valuable, and there are certainly clients in this audience today who have used this.
So just to recap it, we created this because clients were asking for it. What clients are using it for is a series of information points about how they compare in their treasury operations, policies, practices and processes. And the real use for it that our clients have put it to, it’s a data point; it’s a series of information points in prioritizing, where you put your budget and what you do. And that’s where clients have found it useful again, because it gives you some useful systematic and analytical information about how you compare. And then ultimately, it’s your strategy in your organization and your business that’s going to determine which of those data points are useful, and which of those are not.
For Citi, it’s been useful as well, of course. First of all, it lets us meet what we said at the top, which is it’s something that they expect us to do. So this isn’t a service that we charge for. And quite frankly, when we look at the results overall, naturally it gives us information on trends that are emerging around the world, across particular industries that help us decide how we should be delivering solutions to our clients, not just today, but how they should be evolving as clients needs emerge. As clients go into new markets, as emerging-markets multinationals get bigger and bigger, how they should evolve and what we should be doing.
In a nutshell, going forward, we’re going to continue to certainly provide it to our clients, but we’re looking at additional cuts, we’re providing additional industry cuts. The first version was very much corporate nonfinancial corporate oriented. Our financial institutional clients and nonbank financial institutional clients have been expressing interest, and we’ve started doing cuts for that. The public sector, which is sort of governments, NGOs; it’s a very wide field that expresses a lot of interest. We’ll be continuing to develop that, and it’s really sort of evolving into a whole series of other initiatives, and discussion forums that’s coming out of that. So that’s it in a nutshell.
I wanted to end by just saying that this wouldn’t have been possible, like I said before without our clients. It also wouldn’t have been possible without a couple of people around the audience. First of all is someone I call our rocket scientist, because he comes to us from MIT, Joe Morrow is sitting over thereAnd the other one is Cindy Gerhard, a senior member of our team who joined Citi a couple of years ago, and has proven to crucial to rolling this out, and making it useful to clients. Thank you.
DOWLING: Okay. The Gold winner is Credit Suisse Securities. Credit Suisse Securities developed a tool that provides practical actionable advice on corporate ethics risk management strategies in a comprehensive and rigorous framework that captures the uncertainty around the correlations between FX rates, and employs robust risk measures. It is a Monte Carlo-based system that generates simulations of FX rates against which different risk management strategies can be systematically tested and compared.
Jamie Ballingall is here representing Credit Suisse, and he’s a director in the investment banking department in New York. He is the lead quantitative analyst for the corporate finance and risk solutions group, which provides state-of-the-art analysis and analyticals for optimal corporate financial policy questions. Jamie joined Credit Suisse in July 2008 from Merrill Lynch in New York, where he spearheaded the development of the innovative Merrill Lynch building blocks analytical suite. Prior to that, he held various positions at Deutsche Bank in London, doing optimal corporate financial policy, rating advisory and sovereign debt management. Jamie is originally from the United Kingdom, and currently resides in New York. He holds a BS in mathematical science from the University of Bath, and an MS in mathematics and finance from the Imperial College in London. He is also an adjunct associate professor in finance and economics at Columbia University, where he teaches quantitative corporate finance.
JAMIE BALLINGALL: Well, I almost feel like I don’t need to say anything more after that. So let me just give you a little bit more color. So I should start out of course by thanking Treasury & Risk magazine and thanking everybody for coming out today to listen to all of this nonsense. And thanking a name that I know you’ve heard a lot, and you’re going to hear again -- Cisco Systems, who is providing our case study today.
So I work in the corporate finance and risk solutions group, and we have a mandate to assist any of our key corporate clients with any questions. Today I want to focus on risk management, and I specifically want to focus on FX rate risk management.
Towards the beginning of this year, we sat down and we did an annual review of all of the analytics that we had available to our clients, and we came to the very odd conclusion regarding FX rate risk management. We looked at what we had and we said, ‘We think this is state of the art; we think this is the best on the Street, and it’s also not good enough. We need to make this a lot better.’ We’d just been through the financial crisis, and that had really fundamentally challenged all of the assumptions that everybody had been making when they were building risk management models and we thought that we needed to do something about that.
It turns out that at almost exactly the same time, the treasury team at Cisco Systems was also sitting around having a broadly similar conversation that went something along the lines of: ‘We think we’re state of the art, but we need to get better.’ They reached out to us as a partner, and said, ‘What can we do together in order to come up with a better solution here?’
That was critical from my perspective, because now I had a client, and now I could actually go and grab some resources and get this project under way. And also they brought on a wealth of expertise about the pragmatics and even some of the theory that we needed to look at. So what was the question that we wanted to look at very specifically? The question was quite straightforward: Cisco has a very extensive portfolio of foreign currency exposures, and they have a specified budget for buying options in order to hedge those exposures. That’s a number that is specified in their annual budget; it is to a certain extent fixed, and they’ve got more exposures than they can feasibly hedge with that budget alone. So the question becomes, how does Cisco most efficiently allocate that option premium spend to the different currencies that they have available?
So in the interest of time, and simplicity and confidentiality, let’s imagine three currencies for a hypothetical company here. So I’m sitting there in the treasury department, I’ve got these three different currencies, and I’ve got a $100 million worth of exposure to each of them. And I do some simulation, maybe I look historically and do an historical simulation, maybe as we do, I calibrate to forward-looking option prices, and I come up with some kind of simulation of the euro, U.S. dollar exchange rate of sterling and of yen.
I look at those, and I can look at the individual risks, and then I say, well if I look at my portfolio of risk obviously I expect the portfolio risk to be lower than the sum of the individual risks, because I have this diversification benefit. Not all of these currencies move precisely in line, the correlation is not exactly one, so I get some certain amount of diversification.
So first of all, I need to make sure that I fully understand my risk picture, and that I’m confident in my diversification number. But secondly, I need to understand how individual risks and also the diversification move if I start to apply hedges to this. So if I go out and start hedging U.S. dollar-yen, well that looks like that should reduce risk. But how much risk will it really reduce? Because it will also reduce my diversification. So we do the standard sort of thing. We go out, we calibrate a model, some Monte Carlo simulations, we test out various strategies, and we come up with a model to assist doing this. We look at the inputs that are going into that model, and we realize the correlations are absolutely critical. They’re really what drive the diversification factor, and therefore what’s driving a lot of the hedging decisions.
If I’ve got a currency that is completely uncorrelated with anything else, that’s one of the last currencies I want to hedge. In fact, I maybe even want to hedge currencies that have got a negative correlations; I maybe do not want to hedge those at all. So the correlation is an extremely important input into this model. If we look back here, this is a correlation on a one-year rolling window between euro-U.S. dollar and yen-U.S. dollar; if it looks upside down it’s because we’re quoting everything in a direct quote fashion.
And what’s important here is that it’s jumping around a fair amount and it’s jumping around very, very quickly. So if I do my diversification calculation with today’s numbers I get 10, but if I do my diversification calculation at some other moment in time, here in 2007, I get a completely different number. And importantly with those correlation assumptions I get a completely different answer as to what the optimal hedging strategy is. This was causing us some significant concern, and given the volatility in the markets, we were not comfortable just plugging a simple correlation number into our models.
So the approach we took is a Bayesian correlation approach. It’s a Black-Litterman-inspired approach. The top of what we got here is traditionally how people do it. You pick a correlation number based on some historical time period. You hoped you picked the right historical time period that you didn’t need to get more data or you shouldn’t have used less data. You just pick one that feels right to you. You get your number, and you go ahead and you generate your Monte Carlo simulation pods, and do all your risk analytics, and all your optimization based on that.
The approach that we’re using is saying I want to treat that single number as a forward probability distribution. I want to recognize that I don’t really know what the correlation between euro and yen is. I need to explicitly account for that uncertainty in that correlation. So I’m going to model that as a forward probability distribution, and then before I do my path simulation, I’m going to generate 10,000 different versions of the correlation number itself. And if each of those correlations then I can go ahead and generate my paths. That seems a lot more robust to me, because now what I’ve got is Monte Carlo simulations that cover all those correlations’ circumstances, not just the circumstances that I happen to find myself in at the moment. Now of course in practice, we’re not just dealing with two or three currencies, we’re dealing with a portfolio, so these individual numbers obviously these matrixes and so forth, but the principle remains the same.
What is the benefit of doing that? Well, it gives you a better read on what your risk measures actually are. It gives you a better read on actually what the diversification is. But it also gives you a more robust answer when you go out and start applying hedging strategies. In particular, it tends to tell you to hedge currencies that you may have otherwise have overlooked. There could easily be currencies out there with low or negative correlations that a traditional model would say, well, don’t worry about that, because the correlation is low or negative. But in this kind of framework because there are some parts in which the correlation is very high, the model comes back and says, although it looks fine today it may not look fine six months from now, so you need to hedge that anyway. So you get a set of results out of this model that feel a great deal more robust, and are covering a lot more of the bases.
I’ve thrown around the word ‘risk’ quite a lot here without really discussing it or explaining what it might mean, because it’s a room full of treasury professionals, we all know what risk means. But we need to quantify risk, particularly within any kind of modeling framework. And the standard on the Street at the moment is very much to use Value-at-Risk. I’ve had it measured a couple of times, and on a diagram like this you would start out at a particular -- this distribution is supposed to be a distribution of gains or losses versus some benchmark. So you’re benchmarking off of zero, I’ve already done the benchmarking part here, and Value-at-Risk is very simply the distance between the zero line and the fifth percentile. And that’s fine, it’s widely used. I think we’ve all been fighting a battle to get it recognized as a useful risk measure over the last couple of years.
There are a couple of problems though with Value-at-Risk. The first one is that if you measure the risk of an individual portfolio, and you measure the risk of some other portfolio and you add them together, you would expect that the risk of the consolidated portfolio would be equal to or less than or equal to the sum of the individual risks. The diversification should be risk-reducing, and with Value-at-Risk that’s true most of the time, but not all of the time. When we’re optimizing over a very large space and possible hedging strategies, wouldn’t you know it, my optimization algorithm always find those couple of situations where that doesn’t happen to work out and falls down a pit, and goes and finds me the wrong answer. So that’s very kind of a technical reason why we might want to steer away from Value-at-Risk, but there’s another more profound one, and that has to do with the left hand tail.
Ninety-five percent value-at-risks is a very common measure, but let’s think about what it really means. I’m telling you that there is this downside case that happens 5% of the time. If I’m looking at quarterly risk measures that’s going to happen once every 20 quarters, i.e. once every five years, and I’m telling you what it looks like at the edge of that downside. So I’m defining these downside scenarios, which happen once every 20 quarters, and I’m telling you about the best downside scenario.
Well, I don’t know about you, but I’d kind of like to be in my job for at least five years, so I’m going to see at least one of those events, and all VAR has told me is that it’s going to be worse than this. It’s told me a number, and I know that during my five-year career, I’m going to see something that’s worse than that.
So that caused us a little bit of concern, and it’s a particular concern when we look at any kind of distribution with fat tails, like this one here has have; I know it looks like a normal distribution. I spent hours making sure it had fat tails. So we used a slightly different risk measure and it’s a very related risk measure, and we try and hold on to some of the key stuff that VAR has. VAR is very communicable within the organization, it’s very robust, it’s a one-tail measure, all of those kinds of things.
But we need to fix those two problems. So we use a risk measure that we call expected-tail-loss, and the academic literature it’s also called expected shortfall or conditional VAR, or something like that. And what we do is we don’t measure to the fifth percentile, but we take the average of everything beyond the fifth percentile to the left of the fifth percentile, and we measure to that. And that’s telling you something slightly different. First of all, it fixes the problem with diversification. But the second thing is it’s telling you not about the best downside scenario, but about the average downside scenario, and we think that’s considerably more informative.
But I should now talk about the benefits of OPAM. So a lot of the benefits are kind of internal: we’re allocating capital more efficiently within the business. That’s a great benefit.
We’re thinking about contingency planning a little bit more, because we’re looking at that different risk measure, and we’re thinking about scenarios that we might not otherwise have considered. We’re reducing administrative expenses. One of the things I didn’t mention is if you’ve got a portfolio of 40-some currencies you probably don’t need to hedge them all. You probably only need to hedge five or six of the big ones, and this kind of model can really help you to understand which those five or six are, and you can just forget about the others, because you really don’t want to be having to hedge Zambian dollars or whatever it is.
It frees up treasury resources, which -- hands up anybody in the room who thinks that treasury operations is overstaffed. We can free up a few resources with a project like this. It promotes a better understanding of risk within the organization. But the one I find that CFOs tend to focus on is it just reduces the amount of the expense that you have to pay out to investment banks in the form of cost of hedging. So as a number, you know we can’t discuss specific numbers, but we typically see that you can achieve exactly the same risk reduction with a more efficient hedging strategy, and reduce the direct cost of that hedging strategy by about 30% to 40%.
So again thank you very much Treasury & Risk magazine, thank you very much to everybody here for listening to my speech. Thank you.
DOWLING: Question time. I’ll kick it off for the panel, the finalists. What challenges did you encounter as you were developing the solutions that you talked about today?
GATES: I recently actually answered that question in an all-employee meeting. When we get together semi-annually the employees generally want to know what I’m worried about, and my biggest concern is continuing to hire really high-quality people to maintain our 50-plus% annual growth rate. And when I think about it that’s a worry or a problem that I think most CEOs would want to have, but that’s really it -- hiring great people.
CHAKRAVARTI: Probably the biggest challenge actually was keeping the scope within limits. The intellectual property so to speak wasn’t very difficult, because we have a lot of ex-corporate treasury practitioners on the team. It was really how do we develop it into something that is useful, but not make it so vast, so big and so deep, that it wasn’t really going to be something that would lead to useful actionable output, and that probably was the biggest challenge.
BALLINGALL: I think my biggest concern is anytime we build any kind of quantitative model like this is, I’m afraid of the label black box, and I hate to build models that nobody understands except me and two other guys on my team. That’s not what we’re trying to achieve. So we went to some very significant lengths to make sure that it was extremely clear what was going on inside this model. It was extremely clear where the inputs were coming, how the calculations were being done, and why the results were the way they were. So the treasury team at Cisco not only understood it, which frankly I don’t think would have been a problem anyway, but could communicate it internally within the treasury team and the organization. So I wanted to avoid black boxes at all cost.
DOWLING: Questions from the audience.
AUDIENCE: The model that Jamie from Credit Suisse was focused specifically on currencies: Are there applications to commodities as well?
BALLINGALL: This kind of technology is definitely something that I am desperate to apply to commodities, and I haven’t had the opportunity yet, because I haven’t had a client relationship yet where the client has said, I don’t want to do these commodities, and I can push forward with that. Depends on what commodities -- I’m going to caveat that a little bit. Some are more difficult than others. You’d have to have a bit of a deeper think about seasonality, for example that you don’t have here at FX, but fundamentally it’s all the same.
DOWLING: Question for Jamie. In developing your model you did it for FX and for Cisco, are there other clients that you are talking to about this model, and is it something that you’re going to share with the rest of the market?
BALLINGALL: Absolutely, and if by the rest of the market you mean other corporations, yes, not other investment banks.
DOWLING: You don’t want to give your secret away?
BALLINGALL: Not yet, no. Absolutely, we’ve already used it I think with maybe a half dozen or a dozen other firms, and I think it’s been very useful. We’re also tweaking it. There are also applications of it, which include FX forwards and other kinds of financial instruments; and there are also applications of it to balance sheet hedging and so forth. I think the one challenge that we haven’t entirely ironed out yet is some of our clients’ asking for straight price optimization, and that’s somewhat tricky, but we’re working on it.
DOWLING: Ron, on your tool, is that available globally, and is that in different languages or is it just English?
CHAKRAVARTI: It is available globally, actually. Probably just under half are actually U.S.-parented companies, U.S.-headquartered companies, and the other half are companies parented in other developed and in fact emerging markets, be it Europe, Japan, Brazil or whatever. And probably about two-thirds use this to look at their global practices, so let’s say a U.S. company in particular would take a look at their global practices, or a Swiss company about their global practices. But about a third actually take it at a regional level. So let’s say an Asian treasurer for XYZ Co. might take it to benchmark against other companies, other multi-nationals operating in Asia. And so that sort of gives you an idea.
DOWLING: Ron, your submission for Cisco was because they needed something to index their investment traders. Are there other institutions -- I’m sorry Cortlandt -- are there other companies using it for the same application?
GATES: Yes, our platform is software-as-a-service, and it’s the same solution for virtually all of our clients. The same process of aggregating and reconciling the foundation of tax lots and security master information is the same for everybody. Depending on the vertical market that we’re serving whether its corporate treasury, or insurance, or private wealth the output can look a little bit different, but our solution by virtue of its SaaS nature we deploy the upgrades across all of our clients based on a monthly basis.