A Guide to Currency Conversions in SSAS

In this post I will try to describe different ways we can do currency conversions in SSAS, which should cover most, if not all, requirements. Hopefully, it will also provide some best practice advice in the process, as well.

As a background and reference, I will use some other blog posts, most prolific of which are Christian Wade’s:

http://consultingblogs.emc.com/christianwade/archive/2006/08/24/currency-conversion-in-analysis-services-2005.aspx

http://consultingblogs.emc.com/christianwade/archive/2006/07/25/Measure-expressions_3A00_-how-performant-are-they_3F00_.aspx

The first one deals with currency conversions and compares the Business Intelligence Wizard approach and Measure Expressions, clearly favouring the Measure Expressions (MEs). The second post explores MEs and their strengths as opposed to MDX calculations. Both posts are very interesting and useful. A slight inaccuracy can be found in the summary section of the first post, which describes the MEs as stored on disk, which is untrue. In fact MEs are not stored on disk and are calculated at runtime. Teo Lachev explains their behavior here:

http://prologika.com/CS/forums/p/835/3064.aspx

And another reference to MEs can be found in the SQL Server 2005 Performance Guide:

http://download.microsoft.com/download/8/5/e/85eea4fa-b3bb-4426-97d0-7f7151b2011c/SSAS2005PerfGuide.doc

Last evidence, and possibly most helpful for me was the confirmation about their behavior I got from Gerhard Brueckl, Philip Stephenson and Darren Gosbell in this MSDN Forum thread:

http://social.msdn.microsoft.com/Forums/en/sqlanalysisservices/thread/61cc5840-f8f1-45b6-9a9b-f9af4b21513e

Darren Gosbell also emailed me with another little clarification, which could have big impact on your solutions – and that is the fact that no aggregations are used for a measure group in SSAS where at least one measure has a ME defined for it. This could be very important in some cases. Teo Lachev has blogged about this limitation here:

http://prologika.com/CS/blogs/blog/archive/2010/05/22/in-search-of-aggregations.aspx

Since we have some background knowledge of how currency conversions can be done, I will continue with a brief discussion of how currency amounts can be stored in a data mart.

In the vast majority of cases, a business works with a “base currency”, which is the default currency used to conform all currency amounts throughout all transactions. When a currency conversion needs to be made, typically we would have to multiply or divide the “base currency amount” by some “currency rate”, which will give us as a result the amount in a non-base currency amount. To implement this approach, we could just follow Christian Wade’s ideas of using Measure Expressions, which would give us the best performance (keeping in mind, of course, the limitations of using Measure Expressions).

Another approach is to store both base currency amount, as well as an amount for all the most commonly used currencies throughout the organisation as physical measures. As a result we end up with a few measure columns corresponding to the different currencies (e.g. USDAmount, EURAmount, AUDAmount). Then we just add these to our measure group and we can build a SCOPE statement, which gives us the correct measure when using our Currency dimension. If we want to convert to a currency other that the ones we have already converted, we need to resort to the previously mentioned approach, accepting one of these currencies as a base currency. Because we work with physical measures in the majority of cases, this implementation solves some problems with performance. However, it suffers from increased storage space requirements, which could (for a very large implementation) be severe. Also, if we have multiple measures we need to convert, we need to store [Number Of Measures] x  [Number of Frequently Used Currencies – 1] more measure columns in our database, and subsequently in our cube. When I am saying “solves some problems with performance”, in fact our problems are solved only when we use the currencies we have the converted amounts for. In all other cases, we are at the worst possible case – complete conversion calculation of our measures.

There is a third tactic, which I have recently been working on. Together with the previous two it could potentially yield best possible results. The idea is to store a base currency amount and a local currency amount in another column, as well as adding a Local Currency dimension to the cube. The Local Currency measure contains the base amount converted to the currency, which is “local” for the data. In example, if we have a company which has offices in Sydney and London, the local amounts stored against the Australian offices (based on business unit or geography) will be in AUD, while the amounts for the English ones will be in GBP. However, the base currency could be AUD, in which case in our BaseAmount column the amounts will always be in AUD. Once we have set this up we can do the following:

  1. For reports in the base currency: Use the base currency amount only
  2. For reports in non-base currency:
    1. Take the LocalAmount from the fact table, where the local currency is the selected currency
    2. Convert the BaseAmount from the fact table, where the local currency is not the selected currency
    3. Sum the amounts we get from the previous two steps

In my current solution I have the following cube (showing only the Dimension Usage tab, which I believe illustrates the structure best):

 

Note that I have a Currency Conversion measure group, which stores conversion rates from base currency for each date. It has a many-to-many relationship with the other two measure groups in my cube, while these two measure groups have a Local Currency dimension related to them as described above.

Then the MDX in the cube script for my Deal measure (the physical measure is called [Amount – Trade]) is:

/* This calculation will pick the Local Currency Amount from the fact table with no currency
     conversions applied */
CREATE MEMBER CURRENTCUBE.[Measures].[Local Trade Currency Amount]
AS
    (LinkMember([Currency].[Currency Code].CurrentMember,
                [Local Currency].[Currency Code]),
    [Currency].[Currency Code].[AUD],
    [Measures].[Local Trade Amount]),
ASSOCIATED_MEASURE_GROUP = ‘Deal’,
VISIBLE = 0;

/* Here we pick all other amounts in Base Currency (other than the one selected by the user)
     and then we pull the converted amounts for them (calculation done in a Measure Expression)*/
CREATE MEMBER CURRENTCUBE.[Measures].[Converted Local Trade Currency Amount]
AS
    SUM([Local Currency].[Currency Code].[Currency Code] –
            LinkMember([Currency].[Currency Code].CurrentMember,
                        [Local Currency].[Currency Code]),
        [Measures].[Trade Amount]),
ASSOCIATED_MEASURE_GROUP = ‘Deal’,
VISIBLE = 0;

/* In this combined measure we combine the previous two to retrieve the overall amount
     in the selected currency */
CREATE MEMBER CURRENTCUBE.[Measures].[Amount – Trade]
AS
    SUM({[Measures].[Local Trade Currency Amount],
         [Measures].[Converted Local Trade Currency Amount]}),
    FORMAT_STRING = “Currency”,
    ASSOCIATED_MEASURE_GROUP = ‘Deal’,
    VISIBLE = 1;

When we generate a report in the base currency, the performance is best-possible as no conversions are made (if we utilize the Direct Slice property as described in Christian Wade’s blog posts above). The same holds true for reports in a non-base currency, where 100% of the data can be retrieved from the LocalAmount measure. The worst-case comes when we request a currency, which is not a Local Currency in any sub-slice to the one requested by our query. In this case, we resort to the first approach, where we convert every amount and then sum it up. If some of the data can be retrieved as a LocalAmount, and some of the data cannot, we are in between the best and worst cases and performance will depend on the amount of conversions which need to be performed.

I consider this a useful solution as in many cases reports will either be specific to an office/country and are generated in a local currency only; or are “global” and are using only the base currency for the organisation. Therefore, using this approach, we get best possible performance with no severe trade-off in storage space.

In the end, it depends on the specific organisation and requirements, but we can always combine the second and the third solutions, storing a few converted amounts as physical measures, including a LocalAmount as another one, and converting to a different to these currency with Measure Expressions only when absolutely necessary. This way, we essentially take a “best of both worlds” approach and can obtain best possible results from our solution.

Advertisements

10 thoughts on “A Guide to Currency Conversions in SSAS”

  1. Nice to see all of that laid out in one place and an interesting read. I can’t help but wonder though, what happens if you set the “Disable Prefetch Facts” property to true. Does AS still not bother to use aggregations on measures with no measure expressions. I’m guessing it does, even though it probably has no reason to.

    Like

  2. This is kind of a inflexible solution; limited. The best, most powerful approach in doin this is to utilize a well-understood constuct in the cube engine: dimensionalize this currencies, and have different parts of the hierarchies that hold members which have member formulas that set the present currency exchange rates so that you can apply them to the measures depending on what kind of reporting you’re attempting. Just because SSAS allows you to use measure groups (which is only kind of half-baked) does not mean that you grasp at it and use it for anything for which it’s not well meant for. Lastly, the dimensionalizing of this means that you can also provide a true structure to the currencies where you could group the currencies into currency groups, and have alternative way of representing them to the end-users. Just a different approach, but the one you painted falls flat with me.

    Like

  3. One more note, I think if you want the results to truly aggregate (and not do the work at query time), I’d put the exchange rates in the DSV or in the ETL-layer since they are, related to another single currency simplistic (not many to many), and since those exchange rates would at best be modified done daily, so if you processed the cube once per day, that would mean that you would just reference those DSV-driven member with those set valued in them. A nice custom AMO app to allow a simple interface to let a normal end-user pump in the daily exchange rates would be a task that would not have to be relegated to the cube dev.

    Also, since you would turn the currency structure into a true dimension, you could make the default member the dollar, or set the default member with some dynamic MDX for the default member expression that would drive from the locale setting.

    Like

  4. AD, it does not really make sense to me. I would not mind dicussing alternative methods of building this, but I’ll need to get a better picture of what you are talking about. Please let me know if you have a concrete sample of what you have in mind and I’d love to see how it works.

    Like

  5. Boyan,
    Thanks for your well thought out approach to this often confusing area. I have implemented the one-many measure expression approach a couple of times, once replacing a calculation script approach for a client that wasn’t performing in any way reasonably.
    I like the idea of persisting the measures in both original and base currencies which would be an option for a client that previously only required a set of reporting currencies, but now needs to report by original currency.
    One area you haven’t mentioned is the approach used for conversion to base currency. I have found the best way to achieve this is to load your fact table with original currency FK and original currency amount. Have a one-many currency rates fact table with FromCurrencyKey, ToCurrencyKey, DateKey and Rate. It is preferable to have the “one” currency in your base currency, then have a view on top of this table that triangulates it to provide a many-many when necessary without physically storing all the rates. Then simply join to this view on your cube process. This approach allows you to store the minimum amount of data and remain flexible.
    I think there would be some value in articulating this with more detail, so will put a post together and reference this post.
    One additional complication is a situation where you need all measures to be converted at different dates. ie BilledAmount to be converted either at BillDate, WorkDate or DeliveryDate. In any situations I’ve faced there has only been one date per measure group, but I imagine this requirement could easily arise. Possibly a combination of this and your previous post about loading multiple facts with an “event” dimension could work.

    Like

  6. Thank you, Boyan,
    It is a great post.
    I am currently working on a cube having a bit complex currency conversion requirements. The transaction is on the international profit center level, for a specific day and specific currency, different center uses different exchange rate. Further more, one transaction record has three different business activities, each activity has its own rate. Do you have any suggestion on how to structure the currency conversion in an effecient way?
    Here is the transaction record’s schema.
    TransactionId, ProfitCenterId, TradeAmount, TradeCurrecyId, LoanAmount, LoanCurrencyId, BorrowAmount, BorrowCurrecyId.
    The exchange rate fact schma listed below.
    DateId, ProfitCenterId, CurrencyId, ExchangeRate.
    The Currency dimension has the following schema.
    CurrencyId, CurrencyISOCode, CurrencyName.

    Thank you.
    Joe

    Like

Comments are closed.