In Part I of the SSAS Tabular vs. SSAS Multidimensional – Which One Do I Choose? series, we began to set the stage for looking at 5 high level considerations to answer this question. In Part II, we explored the Scalability and Performance considerations between the two technologies. In Part III, we explored the characteristics of the data and the model required to deliver the analytics and reporting required. In this installment, we will cover how to extend the capabilities and characteristics of your data model through the use of business logic.
Business logic can add tremendous value to any data model in the form of calculations and business rules that enhance data for end-user analysis.
Both tabular and multidimensional modeling offer rich formula languages to implement business logic. Multidimensional modeling leverages MDX and tabular modeling leverages DAX. Before delving into some of the advanced business logic scenarios of each paradigm, it is important to establish a baseline understanding of how business logic can be applied using row-level transformations, aggregated values, and calculations in multidimensional and tabular modeling.
Row Level Transformations
You may need to perform calculations and data transformations that are not readily available in your source data. For example, your source data may have a Sales Amount column and a Daily Exchange Rate column, but be missing sales converted to the foreign currency, or your source data may have Employee First Name and Employee Last Name but be missing a concatenated Employee Full Name. Note that in these examples the calculation or data manipulation must occur on row-level, un-aggregated data.
In multidimensional modeling, row-level transformations on un-aggregated data must be performed before the data is loaded into the model or must be performed when the model is queried. You can transform dimension attributes, like employee names, by either applying the transformation in the data source system or by writing a SQL expression that gets applied when Analysis Services queries the source database. Row-level transformations of numeric data can be performed using a SQL expression before the data is loaded into Analysis Services, or the transformation can be applied using a MDX expression within a Scope statement so that the calculation is applied at the row level. If the transformation is applied before the data is loaded, then Analysis Services can pre-aggregate the numeric values. If the transformation is applied using a Scope statement, aggregation occurs at query time.
In tabular modeling, row-level transformations are created using calculated columns. When you create a calculated column, you add the column to a specific table in your model and you use DAX formulas to define the column’s values. The formula is then evaluated for every record in that table and is loaded into memory just like any other column in the model. This flexibility allows you to enhance your data directly in the tabular model based on your specific analysis requirements and lessens the need to perform tweaks to upstream data sources that may or may not be able to accommodate your changes in a timely manner. Calculated columns provide a very convenient way to create and persist calculations that must be performed at a detailed level in your data before being aggregated. While this flexibility is powerful, note that calculated columns are not intended to perform the heavy data cleansing or data transformations that you would find in Extract, Transform, and Load (ETL) processes.
In multidimensional modeling, you use measures to create aggregated values. The Analysis Services OLAP engine pre-aggregates a cube’s measures using aggregate functions like SUM, COUNT, MIN, MAX, and DISTINCT COUNT, and others. During cube processing, each measure is aggregated from bottom to top across all hierarchies. Because this processing happens prior to end-user analysis, the pre-aggregated measures can provide tremendous benefits to query performance.
When you create a measure in your cube, there is a one-to-one relationship between a cube measure and a numeric column in your source data. As such, in multidimensional modeling, measures are useful when you need to perform bottom-up aggregation of numeric data elements that (1) exist in your source data at the lowest level of detail and (2) require a rollup that leverages one of the native cube aggregate functions.
In tabular modeling, you also use measures to create aggregated values. You create a measure by selecting a column and then specifying the aggregation function (SUM, COUNT, DISTINCT COUNT, MIN, MAX, or AVERAGE), or you can write a DAX expression that specifies function you want to use to aggregate the measure. In tabular modeling, the row-level data is stored in memory and aggregates are calculated at query time. As explained in the next section, in tabular modeling measures can also be used to apply calculations. This can include calculations that are based on multiple aggregated columns.
In multidimensional modeling you use MDX to create calculations. MDX is both an expression and a query language with functions that natively understand the design of a cube’s dimensions, hierarchies, attribute relationships, and measures. This native understanding enables you to create succinct and powerful expressions that apply business logic across multiple data contexts. You create and store MDX calculations in the cube’s calculation script, where you can control the order in which the logic is applied.
Calculated members are the most common MDX calculations. Calculated members are evaluated at query time after the data is pre-aggregated. Calculated members can be created in any dimension. When they are created in the measures dimension they are often referred to as calculated measures. Calculated members can be fairly simple with basic arithmetic operations such as sales per unit (sales / unit) or spend per person (spend / headcount). They can also be more complex when you need to apply specific business rules such as Rolling 3 Period Average Sales or YTD Margin. For example, if you want to calculate sales for the current time period as a percent of the parent time period you can use the following MDX calculation:
[Measures].[Sales Amount] / ([Date].[Calendar].CurrentMember.Parent,[Measures].[Sales Amount])
Creating a calculated member in a dimension other than the measures dimension adds a value to an attribute in the dimension. For example, if you had a dimension attribute that contained a list of colors, you might want to add the calculated member Primary Colors which would sum the values of the colors red, green, and blue. In tabular modeling, creating a measure is similar to creating a calculated member in the measures dimension in a multidimensional model. In tabular modeling you cannot add a value to a column in a table, so tabular modeling does not support the equivalent of creating a calculated member in a dimension other than the measures dimension in a multidimensional model.
Scope assignments are more advanced than calculated measures but they are also more powerful. As mentioned in the “Row-Level Transformation” section earlier, you can use a Scope statement so that calculations are applied at the row level. However, you can also use a Scope statement to specify any range of cube cells where you want to apply a calculation. Scope assignments are compiled ahead of query time and enable Analysis Services to provide an optimized execution path when the calculation is queried. Given their strength, scope assignments can not only do the work of multiple calculated measures but they can also do the work more efficiently. For example, in a budgeting solution, you want to assign next year’s budget for the East region to be 90 percent of their current year’s budget. You want the new West region budget to be 75 percent of their current year’s budget. You want the new South region budget to be 105 percent of their current year’s budget and the new North region budget to be the same as their current year’s budget.
Rather than writing a single complex calculated measure with nested IF statements or multiple calculated measures that segregate each budget scenario individually, you can use scope assignments to effectively apply these ratios at the Region level and then aggregate the data totals. For example, if you wanted to convert sales amount into a foreign currency using daily exchange rates you could use the following MDX expression:
This = [Measures].[Sales Amount] * [Measures].[Daily FX Rate];
In tabular modeling you use DAX to create calculations. As mentioned previously, in tabular modeling you apply row-level calculations by creating calculated columns. You can also apply calculations when you create a measure by writing a DAX expression. Because you explicitly use a combination of DAX row-level and aggregation functions, measures in tabular models are very flexible. You can apply row-level functions and then apply an aggregation function so that your measure applies calculations before aggregation, or you can apply aggregation functions first and then apply row-level functions so that your measure applies calculations after aggregations.
DAX can dynamically evaluate a formula in different data contexts (not just the current view of an Excel worksheet or a PivotTable) using a special set of functions called FILTER functions. In the broadest sense, these functions have a similar purpose to Analysis Services scope assignments in that they enable you to define and perform a calculation over a specific set of rows. For example, you can use FILTER functions to handle the budgeting example described previously.
Business Logic Scenarios
Hierarchies and they are just a way to make it easier to work with common attributes that are usually linked or nested together. You can link together what are referred to as natural hierarchies like Year > Quarter > Month or you could create business defined hierarchies like Product Category > Product SubCategory > Product.
Hierarchies make the lives of your end users easier by allowing the navigation of large data set by rolling up or drilling down hierarchies. Now end-users can still simulate hierarchies even if they aren’t defined in the data model by nesting columns in pivot tables in the way they want, this again just makes it easier for end users.
Part III of this series compared and contrasted the various hierarchical structures to facilitate drilldown analysis.
Both MDX and DAX provide functions to work with data that is organized into a hierarchy and create calculations like percent of parent; however, the MDX functions tend to be more straightforward and easy to use.
Although uniform data summarization is applicable in many scenarios, there are also situations where you want to have finer-grained control over how your data rolls up. One example of this is I worked on a project where the requirement was to take a Hyperion-based cube general ledger (GL) account structure and convert it to a SSAS Multidimensional solution. A parent-child hierarchy was implemented as the solution. We leveraged SSIS to dynamically create the dimension table entries and the native support of parent-child hierarchies to ultimately deliver the solution requested. This was needed because specific rollup logic was required for various account types within the parent-child hierarchy.
Semi-additive measures are measures that do not uniformly aggregate across all dimensions. Most commonly this occurs when values cannot be aggregated consistently over a date dimension. The most common scenarios are those that deal with opening and closing balances.
Multidimensional models provide out-of-box support for semiadditive measures with special aggregate functions like First Child, Last Child, FirstNonEmptyChild, and LastNonEmptyChild. If these aggregate functions do not satisfy your specific logic requirements, you can also write custom MDX formulas.
Tabular models provide similar functions such as ClosingBalanceMonth and OpeningBalanceMonth. There are additional functions that apply across other date attributes like quarter and year.
Almost every BI solution that you encounter will require time intelligence. Time intelligence includes being able to calculate year-to-date summaries and perform prior year comparisons. Both MDX and DAX provide time-series functions; however, each uses a slightly different data model design.
Multidimensional models provide out-of-the-box time intelligence through the Analysis Services Business Intelligence Wizard. Using this wizard, time calculations can be added to the design of the time dimension and also applied to all measures in the model. However, there is significant overhead of using the wizard to create your time intelligence calculations. When the wizard is used you can pick and choose the calculated measures and the time-based calculations and it will create the necessary MDX calculation script for each one. However, if you come to a situation where you need to add a new calculated measure to your time-based calculations you will have to either re-run the wizard or you may have to create the MDX scripts yourself. There are other options for creating more dynamic and easier to manage time intelligence within the multidimensional model. For example, I have used the technique covered in – A Different Approach to Implementing Time Calculations in SSAS – to provide more dynamic and easy to manage time intelligence functionality. Although using the wizard is one way to build time calculations, you can also write your own MDX calculations within the multidimensional model.
In tabular models, although there is no wizard to create time intelligence calculations, you can manually create calculations by creating DAX formulas that leverage a variety of functions including TOTALMTD and TOTALYTD as well as SAMEPERIODSLASTYEAR. The key thing here to remember is that the table used for your date dimension must be set as a “date” table for the DAX date functions to work properly.
KPI or Key value indicators let us display custom icons based on the current value. Key performance indicators (KPIs) identify special measures that allow for monitoring against a target value or metric within your organization. Both multidimensional and tabular models provide support for KPIs. Both provide the ability to assign a target for a measure and to use the comparison of actual to target to assess the performance status of the measure. Multidimensional models provide the added ability to assess the KPI’s trend and assign a separate visual indicator to represent how the KPI performs over time.
Currency conversions require you to convert currency data from one or more source currencies into one or more reporting currencies.
In either modeling experience, the primary requirement is having data that provides currency conversion rates.
In multidimensional models, the Business Intelligence wizard can be used to create the MDX script for the necessary calculations. In a tabular model DAX is used to create the formulas for the currency conversion calculations.
In multidimensional modeling named sets provide a way for you to return a set of dimension members that are commonly used in reporting applications. For example, the Top 10 Products Sold, the last 12 months, etc. This is defined within the cube as central logic that can be leveraged and accessed by various reporting applications.
In tabular models, named sets are not available.
Business Logic Summary
In this post we have explored the Business Logic aspects by comparing and contrasting SSAS
Tabular and SSAS Multidimensional in terms of extending the characteristics of the data and the model required to deliver the analytics and reporting required through the use of business logic capabilities provided by each modeling experience.
From a business logic perspective, the edge goes to SSAS Multidimensional.
It is important to take the time and consideration to determine what kind of business logic capabilities are necessary to deliver the desired analytics and reporting capabilities that need to be provided by your SSAS solutions.
Sign-up now and get instant access
ABOUT THE AUTHOR
I am an IT professional with over 20+ years of progressive technical experience with managing, consulting, software development, data integration architecture, data warehousing, and business intelligence development projects. The focus of my work is to work closely with stakeholders to develop architecture frameworks that align with organizational strategy, processes, and IT assets. Specializing in Business Intelligence (BI), Data Warehousing, database architecture, design and implementation, performance tuning and troubleshooting.
Leave a comment