PowerPivot Books

Teo Lachev just published a review of Marco Russo’s and Alberto Ferrari’s book PowerPivot for Excel 2010: Give Your Data Meaning. Instead of a full-blown review, I will add a visual one:

 

Yes, I ordered three copies from Amazon. Ok, they are not all mine, but you can see what I think about this book…

Thoughts on BISM, SSAS and MDX

I missed PASS this year but I am addicted to all blog posts coming from the attendees. I am reading, sending links to colleagues and mostly thinking about the future – both with disappointment (rarely), and excitement (99% of the time). There were two very significant blog posts by Chris Webb and Teo Lachev in the last couple of days and I would like to share my thoughts on the future of SSAS.

Chris and Teo are both moderately unhappy (that’s how I got the mood from what they wrote) with the future of SSAS because of the shift of focus from “traditional” M/H/R OLAP to the in-memory Vertipaq engine, as well as the switch from MDX to DAX. My initial feeling was similar – I like MDX and the way SSAS works. Furthermore, after all, SSAS is the best OLAP tool on the market right now and it got there after a long evolution. As Donald Farmer mentioned on Tech Ed in Australia this year, SSAS is the best-selling OLAP tool for Oracle. It also is the best-selling OLAP tool in general. Leaving a tool like that and moving on to a completely new technology is a stunning move in my opinion.

There are a few things, which I think are important to consider before declaring it as a wrong move. First of all, in today’s affordable 64bit world, there are a few limits for in-memory OLAP. Also, MDX, if powerful, different, interesting and sometime elegant, is HARD. I know (personally) literally 3-4 developers in Australia who know how to write good MDX and grasp the concepts behind it. The vast majority of feedback on MDX is that it is “too hard”. Therefore, the two main selling points of SSAS – ease of building analytics and performance are hard to improve on without radical action. The only way to make Microsoft OLAP easier to use is to move on from MDX and the easiest way to improve performance is to go in-memory. This is why I am very optimistic about the future of SSAS and BISM. If small-size competitors like QlikView can build a good product (technologically), which can in some cases outperform SSAS, I am very confident that the SSAS team with its vast expertise on building the most successful OLAP tool in the world can beat them over a few years. It is better to start earlier rather than later.

On the flipside, the handling of the whole matter is puzzling and disorienting. Chris had a few really good points about the problems that may arise from such a swift transition. How do we sell traditional pre-Vertipaq SSAS when it is getting replaced with a new technology, which will require a couple (at least) releases to beat everyone else and become Enterprise-ready? This is a question I would like to get an answer for, as I am an advocate of Microsoft BI and apart from the minor inconvenience to my base for arguments with advocates of other BI suites/products will no doubt cause a major headache when talking to prospective clients. This is in fact my only concern.

To recap – I am very happy about the ongoing evolution of SSAS and I would be very happy when Vertipaq becomes the SSAS of in-memory BI, when we have an easier language to query it and when the stack gets better integration between its components. However, the transition is too abrupt for my taste. The “security through obscurity” approach to marketing I am observing may not be able to convince enterprise customers with complex needs and strategic plans to adopt a tool in transition. I hope that Microsoft does put more effort in a seamless migration path from SSAS to Vertipaq, so that the #1 spot of Microsoft OLAP technologies gets preserved through this drastic change.

Jumping straight to Amazon now, to buy a book or two on DAX – a great opportunity for me to get a head start in DAX and PowerPivot before we get the technology in SSAS J My previous concerns about the PowerPivot adoption rates because of a few little things is now replaced with excitement over the fact that we will get this new technology on a full-featured client-server architecture (if not fully-featured in Denali, then soon after that).

UPDATE: Please note Amir Netz’s response to Chris’ post in the comments – it explains the intent behind BISM and paints a brighter future for MOLAP and MDX.

UPDATE #2:TK Anand has a blog post in regards to this topic here.

UPDATE #3:Chris Webb’s follow up with a more optimistic tone here.

PowerPivot DAX Measures in a Data Source

OK, my doubts aside, this is my first post about PowerPivot. Obviously the demand is high and the perspectives bright – so here we go…

There are a number of posts about using PowerPivot as a datasource in PerformancePoint and Reporting Services. You can find two links below (both at TechNet) and I will not repeat the content there:

 

There you can find a detailed step-by-step approach to using published to SharePoint PowerPivot workbooks in your reports. One thing which is not explained, though, is that you can also use any published DAX measures as report data sources, as well. They appear as physical measures in the PowerPivot Analysis Services instance and can be directly used in the reports. I found this fascinating, as now we can actually integrate even user logic into our reports.

In example, today I was working on a small demo of PowerPivot and I created a PowerPivot report integrating a relational data source and an OLAP data source from two completely different systems, which then got related to each other by State/Province . After that, I created a DAX ratio measure, which showed some relation between the Sales measures from each database. After publishing the workbook to SharePoint, I opened Report Builder 3.0 and to my surprise (well, I expected to see only physical measures for some reason), I was able to pull the DAX measure which I had just created and beautify the map of the USA based on its value.

The integration between the various components in the Microsoft BI stack is continually improving and with the latest few versions of the various components (labelled 2010 or R2), we are being empowered with richer and more powerful tools, covering a larger array of users – proving that the toolset is the best out there for both enterprise-level and relatively smaller customers.