First PerformancePoint Services 2010 Project Recap
Posted by Alan on September 27, 2010
The post below was originally set to be used as a guest blogger article on the Microsoft PPS team blog. Unfortunately due to schedules and then role/responsibility changes it never got published. Now that I am getting back into doing the blog thing and given that I am lazy by nature, I have decided to re-use it. Keep in mind that this was written almost 8 months ago about a project that had finished up 2 months prior to that.
We recently completed Phase 1 of a PerformancePoint 2010 deployment for one of the largest insurance companies in Canada. In 2009, TGO Consulting engaged with them to deploy Dynamics GP as their new ERP system. Towards the middle of November, we heard through the grapevine that the client was also looking for a business intelligence solution and had already narrowed the list down to two well-known players in the industry – neither of which was Microsoft.
We contacted the client and asked for a chance to meet to discuss what we might be able to offer. As it turns out, the omission of Microsoft was not due a lack of confidence, but due to the simple fact that the client was unclear about what Microsoft offered in this area. While they associated Microsoft with the back-end infrastructure of business intelligence (e.g., SQL, SSAS, etc.) they were fuzzy on the current state of the front-end applications offered by Microsoft. In that meeting they presented a mock-up done in Excel of what they wanted. It was a fairly robust dashboard that pulled together information from their ERP system as well as their various insurance industry specific systems. To give you some idea of the complexity, if you printed it out, it could be made to fit on a single 8 ½ x 14 piece of paper, but you would have to shrink everything down to 4 point font to do it! Then they dropped the bombshell on us – if we wanted the project we had to guarantee that we could have them up and running by the end of the year. A mandate had come from the executive team that a solution had to be deployed by the end of the year.
As we talked further, we discovered that they had already invested in Microsoft Office SharePoint Server 2007 for their corporate intranet. This meant they already owned a business intelligence solution – they just didn’t know it. We brought them up to speed on the changes that had happened at the start of the year and how PerformancePoint was now a part of SharePoint. Once they realized they already owned a solution that fit with their current infrastructure and could be deployed by their deadline they gave us the green light.
As we discussed the project scope more in-depth and as we reviewed all the variables and issues, it began to make more and sense on both our parts to deploy on SharePoint 2010 utilizing PerformancePoint Services rather than use MOSS 2007 and PerformancePoint Server 2007. Luckily the client’s Chief Information Technology Office is fairly progressive and once we came to an understanding of how a project using beta code would work, we got rolling.
We started the project in earnest towards the end of November. The initial installation did not go as smoothly as we would have liked. The issues we faced were not caused by problems with the product, but due to the fact that the whole backend setup of SharePoint and PerformancePoint has changed. The concept of the Shared Services provider in SharePoint and the separate installation of PerformancePoint Server are no more with both items having been replaced by the concept of Service Applications. Thankfully we found the “Soup to Nuts” blog post by the PerformancePoint Services Team and after uninstalling and reinstalling and actually following instructions everything worked right off the bat. Who would have thought that following directions might get you somewhere? As a side note, I highly recommend everyone who will be working with PerformancePoint Services to bookmark that page. In addition, if you have not traditionally worked with SharePoint, you will need to learn more about it if you want to be successful with PerformancePoint Services.
Build, Deploy and New Features
All things considered the deployment of the solution went rather smoothly. We did have some hurdles to overcome (see below), but the new features of 2010 saved us a few headaches as many of the client’s requirements would have been problematic in the 2007 world. Here are some of the aspects of 2010 that stick out in my mind about this project:
- Dashboard Designer
If you worked with PerformancePoint Server 2007 then you will be able to jump right in and get rolling as the design environment is almost identical as 2007, only with more features and functionality. But now we have the added benefit that the storage of objects (i.e., KPIs, Scorecards, etc.) is now in SharePoint and we can take advantage of things like versioning and security. The only thing I wish they had done is setup the system such that when you save your workspace it is automatically saved in SharePoint as well.
- Multiple Actuals/Budgets/Targets
This feature is huge. For our client, we had requirements where we had to display actuals for the current time period, actuals for prior periods and then actuals over a different time horizon (i.e., full year last year vs. current month this year). We then had to include different budget and forecast numbers and do different combinations of which budget or forecast went against which actual to determine which target and threshold. Even if we could have accomplished it in 2007 it would have been ugly. In 2010 it was a breeze.
- Calculated Fields
Every client is going to want to show a variance on their KPIs and Scorecards and some will want to show a percentage change or some sort of standard deviation or even something more. In 2010 you can now create user defined calculations without having to touch your underlying data source. Again it is a little thing, but had a big impact on the quality of the final product we were able to deliver to the client.
- Reusable Filters
We probably have 10 or 12 different Filters (i.e., department, geographic region, time, etc.) that come into play in the client’s solution. These Filters are used in various combinations over 8 or so different Dashboard pages. In 2007 we would have had to create these Filters anew with each and every Dashboard we created. In 2010 we can create them once and reuse them multiple times. Talk about a simple change that is a huge time saver.
- Publishing Dashboards and Master Pages
In PerformancePoint Server 2007 I was not a big fan of the look and feel of the Dashboards created by default when you published them from within Dashboard Designer as everything had to use the same format and if you change them in SharePoint when you republished then all your modifications were lost. Because of this we tended to do most of our layout design in the SharePoint environment by creating blank web part pages and pulling in PerformancePoint web parts and manually configuring them. However in 2010 we have the capability to reference which Master Page you wish to use when deploying a Dashboard. Now we are able to create our own Master Pages which include pre-configured traditional SharePoint web parts. This new feature now allows users of the Dashboard Designer to be able to publish more robust and sophisticated dashboards without having to really know SharePoint.
- Decomposition Tree
Hands down my favorite new feature is the Silverlight based Decomposition Tree. A couple of our client’s team members had experience with ProClarity in prior lives and the fact that they could have this function back made them really excited. The implementation in 2010 is excellent and the fact that you can pick and choose which of your actual or budget values to tie it to is really great. I would hope that the PerformancePoint Services Team has it on their radar to do this same type of implementation for Heat/Tree Maps (sore spot for me that we lost these) and Bing Maps for the next release.
I would be lying if I said that we didn’t experience issues here and there and we did find a couple of “undocumented features”, but none of them were show stoppers. When it was all said and done we really had to overcome just two hurdles. The first dealt with how we had to handle the presentation of information that was being sourced from a relational table. In 2007, we had the report types of Pivot Chart and Pivot Table which we could use for presenting relational data in a PerformancePoint Dashboard. These OWC objects were not the prettiest, but they were quick and easy to work with and they got the job done. In 2010, our only choice for presenting relational data is either Excel Services or SQL Reporting Services. The problem with Excel Services is that while Excel can connect to a relational data source and present data in a table format (as was our requirement) with automatic refresh, that same spreadsheet when published to Excel Services is non-refreshable unless you open the underlying spreadsheet. This same limitation carries forward to Excel Services reports presented within PerformancePoint Services. So that left us with two options either create the report in SRS or have the client update the data manually in Excel and bypass using the relational tables completely. Due to budget and deadlines we chose the latter. Down the road we will either create SRS reports to present this data or write a custom UDF so we can present it via Excel Services.
However, the biggest hurdle we had to deal with had nothing to do with PerformancePoint Services at all. In order to meet the budget and deadlines it was jointly agreed that we would try to use the out-of-the-box cubes that the customer already had in place. Unfortunately, while these cubes are solid and have great data in them, their design is not a perfect match for the type of analysis the client needs. Because of this we had to put extra time into the project to deal with modifications, additions and jury-rigging the cubes to get them what they want. However, this also means that some of their Drill Up/Drill Down and Decomposition capabilities have been limited. It is a good lesson to keep in mind – no matter how good your presentation tool is, if your back-end data is in a bad format then there is only so much you can do. In our next phase we will go back and write a custom cube from scratch that exactly matches the analysis requirements of the client.
PerformancePoint Services vs. Excel/Excel Services vs. Reporting Services
One of the questions I have received during this endeavor is why did we choose to use PerformancePoint Services as the base of this project and not just build everything in Excel, Excel Services or SQL Reporting Services? To be honest, when it is all said and done we probably could have completed the project just using any one of those solutions. However, I really believe that by using PerformancePoint Services as the base, we were able to deploy a more robust solution in a shorter amount of time. You can create a KPI or a Scorecard or a Dashboard in any of the products listed above, but only PerformancePoint Services was designed from the ground up to accomplish those types of tasks. But keep in mind that when creating business intelligence solutions it is not an all-or-nothing proposition. All the tools listed above have relative strengths and weaknesses and they all play wonderfully well with each other. It is great to have options when trying to develop your solution and the options we have to choose from in the Microsoft stack really let us pick the right tool for the job at hand.
Client’s Response and Next Steps
I am happy to report that the client’s project team and executives really like what we have done in Phase 1 of the project. In fact, they like it so much that they have agreed to be a case study for Microsoft. And now that we have proven that all our concepts are sound, the solution works, and the client sees the value they are getting out of the solution we will be expanding the footprint. We will be going back and creating some custom cubes for the ERP data that are more in-line with what they need to see and most likely will be creating cubes for the data found in their various insurance related systems. We are also looking at implementing some data visualization utilizing Visio Services and Bing Maps as well as rolling out Power Pivot for some of the advanced users who want to be able to do some deep, deep dives into the numbers and really play with them on an ad-hoc basis.
All in all, everyone considers this project a success and before I sign off, I would like to take a moment to recognize a couple of people from our team – Richard Mintz and Jerry Bierema. This project would not have been a success without them as they are the ones who do the real work around here; and I am just the pretty face.