Alan Whitehouse's Ramblings

Continuing to work until my heavy investment in lottery tickets finally pays off….

  • Categories

  • Archives

  • Deep Thought

    Historically speaking, all true change in the world has come thanks to leaders emerging, them taking charge and giving the masses someone to rally around. Can an intentionally "leaderless" movement survive or will it just slowly fade away?

  • Subscribe

  • Enter your email address to subscribe to this blog and receive notifications of new posts by email.

    Join 18 other followers

  • Me!

  • Disclaimer

    The views expressed in this blog, while intelligent and always right, are strictly my own, and do not necessarily reflect the views of anyone else with which I am in any way affiliated. And don't forget, I own the rights to all information on this blog (except for the stuff I stole from other people).
  • Current Rant

    *START OF RANT*

    You want change, then get involved. Vote, run for office, go to shareholder meetings and contact advertisers or investors. Sitting around banging drums, singing kumbaya, smoking weed and having a camp out under the stars is not going to get you the change you want.

    *END OF RANT*
  • Admin Stuff

First PerformancePoint Services 2010 Project Recap

Posted by Alan on September 27, 2010

The post below was originally set to be used as a guest blogger article on the Microsoft PPS team blog.  Unfortunately due to schedules and then role/responsibility changes it never got published.  Now that I am getting back into doing the blog thing and given that I am lazy by nature, I have decided to re-use it.   Keep in mind that this was written almost 8 months ago about a project that had finished up 2 months prior to that.

Project Background
We recently completed Phase 1 of a PerformancePoint 2010 deployment for one of the largest insurance companies in Canada. In 2009, TGO Consulting engaged with them to deploy Dynamics GP as their new ERP system. Towards the middle of November, we heard through the grapevine that the client was also looking for a business intelligence solution and had already narrowed the list down to two well-known players in the industry – neither of which was Microsoft.

We contacted the client and asked for a chance to meet to discuss what we might be able to offer. As it turns out, the omission of Microsoft was not due a lack of confidence, but due to the simple fact that the client was unclear about what Microsoft offered in this area. While they associated Microsoft with the back-end infrastructure of business intelligence (e.g., SQL, SSAS, etc.) they were fuzzy on the current state of the front-end applications offered by Microsoft. In that meeting they presented a mock-up done in Excel of what they wanted. It was a fairly robust dashboard that pulled together information from their ERP system as well as their various insurance industry specific systems. To give you some idea of the complexity, if you printed it out, it could be made to fit on a single 8 ½ x 14 piece of paper, but you would have to shrink everything down to 4 point font to do it! Then they dropped the bombshell on us – if we wanted the project we had to guarantee that we could have them up and running by the end of the year. A mandate had come from the executive team that a solution had to be deployed by the end of the year.

As we talked further, we discovered that they had already invested in Microsoft Office SharePoint Server 2007 for their corporate intranet. This meant they already owned a business intelligence solution – they just didn’t know it. We brought them up to speed on the changes that had happened at the start of the year and how PerformancePoint was now a part of SharePoint. Once they realized they already owned a solution that fit with their current infrastructure and could be deployed by their deadline they gave us the green light.

As we discussed the project scope more in-depth and as we reviewed all the variables and issues, it began to make more and sense on both our parts to deploy on SharePoint 2010 utilizing PerformancePoint Services rather than use MOSS 2007 and PerformancePoint Server 2007. Luckily the client’s Chief Information Technology Office is fairly progressive and once we came to an understanding of how a project using beta code would work, we got rolling.

Software Installation
We started the project in earnest towards the end of November. The initial installation did not go as smoothly as we would have liked. The issues we faced were not caused by problems with the product, but due to the fact that the whole backend setup of SharePoint and PerformancePoint has changed. The concept of the Shared Services provider in SharePoint and the separate installation of PerformancePoint Server are no more with both items having been replaced by the concept of Service Applications. Thankfully we found the “Soup to Nuts” blog post by the PerformancePoint Services Team and after uninstalling and reinstalling and actually following instructions everything worked right off the bat. Who would have thought that following directions might get you somewhere? As a side note, I highly recommend everyone who will be working with PerformancePoint Services to bookmark that page. In addition, if you have not traditionally worked with SharePoint, you will need to learn more about it if you want to be successful with PerformancePoint Services.

Build, Deploy and New Features
All things considered the deployment of the solution went rather smoothly. We did have some hurdles to overcome (see below), but the new features of 2010 saved us a few headaches as many of the client’s requirements would have been problematic in the 2007 world. Here are some of the aspects of 2010 that stick out in my mind about this project:

  • Dashboard Designer
    If you worked with PerformancePoint Server 2007 then you will be able to jump right in and get rolling as the design environment is almost identical as 2007, only with more features and functionality. But now we have the added benefit that the storage of objects (i.e., KPIs, Scorecards, etc.) is now in SharePoint and we can take advantage of things like versioning and security. The only thing I wish they had done is setup the system such that when you save your workspace it is automatically saved in SharePoint as well.
  • Multiple Actuals/Budgets/Targets
    This feature is huge. For our client, we had requirements where we had to display actuals for the current time period, actuals for prior periods and then actuals over a different time horizon (i.e., full year last year vs. current month this year). We then had to include different budget and forecast numbers and do different combinations of which budget or forecast went against which actual to determine which target and threshold. Even if we could have accomplished it in 2007 it would have been ugly. In 2010 it was a breeze.
  • Calculated Fields
    Every client is going to want to show a variance on their KPIs and Scorecards and some will want to show a percentage change or some sort of standard deviation or even something more. In 2010 you can now create user defined calculations without having to touch your underlying data source. Again it is a little thing, but had a big impact on the quality of the final product we were able to deliver to the client.
  • Reusable Filters
    We probably have 10 or 12 different Filters (i.e., department, geographic region, time, etc.) that come into play in the client’s solution. These Filters are used in various combinations over 8 or so different Dashboard pages. In 2007 we would have had to create these Filters anew with each and every Dashboard we created. In 2010 we can create them once and reuse them multiple times. Talk about a simple change that is a huge time saver.
  • Publishing Dashboards and Master Pages
    In PerformancePoint Server 2007 I was not a big fan of the look and feel of the Dashboards created by default when you published them from within Dashboard Designer as everything had to use the same format and if you change them in SharePoint when you republished then all your modifications were lost. Because of this we tended to do most of our layout design in the SharePoint environment by creating blank web part pages and pulling in PerformancePoint web parts and manually configuring them. However in 2010 we have the capability to reference which Master Page you wish to use when deploying a Dashboard. Now we are able to create our own Master Pages which include pre-configured traditional SharePoint web parts. This new feature now allows users of the Dashboard Designer to be able to publish more robust and sophisticated dashboards without having to really know SharePoint.
  • Decomposition Tree
    Hands down my favorite new feature is the Silverlight based Decomposition Tree. A couple of our client’s team members had experience with ProClarity in prior lives and the fact that they could have this function back made them really excited. The implementation in 2010 is excellent and the fact that you can pick and choose which of your actual or budget values to tie it to is really great. I would hope that the PerformancePoint Services Team has it on their radar to do this same type of implementation for Heat/Tree Maps (sore spot for me that we lost these) and Bing Maps for the next release.

Project Hurdles
I would be lying if I said that we didn’t experience issues here and there and we did find a couple of “undocumented features”, but none of them were show stoppers. When it was all said and done we really had to overcome just two hurdles. The first dealt with how we had to handle the presentation of information that was being sourced from a relational table. In 2007, we had the report types of Pivot Chart and Pivot Table which we could use for presenting relational data in a PerformancePoint Dashboard. These OWC objects were not the prettiest, but they were quick and easy to work with and they got the job done. In 2010, our only choice for presenting relational data is either Excel Services or SQL Reporting Services. The problem with Excel Services is that while Excel can connect to a relational data source and present data in a table format (as was our requirement) with automatic refresh, that same spreadsheet when published to Excel Services is non-refreshable unless you open the underlying spreadsheet. This same limitation carries forward to Excel Services reports presented within PerformancePoint Services. So that left us with two options either create the report in SRS or have the client update the data manually in Excel and bypass using the relational tables completely. Due to budget and deadlines we chose the latter. Down the road we will either create SRS reports to present this data or write a custom UDF so we can present it via Excel Services.

However, the biggest hurdle we had to deal with had nothing to do with PerformancePoint Services at all. In order to meet the budget and deadlines it was jointly agreed that we would try to use the out-of-the-box cubes that the customer already had in place. Unfortunately, while these cubes are solid and have great data in them, their design is not a perfect match for the type of analysis the client needs. Because of this we had to put extra time into the project to deal with modifications, additions and jury-rigging the cubes to get them what they want. However, this also means that some of their Drill Up/Drill Down and Decomposition capabilities have been limited. It is a good lesson to keep in mind – no matter how good your presentation tool is, if your back-end data is in a bad format then there is only so much you can do. In our next phase we will go back and write a custom cube from scratch that exactly matches the analysis requirements of the client.

PerformancePoint Services vs. Excel/Excel Services vs. Reporting Services
One of the questions I have received during this endeavor is why did we choose to use PerformancePoint Services as the base of this project and not just build everything in Excel, Excel Services or SQL Reporting Services? To be honest, when it is all said and done we probably could have completed the project just using any one of those solutions. However, I really believe that by using PerformancePoint Services as the base, we were able to deploy a more robust solution in a shorter amount of time. You can create a KPI or a Scorecard or a Dashboard in any of the products listed above, but only PerformancePoint Services was designed from the ground up to accomplish those types of tasks. But keep in mind that when creating business intelligence solutions it is not an all-or-nothing proposition. All the tools listed above have relative strengths and weaknesses and they all play wonderfully well with each other. It is great to have options when trying to develop your solution and the options we have to choose from in the Microsoft stack really let us pick the right tool for the job at hand.

Client’s Response and Next Steps
I am happy to report that the client’s project team and executives really like what we have done in Phase 1 of the project. In fact, they like it so much that they have agreed to be a case study for Microsoft. And now that we have proven that all our concepts are sound, the solution works, and the client sees the value they are getting out of the solution we will be expanding the footprint. We will be going back and creating some custom cubes for the ERP data that are more in-line with what they need to see and most likely will be creating cubes for the data found in their various insurance related systems. We are also looking at implementing some data visualization utilizing Visio Services and Bing Maps as well as rolling out Power Pivot for some of the advanced users who want to be able to do some deep, deep dives into the numbers and really play with them on an ad-hoc basis.

All in all, everyone considers this project a success and before I sign off, I would like to take a moment to recognize a couple of people from our team – Richard Mintz and Jerry Bierema. This project would not have been a success without them as they are the ones who do the real work around here; and I am just the pretty face.

Advertisements

4 Responses to “First PerformancePoint Services 2010 Project Recap”

  1. bhavikmerchant said

    Hi Alan,

    Thanks for a very informative post on your first real-world experience with PPS 2010!

    Regarding your comment on PPS 2007 – “The problem with Excel Services is that while Excel can connect to a relational data source and present data in a table format (as was our requirement) with automatic refresh”.

    I didnt find this to be an issue with Excel Services 2007. We have setup the Connection Properties in the file and turned on the “Refresh data when opening file” option and this works. The setting is honoured when using ODC files deployed to Sharepoint.

    I’ve implemented this successfully at 2 customer sites. Believe me they would have complained about the usability issues of having to open up the Excel file on from the dashboard in order to refresh and we would have looked to some alternative.

    Is this setting something you havent come across in the Excel data connection properties perhaps?

  2. Prabjot said

    Hi Alan,

    Thanks for sharing your experience about PPS 2010. I am also working on the PPS 2010 technology. I have some requirement by client where I need to implement forecasting chart in PPS 2010. As per my research forecasting and trending chart type was available in PPS 2007 but diminished in 2010. I can create Trends in analytical chart but I am really not able to understand how can I implement Forecasting in PPS 2010. My client has very complex Forecasting algorithm which they want to implement in PPS 2010. This is my first project in PPS 2010 and my performance really matters a lot.

    Please help me if you have any idea how can I achieve above requirement.

    Thanks,
    Prabjot

  3. sanjay shah said

    Hi Alan,

    Interesting Post. That was in Sept 2010. Has there been further developments in PPS 2010 with your client ?
    All the best for winning the lottery soon 😉

    Sanjay

  4. Indies said

    Indies…

    […]First PerformancePoint Services 2010 Project Recap « Alan Whitehouse's Ramblings[…]…

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: