DoubleCheck Software presents GRC Implementation Success, a guest blog series by Blue Hill Research Principal Analyst David Houlihan. This series draws on five years of Blue Hill studies in GRC in order to highlight key lessons for purchasing and implementing GRC software.
Part 4 of this series examines the importance of questions of “how” in addition to “what” in vendor evaluation process.
One of the issues that I encounter most in studying GRC purchases and implementations is the problem of the Presentation Gap. This gap, of course, is the difference between what software may be shown to do and what is required to accomplish those activities in the real-world. It is one thing to have an effective package for RFP answers and demos. This is not to suggest that vendors necessarily misrepresent themselves. However, it is another to uncover the mechanics that lie behind the layers of polish.
To illustration, consider a photo of the Apollo 11 launch (below). It is a small task for me to find the image and copy it into this post to demonstrate to you the sort of grand achievement that is obtainable by human effort. However, it does very little to make you appreciate with any tangible weight the learning, labor, cost, and complexity involved in getting a rocket off of the ground and landing it on the moon. In other words, it shows you that a desired outcome may be achieved, but not how it may be achieved.
Figure 1: A Photo of the Launch of the Saturn V Rocket Carrying the Apollo 11 Mission
Care of NASA, Not Pictured: Research, Development, Testing, Training, Complexity of Operation
The same problem occurs in a software purchase cycle, where often a vendor helps us to understand what might be obtained with their software (and where we should expect limitations), and gives us very little appreciation for how to achieve the end result. However, as we observed in Part 1, the implementation process is a crucial contributor to the organization’s overall success with GRC.
How then, do we approach vendor assessments in a way that can help us to understand to make that journey and deploy the vendor’s software in a way that will meet our organization’s needs?
Blue Hill’s Contributors to GRC Implementation Success: Avoiding the Worst-Case Scenario benchmark report identified important steps:
- Focus on process change required over functionality desired
- Involve IT early to assess technical limitations and potential pitfalls
To give this a bit more context, we’ll return to the example of professional services and technology firm KBR, Inc. In Part 3, we discussed how KBR used a comprehensive business needs analysis to define a robust set of technical requirements for a SOX controls management platform. (Read the full case study here.) In defining this list, KBR looked beyond solution functionality to non-functional factors that would allow it to understand how it thought it could be able to most effectively deploy the solution. This was then the starting point for all engagements with the vendors that KBR considered.
A “Show Me” Approach
The primary difference between KBR’s evaluation process and most of those that Blue Hill encounters was KBR’s focus on having vendors demonstrate their ability to satisfy its requirements.
As the project lead on KBR’s effort, Patricia Pavlick, explains “We required the vendors we evaluated to provide specific responses regarding our business requirements, not only whether or not the system met them, but how. When you get an RFP response from a vendor, most give you ‘yes’ or ‘no’ for an answer. The truth is a vendor can often answer ‘yes’ honestly when the way that it is done makes you want to say ‘no’.”
This focus appeared both in the RFP process and in vendor demonstrations.
The RFP Process
KBR approached the RFP through its standard organization process, giving vendors approximately six weeks to submit responses to questions addressing:
- Application cost and total cost of ownership
- Financial viability of the vendor
- Support offerings and processes
- Technical platform and maturity
- Customer retention and reviews
- Length of vendor experience in SOX support
Notably, to develop these questions, the organization’s Financial Controls Group (the GRC consumers) played the primary role in determining these factors and in reviewing the RFP submissions of vendors. Its IT organization identified key questions to include with respect to technical characteristics to be assessed that would help the organization to effectively deploy the software.
KBR’s vendor demonstration process was structured around asking the vendor to demonstration how it would structure and configure its application to meet the organization’s requirements. KBR provided a detailed “demo script” that each vendor was to follow as closely as possible. To create a formalized and consistent evaluation methodology, KBR also developed an evaluation template and ranking system that its demo participants would use.
KBR’s evaluation factors and priorities at this stage closely mirrored those of the business case and RFP evaluation processes, albeit at a more granular level of detail. Primary factors for review included:
- Ease of use
- Implementation process
- First-year cost
- Total cost of ownership
This process favored vendors that could demonstrate, in person, how its requirements could be implemented and configured in the product. In this analysis, KBR gave significant consideration to how changes to controls, business processes and organizational structure would be handled by the system and the projected resulting impact on a closed assessment cycle.
What Did KBR Do Right?
KBR’s process enabled it to review not only to effectively assess the value of the software, but what would be required for implementation and management of the solution. Uncritical acceptance of vendor answers and a lack of attention to the concrete details of how a solution delivers its functionality will result in scope creep, compromises, and workarounds that retro-fit processes to the limitations of a tool. All of these outcomes work to lengthen deployment and erode value, sometimes by dramatic margins.
For KBR, this focus how vendors responded to request to demonstrate how requirements could be implemented provided a preview of both what to expect in implementation as well as in the potential working relationship. Vendors may push back on this process. However, even the willingness to engage, itself, is an indication of how easily the tool can be used to deliver an organization’s needs.
It bears repeating: the ability to take this sort of “show me” approach requires that the organization has a very clear understanding of what they are trying to accomplish and how it will impact their business.
KBR had 75 functional and non-functional technical requirements that were derived from a business and process analysis. With so clear a view, it would not have been able to apply this level of scrutiny. Nor would a vendor have sufficient information to provide meaningful responses.
Next, we look at: tailoring a solution to your organization’s needs and the important of configuration over customization.
Before, we discussed: Why implementation success is investment success