Make Your System Demo Count!

Make Your System Demo Count!

As a continuation of our blog series on system selection, it’s time to discuss helpful tips to facilitate a successful product demonstration. The organization and management of the entire process requires upfront preparation. If you drive the process, your demo evaluations will be far more effective.

Demonstrations are one of the most critical components of the software selection process. Seeing a system in action can be a great learning experience. But not all demos are created equal. Let’s talk about how you can level the playing field. To make the most of everyone’s time, CC Pace recommends the following best practices for product evaluations.

Tip One – Keep your process manageable by evaluating no more than five systems. If you evaluate too many vendors, it becomes difficult to drill down deep enough into each offering. You will inevitably suffer from memory loss and start asking questions like, “which system was it that had that cool fee functionality that would be really helpful?”

Tip Two – For each software vendor, set a well thought out date and time for the on-site demo. Depending on your team’s travel schedule, try to space out the demos a few days apart so that you have time to prepare and properly analyze between sessions.

Tip Three – Logistics play a big role in understanding how a system looks and functions, so do your part to help your vendors present well. Whenever possible, arrange for a high-quality projector or large HD screen for the attendees in the room. Hard-wired internet connections are always better. There’s nothing worse than being told, “the screen issues are because of a resolution problem” or “it’s running slow because the air card only has one bar.” Providing these two items can easily remove doubts about external factors causing appearance and performance issues.

Tip Four – Involve the right people from your organization. It’s important to have executive sponsorship as well as hands-on managers involved to assess the software modules. This is also the best opportunity to get “buy-in” from all parts of your organization.

Tip Five – Be sure to head into these demonstrations knowing your key requirements. Visualize it as a day in the life of a loan and follow a natural progression from initial lead into funding. Jumping around causes confusion and can be difficult on the vendor.

Build a list of requirements based on the bulk of your business. Asking to see how the software handles the most complicated scenarios can send the demo down needless paths. No one wants to watch a sales person jump through a bunch of unnecessary hoops for a low-volume loan product.

If you highlight which functional capabilities are most important to your organization, the vendors can spend more time demonstrating those capabilities in their software. Communicate how you think their software can help. But be careful not to justify why something is done a certain way today, but rather focus on how it should be done in the future.

Tip Six – The easiest way to take control of the demo process is to draft demo scripts for your vendors. Start by identifying the ‘must-have’ processes that the software should automate. Don’t worry about seeing everything during this demo. Set the expectation that if the demo goes well, the vendor will likely be called back again for a deeper dive. Provide a brief description of each process and send it to the vendor participants so they can show how their software automates each process. The best vendor partners will have innovative ways to automate your processes, so give them a chance to show their approach.

As you watch the demos, keep track of how many screens are navigated to accomplish a specific task. The fewer clicks and screens, the better. Third-party integrations can significantly help with the data collection and approval process. Always have an open mind regarding different ways to accomplish tasks and don’t expect your new software to look or act just like your legacy system.

Simple scorecards should be completed immediately following each demonstration. This will make it easier to remember what you liked and disliked and prove invaluable when comparing all the systems side-by-side when your demos are complete.

One final suggestion: always request copies of the presentations. Not only will this help you remember what each system offers, it’s useful when the time comes to create presentations for senior management.

 

photo credit: http://www.freepik.com/free-vector/business-presentation_792712.htm Designed by Freepik

An important part of any system selection process is when the vendor is asked to demonstrate their products. This is a pivotal time, when the dry responses to the RFP become something that is seen and the staff can begin to visualize themselves using the system in their daily work.  Selection Team members walk out of a demonstration with their preconceptions turned into expectations of what the product can or cannot do, and what benefits it may bring to the organization.  These impressions stick with the audience; it is hard to move someone away from what they’ve seen or heard during a demonstration.

I’ve always considered the demonstration, as well as the set-up and coordination activities around this meeting, as where I earn most of my fee for managing a selection process.  It is important not to view this as a one-off meeting, or standalone activity, but to view it as integral to the overall selection process using information already collected and providing output to the next steps, as well as the final decision.

Steps prior to the demonstration including defining and prioritizing the business requirements, creating a potential product list, developing/distributing an RFP and assessing the vendor responses.  That assessment should narrow down the field to those 2-4 vendors that best meet your baseline requirements and are most worthy of being invited in for a demonstration.

Recommended activities to surround the demonstration, include:

Schedule:  I try to group the demonstrations within a 1-2 week time period, without significant time gaps between sessions.  This is rough on the individual calendars of those attending the meetings, but worth it to keep the purpose, critical requirements and comparisons top of mind throughout.

Agenda: Using the most critical requirements identified previously, the agenda is set to walk through all key aspects of the functionality, with a focus on any particular area where the selection committee is particularly concerned.  The agenda is also set up to allow users to manage their time, so they are only present when the demo is covering their functional areas, without tying them up for the full session. Importantly, a well thought out agenda ensures the vendor spends adequate time on all the aspects of the system the team is interested in, with little opportunity to gloss over areas of weakness.

Scorecard:  Any attendee in the demonstration should complete a scorecard for the parts of the demo they participated in.  The scorecards must be completed before the participant exits the room, as their thoughts quickly get mixed between systems, and other priorities occur that take attention and time away from completing the scorecard.  The scorecard is never overly long, but serves to provide a quantitative view of the participant’s impression of specific functionality in the system, and to capture any comments or questions that may be pending at the end of the demo.  To avoid skewing the quantitative results, participants should only score those sections with which they have expertise.  Entries on the scorecard are aligned with the agenda for easy following, and are weighted based on priority for quantitative comparison across products.

Attendees:  I discourage the selection team members from looking at the systems early in the process, before their requirements are known and prioritized, to avoid any preset leanings in one direction or another.  The size of the group varies on the size of the organization, breadth of functionality for the system being selected and amount of time devoted to the selection.  The preference is to keep the participating audience at a manageable size and consistent across all systems being considered.  All audience members should be prepped beforehand, as to how the meeting will run, the agenda and the scorecard.

FacilitationThe facilitator role is an active one, ensuring the focus remains on the agenda and covers all the topics in the scorecard.  Questions may be tabled, conversations cut short (particularly those that serve a small part of the audience present at that time), and information prompted out of the audience or the vendor.  Another role is that of translator and interpreter.  It always stuns me how we all say the same things in entirely different ways within and across financial sectors.  It is important that the vendor’s presentation is translated into the audience’s terminology whenever possible for maximum appreciation of what is being presented.  It is equally important to also interpret what the vendor says into how the audience members think.  The facilitator’s knowledge of the industry, the available products, implementation, maintenance, etc. are all leveraged to steer the discussion such that the audience will appreciate not only what they are seeing, but what they will need to contribute for configuration and maintenance and whether the system has the flexibility to meet their needs in different ways.  This leads to a more mutually fulfilling discussion between the vendor and the audience, as everyone speaks from the same page.

Post-Meeting Roundtable:  A facilitated session of key audience members should quickly follow each demonstration (to mitigate crossover confusion with what functionality went where or when a particular comment came up).  A review of the scorecards should be completed prior to this session, so disparities can be addressed.  This meeting is the opportunity to discuss the demo, questions raised, and establish a general consensus about where the product stands and that the functionality represents similar things to everyone.  It is not unknown to find a score of 1 and a score of 5 (using a 1-5 range) for the same functionality line item on the scorecards of two different participants.  There is no expectation that everyone will score things the same, rather that scores should be in a similar ballpark.  Large disparities like this one indicate misunderstandings by one or both team members, and those need to be put on the table for clarification as soon as possible, before perceptions are cemented and expectations set in one’s mind that cannot be met.

I know companies who failed to follow one or more of the steps above during their selection process and the result was typically missed expectations and buyer’s regret.  Allowing the vendors free roam for their demonstrations causes confusion when comparing products, as the vendors may approach the discussion from totally disparate functional areas.  Lack of a schedule requires a larger investment of time, as people with only a small area of functionality to observe are sitting in for much longer time periods (or the meeting is stopping and starting, while new people are called in and others leave). Most importantly, what someone hears versus what was intended may be completely different messages that were not caught prior to a final recommendation.  That “results in not getting what you thought you were getting”.

While key to the overall selection process, the demonstration is not the final task in the process.  A quantitative comparison of RFP responses and demo results can be used to further reduce the short list of potential candidates prior to moving into an in-depth due diligence process.  Targeted system demonstrations, or question/answer sessions with the vendor may occur during this period to collect additional information or clarify any points.

Once the due diligence is completed, the qualitative and quantitative results are assessed to identify the final recommendation from the selection process.