System Selection: The Vendor Demo
An important part of any system selection process is when the vendor is asked to demonstrate their products. This is a pivotal time, when the dry responses to the RFP become something that is seen and the staff can begin to visualize themselves using the system in their daily work. Selection Team members walk out of a demonstration with their preconceptions turned into expectations of what the product can or cannot do, and what benefits it may bring to the organization. These impressions stick with the audience; it is hard to move someone away from what they’ve seen or heard during a demonstration.
I’ve always considered the demonstration, as well as the set-up and coordination activities around this meeting, as where I earn most of my fee for managing a selection process. It is important not to view this as a one-off meeting, or standalone activity, but to view it as integral to the overall selection process using information already collected and providing output to the next steps, as well as the final decision.
Steps prior to the demonstration including defining and prioritizing the business requirements, creating a potential product list, developing/distributing an RFP and assessing the vendor responses. That assessment should narrow down the field to those 2-4 vendors that best meet your baseline requirements and are most worthy of being invited in for a demonstration.
Recommended activities to surround the demonstration, include:
Schedule: I try to group the demonstrations within a 1-2 week time period, without significant time gaps between sessions. This is rough on the individual calendars of those attending the meetings, but worth it to keep the purpose, critical requirements and comparisons top of mind throughout.
Agenda: Using the most critical requirements identified previously, the agenda is set to walk through all key aspects of the functionality, with a focus on any particular area where the selection committee is particularly concerned. The agenda is also set up to allow users to manage their time, so they are only present when the demo is covering their functional areas, without tying them up for the full session. Importantly, a well thought out agenda ensures the vendor spends adequate time on all the aspects of the system the team is interested in, with little opportunity to gloss over areas of weakness.
Scorecard: Any attendee in the demonstration should complete a scorecard for the parts of the demo they participated in. The scorecards must be completed before the participant exits the room, as their thoughts quickly get mixed between systems, and other priorities occur that take attention and time away from completing the scorecard. The scorecard is never overly long, but serves to provide a quantitative view of the participant’s impression of specific functionality in the system, and to capture any comments or questions that may be pending at the end of the demo. To avoid skewing the quantitative results, participants should only score those sections with which they have expertise. Entries on the scorecard are aligned with the agenda for easy following, and are weighted based on priority for quantitative comparison across products.
Attendees: I discourage the selection team members from looking at the systems early in the process, before their requirements are known and prioritized, to avoid any preset leanings in one direction or another. The size of the group varies on the size of the organization, breadth of functionality for the system being selected and amount of time devoted to the selection. The preference is to keep the participating audience at a manageable size and consistent across all systems being considered. All audience members should be prepped beforehand, as to how the meeting will run, the agenda and the scorecard.
Facilitation: The facilitator role is an active one, ensuring the focus remains on the agenda and covers all the topics in the scorecard. Questions may be tabled, conversations cut short (particularly those that serve a small part of the audience present at that time), and information prompted out of the audience or the vendor. Another role is that of translator and interpreter. It always stuns me how we all say the same things in entirely different ways within and across financial sectors. It is important that the vendor’s presentation is translated into the audience’s terminology whenever possible for maximum appreciation of what is being presented. It is equally important to also interpret what the vendor says into how the audience members think. The facilitator’s knowledge of the industry, the available products, implementation, maintenance, etc. are all leveraged to steer the discussion such that the audience will appreciate not only what they are seeing, but what they will need to contribute for configuration and maintenance and whether the system has the flexibility to meet their needs in different ways. This leads to a more mutually fulfilling discussion between the vendor and the audience, as everyone speaks from the same page.
Post-Meeting Roundtable: A facilitated session of key audience members should quickly follow each demonstration (to mitigate crossover confusion with what functionality went where or when a particular comment came up). A review of the scorecards should be completed prior to this session, so disparities can be addressed. This meeting is the opportunity to discuss the demo, questions raised, and establish a general consensus about where the product stands and that the functionality represents similar things to everyone. It is not unknown to find a score of 1 and a score of 5 (using a 1-5 range) for the same functionality line item on the scorecards of two different participants. There is no expectation that everyone will score things the same, rather that scores should be in a similar ballpark. Large disparities like this one indicate misunderstandings by one or both team members, and those need to be put on the table for clarification as soon as possible, before perceptions are cemented and expectations set in one’s mind that cannot be met.
I know companies who failed to follow one or more of the steps above during their selection process and the result was typically missed expectations and buyer’s regret. Allowing the vendors free roam for their demonstrations causes confusion when comparing products, as the vendors may approach the discussion from totally disparate functional areas. Lack of a schedule requires a larger investment of time, as people with only a small area of functionality to observe are sitting in for much longer time periods (or the meeting is stopping and starting, while new people are called in and others leave). Most importantly, what someone hears versus what was intended may be completely different messages that were not caught prior to a final recommendation. That “results in not getting what you thought you were getting”.
While key to the overall selection process, the demonstration is not the final task in the process. A quantitative comparison of RFP responses and demo results can be used to further reduce the short list of potential candidates prior to moving into an in-depth due diligence process. Targeted system demonstrations, or question/answer sessions with the vendor may occur during this period to collect additional information or clarify any points.
Once the due diligence is completed, the qualitative and quantitative results are assessed to identify the final recommendation from the selection process.
In the first installment of this blog entry, we laid out a scenario which portrays an unhappy software development team. It has become obvious that developers are wasting time fixing and re-fixing bugs (and in many cases, becoming increasingly frustrated) and not contributing to the team’s true velocity. Testers are also becoming frustrated as they are seemingly wasting time retesting and tracking easily identifiable defects, therefore increasing risk by minimizing the time they have to test more complex code and scenarios.
At this point, if they haven’t already, the daily stand-up meetings and/or retrospectives, have most likely taken an on an ornery tone. It’s become obvious that time is being wasted and team members are unhappy.
One potential solution? The post-deployment demo.
I have seen in MANY cases that a significant amount of these issues can be identified up-front – before the retesting effort even begins – by simply scheduling an informal “post-deployment demo”. The format is similar to the aforementioned sprint demo, yet in this case, the “presenter” of this demo is the developer(s) who have worked on the fixes, and the “customer(s)” are the testers actually testing the fixes. The goal of the demo is simple and also similar to the sprint demo – the “customer” should leave the demo confident that the foundation of work which was completed to fix each defect is complete and satisfactory.
Basically, we can attempt to answer two important questions – first, “have we truly fixed what we said we’ve fixed? Second, “have we fixed what actually needed to be fixed and not clearly broken something else in the process?” The second question is obviously a lot more difficult to answer.
It’s important to keep the deployment demo as informal and efficient as possible, yet consistent. Ideally, if accepted by the team as a valuable practice, it should become a team norm, thereby making it repeatable. Usually, QA deployments are on a set schedule; in this case, the demos should adhere to that schedule. Demos should not be considered “painful” because over the course of a project, significant time and effort is saved when issues are uncovered and agreed on between testers and developers before the issues are even retested. Here are some helpful hints and ideas to get started:
- Keep things as simple and efficient as possible – no PowerPoint slides or agenda needed. All the demo requires is a meeting room, a laptop (or even better, a projector), a listing of defect fixes (along with high-level descriptions) and the essential team members.
- Include essential team members (ideally, those developers and testers associated with the functionality contained within the deployment) as often as possible. My teams have referred to this meeting in the past as a “DAT Session”. “DAT”, you may ask? Developer, Analyst, Tester. (Sometimes, it really is that simple.)
- If applicable, why not include Business Analysts for clarification or feedback?
- Even better, if you have an eager and co-located customer or client (i.e. Product Owner), why not solicit his or her feedback to save time? Non-complex scenarios or questions can be answered directly, saving even more time (and helping to avoid additional subsequent time-consuming meetings!)
- Again, while the goal is to keep this as efficient a meeting as possible, at the same time, the possibilities really are endless. A forum for direct, face-to-face collaboration goes a long way and the team should take advantage of the opportunity.
- Similar to another oft-practiced Agile concept – the “daily stand-up meeting” – try to “time box” the demo as much as possible, stick to topic and do not attempt to resolve issues. This is not a requirements meeting or a story-writing workshop. Yes, simple questions or clarification points can be agreed on and action taken. But if requirements are questioned or further clarification is needed, take it offline. The demo is not the forum for solving these issues.
- Obviously, not every defect/fix needs to be demoed – usually, high-level demonstrations of simple functionality are adequate, and you’ll find they can uncover numerous issues before they are moved over to “ready to test”, which saves valuable time over the life of a project.
The Post-Deployment Demo – Team Acceptance
One final point – the first few iterations of a deployment demo can potentially have a risk of morphing into a “he said, she said” (or, more to point, “Testers” vs. “Developers”) conflict within the team. Employing this type of feedback session in practice poses a bit more risk to newly-formed teams who are unfamiliar with each other and still striving to progress through the “Forming and Storming” stage of “Tuckman’s Stages of Group Development”.
The reality is that in this forum, names are tied to tasks (in this case, names are tied to bugs) and people’s work is more times than not questioned (and sometimes indirectly criticized). Every effort should be made to avoid being overly critical when addressing issues when one person thinks something has been fixed while another person believes the functionality is still incorrect. Remember, it’s a team effort and everyone is in this together with a common goal of team and project success. I feel this simple yet valuable practice is a great tool to help get you there!