An alchemists view from the bar

Network Security Alchemy

Archive for July 2010

Technology Evaluation Do’s and Don’ts

with one comment

As an engineer that has been involved with many IPS evaluations, I have first hand experience of the deciding factors that make the difference between an optimal test, and a test process that becomes drawn out, consumes far too much resource, and provides non confident results back to the business. It is my opinion that a successful evaluation should result in a repeatable and deterministic test report that can be used for the final justification for the selection, or non-selection of a product.

In order to provide some advice of how to optimize the evaluation process without influencing its result or a customers test criteria, I have created a list of “Dos and Don’ts” that can be considered during the planning and testing process. Following these points will avoid many common pitfalls I have witnessed first-hand and should result in a much better test experience regardless of vendor or technology.

This list was created based on my Intrusion Prevention, Vulnerability Assessment, and network discovery background, however the concepts I believe should hold true for most other technologies. I have purposely avoided sharing specific test methods as they do (and should) vary as the customers needs vary. Contact me if you would like some help with those as well.

I hope the below list is if use, and improves your evaluation experience regardless of the technology or vendors involved.

  1. Do: Understand the business needs for implementation and keep them the top priority during test plan creation.
  2. Do: Create a list of functional requirements ahead of the testing process.
  3. Do: Create a list of how each test should be performed (method).
  4. Do: Have the test list reviewed by peers and vendors involved. They may catch impossible, irrelevant, or incorrect scenarios.
  5. Do: Work through the test list assessing a compliant/not-compliant state next to each test with comments.
  6. Don’t: Waste time testing a function of a device that the vender states is not supported/included/doesn’t work.
  7. Don’t: Change the test criteria during the testing.
  8. Don’t: Enforce how (method) a capability should be achieved, assess that it can be achieved (result).
  9. Do: Score and provide weighting to each test. Not all test criteria are, or should be, of equal in importance
  10. Do: Make use of vender/reseller/consultant expertise during a test. An evaluation of technology should evaluate solely the technology not the evaluator’s knowledge of how to use that equipment.
  11. Do: Test the support and after sales capability of the vender/reseller chain. This will be your primary “help” contacts if selected, make sure they can help you!
  12. Do: Check the DUT (Device under test) can integrate with all systems/services that will be in use in the product environment.
  13. Do: Test unique capabilities (USPs) of the DUT, if they are relevant to the functional requirements.
  14. Don’t: Waste time on irrelevant unique capabilities (USP) that are not part of your functional requirements.
  15. Don’t: Try to “Level the playing field” by only comparing capabilities offered by both devices in order to achieve an “apples-to-apples” test. Focus on the business requirements.
  16. Do: Remember that no two systems are the same; there will be differences in both function and implementation. Don’t fear change from any system you are used to and assume something different is something wrong.
  17. Don’t: Assume that multiple products from a single vendor work well together and integrate, either now or in the future. Test them to get your own opinion of the integration maturity.

If you have any more to share from your own experience, I’m open to listening and learning from others.



Written by leonward

July 22, 2010 at 9:29 am

Posted in Security, Sourcefire