Implementing Planned Security Technology

2022-05-14 13:44:23 By : Ms. Jessica Zhu

As new security risks emerge and technologies improve, more sophisticated electronic security measures become necessary. This feature originally appeared in Automation 2022: Cybersecurity & Connectivity Volume 2.

When done correctly, enhanced electronic security can lead to enhanced security posture, increased operational efficiencies and even reduced insurance premiums. Yet, adopting the technologies that provide security and operational benefits can be overwhelming.

In many ways, the electronic security industry is like the dot-com companies of the 1990s. In the last 10 years, the world has been flooded with more security technology than ever. According to an IHS Markit video surveillance report, the number of professional surveillance cameras shipped has increased by 10 times, from 9.9 million in 2006 to more than 106 million in 2016, and is expected to reach 160 million annually by 2020. Other security-related products have seen similar growth. Like with the dot-com boom, many of the companies in the market today promise technologies that, sadly, do not perform as advertised and will not be around 10 years from now. Investing in the right security technologies can bring significant security, operational, and financial benefits to an organization. Investing in the wrong technologies can result in wasted capital, frustrated stakeholders, and increased security vulnerabilities. Developing an effective technology strategy prior to the procurement and deployment of electronic security systems is key to avoid these negative outcomes. Proper planning prior to design or deployment of the technology is necessary and must consider the following:

If the project planning is done correctly, you set the course for a successful overall security system enhancement project.

Once the decision is made to increase the electronic security, there is a tendency to start researching viable technologies immediately—but don’t do it. Like any good real estate agent would tell you, don’t go house shopping until you know what you want and what you can afford (and really need). Otherwise, you’re likely to end up with a flashy house that is too expensive and doesn’t meet your needs. In much the same way, there are a lot of technologies that look enticing but could become costly and restrict your ability to implement other technologies in the future. Before you start looking at solutions, it’s wise to develop a list of metrics to score potential solutions. Performance metrics should be developed with feedback from as many stakeholders as possible. Stakeholders vary from organization to organization, but typical roles include systems and security operators, operations, information technology (IT), compliance, law enforcement liaisons, engineering and executive leadership. All stakeholders do not necessarily need to be involved in the dayto-day execution of the project. However, soliciting these stakeholders for feedback on system criteria they would like to see will help an organization obtain buy-in on the project. It also may be possible to earn some goodwill by providing a benefit beyond security to some of the stakeholders. The more each metric can be understood and quantified relevant to the individual stakeholders, the better. This will better inform decisions on which technologies to implement and how to implement them to maximize system effectiveness across the entire organization. Performance metrics can be broken down into seven categories:

Without fail, each organization deals with stakeholders who are generally resistant to increased security measures. This resistance is not unfounded, as security can sometimes create burdens such as decreased operational effectiveness, increased costs and no revenue. These biases can be difficult to overcome. But identifying operational benefits or introducing cost savings and/or revenueproducing measures as an added benefit to the increased security can go a long way in establishing good rapport and gaining stakeholder buy-in. Consider these examples of value adds of security technologies to other aspects of operations:

While these examples may not apply directly to your business, more than likely some operational benefits can be identified with any given security program, further supporting stakeholder buy-in.

Before investigating new technologies, an organization should determine if any technologies currently deployed can be used to meet the new security requirements. This can be accomplished by comparing the currently deployed technologies to the metrics identified earlier. The goal should be to minimize the number of technologies implemented to reduce the overall complexity of installing and maintaining your system. Once an organization has determined that currently deployed technologies will not work or are cost prohibitive, it is time to begin evaluating outside products to meet the new performance requirements. The list of security technology providers is generally too long to allow testing of every potential solution. Before investing significant resources in any level of design or testing, the list of potential products must be narrowed to a manageable number. To reduce the list of potential solutions to a manageable level, create a list of qualifying “yes/no” questions that can be answered with minimal time researching specific products. The questions should reflect the performance metrics outlined earlier. An independent security consultant can assist in the process of developing the questionnaire and shortlisting technologies, if needed. Once the list of questions and answers has been developed, any technology that fails to meet the criteria should be removed from further consideration. The goal at this point should be to shortlist the technologies to two to three times the number of technologies viable to test. In most cases, this is somewhere between five and 15 technologies. If the list is too long, then adding additional qualifying “yes/no” questions may be required. If the “yes/no” questions eliminate too many technologies, then the shortlist criteria may need to be broadened to increase the number of viable candidates. Next, develop a scoring matrix to rank technologies that pass the “yes/no” questionnaire. The matrix should incorporate all performance metrics already discussed and may be constructed using either a ranking system or a weighted point system based on which criteria have the highest priority. Regardless of how the scoring system is constructed, the result should lead to a justifiable ranking of each candidate technology.

Once a few technologies have been shortlisted, the next step in the process is to conduct an onsite test, or pilot test, of the top two or three technologies in the ranked list. The pilot serves a distinct purpose: to determine if the technology can deliver what it has been promoted to accomplish. The technology needs to operate in its designated environment as expected. If it doesn’t, it may be a waste of time and money to implement. Worse, it may increase, rather than decrease, a facility’s overall vulnerability. The pilot should assess not just performance but system integration, ease of installation and operation, reliability, environmental protection, and maintenance requirements. Writing test procedures for these metrics requires an understanding of how the technology works. This knowledge helps an organization better grasp the limitations of the technology and write tests that reveal its limitations and highlight its strengths. Every technology has weaknesses, so revealing a limitation should not preclude a technology from use—identifying the weakness will help anticipate the need for supplementary technologies and procedures to mitigate the risks presented by the weakness of that technology As tests are conducted, an organization should record the results of each and document, in a test report, unexpected conditions or results. The test report also should include a description of the testing procedure, a description of the technology being tested, how it was to be deployed for the test, and how each test phase was performed. Once performance testing has been completed on each technology, results can be compared and incorporated into the scoring matrix developed earlier.

After performance testing is complete, an organization should implement a more in-depth burn-in test to evaluate environmental factors and additional functionality. A burn-in test entails long-term testing of performance and system robustness. If the system is deployed outdoors, burn-in phases should test the system’s ability to perform during both the hottest and coldest months of the year. Running performance tests during periods of extreme weather, including heavy rain, snow, and fog, is also advised, as many systems experience decreased performance during adverse weather. The burn-in phase is also a time to closely monitor undesirable system behavior such as downtime, false alarms, nuisance alarms, poor image or video quality, incorrect classifications, or lost data. Monitoring this information helps stakeholders predict what efforts will be required to properly program and tune the system and may also reveal some system vulnerabilities not identified in the initial functionality. What is learned during the burn-in phase assists in planning for system deployment and in some cases may affect the decision to deploy a system. The burn-in phase is a time-intensive effort. Budgets and timelines may necessitate an accelerated or modified burn-in. A security technology consultant can assist with developing a burn-in plan that meets timeline and budget constraints. Integrating New Technology Integration is essential to seamless operation whenever multiple technologies are involved. Whether the integration is a simple relay trigger from the sensor to the access control software, or a software integration bringing geospatial data into a map interface and triggering different events based on criteria defined during programming, the integrated system should be tested as part of the pilot. If the technology is a sensor, a tester may monitor alarms at the sensor level and the system level during the burn-in phase. Then, the tester may compare to see that all alarms are making it through to the head-end operating system. During the integration phase, it can be worthwhile to begin incorporating and testing integrations with any other systems that may also benefit from the implementation of the new technology. If the devices will have shared access and/or shared control, organizations should work out the policies and procedures for accessing and/or controlling the devices so that everyone who has a stake in the system understands their access rights and limitations before the system goes live.

For large-scale projects, organizations may want to conduct a scale test, especially if the project calls for a quantity of systems greater than what has been deployed by the manufacturer in previous scenarios. A scale test assesses the system’s capacity to handle the traffic produced by many devices. This will confirm the load-bearing capability of the software, as well as identify functionality issues that may not present themselves with just a few units. Scale testing also helps uncover issues that may occur when many systems are integrated. Setting up sufficient hardware to run a full-scale test can be difficult, especially when it is necessary to deploy the system at many sites. To simplify this process, virtual devices may be able to be replicated in a simulated software environment at minimal cost. Scale testing can require significant effort, but working with the manufacturer and a security consultant can assist in conducting a successful scale test.

Training on system operation is an often overlooked key to system implementation strategy. Organizations should avoid waiting until after a technology is decided on to conduct operator training. Instead, training should be included as part of the evaluation criteria. Operators can provide vital insight into the overall usability of a system. Many technologies have been purchased and installed only to be abandoned shortly thereafter because they were too complex to operate, or the operators were never trained properly. Depending on the size and skill of an operating group, an organization may consider assigning only a few operators to train on and test the new technology during the pilot or burn-in phase. If the same operators test multiple technologies, they may be asked to provide feedback on which they prefer and why. This information can be incorporated into the overall scoring matrix. If other stakeholders beyond security will have access to the system, now is the time to train them on how to properly access and interface with the system.

Developing a strategy before deploying any electronic security technology is key to any implementation effort. Clear and measurable performance metrics allow organizations to identify and thoroughly vet technologies. Conducting comprehensive testing means the odds of successful product selection and implementation are increased. By incorporating a strategy for continual adoption of new technologies into your existing security technology strategy, your organization lowers costs, invests more effectively, and gains stakeholder buy-in. Ultimately, these efforts made on the front end of the project will provide benefits later such as faster and more predictable implementation, improved overall functionality of the integrated security system, and additional business value to the organization. For more information about implementing physical security, visit 1898 & Co. This feature originally appeared in Automation 2022: Cybersecurity & Connectivity Volume 2.

Brock Josephson is a physical security consultant for PSP, and was formerly with 1898 & Co., part of Burns & McDonnell. Josephson recently moved from 1898 & Co. to Burns & McDonnell. His work includes helping clients assess and mitigate their physical security vulnerabilities, as well as plan their electronic security systems. He has extensive firsthand experience in physical security system installation management, with more than eight years of experience in the full spectrum of security upgrades, including system implementation and commissioning.

Check out our free e-newsletters to read more great articles..

©2020 Automation.com, a subsidiary of ISA