How vulnerable is your data center to system failure? Are you able to access how resilient your data center is by knowing how many single points of failure you have or identify your weakest links? In today’s digital lifestyle of always-on and fully connected, the costs of data center downtime is measured both financially and in the impact to a company’s reputation. According to the Uptime Institute’s seventh annual Data Center Industry Survey, downtime matters with more than 90 percent of data center and IT professionals believing that their corporate management is more concerned about outages now than they were just 12 months ago. However, only 60 percent report that they measure the cost of downtime as a business metric.
Having significant hardware redundancy, a backup for the backup for literally everything could make a data center more resilient. However, this is not a good strategy for a company’s bottom line especially in light of the exponential growth of data from IoT. Thus, means for eliminating or mitigating downtime to non-harmful levels should be top of mind for IT management. One such way is to allow facilities managers to experiment in safe offline environments by creating virtual prototypes to troubleshoot “what-if” simulations for potential risks associated with power failure or critical systems going offline. Read full article here.



As a start to try and fix the potential massive security problem imposed by IoT, lawmakers in the U.S. Senate introduced a bill in early August that would apply to vendors supplying the US federal government by setting baseline security standards and use of a broad range of Internet-connected devices, including computers, routers and security cameras. The new bill, 
