I read an interesting article pn Computerworld.com titled “Worm may create an Internet of Harmful Things.” It discusses how, as our world becomes filled with Internet-connected devices, concerns over security grow. There is one quote from the article that stands out for me: “Security expert Bruce Schneier…is concerned about the broader risks to the Internet of Things. In many cases, IoT connected systems are using firmware that can be hard to patch. In fact, ‘in many cases, [it’s] unpatchable,’ he said.” Unpatchable.
When it comes to software security, if it cannot be updated, it should not be used.
This reminds me of an interesting (perhaps unrealistic) software development methodology called “cleanroom engineering”. With this approach the focus is on preventing bugs or vulnerabilities from ever making it into production code. The SDLC is heavily weighted on all phases before actual coding begins, because (per this methodology) all you have to do is code to the design, since it should be defect free.
This approach to software development seems like an analogy for vendors who release products that cannot be updated. There three reasonings that I can see for this. The vendor
- assumes its product is truly 100% secure, and no vulnerabilities will ever be discovered
- hopes that significant security issues with its product will not be discovered within a reasonable (or obligatory) timeframe
- does not offer patchable products, but offers to sell replacement products that address the security vulnerabilities found in “last year’s model.”
Replacing products used in production is costly and disruptive, and I can’t see the news of any unpatchable security vulnerability endearing a vendor to a customer. In our rush to Internet-ize everything, security may take a back seat again, just like the early days of the Internet or smart phones. As a result, weaknesses will be exposed that attackers the opportunity to do embarrassing, destructive or even dangerous things.
Vendors should operate under the assumption that something will not go according to plan. At some point, a security vulnerability will be discovered in a product even if the vendor did not find it pre-release. And when that vulnerability is found, the vendor should be able to fix (patch) that vulnerability ASAP. Any other approach, such as selling software products that cannot be patched, is downright irresponsible.
Finally taking the time to figure out how to properly sell the book Cyber Security Basics. I have updated the pricing and updated the Kindle version, and started paying attention to the sales dashboard. To date I’ve sold 215 copies of the physical books and 58 of the Kindle ones.
Kindle Sales for 2016
There has recently been an upward trend that I hope will continue as I dig more into the as-of-yet untapped marketing options available to self-published authors. And there are a lot!
Thank you to everyone who has purchased a copy. Please review it in Amazon if you have a spare minute–it would be a huge help.
It’s official — I got my Splunk Certified Architect 6.3 badge today. It was a lot more work than I though it would be; it’s definitely not just symbolic. Six classes at about 55 hours total (not including studying) capped off by a final lab that we had 24 hours to complete (I think I took about 8 hours, the first day running well past midnight.)
The best way to learn about many technologies, I feel, is to get certified. Preparing for a certification forces you to study and learn the details and a lot of things that you might not have used before, but after learning them, may come in handy later. These classes and tests helped me understand how Splunk works and the depth of the tools and options it offers. It’s really a massive solution, and now I have a better handle on how to apply its various features for almost any type of environment.
The classes were very good as well as their instructors. The education program shows that it cares as much about the quality of their training materials and delivery as the engineers who built and support Splunk products. I always appreciated how Splunk was created and designed–it just felt like it made sense. Now that I demonstrably know a bit more, my instinct has been validated.
Received my Raspberry Pi 3 at a recent IBM Linux event. Now looking for inspiration. Might use it to create a mini arcade. Stay tuned.
Do we have a moral obligation to respond if we see someone using a clearly outdated operating system like Windows XP? Is it along the same lines as “if you see something, say something?” I guess it comes down to risk management. If the deprecated OS is observed using sensitive or personal data (such as in a doctor’s office), the need to do something is elevated. If you see it driving an electronic billboard, well maybe not so much.