Showing posts with label software engineering. Show all posts
Showing posts with label software engineering. Show all posts

Monday, July 22, 2024

#CrowdStrike Cause a Global Tech Outage - what happened, why, and (how) can it be prevented?

While the memes are amazingly good, and there's a lot of jest being spewed across the interwebs, this is a serious event with massive implications. So, in all seriousness, let's review the facts of the #CrowdStrike situation from 19-Jul-2024: 

As reported across global news outlets and the internets, a security company called CrowdStrike caused some chaos. There are cascading impacts across many industries. 

We are already seeing impacts: 
://courier service delays (UPS, FedEx, DHL, etc.) 
://flight delays/cancellations at the airport 
://small business closing for the day 
://websites being inaccessible 
://hospitals cancelling surgeries/treatments 
://municipalities being closed 
://government services being delayed 
among many other cascading effects that could last days, or weeks. 

While a major inconvenience, the bug was quickly resolved within CrowdStrike's system, so (as of publish date) the latest binaries are stable. Recovery will be slow and tedious, especially for larger networks, but the world will recover from this. 

What happened? As is being reported, a bug introduced during a routine update of their Falcon EDR software (anti-virus software run by millions and millions of customers) caused what is known as a kernel panic within the Windows operating system - we are seeing this manifest as a "bugcheck error" (aka - the Blue Screen Of Death , or #BSOD) on Windows machines. It does not affect #Apple or #Linux devices. Note: It is NOT a #Microsoft problem. 

How can we prevent this? Short answer, WE as users can't. However, this isn't the first time a large global tech vendor has caused major outages across the globe, and it won't be the last. 

How can CrowdStrike, or any another company, prevent this? Simply, adhering to the SDLC methodologies, adequate QA testing, and never do a full production roll out without fully testing in the field. A common practice is to deploy to 10% of the network and see how systems and users respond (yes sysadmins, you can do targeted deployments even if you don't have network segmentation in place). If all goes well, push to 25% and test again, then 50% and test again, then the full push. That way when a problem does occur, it doesn't take out everything and can be quickly fixed before a full production push. It's really IT Ops 101 - not that difficult. This is a good example of why you should backup your critical data frequently: whether to an external device, or a cloud storage facility (Google Drive, Dropbox, OneDrive, etc.). You should do this personally as often as you feel is necessary. Most companies have policies governing backup types, schedules, and testing methodologies. 

For my enterprise admins reading this, I hope you have a solid (and tested) backup methodology in place. Yes, you should test-restore your backups at least once per year, if not more often. If you can't restore the data, then what is the point of backing it up? 

So now the big question is, how does this issue get fixed? Well, it's a hands-on-machine fix (which means long days/nights and weekends for IT staffers for a bit). Since the devices are unable to boot, there's no back-of-house configuration that we admins can set to fix this. We literally have to put our hands on the device. The methodology is simple, and only takes about 5 minutes to do - but multiply that over hundreds, thousands, or even hundreds-of-thousands of devices and you can quickly see this is not a quick fix at scale. It is an even bigger nightmare for remote workers, who would need to be walked through the fix via telephone, making it a 30min fix (at best). In those cases, from my perspective, it makes more sense to send them a replacement machine that is not bricked, then reset the trouble device once back in hand. Hopefully you have the inventory ready and waiting, otherwise you need to grab a company credit card and hit up every electronics store in your city. What a fucking PITA. 

CrowdStrike's official guidance can be found on their webpage here: https://www.crowdstrike.com/falcon-content-update-remediation-and-guidance-hub/ (external link). 

While all of this is happening, myself and most of my peers agree that CrowdStrike is still a quality vendor offering quality security products and services. This was just a BIG fuckup from whoever pushes out their updates. Clearly, someone did not follow protocol. 

As of this writing, CrowdStrike is the second largest security vendor in the world, which is why the impact of this was as massive as it was...and the cascade effect isn't done yet. There will be more fall out from this, not to mention the legal cases that could be brought against them in the aftermath due to the downtime. 

One of the biggest fallouts of this mess is phishing attacks - threat actors spinning up malicious domains claiming to fix the issue (they won't, they just want your money); emails being sent claiming to be able to fix the issue with "a click" (using a piggy-back technique to install a payload on your machine to do god knows what; oh and steal your money too). Please do not fall for the phish. It's won't end well for you, or your employer. 

There is no "easy button" here peeps. Just a massive Pain In The Ass. 

#StayCyberSecure 
#BeCyberAware

Friday, April 3, 2015

Premise Control and Environmental Factors

"Premise control is the systematic recognition and analysis of assumptions upon which a strategic plan is based, to determine if those assumptions remain valid in changing circumstances and in light of new information." Planning premises are primarily concerned with environmental and industry factors, I will focus on the environmental factors. These are your intangibles, the uncontrollable factors the pose great influence on the success or failure of a strategy. One of the biggest influences on corporate strategy in relation to IT is Web 3.0 - the Internet of Things. The entire world is connected to everything at the speed of light through a complex mesh network of connected devices and computers dubbed the Internet. This poses a massive challenge to the status quo of conducting business in that consumers have a "get it now" mentality. With a few taps of a smartphone, or a few clicks of a mouse, consumers have access to not only information, but nearly anything you can imagine is available instantly. Satisfying this need for instant gratification poses a real threat to older, traditional methods of conducting business, while creating immense opportunities for the organizations that puts together the right mix of goods and services. The younger generations are leading the demand curve, as well as sharing feedback in real-time about the viability and quality of goods and services offered in the digital marketplace, and many companies are having trouble keeping pace. CTO's and business strategists are now having to create ways to remain relevant in this new digital world. It is not longer as simple as creating an online presence, or e-commerce capabilities. With social media taking center stage on real-time feedback, and the sheer volume of information that is shared across these networks, companies need to take advantage of this marketing free-for-all and become engaging with their consumers, vendors, and service providers to form a cohesive and comprehensive ecosystem where all parties can interact with the goal of not only improving the quality of the products being manufactured but also provide instantaneous and enriching C2B & B2B interactions. The Internet provides the medium for anyone to access any information that is available, anywhere they are geographically, and this has dictated a new business landscape. The strategy is no longer to appeal to a particular demographic or region, which used to be very successful for product placement and marketing plans. The strategy has shifted to being universally understandable across every demographic, and every region, everywhere. Established best practices and presentation methods of digital information has created an expectation for all e-commerce providers, whether they be Walmart, or the local eye doctor. Every consumer, old and young, demands that every company they interact with have not only an online presence, but also a mobile app, a multitude of customer service options - phone, email, IM, chat, Skype, FAQ's, an online knowledge base, automated troubleshooting tools, and with larger corporations like Walmart the ability to get in-store service for products purchased online. Companies like Walmart are large enough and have enough resources to remain relevant in the every changing, fast evolving, Internet of Things world, however many companies, even the largest ones, are struggling to remain relevant. The opening example of the ups and downs of Dell Corporation are a perfect example. Dell was late to the table with the mobile computing market as they were focused on consumer PC's, a brief stint in consumer peripherals (printers, etc, which were highly unsuccessful), then the Enterprise where they are still very strong with their Server and managed services platforms. However, as the formal PC becomes something of a niche product group anymore, where old home PC's are being transformed into home servers and left stagnant as tablets, convertible laptops, and smart devices proliferate the global market, Dell feel behind the 8-ball, as it were. As their PC sales dwindled, inventory accumulated, and users looked elsewhere for their technology needs, knocking Dell from the top of the market after a multi-decade dominance globally. The problem is they never had good premise control, thus they did not reevaluate and either change or abandon their existing strategy. The ultimate failure in my opinion was their inability to recognize, ore react to, the paradigm shift in how consumers and businesses conducted business. They stood firm that the PC would remain the dominant force in the market, however Apple was the catalyst of a massive movement with the release of the iPhone - they helped push the mobile convergence to consumers in an easy and pragmatic way. They created a device that was so simple to use, and so connected with everything, that everyone jumped on board, and quickly. Smaller companies and startups rode their coat tails as the mobile marketplace was born. In a short time frame, the entire world was connected via mobile devices. Companies like Google and Samsung were quick to recognize the paradigm shift and started creating systems and services specifically designed for mobile platforms. Fast froward to modern day and the lines are gone - there is no separate platform for desktops and mobiles, all systems now see the same information in the same way. As web development technologies evolved, and modern mobile systems became more powerful than the servers of just a few years ago, what used to be a significant difference in computing power become insignificant on all counts. Premise control has become the name of the game, and the evaluation and management of environmental factors have become the lifeblood of any organization that wants to be successful in the modern business world.







Pearce, John A., and Richard B. Robinson. Strategic Management: Planning for Domestic & Global Competition. 13th ed. New York: McGraw-Hill/Irwin, 2013.

Thursday, November 13, 2014

Digital Security Discussion

Topic of discussion in my Enterprise Models class tonight: Digital Security.  Something I touched on earlier this year.

Our text postulated: "Increasingly opening up their networks and applications to customers, partners, and suppliers using an ever more diverse set of computing devices and networks, businesses can benefit from deploying the latest advances in security technologies."

My Professor said: "My thoughts on this are opposite: by opening up your network, you are inviting trouble and the more trouble you invite in, the more your data will be at risk. I understand what they are hinting at, with a cloud based network, the latest security technologies are always available, therefore, in theory, your data is more secure. Everyone needs to keep in mind though, that for every security patch developed, there are ways around them."

He went on to mention how viruses could affect the cloud as a whole and that companies and individuals moving to cloud-based platforms will become the next target for cyber attacks as the model continues to thrive.

Which is all relevant, however I have a different perspective on digital security. My counter argument to that is user education is the key. I have debated this topic, security and system users, many times over the years. Like most of us in the industry information security is paramount. With the multiple terabytes of data we collect in our home systems, and even more in online interactions, keeping our data safe is really our last defense in privacy and security. As more companies and individuals implant their corporate and personal data upon cloud platforms there is an uneasy sense of comfort for many people, including some seasoned pros. Companies like Google and Microsoft whom both have highly successful cloud models across the board have taken responsibility for ensuring they have more than adequate digital and physical security in their data centers, which to an extent leaves it to assumption that the data and applications they warehouse and host are generally safe from intrusion. Users are the key to this whole ecosystem we have created. This is where user education becomes critical. As most seasoned techies know, in the beginning systems and system operations were highly technical in nature and only the most highly trained or technically creative individuals could initiate and manipulate computer systems. Viruses were something you caught from kids at school or coworkers, not a daily blitz of digital infections numbering in the hundreds of millions perpetually attacking in various forms. As systems got more complex in design but simpler in use the users technical ability level eventually became irrelevant. People ages 1 to 100, and even some very well trained animals, can all navigate systems and digital networks with very little effort. Our systems now do all the work for us, users simply need to provide basic instructions and gentle manipulations, instead of hard coding instruction sets and inventive on-the-fly program generation as was the status quo in the 70's, 80's, and 90's. This idle user perspective is the reason why criminal hackers are still traversing firewalls and breaking encryption algorithms, and they are growing in numbers as is evident by the number of new malware detections and infections quantified annually across all digital platforms and all continents. Educating users on general best practices for system use and maintenance, how to identify potential scams, how to detect spoofing and malformed websites, what to avoid when reading emails or reviewing search results, and which security software is functionally the best whether free or paid is critically important today more than it has ever been. The problem is that the industry has created the lazy user by essentially conveying that security is a given. Microsoft even made a concerted effort by including the Windows Firewall and Windows Defender as a part of its operating system by default so that there was some protection for their users out of the box. This was in response to a large number of users, whom had been infected by one or more viruses, that assumed they were protected because "it's from Microsoft, it has to be safe" which was further from the truth than they could understand. As an educated user that knows how to secure systems and networks, I take it upon myself to ensure that users appreciate they have to set a passwords when logging into various systems and services. I teach about the importance for digital security and how to be more security conscious with their every day interactions. I teach them how to correctly navigate Internet search results (avoiding "ads"), how to understand various security prompts and what they look like so they don't ignore them, what security solutions should be installed and how to identify them, etc. This improved knowledge has created a culture of awareness for my users both at work and at home. I am regularly consulted by my peers on how to secure their own families and how to explain it to their children. This creates a more intelligent user and thereby creates a more intelligent user community at large, making the Internet a bit more secure. All of that said, it only takes a single character missing from source code to give a programmer the ability to break the program and cause havoc, or a user inadvertently installing malware. Even the most seasoned users make these mistakes from time to time because we are all human, and as such we are fundamentally flawed, making no security solution 100% secure because they are developed and used by humans. Best you can do is make every effort to educate and secure, and hope no one targets you because if they want to get in bad enough, they will get in and you won't be able to stop them.

~Geek

Sunday, March 16, 2014

Hackers & Stuxnet: Education & Best Practices can Change Perspectives

Speaking of hackers in general, the video "Creating panic in Estonia" was well done.  It speaks to aspects of cyber security I have touched on personally with peers and users who are generally unaware of how dangerous the Internet can be, and do not understand how they should be protecting themselves and in turn the rest of the user community at large.  The global dependency on the Internet as a necessary aspect of daily life can, and may, eventually lead to its demise.  It used to be seen as a tool, something that made research easier or necessitated more efficient processing of goods and services to customers.  The global Internet is far more interconnected than most people can comprehend.  We, as IT pros and field experts, find ourselves at a unique crossroads when it comes to the cyber realm.  On one hand, we are users who find entertainment and conduct business transactions through the Internet.  We keep in touch with friends, relatives, and associates.  Pay bills and send gifts to people, through the Internet.  We curiously investigate other perspectives on anything we can think of, reachable with a simple search string.  On another hand we develop and/or service information systems and are responsible for ensuring that the users are not their own worst enemy, and the executive stakeholders understand why expenses are necessary to ensure seamless operations while maintaining data security and integrity through digital interactions across the company or across the Internet.  Yet another proverbial fork is that of a hacker.  Not necessarily one that breaks into systems with malicious intent, which are primarily the hackers (criminals) most people hear about, but the white hat hacker who, like the man who works for Kaspersky in Russia, looks to improve the quality and safety of the Internet.  In order to beat a hacker, one must be able to think like a hacker, and have the intimately specific knowledge of software and systems, how they interconnect, and how users interact with them.  This whole-view perspective on digital communications is necessary in order to properly safeguard oneself, and also the global user community.  Education and repetitive reinforcement have been the successful combination for me in getting users at all levels to start to invest in cyber security and take a different view on what they share on the web.  People contact me regularly to clean infections from their systems and networks.  Unfortunately, most of these users are of the mindset that "it should just work, and never give me problems, regardless of what I do" which has no substantive basis in reality.  In reality, it takes the combination of dozens, if not hundreds, of software applications to make a system function the way it does today.  Since no user situation is ever the lab-tested "ideal" situation, users must be educated on not only how to use their system, but a basic level understanding of how their use affects everyone connected to their system, and the subsequent systems the collective interacts with.  As experiences and exposure to different interactive scenarios manifest, continued education is the key which not only makes a better user but a better system as a whole.

The Stuxnet event could have been avoided with the right engineer designing security protocols, establishing policies, and integrating hardware solutions designed specifically at denying access to unauthorized devices on a network.  Granted, Estonia may not have had access to the technology necessary to affect such a system, but those types of systems do exist.  It is this reason why companies like SymantecMcAfee, and Kaspersky have integrated a feature into their anti-everything software packages to instantly scan any removable device attached to a protected system.  Granted, the Stuxnet did not yet have a known signature and thus could not be specifically scanned for, but those packages also have zero-day detection capabilities, meaning they have an algorithm designed into the software to detect virus-like patterns and flag them as suspicious - which is how Stuxnet was ultimately found, through a zero-day detection algorithm.  While they are highly effective, they can only be detected on a system with this type of software installed.  Unfortunately, there is a large number of users who do not have protective software, let alone hardware solutions, installed in their systems and/or networks which leaves them, and anyone they connect to, highly vulnerable.  Here again, education is the key - once the people can be made to understand the risks involved, they will be willing to learn how to best safeguard themselves, which protects everyone else they interact with digitally.  It is like getting an immunization for a disease - if everyone gets the shot, then no one can transfer it or get it from someone who is infected or has not had their immunization.  The shot cures any carriers of the disease, prevents spreading of the disease to others, and does not allow the inoculated to become carriers again.  That is the same philosophy of security software, and important for the same reasons.

~Geek

Reference: Video On Demand - http://digital.films.com/PortalPlaylists.aspx?aid=7967&xtid=50121&loid=182367

Sunday, October 7, 2012

Integrating Information System Knowledge

Integrating Informations Systems Knowledge within an Organization

The first discussion question for my last class went something like this:

"What is the best approach to integrating IS knowledge domain-specific and professional core competency needs with your organization? Why is this approach better than others?"

After reading my fellow classmates responses, and pondering my own perspectives on the question, here is what I came up with as my response:

The theme I see in my classmates responses mirror my own opinions on this topic - the best approach is to evaluate the IS resource to determine its usefulness, or kind of knowledge, and apply the resource based on the needs and goals of the organization.  As information systems have evolved into the complex and interconnected platforms of the modern era, more domain specific competencies have emerged, resulting in a need for more specialization from IT professionals.  My families business has been scaled down over the past two years due to the economic downturn  however we continue to thrive because of my unique understanding of information systems and how they can help us be more efficient with less knowledge workers for our type of business.  The reason why we remain successful is because of information systems.  Having grown up watching my father and grandfather build the business to what it was after over 60 years (dozens of workers, four locations around the globe, representing more than 100 major brand names for all aspects of construction and hospitality supply), being a part of the downsize effort was sobering.  Before the downsize, our core competencies were spread among departments, much like found at any corporation, based on the level of skill and domain specific knowledge.  Since I am the systems engineer it is my responsibility to connect the dissimilar and unconnected workers to each other   Understanding the core competencies for each position, and its importance to the whole of the company, coupled with my IT experience and system prowess, I was able to create a cohesive environment of wares and solutions that not only integrated all domain-specific workers to the larger whole, but also helped refine our core competencies to what they have evolved into today.  A mostly interchangeable workforce that can be effective and efficient across all knowledge bases.  As another classmate noted, it would be unconventional to have an IT tech move into the accounting role especially within a larger organization, but in small businesses workers need to have a mix of knowledge because more is required by less.  Integrating information systems is the only way to accomplish this with any level of organization and efficiency.  Without properly planned and implemented information systems, and skilled workers to use them, most companies would fail before they get going because they could not remain competitive in the market.  Staying "lean and mean" in modern business is the only way most companies are still around after the harsh economic downturn  and the low cost of technology has enabled them to stay open, as well as pave the way for start-ups to reach a profitable state faster than ever.  The trick is to plan, plan, and plan.  Going into a information system migration/integration takes careful planning, testing, redesigning, retesting, and finally implementation and maintenance.  Trying to integrate the same without a plan almost always results in catastrophic failures.

...damn I love this shit...

Do you have any thoughts on the subject?  Or any questions on my contribution?  Post a comment, let's discuss.


~Geek

Tuesday, July 31, 2012

I have decided to post some of my papers from class in hopes that my knowledge can be passed on to those of you looking to learn more about the world I am a part of.  This...is my passion ~Geek



Design Patterns
Abstract
            The purpose of this paper is to define, compare and contrast three software architecture design patterns, considering how or when they could be implemented, and why one method might be favored over another.  The Adapter, Factory Method, and Façade will be reviewed.  Before discussing these design patterns, structural patterns and creational patterns must be defined for clarity.

Structural Patterns
            Structural patterns define the make up of objects and manage access control to subsystems of objects.  In contrast to the architectural patterns that define an entire solution of subsystem, there are usually a number of structural-design patterns in a single framework (Hofstader, 2006).

Creational Patterns
            Creational patterns are concerned with the instantiation of objects.  The creational patterns focus on the composition of complex objects and the encapsulation of creational behavior (Hofstader, 2006).

Adapter
            The Adapter, a structural design pattern, converts the interface of a class into another interface clients expect ("Adapter Design Pattern In C# And Vb .net", 2012).  Functionally, the Adapter acts as the bridge for classes with incompatible interfaces to work together.  The four participants in this pattern are, the Target defines the domain-specific interface that Client uses, the Adapter which adapts the interface Adaptee to the Target interface, the Adaptee defines a existing interface that needs adapting, and the Client which collaborates with objects confirming to the Target interface.  The brilliance behind the Adapter pattern is that it can take incompatible classes from different libraries or frameworks and act as a translator between the two, morphing the data so it can be manipulated by the Client.  The Adapter design pattern could be an ideal solution when connecting legacy databases to newly created objects, such as in the aerospace industry where databases may be decades old and the data has to be accessed through a modern workstation.  The Adapter acts as the translator between the two frameworks so the Client can interact with the data and manipulate as necessary. 

Factory Method
            The Factory Method, a creational design pattern, defines an interface for creating an object, but let subclasses decide which class to instantiate.  Factory method lets a class defer instantiation to subclasses ("Factory Design Method Pattern In C# And Vb .net", 2012).    This pattern attempts to standardize the architectural model for a range of applications by creating a superclass, and delegating details to subclasses that are Client supplied, such as the case with assigning values to variables by way of user prompts ("Factory Method Design Pattern", n.d.).   According to "Factory Design Method Pattern In C# And Vb .net" (2012), there are four participating classes and/or objects participating in this pattern; the Product defines the interface of objects the factory method creates; the ConcreteProduct implements the Product interface; the Creator declares the factory method, which returns an object of type Product; and the ConcreteCreator which overrides the factory method to return an instance of a ConcreteProduct.  The Factory method can aid in design customization with minimal increase in complexity to the overall product.  Where other design patterns require new classes, the Factory Method only requires a new operation to introduce the new subclass into the superclass.  The Factory Method design pattern could be used to offer flexibility in creating documents of different classes but with the same basic properties as all documents.  The default template offered in word processing programs is an example of this superclass ideal where elements such as margins and fonts are predefined, which the user can further classify into a report or letter based on the values input into the subclass interfaces

Façade
The Façade Method, another structural design pattern, provides a unified interface to a set of interfaces in a subsystem.  Façade defines a higher-level interface that makes the subsystem easier to use ("Facade Design Pattern In C# And Vb .net", 2012).  This pattern attempts to provide a simple and uniform interface to a large subsystem of classes.  According to "Factory Design Method Pattern In C# And Vb .net" (2012), there are two participants in this pattern; the Façade knows which subsystem classes are responsible for a request and delegates client requests to the appropriate subsystem objects, and the Subsystem classes which implement subsystem functionality, handle work assigned by the Façade object, and have no knowledge of the façade and keep no reference to it.  An interface designed to access random segments of a complex subsystem, such as a relational database, could be created using the Façade Method to bring together all the details that need to be viewed in a singular interface, such as with an automobile manufacturer.  The factory floor must bring together the thousands of dissimilar components to construct a vehicle and developing a unified interface for all workers to access can make managing the production process, as well as the quality assurance process, easier because each subsystem (factory worker or robot in this case) can see what components have already been assembled to determine which subsystem is next in the manufacturing processes.  Accessing subsystems through a remote server could also lend itself to the Façade Method since there are a number of subsystems being utilized in the background that the user is unaware of.

Conclusion
            Design patterns exist to attempt to simplify the architecture process and assist developers in creating architectures that are easy to maintain or adapt.  The three methods discussed here are useful in their own way, and could all be used within the scope of a larger, more complex solution to achieve a cohesive system that can dynamically respond to users’ needs while minimizing administrative overhead and system complexity.  To say that one design method is preferred over another is a paradox because any design method could arguably be used in most situations, or they could all be used in certain situations and if implemented intelligently could result in a stable system architecture.


 References
Adapter Design Pattern in C# and VB .NET. (2012). Retrieved from http://www.dofactory.com/Patterns/PatternAdapter.aspx#inten
Facade Design Pattern in C# and VB .NET. (2012). Retrieved from http://www.dofactory.com/Patterns/PatternFacade.aspx#intent
Factory Design Method Pattern in C# and VB .NET. (2012). Retrieved from http://www.dofactory.com/Patterns/PatternFactory.aspx
Factory Method Design Pattern. (n.d.). Retrieved from http://sourcemaking.com/design_patterns/factory_method
Hofstader, J. (2006). Using Patterns to Define a Software Solution. Retrieved from http://msdn.microsoft.com/en-us/library/bb190165.aspx#sysptrn_topic2
instantiation. (n.d.). The Free On-line Dictionary of Computing. Retrieved May 28, 2012, from Dictionary.com website: http://dictionary.reference.com/browse/instantiation

...more to come...