Showing posts with label Microsoft. Show all posts
Showing posts with label Microsoft. Show all posts

Thursday, November 13, 2014

Digital Security Discussion

Topic of discussion in my Enterprise Models class tonight: Digital Security.  Something I touched on earlier this year.

Our text postulated: "Increasingly opening up their networks and applications to customers, partners, and suppliers using an ever more diverse set of computing devices and networks, businesses can benefit from deploying the latest advances in security technologies."

My Professor said: "My thoughts on this are opposite: by opening up your network, you are inviting trouble and the more trouble you invite in, the more your data will be at risk. I understand what they are hinting at, with a cloud based network, the latest security technologies are always available, therefore, in theory, your data is more secure. Everyone needs to keep in mind though, that for every security patch developed, there are ways around them."

He went on to mention how viruses could affect the cloud as a whole and that companies and individuals moving to cloud-based platforms will become the next target for cyber attacks as the model continues to thrive.

Which is all relevant, however I have a different perspective on digital security. My counter argument to that is user education is the key. I have debated this topic, security and system users, many times over the years. Like most of us in the industry information security is paramount. With the multiple terabytes of data we collect in our home systems, and even more in online interactions, keeping our data safe is really our last defense in privacy and security. As more companies and individuals implant their corporate and personal data upon cloud platforms there is an uneasy sense of comfort for many people, including some seasoned pros. Companies like Google and Microsoft whom both have highly successful cloud models across the board have taken responsibility for ensuring they have more than adequate digital and physical security in their data centers, which to an extent leaves it to assumption that the data and applications they warehouse and host are generally safe from intrusion. Users are the key to this whole ecosystem we have created. This is where user education becomes critical. As most seasoned techies know, in the beginning systems and system operations were highly technical in nature and only the most highly trained or technically creative individuals could initiate and manipulate computer systems. Viruses were something you caught from kids at school or coworkers, not a daily blitz of digital infections numbering in the hundreds of millions perpetually attacking in various forms. As systems got more complex in design but simpler in use the users technical ability level eventually became irrelevant. People ages 1 to 100, and even some very well trained animals, can all navigate systems and digital networks with very little effort. Our systems now do all the work for us, users simply need to provide basic instructions and gentle manipulations, instead of hard coding instruction sets and inventive on-the-fly program generation as was the status quo in the 70's, 80's, and 90's. This idle user perspective is the reason why criminal hackers are still traversing firewalls and breaking encryption algorithms, and they are growing in numbers as is evident by the number of new malware detections and infections quantified annually across all digital platforms and all continents. Educating users on general best practices for system use and maintenance, how to identify potential scams, how to detect spoofing and malformed websites, what to avoid when reading emails or reviewing search results, and which security software is functionally the best whether free or paid is critically important today more than it has ever been. The problem is that the industry has created the lazy user by essentially conveying that security is a given. Microsoft even made a concerted effort by including the Windows Firewall and Windows Defender as a part of its operating system by default so that there was some protection for their users out of the box. This was in response to a large number of users, whom had been infected by one or more viruses, that assumed they were protected because "it's from Microsoft, it has to be safe" which was further from the truth than they could understand. As an educated user that knows how to secure systems and networks, I take it upon myself to ensure that users appreciate they have to set a passwords when logging into various systems and services. I teach about the importance for digital security and how to be more security conscious with their every day interactions. I teach them how to correctly navigate Internet search results (avoiding "ads"), how to understand various security prompts and what they look like so they don't ignore them, what security solutions should be installed and how to identify them, etc. This improved knowledge has created a culture of awareness for my users both at work and at home. I am regularly consulted by my peers on how to secure their own families and how to explain it to their children. This creates a more intelligent user and thereby creates a more intelligent user community at large, making the Internet a bit more secure. All of that said, it only takes a single character missing from source code to give a programmer the ability to break the program and cause havoc, or a user inadvertently installing malware. Even the most seasoned users make these mistakes from time to time because we are all human, and as such we are fundamentally flawed, making no security solution 100% secure because they are developed and used by humans. Best you can do is make every effort to educate and secure, and hope no one targets you because if they want to get in bad enough, they will get in and you won't be able to stop them.

~Geek

Sunday, October 21, 2012

What is an enterprise system is and how can this design support testing processes?




An enterprise system is a compilation of separate but related modulated wares that are integrated with a single cohesive database, with multiple interfaces, to achieve the business purpose of an organization across multiple departments, in an effort to consolidate separate legacy systems and improve overall efficiencies. Enterprise systems are complex by nature but with the right planning and proper execution success can be reached and the benefits to an organization can be immense.

The enterprise system paradigm supports testing processes by offering a wide range of potential test cases for every aspect of an organization, effectively enabling developers and systems engineers to better the system as a whole for the entire enterprise in specific ways. The concept behind enterprise systems is integration through modulation, empowering organizations to perform any function required and change the system on demand and/or based on local need. For example, an accounting module that is functionally sound for US locations will calculate salaries differently than what is required for a European location. Currencies are different, taxes are different, pay scales are different, etc. As such, modified/localized versions of modules allow the organization to deploy localized versions of the accounting module, while still integrating data to the central database. This allows executives from any locale to gain insight into labor trends and costs across the enterprise to make more intelligent decisions on the direction of the company on a global scale. Test cases can be created to compare modules and sub-modules to see which are transferable to other locations of the organizations, and then run in tandem to determine functionality. Since enterprise systems are sold by the module, then allow for some customization on the customers part, the accounting module in general should be transferable to any locale, with some minor modifications to allow for local laws and practices, which saves the company money overall. It is far easier and less expensive to modify an existing module to allow for proper payroll calculations based on local laws, as an example, than it is to have the developer write a completely new module for each location that requires it and then figure out how to integrate that data without having to add too much to the already complex central database. This testing model applies to any aspect of an enterprise system: inventory, human resources, manufacturing, etc, but with different data sets. Having a global infrastructure also allows administrators to tap into collective resources to evaluate and gain feedback on any proposed update/upgrade. Sometimes, asking workers simple questions can eliminate the need for many costly test cases, which when performed in excess can actually result in project failure due to never really gaining momentum and being stuck in procedure or policy, as it were. Sometimes just listening to the users can be an administrators best test case, as long as they are willing to hear what is said and then make sure that the executives buy into the concept.



Have a question? Have a comment? Comment below, let's start a dialog.






~Geek

Sunday, October 7, 2012

Integrating Information System Knowledge

Integrating Informations Systems Knowledge within an Organization

The first discussion question for my last class went something like this:

"What is the best approach to integrating IS knowledge domain-specific and professional core competency needs with your organization? Why is this approach better than others?"

After reading my fellow classmates responses, and pondering my own perspectives on the question, here is what I came up with as my response:

The theme I see in my classmates responses mirror my own opinions on this topic - the best approach is to evaluate the IS resource to determine its usefulness, or kind of knowledge, and apply the resource based on the needs and goals of the organization.  As information systems have evolved into the complex and interconnected platforms of the modern era, more domain specific competencies have emerged, resulting in a need for more specialization from IT professionals.  My families business has been scaled down over the past two years due to the economic downturn  however we continue to thrive because of my unique understanding of information systems and how they can help us be more efficient with less knowledge workers for our type of business.  The reason why we remain successful is because of information systems.  Having grown up watching my father and grandfather build the business to what it was after over 60 years (dozens of workers, four locations around the globe, representing more than 100 major brand names for all aspects of construction and hospitality supply), being a part of the downsize effort was sobering.  Before the downsize, our core competencies were spread among departments, much like found at any corporation, based on the level of skill and domain specific knowledge.  Since I am the systems engineer it is my responsibility to connect the dissimilar and unconnected workers to each other   Understanding the core competencies for each position, and its importance to the whole of the company, coupled with my IT experience and system prowess, I was able to create a cohesive environment of wares and solutions that not only integrated all domain-specific workers to the larger whole, but also helped refine our core competencies to what they have evolved into today.  A mostly interchangeable workforce that can be effective and efficient across all knowledge bases.  As another classmate noted, it would be unconventional to have an IT tech move into the accounting role especially within a larger organization, but in small businesses workers need to have a mix of knowledge because more is required by less.  Integrating information systems is the only way to accomplish this with any level of organization and efficiency.  Without properly planned and implemented information systems, and skilled workers to use them, most companies would fail before they get going because they could not remain competitive in the market.  Staying "lean and mean" in modern business is the only way most companies are still around after the harsh economic downturn  and the low cost of technology has enabled them to stay open, as well as pave the way for start-ups to reach a profitable state faster than ever.  The trick is to plan, plan, and plan.  Going into a information system migration/integration takes careful planning, testing, redesigning, retesting, and finally implementation and maintenance.  Trying to integrate the same without a plan almost always results in catastrophic failures.

...damn I love this shit...

Do you have any thoughts on the subject?  Or any questions on my contribution?  Post a comment, let's discuss.


~Geek

Tuesday, April 24, 2012

What security issues must be resolved now which cannot wait for the next version of Windows® to arrive?

A recent discussion in my Operating Systems class prompted an interesting response on my part to what the main security issues afflicting Microsoft's systems are. Here's my 2 cents:

The most common threats to Microsoft systems on the consumer and business side is through Internet Explorer and Microsoft Office vulnerabilities. When reviewing this months Microsoft Security Bulletin, there are critical updates to patch various vulnerabilities across all releases of the Windows OS for Internet Explorer versions 6 through 9, and Microsoft Office version 2003 through 2010 SP1. The not-so-seen critical updates are for the .NET Framework, which supports interactive sessions with users through browser windows, are related to the Internet Explorer issues of an attacker being able to execute remote code by having a user visit a spoofed website and/or clicking on a link/banner that contains the malformed code. What I find interesting in this months report is that the majority of the notices that Microsoft put out have to do with the same vulnerability, namely the ability to allow an attacker to remotely execute code through a browser session. As many other of my classmates have mentioned in their posts this week, this is part of the evolution of operating system software in particular. Microsoft spends millions of dollars and thousands of man hours developing, and hardening, their kernels. With a user-base of Microsoft software reported at over 1 billion users world-wide, there are only so many scenarios that can be built into software testing labs making it impossible to correct every problem before the product is released to manufacture. Plus, many of these users have advanced knowledge of systems and software who can find vulnerabilities under scenarios impossible to test for under lab conditions. A lot of these users report those vulnerabilities to the development team so a patch can be created and released to the masses, others are not found out until a virus or some malware is put into the wild to exploit them. Security companies like Symantec and McAfee use intuitive software to track these attacks and inform the developers of the issues while generating their own patch, or cure as it were, to the impeding exploit. In the cases of major global attacks, Microsoft works with the security companies, and government entities, to create a cohesive solution to not only cure infected systems, but also protect unexposed systems from future exploitation.

With the great investment it takes to create a major operating system release, patching makes the best sense for providing important updates to a system, without disrupting the flow of user adoption and education on best practices. In some instances, such as when a system issue deals with deadlocked processes, a workaround could be implemented, such as adding a forced delay in processing time with processes competing for the same resource. This is why some of the Windows Updates sent out are related to changing registry values or adding a batch file to affect a process workaround while the development team evaluates whether this is an isolated or potentially wide-spread issue. If they determine that the issues can be replicated over a majority of the systems, then they will create a patch to permanently change the process and resolve the condition, otherwise they will leave the workaround in place for the limited cases that do come up under unusual scenarios.

What do you think? Am I on point, or way off base?

~Geek