Having a generalist knowledge base of IT landscape including systems and concepts is integral to being a successful Cyber Security expert, it forms the backbone of every Analyst, consultant or CISO. Regardless on where you are in life when you decide to enter the Cyber Security industry, it goes without saying that fluency in computer operations, applications, infrastructure and other types of digital literacy are at the forefront.
The JoyPoll was. Have you ever experienced Imposter Syndrome? Yes, a few times. Yes, much of the time! There's a part of me that says no, but another part that knows the answer is yes. Jan 25, 2020 Impostor syndrome is about your lack of belief in your skill at something. Having self-efficacy is a healthy amount of belief in your skill at something. If we increase the latter, we get rid of.
This page in no way aims to be a IT Fundamentals 101 crash course, instead it provides those people of whom may be looking for direction in understanding what baseline of information is required to start a career in Cyber Security. Concepts will be briefly discussed and I you’re certainly urged to dig deeper and conduct more research into those topic in which you aren’t to familiar.
There is a well known psychological pattern called the ‘Imposter Syndrome’ in which an individual doubts their skills, talents or accomplishments and has a persistent internalized fear of being exposed as a “fraud”. The Imposter Syndrome is prevalent within the Cyber Security industry due to the inheritably vast nature of systems, concepts and ideologies that our knowledge coverage is spread across. Its important to remember that it’s near impossible to be a perfect expert in all subject or fields, try your best and strive to learn more!
Generally speaking, we’ll focus on modern computer technologies as I highly doubt there will be a need to investigate malware on a Turing machine. Microsoft, Mac and Unix are the likely subjects that you’ll see in these current ages and each have a prolonged history in which I certainly wont bore you, however if you have the time and interest checkout these links! History of Windows, History of Mac OS, History of Unix.
The concept of networking computers and technologies together was first being developed back in the 1960’s. It was primarily designed for use by the US Military under the pseudonym of ARPANET (Advanced Research Projects Agency Network) to share all the secret military things. After its obvious success and visions of extended functionality, RFC’s (Request for Comments) started to flutter around which inurn provoked many smart engineers to begin strategizing solutions to problems as well as the development for enhancements to computer networking. The Internet came to be during a late night session between friends at UCLA on October 29, 1969 and exponentially grew from that point onwards with technologies such as WiFi, IPv6 and WPA 3 being developed.
The operability of a system come from the underlying program that controls and manages the hardware and other install software and is called the Operating System. All computers, mobile devices, TV’s, home appliances (the list goes on and on) all require an operating system. In an aim to simplify things for the basis of this blog and material here within, we will focus on the major players such as Microsoft Windows (server and workstation), Linux and MacOS. Of course, there are several others that you may come across, particularly IoT specific builds, however this is somewhat of a niche area and one that I am not too familiar with myself.
Each OS differs in how it manages files, permissions and the processing of data. Therefore you will notice not so subtle differences in the tools available and the formatting of system information. A majority of organizations will run Microsoft Windows as a primary operating system for both servers and end user workstations alike. Linux machines are commonly found hosting specific services such as databases or webservices. Macs are for hipsters and are likely found in coffee shops.