Intro to information security


Hello and welcome to Digital Cortex.

This new cyber security page is aimed at contributing to the cyber security community as well as the IT industry as a whole, will be going over there Security concepts and tools.

The aim of explaining industry best practices vendor and technology specifics as well as further social and ethical implications of use and abuse of information technology.


This page is not affiliated with any vendor organisation and stands as a collective effort of cyber security professionals to enhance the digital world.

What is cyber security?

For our first topic we will go over what is cyber security and why is it so important.

Cyber security is the art of protecting information in the digital age, it is built on three primary concepts

  • Confidentiality
  • Integrity
  • Availability

These three towers encompass the safe and viable way to operate information technology systems, from the PC and smartphone to the most sophisticated elements in the data centre, these concepts have been widely adopted by government organisations and the IT industry as a whole, some may consider them relatively basic but there is actually a wide gap in what is considered as the best practice and what is practiced in real life.

It is therefore important to be educated not only on what these concepts represent but also why it is so important to follow and enhance these practices



Confidentiality means that information exchanged between two points whether those points are systems or people operating systems, is withheld from the rest of the world and only they are able to access that information.

This has been frequently misrepresented as the only aspect of cyber security “if we keep it safe then job done” which obviously is not the case, the reasons will be discussed further along. Many products and systems have been developed by multiple vendors in order to safeguard information.

That being said the most important aspect of that equation is the end user. Adding various technical controls goes a long way into securing information but the primary factor of information governance and leakage is the user itself.

The best example to this is writing a very complex password and then sticking it on a post-it under your keyboard.

User education and industry safety practices (and why is it is important to follow them) are equally important to implementing sophisticated controls. As ethical hackers / penetration testers we’re always trying to look for the simple solutions or what is referred to in the business as “low hanging fruits” which would considerably reduce the effort required to penetrate a system or environment.

As security practitioners we are trying to adhere to secure the practices and educate users within an organisation to follow the same practices, these practices may be enforced by either technical or procedural controls.

The primary examples of such security practices are :

Ζero trust principle

The zero trust principal means that no system or human entity is considered trusted, therefore every entity within an organisation or person needs to have  validated access into an information resource.

By limiting access to resources default we ensure that only authorized entities  may actually accesses resources , therefore blocking out everybody else.

This is an often neglected principle due to the excessive administrative overhead it can potentially have, however it is very important especially in very complex or large organisations where there are a lot of cogs moving at any given time.

Least privilege principle

The principle of least privilege the very basic but very important one, the  concept is that no user or system is granted additional access or permissions than is absolutely necessary for them to perform their tasks. 

By limiting each entity to the very base access,  allows for some more granular security control over the environment or the information being handled,  which leaves less room for abuse of authority.

Typical example would be administrators working the banking environment which will have access to perform tasks on the database holding the financial information of all the banks customers but are not able to actually view sensitive information within the database .

Security modelling

There are many security models that apply to different situations and are used to develop an access control schema in regards to the information being secured.


What that basically means that there are different ways to describe access into information, some are regarded as top-down and others are bottom up the most popularly known classification of information is the Bell la Padula model that is made popular from movies and other Pop Culture the designation “Top Secret” derives from that security model.


These security models describe the different access levels within an organisation and how they apply to the information being handled for example a person with “Secret” level clearance me read “Secret” files but not “Top Secret”, somebody with “Top Secret” access may view “Secret” files and “Top Secret” files.

Different security models have have different focuses,  which are aimed at more than confidentiality, other models like the Biba model on more focused on the integrity portion the CIA triad is therefore very important to understand each security model  and where that applies.

In modern security systems, role based access is very important and usually administrators and operators are given access based on the certain set of tasks that they are allowed to perform, which are bundled into what is referred to as the role.

Different users have different kinds of access depending on the roles within the organisation and therefore the actual business case dictates the level of access into each and every system .

For example a database administrator can perform privileged tasks on the database but not on the operating system underneath or the network that hosts the database, respectively in network engineer can perform administrative tasks on the network but is not allowed to touch the database.


Encryption, derived from the ancient greek word “krypton” which means “hidden”, is the cornerstone of confidentiality in information security that dates back to the ancient Greeks, in its earliest form (steganography), encryption is a whole field of science which has demonstrated to have changed the course of History.

 A short description of encryption is the on going battle between the people or systems that encrypt information , the cryptographers, and those who try to decipher the encryption and decrypt the information, called crypt-analysts.  

There are numerous examples of where encryption will determine the rise and fall of a nation. A popular recently appreciated cryptanalysis feat is the cracking of the Enigma machine by the British Intelligence Agency during World War II.  There is a wonderful book that analyses the impact of encryption throughout history for cryptography enthusiasts for anyone interested in history overall which can be found in the reference section.

In the information age it is therefore very important the information exchanged is encrypted across the web, the most popular form of encryption that runs on most web pages is SSL encryption that can easily be distinguished the top level corner of your browser with that hopefully green padlock that indicates that the session is encrypted.


Especially when transporting information over the internet, there are many ciphers and algorithms that are used in order to encrypt information, the primary cipher types are:

Symmetric algorithms

Symmetric algorithms use the same key to encrypt and decrypt the information,  these are the first ciphers that have ever been developed and still find abundant use due to the speed of the calculations being far greater than those of asymmetric ciphers.

If  however the security key is compromised, then all communication can be read in the clear and more so the offender  can impersonate the legitimate owner of the key

Asymmetric algorithms

Asymmetric algorithms use the different key on each side to encrypt and decrypt the information . This system ensures that if one sides’  key is compromised that side cannot impersonate the other or have access to the information without a verified key exchange.

In modern security systems is a continuous change of the encryption key renders it useless over time therefore minimizing the security risk of compromised keys.

The downside of this form of encryption is this severe performance overhead on machines to constantly renew and maintain the keys that are being exchanged therefore not advised on certain use cases.

The most prevalent encryption algorithm at this point in time is the AES-256 algorithm which is an asymmetric algorithm, widely used across web pages,SSL and IPsec VPN implementations for network implementations as well as encrypting information on local machines and hard drives or across different platforms.


Passwords are the oldest form of information security ,that was developed early on,  that effectively employ encryption in order to safeguard information both locally and over a network.

Still widely used today it is the most dominant form of securing information over the years, there are many ways to bypass password security on files and services by either  discovering or inferring the password or simply tricking the user into providing it.

There are various ways into which that can be done technically or through social manipulation( or as it’s widely known in information security industry social engineering), on the opposite side of that front , multiple security mechanisms have been developed in order to counter the various threats to password safety.

As mentioned before this is a two part composite, for the technical controls only compliment the security awareness of an individual or an organisation and are no substitute for good security practices. The most common ways securing passwords are:

Frequent rotation

By frequently changing ones password we ensure that even if compromised that password is no longer valid and therefore an attacker would need to repeat the whole process of obtaining it, impeding his efforts and hopefully leave them open to interception and mitigation.

Multi Factor authentication

Multi Factor authentication has introduced the next level of security by factoring in multiple inputs as well as the time when the information is being accessed  in order to authorise access into that piece of information.

What’s also very important about that is that each form of entity authentication is independent to the other, the most commonly used multi Factor authentication would be a password plus a token that would generate seemingly random numbers, based on an algorithm that can be verified and replicated, in order to ensure that the access is legitimate at that point in time.

Any discrepancies in the time of access, the token code or the password would lead into denial of access to the required resource.

Password management systems

Password management systems are the latest trend in authentication technology where they generate a very long password per connection which is unique to either that application or session , in this way there is no timeframe for an attacker to predict the possible passwords or perform social engineering, as even the authenticating user is oblivious to what the transmitted password is.

After the password being used, any interception that could happen  would be rendered useless for the next time a connection or access is being attempted.This is a very robust method for managing passwords and it’s frequently used by all types of  IT professionals and administrators.


Integrity ensures that information passed on from one point to another is unaltered and intact, there are various methods and technologies with which to ensure that, the main goal is to ensure that no unauthorised tampering of information is allowed and that the chain of custody remains intact at all times.

Integrity is extremely important and specially in the field of Digital Forensics where information  collected is admissible as evidence in court. It is also very important for maintaining an audit trail of activities performed in various systems to ensure that no malicious party may alter logs in order to cover their tracks and conceal what has been wrongfully done.


In order to ensure integrity one of the best tools that is being used widely across many devices whether they are aimed at the end user the network or the infrastructure are hashes. The process of cashing create a digest of a message or file that is encrypted in an alphanumeric sequence unique to that item.


Any change to the file or message would result in a change in the hash by comparing the two values, we compare them and determine whether the  digest value is different and therefore detect tampering in the specific item.

Hashing in cryptography validates that the encryption key has not been tampered with and that the message is authentic since hash values can be calculated based on the  algorithm used, by the recipient of the message or encryption key, who can then validate that the sender is authentic.

In digital forensics the same calculations occur in order to make sure that key files within systems have not been tampered with, prior during or after the investigation, therefore insuring that the chain of custody is intact. United States legislation States clearly that in the case of evidence tampering, they consider the evidence as inadmissible in court. It is therefore evident why is so important to maintain integrity within the information.


There are multiple technologies available to protect the integrity of files and information in general but we will not  dive deeper into that field for the moment being full stop it is sufficient to say that is a very important requirement and worthy of attention as it has implications in the real world as much as the digital one.


Availability is  as the name implies the capability to have a resource available to us at any given time. Depending on the importance of that resource to the owner / user, the need to have it always available increases.


For example, the primary source of frustration for many users throughout the internet especially when trying to load a social media that they are not able,  has often resulted in an exponential number of calls to the local police station, magine what will happen if they lost the availability of a resource like the power company or emergency services or an organ transplant service.

The social and economic implications of the loss of availability can be grave, resulting in the loss of revenue or even life. There are multiple technologies that are invested into providing high availability as it is referred to in the business and the limit of these technologies are only constrained by the budgets of your organisations putting them in place.

Redundancies and convergence

Redundancy in any type of a system is having a spare system ready to assume the responsibilities of its partner once it goes down or is otherwise compromised.This form of redundancy is widely known as clustering and is implemented by various vendors in different types of devices ,applications and services.


The main idea behind it is that there is a transparent or near transparent failover from one device to the next therefore the end-user is oblivious of what has happened behind the scenes and that the confidentiality and the integrity of the information that traverses the system at that point in time is preserved.

Members of a cluster can be geographically joined or separated spanning vast distances across the globe,modern networking technology  has provided fast and reliable networks that are able to facilitate this need.


Convergence is the term used to describe the time that is required to seamlessly redirect traffic from one destination to another location clustering that means that when the original host that was serving the traffic whether that be a server or network device, is no longer available,  the redundant host will assume that role. The time gap for this transition from host A to host B is what we call convergence.

Load balancing

Load balancing technologies enhance performance and provide availability, they are usually network devices that receives incoming traffic on to a resource and redirect it to a pool of hosts, ready to serve this traffic.  

Load balancers are capable of performing diagnostics on the availability and performance metrics of the hosts receiving the traffic and make very accurate estimations that they are ready to perform their role.

Based on the advanced configuration administrators can set criteria for how the load balancing is performed and therefore ensure that traffic is only redirected to hosts that are able to perform their functions adequately.

Modern load balancers can also introduce security checks when performing the load balancing and inspect the traffic destined for the end hosts, in order to protect the infrastructure from malicious attacks or resource exhaustion. This level of next-gen inspection allows granular control as well as a clear audit trail of the information exchanged between points.

Security approach

I hope it is clear by now but this is a multifaceted and highly demanding science where there is always something new around the corner, the question that usually arises when discussing cyber security is:  “What is required in order to secure information?

is it network is that application or user control?”  the answer is all of it and then some.

The approach to security has to be holistic, it covers many areas from physical security to digital and procedural controls. it is important to understand that each security concept complements the other and there are very few that are actually mutually exclusive.

Security industry professionals have been tasked  with fulfilling multitude of needs under one big umbrella that is called security. It is therefore important to understand that specialization in various fields of security is mandatory will having an overview of what is required in the basic concepts is even more so.

Specialization gives  a security professional the ability to understand an area of security in depth, but without a broad overview or what is required and how each piece interacts with the others, it is a partial effort.

When comes to cyber security we all have a role to play weather it is behind a keyboard, in the data centre or at home educating around the basics information disclosure and why we shouldn’t share personal information on the internet.


These are some basic cyber security concepts and we have barely scratched the surface what cyber security And I hope you enjoyed this post.  I will continue to delve into technologies tools and concepts in order to explain this multifaceted science, I hope you find this information useful and if there’s something specific you would like me to cover please contact me.



Zero trust:

Security Models:


The code book – Simon Singh




Tunneling Cheatsheet

All the methods below are meant to use a pivot host in order to get access to a victim that is not directly accessible.

Windows (native)

Map local port on victim1 in order to target victim2.


 netsh interface portproxy add v4tov4 listenaddress= listenport= connectaddress= connectport=



Create dynamic SSH tunnel towards remote host using a local port. Usually used with proxychains or other SOCKS proxy. For simplicity the port that the SOCKS proxy will connect to is 9050.

ssh -ND 9050 user@victim

Chain 2 hosts together and make victim2 available via victim 1 using a local port.

ssh -tt -v -L9050:localhost:8157 user@victim1 ssh -t -D 8157 user2@victim2 -p 228

My OSCP Journey

To establish my street cred and give an insight into where my perspective comes from, my background is mostly in perimeter security where I have been working as a blue team engineer / consultant for the last 10 years, primarily with network and application firewalls of multiple vendors Check Point, Fortinet, Cisco, Juniper, Palo Alto, Imperva, McAfee, along with multiple security products

After acquiring an interest for offensive security in 2014 I got the eCPPT or PTP certification as it is referred now, a week after I got the eCPPT certification my sons were born. As a father of twin toddlers my time is always borrowed so I decided to finally tackle OSCP in 2016 when I was feeling I had the time to commit.


In preparation of the course it self I revisited all my notes from eCPPT, all the material surrounding Buffer Overflows, Metasploit as well as scripting where I was rustiest after not doing much apart bash scripts on my day job.

To that end I decided to buy a couple of books just to sharpen my skills a bit and go on the front foot of the basics.

The 3 books I bought and read:

BlackHat Python – Great python book, I developed some great tools because of this.

WebHacking Exposed– Good book, more on the basics side but a great start for beginners.

GrayHat Hacking 4th Edition – A great all-rounder, haven’t gotten around to finishing it.

One fine day, when I felt ready to kick the tires on this project I enrolled in the course with a start date of early September, I opted for the 60 day lab access with high hopes of taking the exam before Christmas.

After a month of reading and waiting for my company to approve the cost of the course (they were kind enough to foot the bill, who am I to insult them), I finally got the confirmation that my registration was complete and I would be starting on the requested date.

My heart skipped a beat with both anticipation and sheer anxiety as to how well I would cope, this course has a reputation of being intense and I had to balance my time between.

Course and Labs

I started the course on the day the email came through after going briefly into the material I started downloading the custom Kali image, login the forum, join the chat room and finally read the material, I felt giddy and a bit lost at the same time.

So being a simple git I decided to keep it simple and not overstress about the lost lab time but start by going through the material and deal with the lab itself later. I went through all the exercises and videos within 7 days and then it was on to the lab.

The material is very basic and serves into pointing you to the right direction rather than hold your had through it. This is where I disagree with Offsec and I believe their material should be more entry level friendly as this is an entry level exam, e-Learn Security was more educational, although it did take some shortcuts that Offsec didn’t, yay to me for having both.

The labs…. What can I say… I felt like a lost puppy the first few days looking for the low hanging fruits and struggling to map out the lab as it is vast. So I created a spreadsheet with the live hosts I could detect as well as the progress I had made on them.

After spending a few hours enumerating a host if I hadn’t any obvious lead I would park it and move on, slowly and steadily the low hanging fruits fell and some machines implied their dependency on me, its all about becoming methodical and thorough.

It cannot be stated enough, enumerate enumerate enumerate! If your stuck somewhere you haven’t researched it enough!!!

My Spidey sense started becoming attune as to which hosts had more for me to enumerate either pre or post exploitation and which seem to be dependent on something else. A huge boon to this was the forum which people gave hints vague at first but after a few hours of struggling with a host you get that Eureka! moment where you understand “Oh that’s what he meant”.

The admins will give you a hint if you are hopelesly stuck but will not ever give you an outright solution, also moaning about it will get you the opposite of what you want, if you generally display a decent effort you will be awarded… by none other than yourself, this course teaches self reliance if anything.

Research into alternative tools (ex fuzzing) enhanced my arsenal and soon I was using 3-4 different tools of the same scope to attack a host with varied results, what was beautiful about the lab though, is that the best results were yielded but all hands on work. A good number of hosts requires you to exploit a misconfiguration rather than a known skr1pt k1dd13 exploit, which forces you to research and understand the intricacies of both OS and applications.

The big four (Pain, Sufferance, Gh0st, Humble) kicked my teeth in many times and they took me through the emotional rollercoaster of :

1.Let’s have some fun

2.Oh sh1t what’s this

3.I’m Fn stupid

4.Kneel before r00t!

The admins will not give you any hints on these and you will need to man up and take them on all by yourself.

What was surprising is that although I pretty much left those for last I found in the forum posts that people who had sliced and diced the big four, had a hard time with some machines that I found relatively easy and here cometh the lesson:

Your exposure and experience determine what is difficult, each student faces different challenges depending on their knowledge and experience that is unique to them, you may be good at web exploitation for example but you may not be in network or scripting or puzzles etc.

Offsec did an awesome job to create a monster of a lab that will test its students in a variety of ways, furthermore many machines are vulnerable to multiple vectors so you have a lot to play with. It’s all about patience and persistence.

After extending my lab access and a total of 80 days of gruelling, relentless battle in the labs all hosts were checked off in my list J Woohoo!!! Time to book the exam.

The Exam

Game time! After all you go through in the labs you don’t feel ready and you shouldn’t, complacency is not something you can afford.

I spent my last 10 days in the labs knit picking the hosts I had selected for my lab reports and found out I had a few holes in my notes (hard to think straight at 4 in the morning) which I couldn’t remember what exactly I had done. Uber valuable for the exam itself (as my eCPPT experience also taught me), I kicked myself in the backside to keep better notes.

I also created a monster cheat sheet of all topics that I can quickly reference at the drop of a hat for the exam, I polished my scripts and then waited.

At 09:00 on the Monday that I had selected for the exam the email arrived, or rather didn’t in my case thanks to our new anti-spam (made note to self to drop lots of spit in my mail admins’ coffee), luckily I had registered my other email with the admins and after a quick chat with support, 09:15 I was off to exam land.

The first few hours were just keeping calm, reading through the exam objectives and following the overall enumeration process steps I had developed and had documented for myself. The “easy” hosts of the exam were selected to get the ball rolling, but in reality I was running some iterations of tools to capture all the information I could obtain from the auto and semi-automated tools.

3 hours later the first host fell and I was midway into another, that was the end of the fast track though as I stumbled my first big hurdle, in short the next 10 hrs were spent on one host, after that it felt downhill although I did not let myself get cocky, I spent another 8 hours going after the rest of the hosts, in the end I had only one host left which I managed to get a low priv shell only.

In between I ate, cried laughed, did some pushups, took a few rides on the previously mentioned roller coaster and finally at about 05:00 in the morning I was done, I had done as much damage in the lab as I could and had collected as many screenshots, notes and evidence I could.

After a 3 hour nap it was on to the report, I spent about 6 hours writing it, in the words of Armando Romeo founder of e-Learn Security, writing the report is like telling a story you already know, this is where the detailed documentation came in handy.

I spent another 4 hours revising it and going through all the stupid mistakes one makes when one is more than 24 hrs into it and with a slight sense of dread I uploaded my reports (lab and exam). 8 hours later I got a simple email confirming my submission.

I chose to report on 15 machines in the lab and that ran me 220 pages of fully documented (with screens and code) material, the exam report was an even 64 page document. Some people tend to be more brief, I chose to go fully documented.

The worst was ahead, the wait… I was climbing the walls for 3 days until one fine Thursday evening I got a confirmation I had passed… a huge weight lifted off me and I could enjoy Christmas with the kids and be jolly.

Lessons learned

  1. Never give up, ever! If you can’t find a solution you haven’t looked hard enough.
  2. Document your notes in a clean way. Pass on what you learned (without spoilers ).
  3. An hour of careful enumeration saves two days of …. Frustration.
  4. Nothing is impossible except turning back time, so manage it carefully.
  5. Don’t be intimidated but also don’t get cocky.
  6. I never realised how much I wanted this until it was 03:00 and I hadn’t peed for 6 hours.
  7. Buy tons of flowers /gifts for the wife as she will need to back you up big time!
  8. Have fun !!! Even if you fail that’s ok, a setback = a setup for a comeback.