Skip to content

2024

Introduction to Cloudflare: Empowering Secure and Efficient Internet Experiences

Cloudflare has emerged as a critical player in enhancing the performance, reliability, and security of the internet.

Its suite of tools, including web application firewalls (WAFs), content delivery networks (CDNs), DDoS mitigation, and Zero Trust services, addresses the diverse needs of businesses in a digitally transforming world.

The company’s emphasis on automation and scalability makes it a go-to solution for both small enterprises and global corporations.


The Importance of Learning and Practice in Cloudflare’s Ecosystem

Understanding the capabilities of Cloudflare is essential for

AWS IAM Policy Simulator

How to Validate AWS IAM Policies with the AWS Policy Simulator. A Deep Dive

Introduction

Brief overview of the AWS Policy Simulator

An underutilised yet powerful tool that helps verify the impact of IAM policies before deployment.

Highlight the importance of policy validation to ensure the principle of least privilege, especially in environments with strict compliance requirements.

Hashcat in AWS GPUs NVIDIA and password cracking, including performance benchmarks

Cracking Passwords with Hashcat - Performance Benchmarks and Security Implications

Important

DISCLAIMER - this is for educational porpuses only! You are responsable for your own actions.

Alert

GPUs instances can be expensive! Make sure you know and you can affort the cost of the instances you are selecting


Introduction

Hashcat is a widely-used, high-performance tool for cracking password hashes.

Its versatility across various platforms, including local machines and cloud instances, highlights how quickly seemingly complex passwords can be broken.

This post delves into the practical applications of Hashcat, explores password security risks, and presents benchmark comparisons between Apple's M1 chip and AWS GPU instances.

CrowdStrike - Leading Cybersecurity through Major Incidents

CrowdStrike

CrowdStrike is a leading cybersecurity company specializing in endpoint protection, threat intelligence, and incident response. Their flagship product, CrowdStrike Falcon, is a cloud-native platform offering comprehensive security solutions.


Content below

The content below is the result of an interaction between Antonio Feijao UK and ChatGPT. Reminder that the content on this website are my own opinions. Use at your own responsibility.


Major Events and Contributions

The Power of Knowledge Sharing and Education

image generated by ChatGPT, image generator - prompt

Create an image that symbolizes the power of knowledge sharing and education. The scene should feature a diverse group of people exchanging books, ideas, and technology in a vibrant, modern setting. In the background, there could be elements representing historical knowledge, such as ancient scrolls and the Library of Alexandria, merging seamlessly with modern educational tools like computers and tablets. The atmosphere should be bright and inspiring, with visual metaphors for growth and enlightenment, such as light bulbs, open books with glowing pages, and interconnected networks symbolizing the spread of knowledge.

About this post

This post is result of an interaction between Antonio Feijao UK and ChatGPT.

timeline of major historical events

https://www.antoniofeijao.com/timeline/ - Example of a timeline with major historical events that, if prevented, could re-share history as we know it -


Preserving the Library of Alexandria

Historical Context

The Library of Alexandria, established in

Basics web scraping using Python3 with BeautifulSoup4 and then convert to Markdown

Basics web scraping using Python3 with BeautifulSoup4 and then converting to Markdown

Basic Python BeautifulSoup4 web scraping and then Markdown

pip install requests
pip install beautifulsoup4
pip install markdownify

import markdownify 

import requests
from bs4 import BeautifulSoup

def beautifulsoup_web_scrape_url(url):
  response = requests.get(url)
  soup = BeautifulSoup(response.content, 'html.parser')
  return str(soup)

url = "https://www.antoniofeijao.com/"

data = beautifulsoup_web_scrape_url(url)

print(data)



# convert html to markdown 
h = markdownify.markdownify(data, heading_style="ATX") 
  
print(h)


f = open("result.txt", "w")
f.write("##result file done. Woops! I have deleted the content!##")
f.write(h)
f.close()

#open and read the file after the overwriting:
f = open("result.txt", "r")
print(f.read())

inspiration-from


Happy learning

by Antonio Feijao UK

List all AWS VPCs or subnets with theirs tags and list them using jquery

Example AWS cli command with listing using [.jq(https://jqlang.github.io/jq/)].

This commands list all AWS VPCs within the account with their VpcId, CidrBlock and their Tags.

aws ec2 describe-vpcs | jq -r '.Vpcs[] | "\(.VpcId) \t \(.CidrBlock) \t \(.Tags[])" '

It is also possible to "select" a specific Tags.

aws ec2 describe-vpcs | jq -r '.Vpcs[] | "\(.VpcId) \t \(.CidrBlock) \t \(.Tags[] | select(.Key == "Application") | .Value)" '

example, select VPC name and sort by VPC name.

aws ec2 describe-vpcs | jq -r '.Vpcs[] | "\(.VpcId) \t \(.CidrBlock) \t \(.Tags[] | select(.Key == "Name")| .Value)" ' | sort -nk2

documentation for https://awscli.amazonaws.com/v2/documentation/api/latest/reference/ec2/describe-vpcs.html


List all Subnets with filter .key, .value data that I picked.

the sort -nk5 at the end, put on put the subnet with the least number of available IPs, taken from .AvailableIpAddressCount.

aws ec2 describe-subnets | jq -r '.Subnets[] | "\(.AvailabilityZone); \(.AvailabilityZoneId); \(.VpcId); \(.CidrBlock); \t \(.AvailableIpAddressCount); \t \(.Tags[] | select(.Key == "Name")| .Value)  "  ' | sort -nk5

documentation for https://awscli.amazonaws.com/v2/documentation/api/latest/reference/ec2/describe-subnets.html


Next, why not rotate through other AWS accounts in the Org if you have them?! and rotate through regions?! :)

I have been there, done that, so leave the challenge for you :)


Happy learning,

Antonio Feijao UK

lime-linux-ubuntu-step-by-step

LiME on Ubuntu Linux, live memory capture.

sources and learning material:


LiME step by step

My adaptation for manually testing LiME in a step-by-step method.

USE AT YOUR RISK

## check if LiME is installed

if [[ `lsmod|grep lime|wc -l` -gt 0 ]] ; then
	sudo rmmod lime.ko
fi

kernel_release=$(uname -r)
kernel_name=$(uname -s)

echo "
kernel_release : ${kernel_release}
kernel_name    : ${kernel_name}
"

## function - I executed one line at a time
installLimeApt() {
	sudo apt-get -y update
	sudo apt-get -y install git
	
	sudo apt-get install -y linux-headers-$1
	#sudo apt-get install -y linux-headers-${kernel_release}
	
	sudo apt-get install -y build-essential
	
	cd /tmp && sudo rm -rf LiME
	
	git clone https://github.com/504ensicsLabs/LiME
	# >> could not clone, so I copyed 1 file at a time <<
	
	cd LiME/src
	
	make
	
	lime_path=$(pwd)/lime-$1.ko
	#lime_path=$(pwd)/lime-${kernel_release}.ko
	echo "lime_path : ${lime_path}"
}


# I run the commands one by one
#installLimeApt $kernel_release

# loading the kernel module
sudo insmod $lime_path path=tcp:4444 format=lime localhostonly=1 &

# confirm the LiME kernel module is "listening" on port 4444
netstat -patnl | grep 4444

#sleep 120

if [[ `lsmod|grep lime|wc -l` -gt 0 ]] ; then
	echo "LiME has been loaded"
fi

MEMSIZE=`awk '/MemTotal/ {print $2/1024/1024}' /proc/meminfo`
echo "MEMSIZE: ${MEMSIZE}"

METADATA_FLAG="--metadata uncompressed-size=$MEMSIZE,kernel-name=$kernel_name,kernel-release=$kernel_release"
echo "METADATA_FLAG : ${METADATA_FLAG}"
# sample output >>> `METADATA_FLAG : --metadata uncompressed-size=31.0748,kernel-name=Linux,kernel-release=4.4.0-184-generic`


# copying memory dump into S3
#s3cp() {
# aws s3 cp - {{s3ArtifactLocation}}/linux_memcapture$1 $2 $3 $4
#}

# original command
# cat < /dev/tcp/127.0.0.1/4444 | tee >(gzip | s3cp \".lime.gz\" \"$EXPECTED_SIZE_FLAG\" \"$METADATA_FLAG\" \"$ACL_FLAG\") | sha256sum | s3cp \"_sha256.txt\" \"$ACL_FLAG\"",

# compressed memory
#cat < /dev/tcp/127.0.0.1/4444 | tee >(gzip > ./linux_memcapture.lime.gz)

# raw memory dump

cat < /dev/tcp/127.0.0.1/4444 > ./linux_memcapture.lime
sha256sum linux_memcapture.lime >> _sha256.txt

# remove the kernel module
# most of the time I tested, the kernel module `lime.ko` "removed" itself.

sudo rmmod lime.ko

Happy learning,

Antonio Feijao UK