<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0"><channel><title><![CDATA[Angel's Cloud Blog]]></title><description><![CDATA[Angel's Cloud Blog]]></description><link>https://blog.it-anc.cloud</link><generator>RSS for Node</generator><lastBuildDate>Thu, 16 Apr 2026 21:22:38 GMT</lastBuildDate><atom:link href="https://blog.it-anc.cloud/rss.xml" rel="self" type="application/rss+xml"/><language><![CDATA[en]]></language><ttl>60</ttl><item><title><![CDATA[Weekly Reflection]]></title><description><![CDATA[Week 1
This week marked the beginning of my weekly learning-in-public journey in Cloud and DevOps. It’s amazing how much you can build just by showing up every day and doing one thing at a time.
What’s up y’all I’m Angel. This is IT with ANC and welc...]]></description><link>https://blog.it-anc.cloud/weekly-reflection</link><guid isPermaLink="true">https://blog.it-anc.cloud/weekly-reflection</guid><category><![CDATA[Homelab]]></category><category><![CDATA[debian]]></category><category><![CDATA[proxmox]]></category><category><![CDATA[gitops]]></category><category><![CDATA[ArchLinux]]></category><category><![CDATA[#learning-in-public]]></category><category><![CDATA[windows server]]></category><dc:creator><![CDATA[Angel Chavez]]></dc:creator><pubDate>Fri, 26 Sep 2025 14:15:24 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/ZoXCoH7tja0/upload/f842d77644255ecd31063be266ebbf99.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<h1 id="heading-week-1">Week 1</h1>
<p>This week marked the beginning of my weekly learning-in-public journey in Cloud and DevOps. It’s amazing how much you can build just by showing up every day and doing one thing at a time.</p>
<p>What’s up y’all I’m Angel. This is IT with ANC and welcome to my blog, let’s get into it!</p>
<hr />
<h2 id="heading-what-i-worked-on">What I Worked On</h2>
<p>Here’s a closer look at what I built, configured, and learned from Sunday through Saturday</p>
<h3 id="heading-debian-13-home-server-setup">Debian 13 Home Server Setup</h3>
<p>I repurposed an old Surface Laptop 2 as a home server. I installed Debian 13, set up Docker Compose alongside Portainer for container management, and created a homepage dashboard using “Meet-Homepage”. On top of that I configured it as a <strong>Tailscale exit node</strong>, which came in handy when the network at my job was acting funny when we tried to access work services behind SSO. It allowed me to route my internet traffic through my home server to access work systems behind SSO when using the my jobs wifi or LAN was not letting me. It also inspired others to do the same.</p>
<h3 id="heading-fog-imaging-server">FOG Imaging Server</h3>
<p>Together with my colleague Joseph, I deployed a FOG server to enable imaging/cloning PCs on the floor using iPXE with a “silver image.” This helps us standardize computer setup and speeds up deployments across the floor. It also allows us to create images as well.</p>
<h3 id="heading-kasm-workspaces">KASM Workspaces</h3>
<p>I single-handedly deployed a KASM Workspaces server so that our team can stream containerized apps or desktops to their browser. We also demoed its isolation features to representatives of another school. Super useful for forensic research, testing, and ensuring a clean sand boxed environment.</p>
<h3 id="heading-windows-server-2022">Windows Server 2022</h3>
<p>I spun up a Windows Server 2022 instance which assisted Joseph with AD, DNS, and DHCP setup. Even though I’m not a windows guy anymore, It’s always good to get hands-on with core Windows infra to complement the Linux side of things.</p>
<h3 id="heading-proxmox-cluster-documentation">Proxmox Cluster Documentation</h3>
<p>I started writing documentation for our Proxmox cluster, which is where many of these services (KASM, FOG, etc.) are running. Documentation is becoming one of my priorities, because without it, future work becomes fragile or hard to onboard others.</p>
<h3 id="heading-github-repo-amp-gitops-workflows">GitHub Repo &amp; GitOps Workflows</h3>
<p>I set up a plan for a professional infrastructure repository in GitHub. Part of that was practicing Git workflows like feature branching for documentation changes to start off. This is laying the foundation for more formal GitOps practices, when we add our IaC and our config playbooks as well as future projects.</p>
<hr />
<h2 id="heading-key-takeaway">Key Takeaway</h2>
<p>Small consistent steps are powerful. Even basics like documenting, installing tools, and setting infrastructure support are what compound into real skill and confidence. Also; doing things in public adds accountability, clarity, and momentum.</p>
<hr />
<h2 id="heading-whats-next">What’s Next</h2>
<p>Here’s what I’ll be focusing on in Week 2:</p>
<ul>
<li><p>Make the <strong>Master repo</strong> for our cluster infra</p>
</li>
<li><p>Finalize and clean up the documentation structure and add it to the master repo</p>
</li>
<li><p>Write the first meaningful Ansible playbook (beyond just pinging nodes)</p>
</li>
<li><p>I will also be (attempting) installing Arch Linux on a PC i picked up recently</p>
</li>
</ul>
<p>Thanks for following along I’ll be back next Friday with Week 2’s reflection and what I learn!</p>
]]></content:encoded></item><item><title><![CDATA["Python Crash Course" by Eric Matthes - A Review]]></title><description><![CDATA[Howdy y’all this is IT with ANC and welcome to my Cloud Blog. Let’s get into it.
It’s been quite a while since my last entry where I talked about Terraform. That entry was an intro into using Terraform to automate the provisioning of cloud resources,...]]></description><link>https://blog.it-anc.cloud/python-crash-course-by-eric-matthes-a-review</link><guid isPermaLink="true">https://blog.it-anc.cloud/python-crash-course-by-eric-matthes-a-review</guid><category><![CDATA[Python 3]]></category><category><![CDATA[python beginner]]></category><category><![CDATA[Programming Blogs]]></category><dc:creator><![CDATA[Angel Chavez]]></dc:creator><pubDate>Wed, 02 Jul 2025 22:14:21 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/ZIPFteu-R8k/upload/8b8e62f6e7459eff90bf9330db73887c.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Howdy y’all this is IT with ANC and welcome to my Cloud Blog. Let’s get into it.</p>
<p>It’s been quite a while since my last entry where I talked about Terraform. That entry was an intro into using Terraform to automate the provisioning of cloud resources, with hopes of diving deeper into it. This is still the plan, however things change, and currently I’ve been working a lot more with Python than IaC. So today I thought I’d share with y’all my experience learning the syntax of Python over the past month.</p>
<p>So my main resource for learning the syntax has been, you guessed it, “Python Crash Course“ by Eric Matthes from Starch Press. If you don’t know already Starch Press comes out with exceptional resources for a plethora of IT topics. PCC takes you from noob to apprentice in chapters 1 - 12 where you will read some theory and explanations, and do code challenges at the end of each chapter. Followed by projects in the remaining chapters(this is where I am).</p>
<h3 id="heading-what-why-how">What, Why, How</h3>
<p>Let’s talk day 0, the book starts off real nice. With the author stating how he has been coding since he was 5 and if you didn’t start then you’re doomed to never getting past “Hello World“…haha nah I’m just kidding.</p>
<p>Matthes starts by saying that his dad was a major reason he started coding. His father worked for a pioneer company in the computing realm, Digital Equipment. Matthes started coding as a child because of this. He then goes on to state who the book is for, why learn Python, what to expect in the book, and some resources. In short this book is for those looking to start building as fast as possible, it’s a book for all levels of expertise, but particularly its good for noobs(like me).</p>
<h3 id="heading-part-i">Part I</h3>
<p>The book is split into 2 parts. Part 1 is the “basics” starting with creating a programming environment tailored to python whether you’re on a mac, windows, or Linux machine. The book then takes the reader from variables, conditionals, loops, functions, etc, to data-structures, OOP, tests, and best practices. My impression of this part of the book was that it was perfect for someone like me to get started with python. Because I like most, am a hands-on learner, I can’t just read it I have to do it. Matthes has the reader covered by making us do challenge labs at the end of each chapter. no hand holding, only the text and minimal simple examples, the challenges are actually challenging which is something I really appreciated.</p>
<p>Here is an example of one of the challenges where I needed to create a program that checks for stored data(username) and retrieves the data or prompts the user for data if none is stored, and then store that data in a json file:</p>
<pre><code class="lang-python"><span class="hljs-keyword">from</span> pathlib <span class="hljs-keyword">import</span> Path
<span class="hljs-keyword">import</span> json

<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">get_stored_username</span>(<span class="hljs-params">path</span>):</span>
    <span class="hljs-string">"""get stored username if available"""</span>

    <span class="hljs-keyword">if</span> path.exists():
        contents = path.read_text()
        username = json.loads(contents)
        <span class="hljs-keyword">return</span> username
    <span class="hljs-keyword">else</span>:
        <span class="hljs-keyword">return</span> <span class="hljs-literal">None</span>

<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">get_new_username</span>(<span class="hljs-params">path</span>):</span>
    <span class="hljs-string">"""prompt for a new username"""</span>

    username = input(<span class="hljs-string">"What is your name? "</span>)
    contents = json.dumps(username)
    path.write_text(contents)
    <span class="hljs-keyword">return</span> username

<span class="hljs-function"><span class="hljs-keyword">def</span> <span class="hljs-title">greet_user</span>():</span>
    <span class="hljs-string">"""greets user"""</span>

    path = Path(<span class="hljs-string">'FilesAndExceptions/StoringData/username.json'</span>)
    username = get_stored_username(path)

    <span class="hljs-keyword">if</span> username:
        print(<span class="hljs-string">f'Welcome back, <span class="hljs-subst">{username}</span>'</span>)
    <span class="hljs-keyword">else</span>:
        username = get_new_username(path)
        print(<span class="hljs-string">f'We"ll remember you when you come back, <span class="hljs-subst">{username}</span>'</span>)

greet_user()
</code></pre>
<h3 id="heading-part-ii">Part II</h3>
<p>Part 2 is where the fun/building really begins. It consists of 3 different projects each with a specific topic/focus in mind. You get to choose between building a game, a data visualization tool(s), or a Web App all with pure python. This is currently where I am at, and I must say I am proud of how far I’ve come. Now by no means does this book make you a professional, not in the slightest. What it does is give you a strong foundation so that you can get good with the fundamentals and if you choose to and put in the work, become an expert.</p>
<p>If you enjoyed this read or you’re inspired to go give this book a try because of my review, I appreciate that and I’m glad I was able to offer some value from this. Check out my GitHub where I am building my first game in python inspired by the book. I also have some other cool stuff.</p>
<p><a target="_blank" href="https://github.com/El-Padre12">My GitHub</a></p>
<p>See y’all next time✌️</p>
]]></content:encoded></item><item><title><![CDATA[Getting Started with Terraform: Your First Steps into Infrastructure as Code]]></title><description><![CDATA[Salutations Fellow Cloud Enthusiasts!
It's been a while, and I felt like I really needed to get back to making my learning journey public again. Today, we're talking about Infrastructure as Code (IaC), more specifically, Terraform!
First, what is IaC...]]></description><link>https://blog.it-anc.cloud/getting-started-with-terraform-your-first-steps-into-infrastructure-as-code</link><guid isPermaLink="true">https://blog.it-anc.cloud/getting-started-with-terraform-your-first-steps-into-infrastructure-as-code</guid><category><![CDATA[Terraform]]></category><category><![CDATA[#Iac #terraform #devops #aws]]></category><category><![CDATA[AWS]]></category><category><![CDATA[Cloud Computing]]></category><dc:creator><![CDATA[Angel Chavez]]></dc:creator><pubDate>Fri, 10 Jan 2025 15:00:34 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1736482752331/2a2392ac-13e9-4909-a16f-a081a28d0fa8.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Salutations Fellow Cloud Enthusiasts!</p>
<p>It's been a while, and I felt like I really needed to get back to making my learning journey public again. Today, we're talking about Infrastructure as Code (IaC), more specifically, Terraform!</p>
<p>First, what is IaC? It's actually several things - it can be an ad hoc script you've whipped up, configuration management like Ansible or Chef, or provisioning tools like Terraform or Pulumi. To me, IaC is a method or tool used to automate and provision cloud infrastructure. Here's what a quick Google search says: "Infrastructure as Code (IaC) is a process that uses code to provision and manage infrastructure instead of manual processes."</p>
<h3 id="heading-overview-setup">Overview + Setup</h3>
<p>Common patterns include:</p>
<ul>
<li><p>Terraform + Config Management (i.e., Ansible)</p>
</li>
<li><p>Terraform + Server Templates(i.e., Packer)</p>
</li>
<li><p>Terraform + Orchestration (i.e., Kubernetes)</p>
</li>
</ul>
<p>To get started, you'll need to download and install the Hashicorp Repository from the CLI using your local package manager. Once completed, download and install Terraform. Verify the installation by checking your local repolist or typing "terraform --version".</p>
<p>If you're using Fedora 40 with the DNF package manager, here are the steps:</p>
<pre><code class="lang-bash">dnf install -y dnf-plugins-core
dnf config-manager --add-repo https://rpm.releases.hashicorp.com/fedora/hashicorp.repo
dnf install terraform -y
terraform --version
</code></pre>
<p>If you're using a different OS, just google "how to install terraform from the CLI on [your OS]."</p>
<p>Next, create a basic "bare bones" main.tf file somewhere in your filesystem. Here's a basic example:</p>
<pre><code class="lang-yaml"><span class="hljs-string">terraform</span> {
    <span class="hljs-string">required_providers</span> {
        <span class="hljs-string">aws</span> <span class="hljs-string">=</span> {
            <span class="hljs-string">source</span> <span class="hljs-string">=</span> <span class="hljs-string">"hashicorp/aws"</span>
            <span class="hljs-string">version</span> <span class="hljs-string">=</span> <span class="hljs-string">"~&gt; 3.0"</span>
        }
    }
}

<span class="hljs-string">provider</span> <span class="hljs-string">"aws"</span> {
    <span class="hljs-string">region</span> <span class="hljs-string">=</span> <span class="hljs-string">"us-east-1"</span>
}

<span class="hljs-string">resource</span> <span class="hljs-string">"aws_instance"</span> <span class="hljs-string">"example"</span> {
    <span class="hljs-string">ami</span> <span class="hljs-string">=</span> <span class="hljs-string">"ami-0b0ea68c435eb488d"</span> <span class="hljs-comment"># Ubuntu Xenial Xerus 16.04 LTS</span>
    <span class="hljs-string">instance_type</span> <span class="hljs-string">=</span> <span class="hljs-string">"t2.micro"</span>
}
</code></pre>
<p>In this example, the first code block defines which provider we're going to use (AWS), followed by defining a default region. In the third code block, we're defining an EC2 instance named "example," using an Ubuntu 16.04 AMI and a t2.micro instance type.</p>
<p><strong>Important</strong>: <strong>Make sure you have the AWS CLI installed with the correct user credentials/secrets and that the user has proper permissions in AWS to create an EC2 instance, or this won't work!</strong></p>
<p>Finally, in the directory containing your main.tf file, run the following commands:</p>
<pre><code class="lang-bash">terraform init
terraform plan  <span class="hljs-comment"># to see what changes you're making</span>
terraform apply
</code></pre>
<p>Once done, if you have no error messages, check the AWS EC2 Console to see your instance created and running! This is real power! No longer do you need to create and deploy your cloud infrastructure from the console - the console should only be used for verification purposes!</p>
<h3 id="heading-avoiding-unwanted-cloud-bills">Avoiding Unwanted Cloud Bills</h3>
<p>To avoid acquiring a hefty AWS bill overnight, make sure to run the terraform destroy command to ensure that everything you created is destroyed and not left running:</p>
<pre><code class="lang-bash">terraform destroy
</code></pre>
<p>Congrats! You just used Terraform to provision AWS resources all from the CLI. You can now build on this, as will I in the coming weeks.</p>
<p>Stay tuned for upcoming blog entries where I'll dive into:</p>
<ul>
<li><p>Language Features</p>
</li>
<li><p>Variables &amp; Outputs</p>
</li>
<li><p>Testing</p>
</li>
<li><p>Developer Workflows</p>
</li>
<li><p>And more!</p>
</li>
</ul>
<p>See y'all in the next one!</p>
]]></content:encoded></item><item><title><![CDATA[Simplified Guide to User and Group Management in Linux]]></title><description><![CDATA[Salutations Fellow Cloud Enthusiasts,
Today we get to talk about the fundamentals of User and Group account management specifically in Linux, since that’s what I use daily and since that’s what the cloud runs on. Whether you're configuring access con...]]></description><link>https://blog.it-anc.cloud/simplified-guide-to-user-and-group-management-in-linux</link><guid isPermaLink="true">https://blog.it-anc.cloud/simplified-guide-to-user-and-group-management-in-linux</guid><category><![CDATA[linux for beginners]]></category><category><![CDATA[linux-basics]]></category><category><![CDATA[AWS]]></category><dc:creator><![CDATA[Angel Chavez]]></dc:creator><pubDate>Wed, 02 Oct 2024 01:26:20 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/wh-RPfR_3_M/upload/14337080ea002a6ccb84f0a148b4049d.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Salutations Fellow Cloud Enthusiasts,</p>
<p>Today we get to talk about the fundamentals of User and Group account management specifically in Linux, since that’s what I use daily and since that’s what the cloud runs on. Whether you're configuring access control for an EC2 instance or organizing users within a secure environment, understanding how Linux handles user and group accounts will lay the foundation for effective cloud administration. We'll be covering everything from creating new users to managing group permissions, ensuring you're well-prepared to handle user management in a Linux/cloud context.</p>
<p>With that being said, hey, I'm Angel. This is IT with ANC, and welcome to my cloud blog. Let's get into it!</p>
<hr />
<h3 id="heading-overview">Overview</h3>
<p>As a system administrator, one of your core responsibilities is managing user and group accounts. This task encompasses everything from creating new users to ensuring proper password security and system access. In this post, we'll dive into the foundational tasks for managing <strong>user and group accounts</strong> on a Linux system, including working with special purpose accounts, password management, and more. By the end of this guide, you'll have a solid understanding of how to efficiently manage these accounts and tailor them to meet your organization's security policies.</p>
<hr />
<h3 id="heading-understanding-key-files-for-user-and-group-management">Understanding Key Files for User and Group Management</h3>
<p>Managing users and groups involves interacting with several critical system files. Here’s a breakdown of two essential ones:</p>
<ol>
<li><p><strong>/etc/passwd</strong>:<br /> This file contains details about all user accounts, including special system accounts used to run services. It is readable by all users but only writable by the root user. The information in this file includes usernames, user IDs (UIDs), and the location of user home directories.</p>
</li>
<li><p><strong>/etc/shadow</strong>:<br /> For added security, Linux uses the <code>/etc/shadow</code> file to store encrypted passwords, ensuring that only the root user has access. Each entry in <code>/etc/passwd</code> corresponds to an entry in <code>/etc/shadow</code>. This file also contains additional fields, such as password expiration information, allowing you to enforce password policies.</p>
</li>
</ol>
<hr />
<h3 id="heading-special-purpose-system-accounts">Special Purpose System Accounts</h3>
<p>Some system accounts are critical for services but should not be accessible by regular users. These are referred to as <strong>special purpose system accounts</strong>, and they are generally used to run system services.</p>
<p>Administrators can control the range of regular user accounts through the <code>/etc/login.defs</code> file, ensuring that system accounts have their own UID range, distinct from regular users.</p>
<hr />
<h3 id="heading-creating-and-managing-user-accounts">Creating and Managing User Accounts</h3>
<p>Adding a user account is a two-step process:</p>
<ol>
<li><p><strong>Create the user account</strong>: To create a new user, you can use the <code>useradd</code> command. This command adds a new entry to <code>/etc/passwd</code> and generates a corresponding entry in <code>/etc/shadow</code>.</p>
<pre><code class="lang-bash"> useradd newuser
</code></pre>
</li>
<li><p><strong>Set the user password</strong>: After creating the account, you need to set a password for the new user using the <code>passwd</code> command. Users can change their passwords, but only the root user can set or reset other users' passwords.</p>
<pre><code class="lang-bash"> passwd newuser
</code></pre>
<p> By default, the <code>passwd</code> command will prompt the user to change their password after a specified period, ensuring security compliance.</p>
</li>
</ol>
<hr />
<h3 id="heading-using-template-initialization-files-for-new-users">Using Template Initialization Files for New Users</h3>
<p>When a new user account is created, a default set of files from the <strong>/etc/skel</strong> directory is copied to their home directory. These files provide a basic configuration for the new user’s environment. Admins can customize these files to provide pre-configured settings for new users. For example, you can add default bookmarks in a browser or setup default environment variables.</p>
<p>Note that changes to <code>/etc/skel</code> after user creation won’t affect existing users.</p>
<hr />
<h3 id="heading-updating-and-managing-user-passwords">Updating and Managing User Passwords</h3>
<p>Linux allows users to change their passwords with the <code>passwd</code> command, but as a system administrator, you may need to manage password policies and lock or unlock user accounts:</p>
<ul>
<li><p>To <strong>lock</strong> a user account:</p>
<pre><code class="lang-bash">  passwd -l username
</code></pre>
</li>
<li><p>To <strong>unlock</strong> a user account:</p>
<pre><code class="lang-bash">  passwd -u username
</code></pre>
<p>  Additionally, you can enforce password policies by controlling how often users must change their passwords using the <code>change</code> command. This command lets you specify the number of days between password changes and when a password expires.</p>
</li>
</ul>
<h3 id="heading-removing-user-accounts">Removing User Accounts</h3>
<p>When a user no longer needs access to the system, it's important to remove their account. This can be done using the <code>userdel</code> command:</p>
<pre><code class="lang-bash">userdel username
</code></pre>
<p>You can also delete a user’s home directory and associated files by adding the <code>-r</code> flag:</p>
<pre><code class="lang-bash">userdel -r username
</code></pre>
<h3 id="heading-understanding-group-accounts">Understanding Group Accounts</h3>
<p><strong>Groups</strong> are an essential part of managing users in Linux. A group allows you to assign collective permissions to a set of users, making resource management easier. For example, you might have groups like <strong>admin</strong>, <strong>developers</strong>, or <strong>finance</strong>, each with specific permissions to access certain files or directories.</p>
<ul>
<li><p><strong>Primary Group</strong>:<br />  Every user is automatically assigned a primary group when their account is created. Typically, this is a group with the same name as the user.</p>
</li>
<li><p><strong>Secondary Groups</strong>:<br />  Users can also belong to additional groups, granting them access to other system resources.</p>
</li>
</ul>
<p>The <strong>/etc/group</strong> file holds information about groups, including which users belong to which group. You can add, modify, or delete groups with the <code>groupadd</code>, <code>groupmod</code>, and <code>groupdel</code> commands.</p>
<hr />
<h3 id="heading-best-practices-for-managing-users-and-groups-in-linux">Best Practices for Managing Users and Groups in Linux</h3>
<p>Here are some best practices for managing user and group accounts in Linux:</p>
<ol>
<li><p><strong>Follow the Principle of Least Privilege</strong>:<br /> Ensure that users are only granted the access they need to perform their tasks, and no more. Use groups to control access to resources efficiently.</p>
</li>
<li><p><strong>Regularly Update Password Policies</strong>:<br /> Use tools like <code>chage</code> to enforce regular password changes and ensure that passwords are complex enough to resist brute-force attacks.</p>
</li>
<li><p><strong>Monitor User Activity</strong>:<br /> Regularly audit user and group information by reviewing the <code>/etc/passwd</code>, <code>/etc/shadow</code>, and <code>/etc/group</code> files. Remove inactive users or accounts that are no longer in use.</p>
</li>
<li><p><strong>Use Automation for Account Management</strong>:<br /> For larger organizations, consider using scripts to automate the creation, modification, and deletion of user accounts. This ensures consistency and reduces the potential for errors.</p>
</li>
</ol>
<hr />
<h3 id="heading-cloud-specific-best-practices-for-aws"><strong>Cloud-Specific Best Practices for AWS</strong></h3>
<p>When working in an AWS environment, user and group management becomes even more crucial for securing your cloud resources. Here are some additional best practices tailored to AWS:</p>
<ol>
<li><p><strong>Leverage IAM for User Management:</strong> Instead of managing individual Linux accounts for each user across multiple EC2 instances, use AWS Identity and Access Management (IAM) to manage permissions centrally. IAM roles can be assigned to EC2 instances, allowing users to interact with other AWS services without needing to manually configure credentials.</p>
</li>
<li><p><strong>Use AWS Systems Manager for Access Control:</strong> AWS Systems Manager Session Manager allows you to securely connect to your EC2 instances without needing to manage individual SSH keys. This way, you can control user access centrally from the AWS Management Console, which simplifies auditing and improves security.</p>
</li>
<li><p><strong>Integrate with AWS Directory Service:</strong> If you need centralized authentication, AWS Directory Service can help manage Linux users via Microsoft AD, making it easier to apply uniform user policies across different environments, including on-premises and cloud.</p>
</li>
<li><p><strong>Automate with CloudFormation or Terraform:</strong> Automate the creation and configuration of EC2 instances, including user and group settings, by using infrastructure-as-code tools like AWS CloudFormation or Terraform. This ensures consistent configurations and simplifies deploying new resources with the correct user and group permissions.</p>
</li>
<li><p><strong>Monitor with CloudWatch and CloudTrail:</strong> Use Amazon CloudWatch and AWS CloudTrail to monitor user activity across EC2 instances. Set up alarms to notify you of suspicious behavior, such as repeated failed login attempts or unauthorized access to sensitive files.</p>
</li>
<li><p><strong>Isolate Workloads with VPC and Security Groups:</strong> In addition to Linux group permissions, use Virtual Private Clouds (VPC) and Security Groups in AWS to isolate workloads and control traffic to your instances. This provides an additional layer of access control beyond just user/group management at the OS level.</p>
</li>
</ol>
<p>These cloud-specific practices help extend traditional Linux user and group management into the cloud, ensuring that you follow both the least privilege principle and maintain strong security standards across your AWS infrastructure.</p>
<hr />
<h3 id="heading-conclusion">Conclusion</h3>
<p>Managing users and groups is a foundational task for any system administrator. Whether you're creating new user accounts, updating passwords, or managing group access, understanding the basics of user and group management ensures that your system runs smoothly and securely. By following best practices and leveraging the right tools, you can streamline these processes and maintain a secure environment. As I continue this blog I am going to keep focusing on the fundamentals, like Linux basics, networking basics, and scripting. You’ll find that being “Mr. Fundamentals” when it comes to cloud will take you far and future proof your career in tech. Additionally as I learn more from my classes at Northwest Vista and start my job as a datacenter technician these posts will continue to get more and more technical.</p>
]]></content:encoded></item><item><title><![CDATA[Mastering Shell Customization: Advanced Features for Power Users]]></title><description><![CDATA[Salutations Fellow Cloud Enthusiasts,
In this entry, we'll dive into how customizing your shell environment can enhance your productivity in the Cloud, covering everything from managing environment variables to automating tasks with Bash functions, a...]]></description><link>https://blog.it-anc.cloud/mastering-shell-customization-advanced-features-for-power-users</link><guid isPermaLink="true">https://blog.it-anc.cloud/mastering-shell-customization-advanced-features-for-power-users</guid><category><![CDATA[Cloud Computing]]></category><category><![CDATA[Linux]]></category><category><![CDATA[shell]]></category><category><![CDATA[AWS]]></category><category><![CDATA[Productivity]]></category><category><![CDATA[techtips]]></category><dc:creator><![CDATA[Angel Chavez]]></dc:creator><pubDate>Mon, 02 Sep 2024 21:33:08 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/xbEVM6oJ1Fs/upload/383694912192120c72293f0a1f897407.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Salutations Fellow Cloud Enthusiasts,</p>
<p>In this entry, we'll dive into how customizing your shell environment can enhance your productivity in the Cloud, covering everything from managing environment variables to automating tasks with Bash functions, and will bring you closer to being a Linux power user. Which will serve you well since 90% of the servers that make up the cloud run some flavor of Linux.</p>
<p>Hey, I'm Angel and this is IT with ANC, welcome to my cloud blog. Let's get into it!</p>
<p><strong>Overview</strong></p>
<p>As you dive deeper into cloud computing, understanding advanced features like environment variables, custom functions, and profile modifications becomes essential. In this post, we'll explore how to customize your shell environment, making it more efficient and tailored to your specific needs. Whether you're managing EC2 instances, automating deployments with scripts, or configuring environments, mastering shell customization can significantly enhance your efficiency and effectiveness.</p>
<h4 id="heading-understanding-variables-the-building-blocks-of-shell-customization"><strong>Understanding Variables: The Building Blocks of Shell Customization</strong></h4>
<p>Variables in the shell are a fundamental concept, serving as placeholders for values that can be used across commands and scripts. In Bash, variable names must start with a letter or underscore and can include letters, numbers, and underscores. In the dynamic environment of cloud computing, variables allow you to create flexible and reusable scripts. Whether you're storing critical configuration data or passing parameters between commands, mastering variable management is essential. Remember, they are case-sensitive!</p>
<p><strong>Local vs. Environment Variables:</strong></p>
<ul>
<li><p><strong>Local Variables:</strong> Available only within the shell in which they are created.</p>
</li>
<li><p><strong>Environment Variables:</strong> Passed into all other commands and programs started by the shell, making them globally accessible.</p>
</li>
</ul>
<p>A common convention is to use lowercase for local variables and uppercase for environment variables, helping you distinguish between the two at a glance.</p>
<p>Example:</p>
<pre><code class="lang-bash">local_variable=<span class="hljs-string">"test"</span>
ENVIRONMENT_VARIABLE=<span class="hljs-string">"TEST"</span>
</code></pre>
<p><strong>Displaying Variable Values:</strong></p>
<ul>
<li><p><code>set</code>: Displays all variables (local and environment).</p>
</li>
<li><p><code>env</code> &amp; <code>export -p</code>: Displays only environment variables.</p>
</li>
<li><p><code>echo $VARIABLE</code>: Displays the value of a specific variable.</p>
</li>
</ul>
<h4 id="heading-creating-environment-variables-flexibility-and-control"><strong>Creating Environment Variables: Flexibility and Control</strong></h4>
<p>By default, variables in Bash are local. However, you can easily promote a local variable to an environment variable using a few different methods:</p>
<ol>
<li><p><strong>Exporting an Existing Variable:</strong></p>
<pre><code class="lang-bash"> <span class="hljs-built_in">export</span> LOCAL_VAR
</code></pre>
</li>
<li><p><strong>Creating and Exporting in One Step:</strong></p>
<pre><code class="lang-bash"> <span class="hljs-built_in">export</span> NEW_VAR=<span class="hljs-string">"value"</span>
</code></pre>
</li>
</ol>
<p><strong>Using</strong> <code>declare</code> or <code>typeset</code>: These commands can also declare a variable as an environment variable.</p>
<pre><code class="lang-bash"><span class="hljs-built_in">declare</span> -x ANOTHER_VAR=<span class="hljs-string">"value"</span>
</code></pre>
<p>For temporary changes, the <code>env</code> command allows you to set environment variables for the duration of a command or script, without permanently altering your environment.</p>
<h4 id="heading-unsetting-variables-avoiding-pitfalls"><strong>Unsetting Variables: Avoiding Pitfalls</strong></h4>
<p>If you need to delete a variable, use the <code>unset</code> command:</p>
<pre><code class="lang-bash"><span class="hljs-built_in">unset</span> VARIABLE_NAME
</code></pre>
<p>Be cautious with critical system variables like <code>$PATH</code>—unsetting these can cause serious issues!</p>
<p>To safeguard against errors, you can enable the <code>nounset</code> option:</p>
<pre><code class="lang-bash"><span class="hljs-built_in">set</span> -o nounset
</code></pre>
<p>This will cause an error if a script tries to reference an unset variable, preventing unexpected behavior.</p>
<h4 id="heading-the-path-variable-the-backbone-of-command-execution"><strong>The PATH Variable: The Backbone of Command Execution</strong></h4>
<p>One of the most critical environment variables in your shell is <code>$PATH</code>. It contains a list of directories that the shell searches when you enter a command. Understanding how <code>$PATH</code> works is crucial for customizing your environment and ensuring that commands are executed correctly.</p>
<p><strong>Absolute Path vs. Relative Path:</strong></p>
<ul>
<li><p><strong>Absolute Path:</strong> Specifies the exact location from the root directory, starting with <code>/</code>.</p>
</li>
<li><p><strong>Relative Path:</strong> Specifies the location relative to the current directory, making it more flexible but potentially less clear.</p>
</li>
</ul>
<hr />
<h4 id="heading-why-shell-customization-is-essential-for-cloud-professionals"><strong>Why Shell Customization is Essential for Cloud Professionals</strong></h4>
<p>In a cloud environment, your ability to quickly adapt and automate tasks is crucial. Shell customization allows you to create a tailored environment that can streamline repetitive tasks, reduce errors, and optimize your workflow. For instance, setting environment variables, writing custom Bash functions, and understanding critical variables like <code>$PATH</code> can save you time and prevent costly mistakes.</p>
<p><strong>Practical Applications in AWS:</strong></p>
<ul>
<li><p><strong>Managing Environment Variables:</strong> In AWS, environment variables are often used to store sensitive information like API keys, database credentials, or configuration settings for applications. Understanding how to properly set and manage these variables ensures that your applications run smoothly and securely.</p>
</li>
<li><p><strong>Bash Functions for Automation:</strong> Automating tasks such as starting/stopping EC2 instances, uploading files to S3, or deploying applications can be streamlined using custom Bash functions. This allows you to execute complex commands with a single, simple command, saving time and reducing the likelihood of errors.</p>
</li>
<li><p><strong>Maintaining Consistent Environments:</strong> When deploying across different environments (e.g., development, staging, production), it's vital to maintain consistency. Customizing your shell profiles and using environment variables can help ensure that your environments are consistently configured, reducing the chance of unexpected behavior when scaling in the cloud.</p>
</li>
</ul>
<h4 id="heading-key-concepts-for-aws-professionals"><strong>Key Concepts for AWS Professionals</strong></h4>
<p><strong>1. Variables:</strong></p>
<ul>
<li><strong>Local vs. Environment Variables:</strong> In cloud computing, understanding the scope of variables is crucial. For example, a local variable might be used within a script to perform a specific task, while an environment variable could store a value like a region or instance type that needs to be accessible by multiple scripts or applications.</li>
</ul>
<p><strong>2. The PATH Variable:</strong></p>
<ul>
<li><strong>Navigating AWS CLI:</strong> The <code>$PATH</code> variable is particularly important when working with the AWS Command Line Interface (CLI). Ensuring that your PATH includes the directories where the AWS CLI and other essential tools are installed allows you to execute commands from any location within your terminal, improving your workflow efficiency.</li>
</ul>
<p><strong>3. Unsetting Variables:</strong></p>
<ul>
<li><strong>Avoiding Misconfigurations:</strong> In AWS, incorrect or missing environment variables can lead to failed deployments, broken connections, or even security vulnerabilities. Understanding how to unset variables safely, and knowing when to do so, is critical for maintaining the integrity of your cloud environments.</li>
</ul>
<h4 id="heading-real-world-examples"><strong>Real-World Examples:</strong></h4>
<ul>
<li><p><strong>Automating EC2 Instance Management:</strong> Suppose you frequently start and stop EC2 instances for development purposes. You can create a Bash function that encapsulates this process, setting necessary environment variables like instance IDs and regions within the function. This not only saves time but also reduces the chance of errors.</p>
</li>
<li><p><strong>Configuring Multi-Environment Deployments:</strong> When deploying applications across multiple AWS environments, you can use environment variables to manage configuration differences between environments. By customizing your shell profile, you can easily switch contexts and ensure that you're deploying to the correct environment with the right settings.</p>
</li>
</ul>
<hr />
<p><strong>Conclusion</strong></p>
<p>By mastering these shell features, you'll not only improve your efficiency in AWS and gain the skills needed to handle complex cloud environments with confidence, but you will be one step closer to becoming a Linux power user. Start customizing your shell today and experience the difference in your cloud workflows!</p>
]]></content:encoded></item><item><title><![CDATA[AWS S3 CLI Uploader]]></title><description><![CDATA[Salutations Fellow Cloud Enthusiasts,
Welcome to today’s blog, where we’ll dive into the fascinating world of the Command Line Interface (CLI) and Bash scripting. If you’re on a journey to enhance your cloud computing skills, mastering these tools is...]]></description><link>https://blog.it-anc.cloud/aws-s3-cli-uploader</link><guid isPermaLink="true">https://blog.it-anc.cloud/aws-s3-cli-uploader</guid><category><![CDATA[CloudUploader]]></category><category><![CDATA[bash scripting]]></category><category><![CDATA[AWS]]></category><category><![CDATA[Cloud Computing]]></category><category><![CDATA[Tech community]]></category><category><![CDATA[aws cli]]></category><category><![CDATA[ #TechLearning]]></category><dc:creator><![CDATA[Angel Chavez]]></dc:creator><pubDate>Mon, 26 Aug 2024 21:17:12 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/stock/unsplash/zGuBURGGmdY/upload/82d86a861a7a8c2f7c1c9006193b8169.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Salutations Fellow Cloud Enthusiasts,</p>
<p>Welcome to today’s blog, where we’ll dive into the fascinating world of the Command Line Interface (CLI) and Bash scripting. If you’re on a journey to enhance your cloud computing skills, mastering these tools is a crucial step. Whether you’re just starting out or looking to sharpen your existing knowledge, this post will offer you practical insights through a project you can complete, using the Linux CLI and Bash scripting to interact with AWS services.</p>
<h3 id="heading-what-to-expect">What to Expect.</h3>
<p>In this post, I’ll walk you through:</p>
<ol>
<li><p><strong>The Project</strong>: How I used the CLI and Bash to create the CloudUploader, a tool for uploading files to cloud storage.</p>
</li>
<li><p><strong>Lessons Learned</strong>: Key takeaways and best practices for working with the CLI and Bash in a cloud environment.</p>
</li>
</ol>
<p>By the end of this entry, I hope you’ll feel inspired to take on your own projects and gain hands-on experience with these powerful tools. So, grab your preferred selection of caffeine (coffee for me), and let’s get started!</p>
<h3 id="heading-getting-started-my-experience-with-the-cli-and-bash">Getting Started: My Experience with the CLI and Bash</h3>
<p>First things first, going into this project I already had played around with the CLI and Bash and was somewhat comfortable with both. So, my experience will be different from yours depending on your skill level with these tools. Regardless I still learned a lot and ran into my own problems, mainly with the bash portion of it since I'm still novice. Let me also start off by saying that no matter what level you're at, you need to be comfortable with solving problems on your own, and eventually get good at it. Think about it most System Administrators, Jr and Sr Developers, and even IT Support just get really good at identifying the problem, searching for possible solutions, and testing/implementing. As a Jr Developer or SysAdmin, solving problems independently and documenting them will fast-track your career. You are gonna get ahead a lot quicker and get that promotion or position a lot faster vs the other guy; when the time comes. Be a problem solver not a problem bringer.</p>
<p>Alright enough of that, lets get into it.</p>
<h3 id="heading-the-clouduploader-project-an-overview"><strong>The CloudUploader Project: An Overview</strong></h3>
<p>The CloudUploader CLI is a Bash based tool created/used for seamless upload of files to a specified cloud storage service. In my case of course I used AWS S3 standard. It's also the capstone project for the 1st phase of the Learn to Cloud study guide. Phase 1 focuses on learning Linux CLI, Bash scripting, and computer networking with good resources for all 3 and obviously a project to top it off.</p>
<p>Learn more details or start your Learn to Cloud journey here: <a target="_blank" href="https://learntocloud.guide/phase1/">Learn to Cloud</a></p>
<p><strong>Disclaimer: Prerequisites for this project.</strong> Get familiar with the basics of AWS Cloud and working in the console. I recommend studying the material found in the Cloud Practitioner exam. Also get familiar with the AWS docs as this will become your best friend and eventually go-to resource when working in the cloud.</p>
<p>Get familiar with the documentation here: <a target="_blank" href="https://docs.aws.amazon.com/">AWS DOCS</a></p>
<h3 id="heading-step-1-provisioning-cloud-resources"><strong>Step 1: Provisioning Cloud Resources</strong></h3>
<p>Pretty straight forward, to start building this for yourself I would start with provisioning your cloud resources for this project. If you don't already have a free tier AWS account(most cloud providers have this tier) I highly recommend doing that first. There's plenty of tutorials and documentation on how to get started so I'll let you pick a resource for that.</p>
<p>After creating your very own AWS free tier account you will be logged in as "root" and free to start building, but wait a second! I seriously advise that you go ahead and create an admin account to work with as it is not best practice and generally unsafe to work as root(also best to set up Multi-Factor Authentication for both root and admin. DO NOT FORGET YOUR LOGIN CREDENTIALS FOR ROOT). If we keep security in mind first we don't have to deal with more headaches later.</p>
<p>Your next step after creating an admin account will be to create an S3 bucket that you want to use for uploading files. you'll be able to use and upload to more than just one S3 bucket, but for now let's just provision one. call it "cli-uploader" or "cloud-uploader-project" whatever makes sense to you.</p>
<h3 id="heading-step-2-installing-and-configuring-the-aws-cli"><strong>Step 2: Installing and Configuring the AWS CLI</strong></h3>
<p>Next your going to want to download and install the AWS-CLI. To configure the AWS-CLI you will need IAM credentials(access key, secret key), now you can either use your admin account or create a specific IAM user for this project. Do not, I repeat, please do NOT use the root account for this. Your root account shouldn't even have access key credentials to begin with.</p>
<p>I chose to create an IAM user specifically for this project with only access to the CLI not the console, with admin privileges. Best practice here would be to give the user access to only what you need(least-privelege), but for this case I went ahead and gave admin rights.</p>
<p>Helpful links:</p>
<p><a target="_blank" href="https://aws.amazon.com/cli/">CLI Installation</a></p>
<p><a target="_blank" href="https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html">Configure CLI</a></p>
<h3 id="heading-step-3-bash-scripting-and-next-steps"><strong>Step 3: Bash Scripting and Next Steps</strong></h3>
<p>Alright great! You have your cloud environment set up, AWS CLI installed and configured, you are ready to start that Bash script. For this part I recommend doing some of your own research and practicing with Bash before you start coding.</p>
<p>The big issue i ran into during this step was learning the syntax, I had already taken programing I &amp; II in Java so was familiar with most programing concepts. To really get familiar with Bash syntax I followed a YouTube course from <a class="user-mention" href="https://hashnode.com/@madebygps">Gwyneth Peña S.</a> as well as looking on GitHub for people who did the same project and seeing how they did it.</p>
<div class="embed-wrapper"><div class="embed-loading"><div class="loadingRow"></div><div class="loadingRow"></div></div><a class="embed-card" href="https://youtube.com/playlist?list=PLlrxD0HtieHh9ZhrnEbZKhzk0cetzuX7l&amp;si=SoY1Z_5EJTbuz5Dr">https://youtube.com/playlist?list=PLlrxD0HtieHh9ZhrnEbZKhzk0cetzuX7l&amp;si=SoY1Z_5EJTbuz5Dr</a></div>
<p> </p>
<p>You can look at my GitHub to help you start but the fun part about this is your only limited to what you know. This makes this project scalable in a sense, since you can always work on bugs, add new features, and refactor as you learn throughout time. Hell if you wanted you can take this project all the way to the finish line; setting up a CICD pipeline, configuring Infrastructure as Code, and even containerize using Docker if you really want. Like I said you're only limited to what you know.</p>
<p><a target="_blank" href="https://github.com/El-Padre12">My GitHub</a></p>
<h3 id="heading-engage-with-me-share-your-experience"><strong>Engage with Me: Share Your Experience!</strong></h3>
<p>I’d love to hear from you!</p>
<ul>
<li><p><strong>What’s your experience with CLI and Bash?</strong></p>
</li>
<li><p><strong>Have you encountered any specific challenges while working with these tools?</strong></p>
</li>
</ul>
<p>Feel free to share your thoughts in the comments below or reach out to me directly. Engaging with fellow learners is a great way to enhance our collective knowledge, so don’t hesitate to join the conversation! If you enjoyed this blog, consider subscribing to my newsletter to stay up-to-date with AWS and IT in general. Let’s continue this journey of learning together!</p>
]]></content:encoded></item><item><title><![CDATA[Completing the AWS Cloud Resume Challenge: A Beginner's Experience]]></title><description><![CDATA[Salutations Fellow Cloud Enthusiasts,
It’s the middle of 2024, and time seems to be flying by. Looking back, I’m proud of how far I’ve come since April 2023 when I started learning networking as a complete beginner. My journey began with repeated fai...]]></description><link>https://blog.it-anc.cloud/completing-the-aws-cloud-resume-challenge-a-beginners-experience</link><guid isPermaLink="true">https://blog.it-anc.cloud/completing-the-aws-cloud-resume-challenge-a-beginners-experience</guid><category><![CDATA[AWS]]></category><category><![CDATA[Cloud Computing]]></category><category><![CDATA[Tech community]]></category><category><![CDATA[Learning Journey]]></category><category><![CDATA[cloud-resume-challenge]]></category><dc:creator><![CDATA[Angel Chavez]]></dc:creator><pubDate>Thu, 01 Aug 2024 17:23:37 GMT</pubDate><enclosure url="https://cdn.hashnode.com/res/hashnode/image/upload/v1722410463170/9c636423-79c6-4487-b7b0-279e477f2835.jpeg" length="0" type="image/jpeg"/><content:encoded><![CDATA[<p>Salutations Fellow Cloud Enthusiasts,</p>
<p>It’s the middle of 2024, and time seems to be flying by. Looking back, I’m proud of how far I’ve come since April 2023 when I started learning networking as a complete beginner. My journey began with repeated failures on the Juniper Networks Certified Internet Associate (JNCIA) exam—not once, but twice! However, by the end of the year, I turned things around after enrolling in a cloud bachelor’s program and acquiring the CompTIA Network+ certification. I also joined my school’s cybersecurity club, where I participated in capture the flags (CTFs) and earned two more certifications(CCST &amp; AWS Cloud Practitioner). Despite these accomplishments, by early 2024, I still hadn’t landed an entry-level IT job, let alone a cloud position. Every job I applied for required professional experience, which I lacked. Sound familiar?</p>
<p>Enter The Cloud Resume Challenge (CRC), created by Forest Brazeal. The CRC is a project designed to give you hands-on experience with cloud technologies, enhance your resume, and demonstrate your ability to manage and deploy cloud-based solutions. In a nutshell, the challenge involves hosting a static website with a visitor counter (powered by a serverless function) and your resume info, all on a cloud service provider like AWS. The project also includes setting up a CI/CD pipeline, which I accomplished using GitHub Actions. The final piece is configuring Infrastructure as Code (IaC), which I plan to complete with Terraform. This is a simplified overview, but I don’t want to overwhelm you with technical jargon just yet!</p>
<p>You can learn more about the challenge here: <a target="_blank" href="https://cloudresumechallenge.dev/">Forest's Website</a></p>
<p>Now that you have some context, let me share my personal experience. This won’t be a step-by-step guide, but I do plan to create something similar in the future, broken down into digestible chunks to avoid information overload. Trust me, there’s a lot more to this challenge than meets the eye, so stay tuned for those tutorials!</p>
<hr />
<h3 id="heading-the-beginning">The Beginning</h3>
<p>I first learned about the CRC from <a class="user-mention" href="https://hashnode.com/@madebygps">Gwyneth Peña S.</a>, a cloud YouTuber I admire. Since she specializes in Azure, my initial attempt was with Azure. Unfortunately, I ran into issues, particularly with the visitor counter. I tried using C#, but quickly realized it wasn’t the right fit for me. Overwhelmed by new technologies and college technical classes in Java and AWS, I had to reassess my approach. Looking back, I see that I had spread myself too thin. So, I made an "executive decision" to push my code to GitHub and recreate/restart the project in AWS.</p>
<p>If you’ve made it this far, kudos to you! (And I guess I’m doing alright so far.) Check out my completed AWS serverless resume website here: <a target="_blank" href="https://awsresume.it-anc.cloud/">My Resume Website</a></p>
<h3 id="heading-recreation">Recreation</h3>
<p>Recreating the project in AWS was much smoother. I reused the same HTML and CSS for the frontend so the site looked the same. I uploaded my files to S3, created a CloudFront distribution for dynamic caching and HTTPS support, and registered a domain in Route 53. I then wrote a Lambda function in Python 3.11, which worked seamlessly (C# was definitely not for me). This function interacts with a DynamoDB database to keep track of the visitor counter. I could've used API Gateway for secure communication between the function and the database, though Lambda's auto-created "functionURL" worked fine for me. A bit of JavaScript was needed to invoke the Lambda function and display the updated data. However, when I visited the site, the counter wasn’t displaying due to a timing issue with the DOM content loading. Fortunately, a one-line fix with ChatGPT’s help resolved the issue. (Again, this isn’t a detailed guide, so if you want to be notified when I publish those tutorials, subscribe to my newsletter!)</p>
<p>The final step was creating a CI/CD pipeline for the frontend files using GitHub Actions, which turned out to be surprisingly simple. The project was complete, except for the IaC component, which I plan to finish soon. I also intend to learn Docker and containerize the project for fun.</p>
<hr />
<h3 id="heading-final-thoughts-and-key-takeaways"><strong>Final Thoughts and Key Takeaways</strong></h3>
<p>Recreating this project in AWS turned out to be a smooth process, thanks to the lessons I learned from my initial attempt in Azure. Here’s a quick summary of what I accomplished:</p>
<ul>
<li><p><strong>Reused Frontend Code</strong>: I used the same HTML and CSS, ensuring consistency in design.</p>
</li>
<li><p><strong>AWS Setup</strong>: I uploaded files to S3, set up CloudFront for dynamic caching and HTTPS, and registered a domain with Route 53.</p>
</li>
<li><p><strong>Serverless Function</strong>: I wrote a Lambda function in Python 3.11, which interacts with DynamoDB to manage the visitor counter.</p>
</li>
<li><p><strong>CI/CD Pipeline</strong>: I implemented a straightforward CI/CD pipeline using GitHub Actions for the frontend.</p>
</li>
</ul>
<p><strong>Key Takeaways</strong>:</p>
<ol>
<li><p><strong>Flexibility is Key</strong>: Switching from Azure to AWS was a strategic decision that aligned better with my skills and project requirements. Sometimes, changing tools or platforms can make a significant difference.</p>
</li>
<li><p><strong>Hands-On Experience Matters</strong>: Completing the CRC not only enhanced my technical skills but also provided practical experience that is highly valued by employers.</p>
</li>
<li><p><strong>Embrace Learning</strong>: Challenges like these are opportunities to learn and grow. Even if things don’t go as planned initially, perseverance and adaptability can lead to success.</p>
</li>
</ol>
<p>Whew! I haven’t written this much in a while(hopefully it doesn't show) lol XD.</p>
<p>Thank you for following along with my journey through the Cloud Resume Challenge. Stay tuned for more posts, beginning with an in-depth, step-by-step guide on the frontend portion of this project. If you’re interested in learning more or have any questions, don’t hesitate to subscribe to my newsletter!</p>
]]></content:encoded></item></channel></rss>