Beginner Tools, Expert Resources

As a Cloud Architect and somebody that has a lot of past experience with virtualization, one thing that is both an advantage and a disadvantage is the need to know a little bit about a slew of different technologies. Having to answer questions an inch deep across a mile of topics one day and a mile deep on a square inch of topics the next can be challenging. If anybody in IT denies using Google as their primary tool much of the time, they’re lying. This applies to amateurs and veterans alike.

Just as carpenters use the same tried and true tools of their apprenticeships, so too do the masters of any trade. They don’t toss their hammers aside when they move on to power tools. They add to their toolsets with the new and advanced, supplementing rather than supplanting.

In the same vein, one of the most important things an infotech professional can do is build and cultivate their own toolsets. I use the term “cultivate” because, ideally, the relationship won’t be one of simply consumption. But, I’m getting ahead of myself. Going back to the carpentry analogy, the basic tools will always have a place. But what are some power tools, expert resources, that could jumpstart your toolset?

Here are several tools that are part of my own toolset that I find helpful.

  1. The Documentation Site
    Honestly, this is probably a cheap start, but I’ll stand by it. My current focus is on AWS and, more times than not, their docs have an answer or can point me in the right direction. It’s broken down by topic and then divided by service.
  2. AWS Prescriptive Guidance
    From the Prescriptive Guidance page, “These resources were developed by experts at AWS Professional Services, based on years of experience helping customers realize their business objectives on AWS.” I almost feel as if I’m giving away trade secrets by sharing this one. There are guides and tested patterns for a couple of hundred things you might want to do. Want guidance on migrating SQL workloads into EC2, RDS? Here it is. Want to set up a logging and monitoring solution on CloudWatch? Here you go.
  3. AWS QuickStarts
    From the Quick Start page, “Quick Starts are automated reference deployments [that] help you deploy popular technologies on AWS based on AWS best practices for security and high availability. These accelerators reduce hundreds of manual procedures into just a few steps so that you can build your production environment in minutes and start using it immediately[…]includes AWS CloudFormation templates that automate the deployment and a guide that describes the architecture and provides deployment instructions.”

    If you want to get something up and running fast and architected correctly, begin with a Quick Start. There are over 300 reference architectures for you to start with, including things like setting up a centralized logging solution or an automated security response and remediation using Security Hub. There’s even a Quick Start for a .NET CI/CD pipeline.
  4. AWS Workshops
    Over a hundred workshops developed by AWS to demonstrate various technologies across all skillsets. Here, you can you get some real experience with a load of AWS’s services.
  5. AWS Modernization Workshops
    Created by AWS and AWS partners, this is a site full of guided labs and hands-on workshops focusing on specific services and related vendor technologies.

I briefly touched on it earlier, but start to build your own tools. Write your own guides and share architectures you’ve built or improved upon. Chances are you’ll either a.) help somebody else or b.) get some free advice. 🙂

Anyway, enough of me for now. These are some of the resources I use on a regular basis, and I’ll try and share out some others I’ve found to be useful. Let me know if you have any recommendations of your own.

AWS Certification Journey: Solutions Architect Associate

This time last year, I was counting down the last few days until I was to sit my first AWS certification exam, the Solutions Architect Associate. Starting in October of 2018 I’d decided that this would be the course I’d take next and began working toward skilling myself up in all things AWS. There is a lot of conceptual overlap between virtualization and Cloud, so personal experience combined with about three months of study led to my passing and achieving the cert.

I thought I’d share the things I used on the off chance anybody else finds it useful. What I find most useful is to identify more than one resource to get a different perspective on the objectives and, with the case of the book, it was also nice to be able to study offline. With the SA Associate certification, I used a combination of the exam blueprint, a textbook, multiple online classes, and (most importantly) hands-on experience. All told, it was a personal investment of only about $350, including the exam fee and any overages that may have exceeded the free tier.

The guidance from the two online courses was excellent and also provided a good deal of hands-on experience.

After you identify the resources you intend to use to study, the next thing you should do is pick a date for the exam and actually schedule it. The commitment of placing actual money is a great motivator. Scheduling will give you a deadline to work towards, not only making the goal real but providing a sense of urgency.

The “Right” Way to Access Private EC2 Instances

Accessing private EC2 instances from a public jump host presents a potential security issue. Every instance must be accessed using a private access key, but storing this key on the jump host is a bad (actually, very bad) idea. If stolen, this could allow any of your systems configured with the key to be compromised.

Try using ‘ssh-agent’ from your local system to store and use your private keys. This will allow your jump host to forward along or proxy the authentication without having to upload or share the key.

# Step 1 is to actually start 'ssh-agent'
# This outputs the commands to set the necessary environment variables
# and will display the PID of the agent.  
#
# Example:
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-wlp2yfs0IEW8/agent.16882; export SSH_AUTH_SOCK;
SSH_AGENT_PID=16883; export SSH_AGENT_PID;
echo Agent pid 16883;

# Step 2 is to store your private key using 'ssh-add'
# ssh-add privateKeyName.pem
#
# Example:
$ ssh-add kpSuperSecret.pem
Identity added: kpSuperSecret.pem (kpSuperSecret.pem)

Once the access key is stored locally, you can simply connect to your jump host with SSH as normal. From there, accessing the private hosts just requires you to add the “-A” flag:

$ ssh ec2-user@ec2-somename-compute-1.amazonaws.com -A

# And then, on into the private instances:
$ ssh ec2-user@ip-someInternalIP.ec2.internal