Discover Your IAM Role With: sts get-caller-identity

When I’m working with AWS managed services like Beanstalk, ECS, Lambda, CodePipeline, CodeBuild, or whatever, I often have difficulty remembering which roles and policies these managed services are operating under. The aws sts get-caller-identity command provides a quick solution to this problem. As the documentation says, it…

Returns details about the IAM user or role whose credentials are used to call the operation.

aws.sts get-caller-identity documentation

You typically find the AWS CLI installed on the services I mentioned so you can just run the command (no permissions are required to run it) and it will very clearly display the role (or user) you’re currently operating with.

I recently figured this out while debugging some permissions issues with a CodePipeline pipeline. I had a relatively simple pipeline that checked source out of CodeCommit, built it with CodeBuild and then deployed it to S3 from CodePipeline. When I added a Terraform command to the CodeBuild script I started seeing Access Denied errors.

I was working on the assumption that the CodeBuild operations were being performed with the role I assigned to the CodeBuild job but then I started thinking maybe I had that wrong and it was actually the CodePipeline roles. My confidence was sinking.

When I continued to have no luck adding the necessary permission, I decided I needed to verify that I was making changes to the correct role. A little digging around got me to the the get-caller-identity API so I added it to my buildspec.yml as follows:

A quick check of the CloudWatch logs for the build confirmed that I was in fact making changes to the correct role (it was using the role assigned to CodeBuild).

Feeling a little more confident, I dug back into the policies attached to the role and discovered that s3:ListObjects is not a valid action in an IAM Policy statement. It’s a little unfortunate because the failure occurred on a ListObjects API call but the correct statement action ended up being s3:ListBucket.

Although my call to get-caller-identity only confirmed what I had suspected to be true, it made me more confident to keep digging into the policy to get it right. Too often the desire is to just throw a * in there, see that it works and move on. A little more context can help keep your policies only as permissive as they need to be.

Locking Down Your S3 Buckets With Terraform

Every time I hear about a company’s critical data being left exposed on the internet, I start thinking about my S3 buckets. I recently started creating some buckets with Terraform and realized acl = "private" isn’t as private as we would like. With that setting it’s still possible for objects to be put into the bucket with less restrictive ACLs. You know you only have "private" set when you see the text “Objects can be public” next to the bucket in the console.

To lock down your bucket, you’ll want to use the aws_s3_bucket_public_access_block resource. The full details can be found in the AWS S3 Block Public Access documentation. Here’s a full Terraform example:

resource "aws_s3_bucket" "private-bucket" {
  bucket = "private-bucket-sample"
  acl = "private"
}

resource "aws_s3_bucket_public_access_block" "private-bucket-public-access-block" {
  bucket = aws_s3_bucket.private-bucket.id
  block_public_acls = true
  block_public_policy = true
  ignore_public_acls = true
  restrict_public_buckets = true
}

When you’ve applied that block, you should see “Bucket and objects not public” next to your bucket in the console.

If you need public access to your S3 objects use CloudFront or maybe signed URLs. If you really must have public S3 objects, I would suggest moving that data to a separate AWS account. It’s pretty easy to manage multiple AWS accounts these days and having a separate account for your public bucket makes a lot of sense.

The iPhone Switch

I’ve been using my iPhone XS for about 3 weeks now after my hellish Google support experience. So far the transition has been mostly painless. Here’s what I’m enjoying from the hardware side of things.

Battery Life

There’s a Google billboard off the 101 on your way out of San Francisco where Google claims the Pixel 3’s battery life is superior to the iPhone XS. My experience has been quite the opposite. I would typically get range anxiety with my Pixel 3 if I was going to have it away from a charger for a full day. My iPhone XS has been lasting about 1.5 days with typical usage before I get to around 30% remaining.

Bluetooth

I’m not sure how they do it but all my Bluetooth devices sound better connected to my iPhone than they did with any of my Pixels. They connect quickly and I rarely get any drop outs even with the phone in my front pocket (i.e. my body between the phone and my headphones). I also have a Wahoo Tickr heart rate monitor and that connected right way with Strava. I can’t believe Google makes this seem so hard.

Face ID

I’ve been using fingerprint readers on my phones since the Nexus 6P and I’ve found them to work pretty well. I was a little surprised the iPhone XS didn’t have a fingerprint reader and I wasn’t sure what to think about using my face to unlock my phone. According to Apple ” Face ID data doesn’t leave your device and is never backed up to iCloud or anywhere else.” I’m not willing to give my face to the government but I am willing to trust Apple (at least for now) when they say my data is staying on the device.

So far the results have been pretty impressive, it picks up my face when I’m looking at the front of the phone but (more importantly to me) it doesn’t pick it up when I’ve got my sunglasses on or if I’m looking away. It even works well in low light situations. I experimented with tuning down the security level by turning off the attention awareness. It works as advertised but I prefer the added level of security that the attention awareness offers.

Camera

I haven’t taken a load of pictures but the few I have taken have turned out well enough. I took this shot six miles into the Bridge to Bridge run and yet it’s still crisp and looks like I had a steady hand (I didn’t!).

Fort Point during the Bridge to Bridge Run

By most standards, the quality should phenomenal but the Pixel camera was pretty good… when it worked. That’s the important thing, I haven’t had the iPhone camera fail on me once when I wanted to pop off a shot. Even before my recent debacle with Google, the Pixel camera was somewhat unreliable.

Conclusions

I’m pretty happy with my switch to the iPhone, aside from the dent in the wallet, it has been a pretty smooth transition for a decade long Android user.

Don’t Wait for the SFO Hole

I took this morning off work to take advantage of the “crummy” weather we’re having, so I could shoot some approaches in real Instrument Meteorological Conditions (IMC).  The ceiling was 900′ and visibility was 3 miles in mist at San Carlos (SQL) when I was ready to depart.  These were probably the lowest conditions I’ve departed in since I’ve received my instrument rating but I was confident that I could make it back to San Carlos if I had any problems during takeoff and on my way down from the city I noticed there was a sizable break in the cloud cover over SFO so that was an option as well.

For this flight I filed from SQL to Charles M. Shulz, Sonoma Counta Airport (KSTS) in Santa Rosa.  Usually the weather is clear up there but today the ceilings were aroundRead More »

Flying the Bay Tour

For the first weekend in March, we had some amazingly clear weather last weekend. It gave me an opportunity to take some friends on a couple trips over San Francisco in what is commonly referred to as the “Bay Tour”. When I got back into flying a few years ago, I was eager to fly the Bay Tour. Unfortunately, most of the information I found online talked about how incredible the flight was rather than more practical information about what to expect as a pilot. Having now flown dozens of Bay Tours out of San Carlos (SQL), I thought it would be a good time to share my perspective and experiences.Read More »

OSUN, Where Art Thou… Going?

While my flight instructor Martin and I were working with the autopilot last weekend, we also spent some time working on flying with a partial panel.  This practice is intended to simulate the failure of one or more instruments on the panel. For this particular “failure,” we simulated losing the directional gyro and attitude indicator by applying round covers that obscured the instrument faces.  These are gyroscopic instruments that rely on a vacuum pump to operate, so we were essentially simulating a vacuum pump failure.

Read More »

Letting George Fly

When I was shopping for a plane two years ago, my goal was to get a capable IFR platform for my instrument training and eventual instrument flight.  Not being an instrument rated pilot, I leaned on my friends and flight instructor for advice.  The overwhelming response was “you gotta have an autopilot.”

The point simply boils down to safety. While you’re on an instrument flight in instrument meteorological conditions (IMC), you have your hands full.  You’re writing down air traffic control (ATC) clearances, responding to ATC commands,  monitoring the health of your aircraft and oh yeah, you’re flying the plane.  Nothing beats a good autopilot for lightening your load when things get tricky.

As luck would have it, after a few months of “close but not quite what I want” planes, I discovered N96988 at Skywagons in Placerville just outside of Sacramento.  There’s a great story there that I’ll write up one day but the point is that I found the plane I wanted and it came with a great GPS (GNS 480), a MX20 multifunction display (MFD) and best of all an STEC-55x autopilot.  When I purchased the plane, I knew it was a capable autopilot but only by what I read, I didn’t have enough experience to put it through it’s paces.  Without having an instrument rating, my use of the autopilot has been limited to flying VFR flight plans on long trips (an all too rare occurrence).  That all changed this past weekend when Martin, my instructor, and I gave that amazing little box a workout.Read More »