Skip to content

Commit

Permalink
Merge pull request #41 from cosad3s/master
Browse files Browse the repository at this point in the history
Add GCP Bucket Storage enumeration script
  • Loading branch information
carlospolop authored Mar 29, 2024
2 parents ab50265 + b416853 commit 667af9d
Showing 1 changed file with 25 additions and 0 deletions.
25 changes: 25 additions & 0 deletions pentesting-cloud/gcp-security/gcp-services/gcp-storage-enum.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,31 @@ If you get a permission denied error listing buckets you may still have access t
for i in $(cat wordlist.txt); do gsutil ls -r gs://"$i"; done
```

With permissions `storage.objects.list` and `storage.objects.get`, you should be able to enumerate all folders and files from the bucket in order to download them. You can achieve that with this Python script:

```python
import requests
import xml.etree.ElementTree as ET

def list_bucket_objects(bucket_name, prefix='', marker=None):
url = f"https://storage.googleapis.com/{bucket_name}?prefix={prefix}"
if marker:
url += f"&marker={marker}"
response = requests.get(url)
xml_data = response.content
root = ET.fromstring(xml_data)
ns = {'ns': 'http://doc.s3.amazonaws.com/2006-03-01'}
for contents in root.findall('.//ns:Contents', namespaces=ns):
key = contents.find('ns:Key', namespaces=ns).text
print(key)
next_marker = root.find('ns:NextMarker', namespaces=ns)
if next_marker is not None:
next_marker_value = next_marker.text
list_bucket_objects(bucket_name, prefix, next_marker_value)

list_bucket_objects('<storage-name>')
```

### Privilege Escalation

In the following page you can check how to **abuse storage permissions to escalate privileges**:
Expand Down

0 comments on commit 667af9d

Please sign in to comment.