StackQL is a simple way to automate the deployment, configuration, management, and removal of Google Cloud Storage buckets, as well as an easy way to query buckets in your environment for asset management reasons or to look for misconfigurations, such as public access, non-conformant encryption configuration and more. It may be useful to review Exploring GCP services and resources using StackQL, which walks working through the StackQL resource hierarchy including the SHOW
and DESCRIBE
commands.
Generate an INSERT
template for a new bucket
The SHOW INSERT
command in StackQL can be used to generate an INSERT
template which can be used to create any resources in GCP. The easiest way to use this command is via the command line using the text output type and supressing column headers using the -H
flag, the StackQL interactive shell can be used as well.
- Interactive Shell
- Command Line
SHOW INSERT INTO google.storage.buckets;
stackql exec "SHOW INSERT INTO google.storage.buckets" --output text -H
To only show the mandatory fields for a resource, you can use the /*+ REQUIRED */
query hint.
You could also use the --outfile
flag to write the template to a new IQL file, or pipe the results to a file using the appropriate shell operator (e.g. >
or >>
).
More information on generating templates can be found at Creating Infrastructure Templates
Create a bucket
Now that you have a template, you can curate this to the fields you desire. Executing an INSERT
command to create a bucket is as easy as:
-- change this for your project and bucket names
INSERT INTO google.storage.buckets(
project,
data__name,
data__location,
data__locationType,
data__labels
)
SELECT
'stackql-demo',
'stackql-demo-bucket',
'US',
'multi-region',
'[{"env":"demo"}]'
;
Removing buckets
In some cases you may want to retire and remove buckets that are no longer needed, doing so is easy in StackQL using the DELETE
command this can be done individually or in a batch (as we will demonstrate later).
NOTE that buckets cannot be deleted if they contain any objects, we will show examples of working with objects in future posts
To delete the bucket you created in the previous step, run the following:
DELETE FROM google.storage.buckets
WHERE bucket = 'stackql-demo-bucket';
Using variables to automate bucket creation
Although the previous example was simple, providing static values for attributes is not very scalable. In most cases, configuration for resource creation and modification is provided as data associated with an StackQL script or module using json
or jsonnet
files.
This can be supplied inline in the IQL script or provided as a separate file, e.g. vars.jsonnet
. For more information, see Using Variables.
The example below demonstrates how to create 3 different buckets using configuration data provided using jsonnet
.
TIP Use Jsonnet where possible as it is a much more powerful and feature rich configuration and data templating language supporting inheritance, external variables, parametrization, functions and operators, and much more. For more information visit https://jsonnet.org/.
Step 1: Save the following jsonnet
configuration to a file named cloud_storage_vars.jsonnet
:
local project = 'stackql-demo';
{
project: project,
location: 'US',
locationType: 'multi-region',
buckets: [
{
name: project + '-bucket1',
label: [{"env":"demo1"}],
},
{
name: project + '-bucket2',
label: [{"env":"demo2"}],
},
{
name: project + '-bucket3',
label: [{"env":"demo3"}],
},
]
}
Step 2: Save the following StackQL script to a file named deploy_buckets.iql
:
{{range .buckets}}
INSERT INTO google.storage.buckets(
project,
data__location,
data__locationType,
data__name,
data__labels
)
SELECT
'{{ $.project }}',
'{{ $.location }}',
'{{ $.locationType }}',
'{{ .name }}',
'{{ .label }}'
;
{{end}}
Step 3: (Optional) Perform a dryrun
operation to see what the resultant query would be
With any modification operation which sources data from a jsonnet
file it is useful to perform a dryrun
operation first to ensure the variable substitution is as you intended. This is done by using the dryrun
flag as shown here:
stackql exec --iqldata ./cloud_storage_vars.jsonnet --infile ./deploy_buckets.iql --dryrun --output text -H
Step 4: Run the query
If you are satisfied with the results of the dry run, remove the dryrun
and output
flags (which are not needed) and run the command, as shown here (note that if you are creating or deploying resources in GCP you will need to supply a valid service account key file or authenticate interactively):
stackql exec --iqldata ./cloud_storage_vars.jsonnet --infile ./deploy_buckets.iql --keyfilepath /mnt/c/tmp/stackql-demo.json
Querying buckets using SELECT
statements
This is easy...
SELECT id, name, location, locationType, timeCreated
FROM google.storage.buckets
WHERE project = 'stackql-demo';
To order the result set by timeCreated
latest first...
SELECT id, name, location, locationType,timeCreated
FROM google.storage.buckets
WHERE project = 'stackql-demo'
ORDER BY timeCreated DESC;
Or a simple count:
SELECT COUNT(*) as 'Number of Buckets'
FROM google.storage.buckets
WHERE project = 'stackql-demo'
AND name LIKE '%stackql-demo-bucket%';
Cleanup
To clean up the environment and remove the test buckets, copy the following code to a file named bucket_cleanup.iql
.
{{range .buckets}}
DELETE FROM google.storage.buckets
WHERE bucket = '{{.name}}';
{{end}}
Since we are in the danger zone here (deleting multiple persistent data objects) you may want to be sure you haven’t fat-fingered anything and don’t remove anything you didn’t intend to... To do this, use the dryrun
option shown earlier to run the script and preprocess the input data only and show the templated command before you run it.
When you are ready to go you can run this command:
stackql exec --iqldata ./cloud_storage_vars.jsonnet --infile ./bucket_cleanup.iql --keyfilepath /mnt/c/tmp/stackql-demo.json
Now you should see a filtered count return 0
resources as shown here:
SELECT COUNT(*) as 'Number of Buckets'
FROM google.storage.buckets
WHERE project = 'stackql-demo'
AND name LIKE '%stackql-demo-bucket%';
Future posts will demonstrate more advanced query and deployment operations with Cloud Storage, such as working with encryption using CMEK in KMS, configuring logging, cors, website access, logging configuration, acls and IAM. See ya then!