Skip to main content

· 2 min read
What is StackQL?
info

stackql is a dev tool that allows you to query and manage cloud and SaaS resources using SQL, which developers and analysts can use for CSPM, assurance, user access management reporting, IaC, XOps and more.

The setup-stackql action is now available in the GitHub Marketplace. setup-stackql downloads and installs the latest stackql binary in your Actions runner. Different runner operating systems are fully tested and supported, including ubuntu-latest, windows-latest and darwin-latest.

Embedding stackql in an actions workflow can be done for assurance, compliance, and provisioning. An example workflow is shown here:

name: 'Setup StackQL Example'

on:
pull_request:

defaults:
run:
shell: bash

jobs:
stackql-test-matrix:
name: Stackql local run on ubuntu-latest with wrapper
runs-on: 'ubuntu-latest'
steps:
- name: Checkout
uses: actions/checkout@v3

- name: Setup Stackql
uses: ./
with:
use_wrapper: true

- name: Validate Stackql Version
run: |
stackql --version

- name: Prep Google Creds (bash)
run: | ## use the secret to create json file
sudo echo ${{ secrets.GOOGLE_CREDS }} | base64 -d > sa-key.json

- name: Use Google Provider
run: |
stackql exec "SELECT name, status
FROM google.compute.instances
WHERE project = 'stackql-demo' AND zone = 'australia-southeast1-a';" --auth='{ "google": { "type": "service_account", "credentialsfilepath": "sa-key.json" }}'

the log outputs showing...

|-----------------------------------------------------|------------|
| name | status |
|-----------------------------------------------------|------------|
| stackql-demo-001 | TERMINATED |
|-----------------------------------------------------|------------|
| stackql-demo-002 | RUNNING |
|-----------------------------------------------------|------------|
| stackql-demo-003 | RUNNING |
|-----------------------------------------------------|------------|
| stackql-demo-004 | RUNNING |
|-----------------------------------------------------|------------|

additional output types available through the --output argument include json and csv

Source code for the action can be found at stackql/setup-stackql

much more to come, stay tuned!

· 6 min read

This exercise will show you how to run a real-time query across your AWS and Google cloud environments. You may do this for inventory analysis, security analysis, or any other reason you can think of. We will use stackql to query the state of your cloud resources across your AWS and Google environments. You can also use stackql to provision, de-provision or manage resources across different cloud and SaaS providers.

The steps we will take are:

  1. Prepare your environment for stackql usage.
  2. Use stackql to provision some resources in cloud. optional
  3. Use stackql to query resources present in the cloud.
  4. Use stackql to tear down resources created in step (2), if any. Important: you must destroy any resources created through this exercise, or you will incur ongoing charges.

Preparation

For this exercise, credentials with privileges against google and aws are required. It is outside the scope of this document to go into great detail on the various topics and options relevant to this. Instead, the below steps provide both: (i) reference to vendor documentation and (ii) suggestions for workarounds to get yourself going.

for old hands

All the materials required for this exercise are:

  1. A current stackql executable.
  2. A Google Service Account Key JSON file, where the corresponding Service Account possesses permissions sufficient to create, interrogate and delete compute block storage.
  3. AWS credentials stored in the traditional AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, where the corresponding Service Account possesses permissions sufficient to create, interrogate and delete ec2 block storage.

step by step

First, please do the following:

  1. Download and install stackql from our website.
  2. For google:
    • (i) Create and download a Google Service Account Key as per Google documentation. Remember the location of your key file.
    • (ii) You will need to grant the Service Account at least read, list, create, and delete privileges. For more information about google iam and Service Accounts in particular, please consult the documentation. For this exercise, grant your service account the roles/compute.storageAdmin role would be adequate.
  3. For AWS:
    • (i) Create and download AWS user credentials as per AWS documentation. We will require long-lived credentials. In keeping with vendor advice, we strongly recommend against using root user credentials. We have created a dedicated CICD user for this exercise.
    • (ii) Set up the AWS CLI environment variables as per the documentation.
    • (iii) The user will need create / read / delete privileges against ec2 volumes. This can be done though the AWS IAM console in various ways. For example, one can use groups and permission policies. Adding your user to a group with AmazonEC2FullAccess will certainly work, although lesser privileges may be adequate.

Then, create some shell variables:

# you will need to edit the file path as appropriate

GOOGLE_DOWNLOADED_KEY_FILE_PATH="/path/to/your/downloaded/key.json"

AWS_AUTH_FRAGMENT='{ "type": "aws_signing_v4", "credentialsenvvar": "AWS_SECRET_ACCESS_KEY", "keyIDenvvar": "AWS_ACCESS_KEY_ID" }'

GOOGLE_AUTH_FRAGMENT='{ "credentialsfilepath": "'"${GOOGLE_DOWNLOADED_KEY_FILE_PATH}"'", "type": "service_account" }'

export STACKQL_AUTH_CTX='{ "aws": '"${AWS_AUTH_FRAGMENT}"', "google": '"${GOOGLE_AUTH_FRAGMENT}"' }'
Setting up Provider Auth in PowerShell
$GOOGLE_DOWNLOADED_KEY_FILE_PATH = "C:\path\to\your\downloaded\key.json"

$AWS_AUTH_FRAGMENT = '{ "type": "aws_signing_v4", "credentialsenvvar": "AWS_SECRET_ACCESS_KEY", "keyIDenvvar": "AWS_ACCESS_KEY_ID" }'

$GOOGLE_AUTH_FRAGMENT = '{ "credentialsfilepath": "' + $GOOGLE_DOWNLOADED_KEY_FILE_PATH + '", "type": "service_account" }'

$env:STACKQL_AUTH_CTX = '{ "aws": ' + $AWS_AUTH_FRAGMENT + ', "google": ' + $GOOGLE_AUTH_FRAGMENT + ' }'

Start a stackql shell session

To start an interactive shell session, in the same shell you setup your envrioment variables, run:

stackql --auth="${STACKQL_AUTH_CTX}" shell

You can exit at any time with ctrl + C.

Setup and meta queries to get started

StackQL providers are installed from the StackQL Provider Registry using the REGISTRY command. StackQL supports meta queries such as SHOW and DESCRIBE which can be used to explore the available services, resources, fields, and operations available in a given cloud or SaaS provider.

-- see available providers
registry pull list;

-- pull the required providers
registry pull google;

registry pull aws;

-- some the installed providers
show providers;

-- some meta queries
show services in google;

show resources in google.compute;

describe google.compute.disks;

Create block storage (optional)

You will need to replace the items in <ANGLE_BRACKETS>.

-- create a google volume, await and verify creation completes successfully
insert /*+ AWAIT */ into google.compute.disks(
project,
zone,
data__name,
data__sizeGb
)
select
'<YOUR_GCP_PROJECT>',
'australia-southeast1-a',
'my-stackql-demo-disk-01',
'10' ;

-- create an aws volume, operation despatched on a BEST EFFORT basis
insert into aws.ec2.volumes(
AvailabilityZone,
Size,
region)
select
'ap-southeast-2a',
10,
'ap-southeast-2';

Interrogate cloud block storage


-- query one resource from google
select
name,
split_part(split_part(type, '/', 11), '-', 2) as type,
status,
sizeGb as size
from google.compute.disks
where project = '<YOUR_GCP_PROJECT>'
and zone = 'australia-southeast1-a';

-- query the equivalent from aws
select
volumeId as name,
volumeType as type,
status,
size
from aws.ec2.volumes
where region = 'ap-southeast-2';

-- union the equivalent resources across clouds
select
'google' as vendor,
name,
split_part(split_part(type, '/', 11), '-', 2) as type,
status,
sizeGb as size
from google.compute.disks
where project = '<YOUR_GCP_PROJECT>'
and zone = 'australia-southeast1-a'
union
select
'aws' as vendor,
volumeId as name,
volumeType as type,
status,
size
from aws.ec2.volumes
where region = 'ap-southeast-2';

-- create a view for convenience
create view dual_cloud_block_storage as
select
'google' as vendor,
name,
split_part(split_part(type, '/', 11), '-', 2) as type,
status,
sizeGb as size
from google.compute.disks
where project = '<YOUR_GCP_PROJECT>'
and zone = 'australia-southeast1-a'
union
select
'aws' as vendor,
volumeId as name,
volumeType as type,
status,
size
from aws.ec2.volumes
where region = 'ap-southeast-2';

-- select from the newly created view, with ordering
select * from dual_cloud_block_storage order by name desc;

Delete block storage (if required)

This will only work if the disks are deletable. For example, aws.ec2.volumes must have status = available; you can check this with the view we created above.

/* delete a google volume, await and verify creation completes successfully.
One at a time only... */
delete /*+ AWAIT */ from google.compute.disks
where project = '<YOUR_GCP_PROJECT>'
and zone = 'australia-southeast1-a'
and disk = 'my-stackql-demo-disk-01';

-- delete an aws volume, operation despatched on a BEST EFFORT basis
delete from aws.ec2.volumes
where VolumeId = 'vol-049ee07b31aff451a'
and region = 'ap-southeast-2';

Verify the cleanup was successful

select * from dual_cloud_block_storage order by name desc;

That's it for the scripted demo!

Get involved

We Need Your Help!

if you find bugs, want features, have tech questions then go to github.com/stackql/stackql/issues and raise the appropriate issue 🙏

· 2 min read

The StackQL Sumologic provider is now available in the public StackQL Provider Registry. Docs are available at sumologic provider docs.

StackQL is an intelligent API client which uses SQL as a front-end language. StackQL can be used for querying cloud and SaaS providers, as well as provisioning and lifecycle operations.

The StackQL Sumo provider can query, create, update and delete Sumologic collectors and sources, view and manage ingest budgets, health events, dashboards, user and account access and activity, and more.

Some example queries include:

SELECT id, name FROM sumologic.collectors.collectors WHERE region = 'au';

or using built-in functions to simplify and format query outputs, such as:

SELECT alive, datetime(lastSeenAlive/1000, 'unixepoch') AS lastSeenAliveUtc,
datetime(lastSeenAlive/1000, 'unixepoch', 'localtime') AS lastSeenAliveLocal
FROM sumologic.collectors.collectors
WHERE region = 'au' AND id = 116208196;

another example...

SELECT id, email,
firstName || ' ' || lastName AS fullName,
isMfaEnabled,
lastLoginTimestamp,
round(julianday('now') - julianday(lastLoginTimestamp), 0) as daysSinceLastLogin
FROM sumologic.users.users WHERE region = 'au';

An example using StackQL with the Sumologic provider to query users and roles and join the results to get a list of users and their roles:

SELECT u.email as email, r.name AS role
FROM sumologic.users.users u
JOIN sumologic.roles.roles r
ON JSON_EXTRACT(u.roleIds, '$[0]') = r.id
WHERE u.region = 'au' AND r.region = 'au';

An example using StackQL and Jupyter is shown here (see stackql/stackql-jupyter-demo):

Use StackQL and Jupyter to query SumoLogic

StackQL can also be used to provision objects in Sumologic, the following query can be used to create a collector for instance:

INSERT INTO sumologic.collectors.collectors(region, data__collector)
SELECT 'au',
'{ "collectorType":"Hosted", "name":"My Hosted Collector", "description":"An example Hosted Collector", "category":"HTTP Collection" }';

Let us know what you think!

· One min read

StackQL is an intelligent API client which uses SQL as a front-end language with support for multi-cloud and SaaS provider environments, you can find more information at github.com/stackql/stackql

StackQL can provide valuable insights into your cloud and SaaS estates, whether for security posture management, cross-cloud entitlements reporting, cost optimization, or asset/inventory management.

As an interactive analysis tool, Jupyter notebooks can leverage StackQL to provide sources for cloud and SaaS provider data.

GCP Nodes

We've recently added magic function support for running StackQL queries in Jupyter notebooks, making the integration between StackQL and Jupyter more seamless. StackQL magic and be used on a line in a cell or the entire cell itself, as shown here:

The stackql-jupyter-demo Docker image is available from Docker Hub. You can find instructions to run using the Docker Hub image and instructions to run using docker-compose at github.com/stackql/stackql-jupyter-demo.

Give us your feedback!

· 2 min read

Proud to announce the release of the first version of our middleware server and playground for StackQL.

StackQL allows you to query and interact with cloud services and APIs using SQL grammar and an ORM which is a direct reflection of a provider API, no database is required or implemented

StackQL Middleware Server

Our middleware solution allows you to use StackQL as a query language to interact with APIs, much like GraphQL - however, the query DSL is SQL, providing a friendlier, more data-centric experience for developers. As shown in the example below, developers can POST queries to a /stackql endpoint; the queries are parsed and executed by a StackQL runner via the middleware server.

As the StackQL middleware API server and runner are stateless, the solution is horizontally scalable. With cache and authorization to be implemented, the solution provides a flexible, robust, scalable, performant back end for applications.

Furthermore, using SQL semantics, developers can JOIN data between API providers and perform projections, filtering, aggregation, windowing operations, and more on simple or complex data types.

StackQL Playground

The StackQL Playground is a TypeScript app that connects to a StackQL Middleware Server, which provides access to backend APIs using SQL. Features of the playground include...

Explore API Providers

The StackQL Playground allows you to explore and query providers, services, resources, fields, and methods.

Submit Queries

Using the Playground you can submit queries to the /stackql endpoint of the StackQL Middleware Server.

View, Sort, and Filter Results

Query results can be sorted or filtered in the grid result set in the StackQL Playground app, JSON results can also be viewed or copied from the JSON results tab.

Save Results as CSV or JSON

Users can save results to local CSV or JSON files as well.

Generate Types

Furthermore, after modeling a query you can export the TypeScript types using the Get Types button.

Huge thanks to Yuncheng Yang for the work he put in on this!

You can find the complete code here to launch an environment using docker-compose, which includes the StackQL Middleware Server, a StackQL runner (runs the queries on the back end), and the StackQL Playground app.

Let us know what you think.