Skip to main content

Updated Google Providers for StackQL Available

· 2 min read
Technologist and Cloud Consultant

The latest versions of the Google-related providers for StackQL: google, googleadmin, googleworkspace, and firebase are available now. These updates include the latest services, resources and methods available from Google.

What's New

The latest release introduces several new services to the google provider, expanding your ability to manage and query Google Cloud resources:

  • API Hub: Centrally manage and discover APIs across your organization
  • Area Insights: Access location-based insights and analytics
  • Cloud Location Finder: Identify optimal Google Cloud regions for your workloads
  • Gemini Cloud Assist: Leverage Google's AI assistant for cloud operations
  • Managed Kafka: Work with Google's fully-managed Apache Kafka service
  • Observability: Enhanced monitoring and observability services
  • Parallel Store: Interact with Google's high-performance storage solution
  • Parameter Manager: Manage configuration parameters across services
  • SaaS Service Management: Tools for managing SaaS offerings on Google Cloud
  • Secure Source Manager: Google's secure, fully-managed source control service
  • Security Posture: Assess and improve your cloud security posture
  • Storage Batch Operations: Perform batch operations on Cloud Storage resources

Enhanced Documentation

We've also released enhanced user documentation to help you get the most out of these providers. Check out our comprehensive docs:

Getting Started

To start using these updated providers, simply pull the latest version from stackql shell or stackql registry command:

registry pull google;
registry pull googleadmin;
registry pull googleworkspace;
registry pull firebase;

Then you can begin querying your Google resources with SQL:

SELECT name, region, status 
FROM google.compute.instances
WHERE project = 'my-project';

Use Cases for the Google Provider

The Google provider for StackQL opens up numerous possibilities:

  1. Infrastructure as Code: Manage your Google resources alongside other cloud providers in a unified IaC approach, see stackql-deploy.

  2. Cost Optimization: Identify unused resources and opportunities for cost savings.

  3. Security and Compliance: Audit account roles, permissions, and access patterns to ensure compliance with security policies.

  4. Performance Monitoring: Track query performance, warehouse utilization, and identify optimization opportunities.

  5. Cross-Provider Orchestration: Build workflows that span Google and other cloud providers, enabling sophisticated data and infrastructure pipelines.

  6. Automated Reporting: Create automated reports on Google usage, performance, and costs.

⭐ us on GitHub and join our community!

Snowflake Provider for StackQL Released

· 3 min read
Technologist and Cloud Consultant

We're excited to announce the release of the Snowflake provider for StackQL! This new provider enables you to query and interact with your Snowflake resources using familiar SQL syntax, bridging the gap between data analytics and infrastructure management.

The Snowflake provider for StackQL gives you the ability to:

  • Query Snowflake metadata and statistics using SQL
  • Monitor warehouse, database, and query performance
  • Analyze resource usage and optimize costs
  • Integrate Snowflake management with your existing cloud infrastructure
  • Build cross-provider workflows and automation

Full documentation for the Snowflake provider is available here.

Getting Started

Getting started is as easy as...

REGISTRY PULL snowflake;

Example Queries

Let's explore some powerful examples of what you can do with the Snowflake provider for StackQL.

Analyzing Warehouses

SELECT 
size,
count(*) as num_warehouses
FROM
snowflake.warehouse.warehouses
WHERE
endpoint = 'OKXVNMC-VH34026'
GROUP BY
size;

Other fields for the warehouses resource include : name, warehouse_type, state, scaling_policy, auto_suspend, auto_resume, resource_monitor, enable_query_acceleration, query_acceleration_max_scale_factor, max_concurrency_level, owner, warehouse_credit_limit, target_statement_size and more.

Table Analysis

SELECT 
name,
bytes,
data_retention_time_in_days,
table_type
FROM
snowflake.table.tables
WHERE
database_name = 'SNOWFLAKE_SAMPLE_DATA'
AND schema_name = 'TPCH_SF10'
AND endpoint = 'OKXVNMC-VH34026'
ORDER BY
bytes DESC;

Other Services and Resources

Other notable resources which can be provisioned, managed or queried using the snowflake provider for stackql include:

alerts, api_integrations, catalog_integrations, compute_pools, databases, database_roles, dynamic_tables, event_tables, external_volumes, functions, grants, iceberg_tables, image_repositories, network_policies, notebooks, notification_integrations, pipes, procedures, roles, schemas, stages, streams, tasks, users, user_defined_functions, views, and more!

Use Cases for the Snowflake Provider

The Snowflake provider for StackQL opens up numerous possibilities:

  1. Infrastructure as Code: Manage your Snowflake resources alongside other cloud providers in a unified IaC approach, see stackql-deploy.

  2. Cost Optimization: Identify unused resources, inefficient warehouses, and opportunities for cost savings.

  3. Security and Compliance: Audit account roles, permissions, and access patterns to ensure compliance with security policies.

  4. Performance Monitoring: Track query performance, warehouse utilization, and identify optimization opportunities.

  5. Cross-Provider Orchestration: Build workflows that span Snowflake and other cloud providers, enabling sophisticated data and infrastructure pipelines.

  6. Automated Reporting: Create automated reports on Snowflake usage, performance, and costs.

⭐ us on GitHub and join our community!

(Quickly) Identify Old Node Runtimes in AWS Lambda

· 3 min read
Technologist and Cloud Consultant

Have you been sent one of these?

[Action Required] AWS Lambda end of support for Node.js 18 [AWS Account: 824123456789] [EU-CENTRAL-1]

If you are like me and manage AWS accounts with numerous Lambda functions potentially deployed across multiple regions, you need to identify affected resources, in this case, Lambda node runtimes, which will be discontinued later this year.  

With stackql this task is easy...

  1. Open AWS cloud shell in your AWS account (any region - it doesn't matter)
  2. Download stackql
curl -L https://bit.ly/stackql-zip -O && unzip stackql-zip
  1. Open an authenticated stackql command shell
sh stackql-aws-cloud-shell.sh
  1. Run some analytic queries using stackql; here are some examples...

🔍 List all functions and runtimes across regions

Run a stackql query to get the details about functions, runtimes, etc, deployed at any given time across one or more AWS regions.  You can include all 25 AWS regions; each query will be performed asynchronously - speeding up the results.

select 
function_name,
region,
runtime
FROM aws.lambda.functions
WHERE region IN ('us-east-1', 'eu-west-1');

📊 Group by runtime and region

Perform an analytic query like a group by aggregate query such as...

select 
runtime,
region,
count(*) as num_functions
FROM aws.lambda.functions
WHERE region IN ('us-east-1', 'eu-west-1', 'ap-southeast-2')
GROUP BY runtime, region;
tip

You can easily visualise this data using a notebook; see stackql-codespaces-notebook or stackql-jupyter-demo.

Using StackQL you can:

  • Quickly spot functions running on runtimes like nodejs18.x that are approaching end of support.
  • Plan your upgrades region-by-region with confidence.

⭐ us on GitHub and join our community!

New AWS Provider Available (Jan 2025)

· 2 min read
Technologist and Cloud Consultant
info

To get started with the aws provider for stackql, pull the provider from the registry as follows:  

registry pull aws;

for more detailed provider documentation, see here.

Happy New Year 🎉. The latest AWS provider for StackQL is now available.  The StackQL AWS Provider by the numbers:

  • 230 services
  • 3174 resources
  • 3917 methods

with additional new support for the following services:

  • amazonmq - Managed message broker service for Apache ActiveMQ and RabbitMQ that simplifies setup and operation of open-source message brokers on AWS.
  • applicationsignals - CloudWatch Application Signals automatically provides a correlated view of application performance that includes real user monitoring data and canaries.
  • apptest - AWS mainframe modernization ppplication Testing
  • connectcampaignsv2 - Amazon Connect Outbound Campaigns V2
  • invoicing - Deploy and query invoice units allowing you separate AWS account costs and configures your invoice for each business entity
  • launchwizard - Easily size, configure, and deploy third party applications on AWS
  • pcaconnectorscep - AWS Private CA Connector for SCEP
  • pcs - AWS Parallel Computing Service, easily run HPC workloads at virtually any scale
  • rbin - Recycle Bin is a resource recovery feature that enables you to restore accidentally deleted snapshots and EBS-backed AMIs.
  • s3tables - Amazon S3 Tables enabling Tabular Data Storage At Scale
  • ssmquicksetup - AWS Systems Manager Quick Setup

And 150 new resources with some notable additions including:

  • aws.apigateway.domain_name_access_associations
  • aws.appconfig.deployments, aws.appconfig.deployment_strategies
  • aws.batch.job_definitions
  • aws.bedrock.flows, aws.bedrock.prompts
  • aws.chatbot.custom_actions
  • aws.cloudformation.guard_hooks, aws.cloudformation.lambda_hooks
  • aws.cloudfront.anycast_ip_lists
  • aws.cloudtrail.dashboards, aws.cloudwatch.dashboards
  • aws.codepipeline.pipelines
  • aws.cognito.user_pool_identity_providers
  • aws.ec2.security_group_vpc_associations, aws.ec2.vpc_block_public_access_exclusions, aws.ec2.vpc_block_public_access_options
  • aws.glue.crawlers, aws.glue.databases, aws.glue.jobs, aws.glue.triggers
  • aws.guardduty.malware_protection_plans
  • aws.iot.commands
  • aws.memorydb.multi_region_clusters
  • aws.rds.db_shard_groups
  • aws.redshift.integrations
  • aws.sagemaker.clusters, aws.sagemaker.endpoints
  • aws.secretsmanager.resource_policies, aws.secretsmanager.rotation_schedules, aws.secretsmanager.secret_target_attachments
  • aws.workspaces.workspaces_pools
  • aws.wisdom.ai_agents, aws.wisdom.ai_prompts, aws.wisdom.ai_guardrails, aws.wisdom.message_templates
  • and much more!

⭐ us on GitHub and join our community!

Databricks Provider for StackQL Available

· 3 min read
Technologist and Cloud Consultant

We are pleased to announce the release of the Databricks provider for StackQL today.  The Databricks provider is two different providers, databricks_account and databricks_workspace.

info

Check out the registry docs at databricks_account or databricks_workspace.

To get started, pull the providers from the registry as follows:  

registry pull databricks_account;
registry pull databricks_workspace;

databricks_account provider

The databricks_account provider is used for account-level operations, including provisioning or managing users, groups, unity catalog metastores, workspaces, and account-level cloud resources used by workspaces (such as networking resources).  Services include:

stackql  >>show services in databricks_account;
|----------------------------|---------------|--------------------------------|
|             id             |     name      |             title              |
|----------------------------|---------------|--------------------------------|
| billing:v00.00.00000       | billing       | Account Billing                |
|----------------------------|---------------|--------------------------------|
| iam:v00.00.00000           | iam           | Identity and Access Management |
|----------------------------|---------------|--------------------------------|
| logging:v00.00.00000       | logging       | Log Delivery                   |
|----------------------------|---------------|--------------------------------|
| oauth:v00.00.00000         | oauth         | OAuth Integrations             |
|----------------------------|---------------|--------------------------------|
| provisioning:v00.00.00000  | provisioning  | Account Provisioning           |
|----------------------------|---------------|--------------------------------|
| settings:v00.00.00000      | settings      | Account Settings               |
|----------------------------|---------------|--------------------------------|
| unity_catalog:v00.00.00000 | unity_catalog | Unity Catalog                  |
|----------------------------|---------------|--------------------------------|

Some example databricks_account queries are shown here:

stackql  >>select *  from  databricks_account.iam.users where account_id = 'ebfcc5a9-9d49-4c93-b651-b3ee6cf1c9ce' and active = true;
|--------|--------------|-------------------------------------------------------------|------------|------------------|---------------------------------------------|---------------------------------------------|------------------|
| active | displayName  |                           emails                            | externalId |        id        |                    name                     |                    roles        
|     userName     |
|--------|--------------|-------------------------------------------------------------|------------|------------------|---------------------------------------------|---------------------------------------------|------------------|
| true   | Jeffrey Aven | [{"primary":true,"type":"work","value":"javen@stackql.io"}] | null       | 5728205706991489 | {"familyName":"Aven","givenName":"Jeffrey"} | [{"type":"direct","value":"account_admin"}] | javen@stackql.io |
|--------|--------------|-------------------------------------------------------------|------------|------------------|---------------------------------------------|---------------------------------------------|------------------|

or..

stackql  >>SELECT applicationId,  displayName
stackql  >>FROM databricks_account.iam.service_principals, JSON_EACH(roles)
stackql  >>WHERE account_id = 'ebfcc5a9-9d49-4c93-b651-b3ee6cf1c9ce'
stackql  >>AND JSON_EXTRACT(json_each.value, '$.value') = 'account_admin';
|--------------------------------------|-------------|
|            applicationId             | displayName |
|--------------------------------------|-------------|
| 0b7b23de-3e7d-4432-812c-cf517e079a22 | stackql     |
|--------------------------------------|-------------|

or..

stackql  >>select
stackql  >>workspace_id,
stackql  >>workspace_name,
stackql  >>deployment_name,
stackql  >>workspace_status,
stackql  >>pricing_tier,
stackql  >>aws_region,
stackql  >>credentials_id,
stackql  >>storage_configuration_id
stackql  >>from
stackql  >>databricks_account.provisioning.workspaces where account_id = 'ebfcc5a9-9d49-4c93-b651-b3ee6cf1c9ce';
|------------------|----------------|-------------------|------------------|--------------|------------|--------------------------------------|--------------------------------------|
|   workspace_id   | workspace_name |  deployment_name  | workspace_status | pricing_tier | aws_region |            credentials_id            |       storage_configuration_id       |
|------------------|----------------|-------------------|------------------|--------------|------------|--------------------------------------|--------------------------------------|
| 1583879855205171 | stackql-test   | dbc-ddbc0f51-c9cf | RUNNING          | PREMIUM      | us-west-2  | dcacd875-c782-46ea-9d3e-8307975d758a | e52e029f-24bb-4a75-99c3-7796c202dd89 |
|------------------|----------------|-------------------|------------------|--------------|------------|--------------------------------------|--------------------------------------|

databricks_workspace provider

The databricks_workspace provider is used for workspace-level operations, such as provisioning and managing clusters, dashboards, and workflow jobs (including delta live table pipelines).  Services include:  

stackql  >>show services in databricks_workspace;
|------------------------------|-----------------|-----------------|
|              id              |      name       |      title      |
|------------------------------|-----------------|-----------------|
| apps:v24.12.00279            | apps            | Apps            |
|------------------------------|-----------------|-----------------|
| cleanrooms:v24.12.00279      | cleanrooms      | Cleanrooms      |
|------------------------------|-----------------|-----------------|
| compute:v24.12.00279         | compute         | Compute         |
|------------------------------|-----------------|-----------------|
| dbsql:v24.12.00279           | dbsql           | Dbsql           |
|------------------------------|-----------------|-----------------|
| deltalivetables:v24.12.00279 | deltalivetables | Deltalivetables |
|------------------------------|-----------------|-----------------|
| deltasharing:v24.12.00279    | deltasharing    | Deltasharing    |
|------------------------------|-----------------|-----------------|
| filemanagement:v24.12.00279  | filemanagement  | Filemanagement  |
|------------------------------|-----------------|-----------------|
| iam:v24.12.00279             | iam             | Iam             |
|------------------------------|-----------------|-----------------|
| lakeview:v24.12.00279        | lakeview        | Lakeview        |
|------------------------------|-----------------|-----------------|
| machinelearning:v24.12.00279 | machinelearning | Machinelearning |
|------------------------------|-----------------|-----------------|
| marketplace:v24.12.00279     | marketplace     | Marketplace     |
|------------------------------|-----------------|-----------------|
| realtimeserving:v24.12.00279 | realtimeserving | Realtimeserving |
|------------------------------|-----------------|-----------------|
| repos:v24.12.00279           | repos           | Repos           |
|------------------------------|-----------------|-----------------|
| secrets:v24.12.00279         | secrets         | Secrets         |
|------------------------------|-----------------|-----------------|
| unitycatalog:v24.12.00279    | unitycatalog    | Unitycatalog    |
|------------------------------|-----------------|-----------------|
| vectorsearch:v24.12.00279    | vectorsearch    | Vectorsearch    |
|------------------------------|-----------------|-----------------|
| workflows:v24.12.00279       | workflows       | Workflows       |
|------------------------------|-----------------|-----------------|
| workspace:v24.12.00279       | workspace       | Workspace       |
|------------------------------|-----------------|-----------------|

An example query could be:

stackql  >>select
stackql  >>cluster_id,
stackql  >>aws_attributes,
stackql  >>node_type_id,
stackql  >>state
stackql  >>from
stackql  >>databricks_workspace.compute.clusters
stackql  >>where deployment_name = 'dbc-ddbc0f51-c9cf';
|----------------------|---------------------------------------------------------------------------------------------------------|--------------|------------|
|      cluster_id      |                                             aws_attributes                                              | node_type_id |   state    |
|----------------------|---------------------------------------------------------------------------------------------------------|--------------|------------|
| 1218-233957-q9v9oi86 | {"availability":"SPOT_WITH_FALLBACK","first_on_demand":1,"spot_bid_price_percent":100,"zone_id":"auto"} | m5d.large    | TERMINATED |
|----------------------|---------------------------------------------------------------------------------------------------------|--------------|------------|

To use either provider, set the following environment variables (either locally or as secrets in your preferred CI tool):

  • DATABRICKS_ACCOUNT_ID - a uuid representing your Databricks account id, you can get this from the Databricks UI
  • DATABRICKS_CLIENT_ID - obtained after creating a service principal through the Databricks UI
  • DATABRICKS_CLIENT_SECRET - obtained after creating a service principal secret through the Databricks UI, using the "Generate Secret" function

These are the same variables that Terraform, the Databricks SDKs, and CLI use.  

stackql-deploy examples coming soon, stay tuned!  

⭐ us on GitHub and join our community!