Skip to main content

78 posts tagged with "stackql"

View All Tags

Run StackQL Queries from the Databricks Web Terminal

· 2 min read
Technologist and Cloud Consultant

If you have access to a Databricks workspace, you can run StackQL queries directly from the Databricks Web Terminal using your Databricks identity.

How It Works

Download the latest release of stackql, then run the convenience script included (similar scripts are included for other cloud provider terminals - e.g. AWS Cloud Shell).

curl -L https://bit.ly/stackql-zip -O && unzip stackql-zip
sh stackql-databricks-shell.sh

Example Queries

Here are the sample queries run in the video, just change the deployment_name for your workspace.

User entitlements

SELECT
deployment_name,
id,
userName,
displayName,
entitlement
FROM databricks_workspace.iam.vw_user_entitlements
WHERE deployment_name = 'dbc-74aa95f7-8c7e';

All workspace settings

SELECT * FROM
databricks_workspace.settings.vw_all_settings
WHERE deployment_name = 'dbc-74aa95f7-8c7e';

Tag policies filtered by key prefix

SELECT
tag_key as key,
description
FROM databricks_workspace.tags.tag_policies
WHERE deployment_name = 'dbc-74aa95f7-8c7e'
AND key LIKE 'class%';

Catalog count by type

SELECT
catalog_type,
COUNT(*) as num_catalogs
FROM databricks_workspace.catalog.catalogs
WHERE deployment_name = 'dbc-74aa95f7-8c7e'
GROUP BY catalog_type;

Provider Coverage

The databricks_workspace provider covers workspace related services, the databricks_account provider covers account-level operations including provisioning, billing, and account IAM.

The web terminal flow covers workspace-scoped queries using the token of the logged-in user. For account-level queries (provisioning, billing, account IAM), you need a Databricks service principal with account admin rights and OAuth2 credentials:

export DATABRICKS_ACCOUNT_ID="your-account-id"
export DATABRICKS_CLIENT_ID="your-client-id"
export DATABRICKS_CLIENT_SECRET="your-client-secret"

These are the same variables used by the Databricks CLI and Terraform provider, so if you already have those configured the auth story is identical.

Get Started

Full provider documentation:

Visti StackQL on GitHub.

New Databricks Providers for StackQL Released

· 2 min read
Technologist and Cloud Consultant

Updated StackQL providers for Databricks are now available: databricks_account and databricks_workspace, giving you SQL access to the full Databricks control plane across account-level and workspace-level operations.

Provider Structure

The following updated providers are available:

ProviderScopeServices
databricks_accountAccount8
databricks_workspaceWorkspace26

Coverage

There are over 30 services, 300+ resources, and 983 operations spanning IAM, compute, catalog, billing, jobs, ML, serving, sharing, vector search, and more.

Example Queries

List workspaces in an account

SELECT
workspace_id,
workspace_name,
workspace_status,
aws_region,
compute_mode,
deployment_name,
datetime(creation_time/1000, 'unixepoch') as creation_date_time
FROM databricks_account.provisioning.workspaces
WHERE account_id = 'ebfcc5a9-9d49-4c93-b651-b3ee6cf1c9ce';

Query account users and roles

SELECT
id as user_id,
displayName as display_name,
userName as user_name,
active,
IIF(JSON_EXTRACT(roles,'$[0].value') = 'account_admin', 'true', 'false') as is_account_admin
FROM databricks_account.iam.account_users
WHERE account_id = 'ebfcc5a9-9d49-4c93-b651-b3ee6cf1c9ce';

List catalogs in a workspace

SELECT
full_name,
catalog_type,
comment,
datetime(created_at/1000, 'unixepoch') as created_at,
created_by,
datetime(updated_at/1000, 'unixepoch') as updated_at,
updated_by,
enable_predictive_optimization
FROM databricks_workspace.catalog.catalogs
WHERE deployment_name = 'dbc-36ff48e3-4a69';

Download billable usage to CSV

This one is worth calling out. You can pull billable usage data for a given period and write it straight to a CSV file:

./stackql exec \
-o text \
--hideheaders \
-f billable_usage.csv \
"SELECT contents
FROM databricks_account.billing.billable_usage
WHERE start_month = '2025-12'
AND end_month = '2026-01'
AND account_id = 'your-account-id'"

Authentication

Both providers authenticate using OAuth2 with a Databricks service principal. Set the following environment variables:

export DATABRICKS_ACCOUNT_ID="your-account-id"
export DATABRICKS_CLIENT_ID="your-client-id"
export DATABRICKS_CLIENT_SECRET="your-client-secret"

These are the same variables used by Terraform, the Databricks SDKs, and the Databricks CLI.

Get Started

Pull the providers:

registry pull databricks_account;
registry pull databricks_workspace;

Start querying via the shell or exec:

SELECT * FROM databricks_account.iam.account_groups WHERE account_id = 'your-account-id';

Full documentation is available at databricks-account-provider.stackql.io and databricks-workspace-provider.stackql.io. Let us know what you think on GitHub.

New Dedicated AWS Cloud Control Provider Released

· 2 min read
Technologist and Cloud Consultant

We've released a new dedicated StackQL AWS Cloud Control provider, providing full CRUDL operations across AWS services via the Cloud Control API including purpose-built resource definitions leveraging Cloud Control's consistent schema.

Resource Naming Convention

Resources follow a clear pattern to differentiate operations:

Resource PatternOperationsUse Case
{resource} (e.g., s3.buckets)SELECT, INSERT, UPDATE, DELETEFull CRUD with complete resource properties
{resource}_list_only (e.g., s3.buckets_list_only)SELECTFast enumeration of resource identifiers

This separation means listing thousands of resources won't trigger rate limits from individual GET calls:

-- Fast enumeration (list operation only)
SELECT bucket_name
FROM awscc.s3.buckets_list_only
WHERE region = 'us-east-1';

-- Full resource details (get operation)
SELECT *
FROM awscc.s3.buckets
WHERE region = 'us-east-1'
AND data__Identifier = 'my-bucket';

Provider Coverage

The awscc provider includes:

  • 237 services and 2371 resources covering the breadth of AWS
  • Full CRUDL support for all Cloud Control compatible resources
  • Consistent schema derived from AWS CloudFormation resource specifications

Example Operations

Create an S3 Bucket

INSERT INTO awscc.s3.buckets (
BucketName,
region
)
SELECT
'my-new-bucket',
'us-east-1';

Query EC2 Instances

SELECT 
instance_id,
instance_type,
tags
FROM awscc.ec2.instances
WHERE region = 'ap-southeast-2'
AND data__Identifier = 'i-1234567890abcdef0';

Delete a Resource

DELETE FROM awscc.lambda.functions
WHERE data__Identifier = 'my-function'
AND region = 'us-east-1';

Enhanced Documentation

The provider documentation at awscc.stackql.io now features:

  • Interactive schema explorer with expandable nested property trees
  • Complete field documentation including complex object structures
  • Ready-to-use SQL examples for SELECT, INSERT, and DELETE operations
  • IAM permissions reference for each resource operation

Get Started

Pull the new provider:

stackql registry pull awscc

Query your AWS resources:

stackql shell
>> SELECT region, bucket_name FROM awscc.s3.buckets_list_only WHERE region = 'us-east-1';

Let us know your thoughts! Visit us and give us a star on GitHub.

StackQL Joins the Linux Foundation and Agentic AI Foundation

· 2 min read
Technologist and Cloud Consultant
AAIF Member Linux Foundation Member

StackQL Studios has joined the Linux Foundation and the Agentic AI Foundation (AAIF) as a Silver Member.

As agentic AI moves from experimentation toward production workloads, we believe the infrastructure layer matters. Open, interoperable standards will be essential for AI agents to work reliably across cloud providers, data platforms, and enterprise systems.

StackQL provides a SQL-based interface for querying, provisioning, and managing cloud infrastructure across providers. Our work on the StackQL MCP Server brings this capability to AI agents through the Model Context Protocol, enabling agents to interact with cloud resources using natural language while maintaining the governance and auditability that production systems require.

Joining AAIF aligns with our long-standing commitment to open-source infrastructure tooling. We look forward to contributing to the foundation's work on MCP, agent runtimes, and the broader ecosystem of standards that will shape how AI agents interact with the systems they manage.

More information about the Agentic AI Foundation is available at aaif.io.

Google Provider Update - December 2025

· 3 min read
Technologist and Cloud Consultant

We've released a major update to the StackQL Google provider with a new service, enhanced AI/ML capabilities, and improvements across 177 service files.

New Service: Speech-to-Text v2

The speechv2 service brings Cloud Speech-to-Text API v2 to StackQL with 6 resources:

ResourceDescription
recognizersManage speech recognition configurations with create, list, get, patch, delete, undelete, recognize, and batch_recognize methods
custom_classesCreate custom vocabulary classes for improved recognition accuracy
phrase_setsDefine phrase hints to boost recognition of specific terms
configManage location-level Speech-to-Text configuration
locationsQuery available service locations
operationsTrack long-running operations

Key features include support for multiple audio encodings (WAV, FLAC, MP3, OGG, WebM, MP4/AAC), translation capabilities, denoiser config, and KMS encryption support.

Vertex AI / AI Platform

The largest update in this release with 87,000+ line changes introduces powerful new RAG and evaluation capabilities:

  • RAG Resources: rag_corpora, rag_files, rag_engine_config for Retrieval-Augmented Generation
  • Conversational AI: New chat resource
  • Model Evaluation: evaluation_sets and evaluation_items for systematic model assessment
  • New Resources: science, invoke, and openapi resources
  • Performance: Enhanced cache_config for caching configurations

Discovery Engine

Major enhancements (50,000+ line changes) for search and conversational AI:

  • New assistants resource
  • New sitemaps resource for site search
  • New custom_models resource
  • Enhanced sessions and answers for conversational search
  • New authorized_views and authorized_view_sets for access control

Contact Center AI Insights

Quality assurance and analytics improvements (20,000+ line changes):

  • New qa_questions and qa_question_tags for quality assurance workflows
  • New analysis_rules resource
  • New segments resource
  • New authorized_views with IAM policy support
  • New datasets and views resources

BigQuery

Enhanced governance and access control (18,000+ line changes):

  • New routines_iam_policies for stored procedure/function IAM
  • Enhanced row_access_policies

Healthcare API

Expanded metrics and data mapping (15,000+ line changes):

  • New data_mapper_workspaces_iam_policies
  • Enhanced metrics: hl7_v2_store_metrics, dicom_store_metrics, series_metrics, study_metrics
  • New instances_storage_info resource

Cloud Spanner

Backup and security enhancements (14,000+ line changes):

  • New backup_schedules with IAM support
  • New databases_split_points resource
  • New database_roles with IAM policies

Cloud SQL Admin

New integration and management features (12,000+ line changes):

  • New instances_entra_id_certificate for Microsoft Entra ID integration
  • New instances_disk_shrink_config
  • New instances_latest_recovery_time

GKE On-Prem

Enhanced IAM across VMware and Bare Metal clusters (9,000+ line changes):

  • Enhanced VMware cluster resources with IAM policies
  • Enhanced Bare Metal cluster resources with IAM policies
  • New vmware_node_pools and bare_metal_node_pools with IAM

Developer Connect

Git integration improvements (3,500+ line changes):

  • New git_repository_links_git_refs resource
  • New users_self and users_access_token resources
  • New token resources: read_token, read_write_token

Text-to-Speech

Enhanced voices and text resources with new capabilities.

Get Started

Update to the latest Google provider:

stackql registry pull google

Let us know your thoughts! Visit us and give us a star on GitHub.