Skip to main content

Google Provider Update - December 2025

· 3 min read
Technologist and Cloud Consultant

We've released a major update to the StackQL Google provider with a new service, enhanced AI/ML capabilities, and improvements across 177 service files.

New Service: Speech-to-Text v2

The speechv2 service brings Cloud Speech-to-Text API v2 to StackQL with 6 resources:

ResourceDescription
recognizersManage speech recognition configurations with create, list, get, patch, delete, undelete, recognize, and batch_recognize methods
custom_classesCreate custom vocabulary classes for improved recognition accuracy
phrase_setsDefine phrase hints to boost recognition of specific terms
configManage location-level Speech-to-Text configuration
locationsQuery available service locations
operationsTrack long-running operations

Key features include support for multiple audio encodings (WAV, FLAC, MP3, OGG, WebM, MP4/AAC), translation capabilities, denoiser config, and KMS encryption support.

Vertex AI / AI Platform

The largest update in this release with 87,000+ line changes introduces powerful new RAG and evaluation capabilities:

  • RAG Resources: rag_corpora, rag_files, rag_engine_config for Retrieval-Augmented Generation
  • Conversational AI: New chat resource
  • Model Evaluation: evaluation_sets and evaluation_items for systematic model assessment
  • New Resources: science, invoke, and openapi resources
  • Performance: Enhanced cache_config for caching configurations

Discovery Engine

Major enhancements (50,000+ line changes) for search and conversational AI:

  • New assistants resource
  • New sitemaps resource for site search
  • New custom_models resource
  • Enhanced sessions and answers for conversational search
  • New authorized_views and authorized_view_sets for access control

Contact Center AI Insights

Quality assurance and analytics improvements (20,000+ line changes):

  • New qa_questions and qa_question_tags for quality assurance workflows
  • New analysis_rules resource
  • New segments resource
  • New authorized_views with IAM policy support
  • New datasets and views resources

BigQuery

Enhanced governance and access control (18,000+ line changes):

  • New routines_iam_policies for stored procedure/function IAM
  • Enhanced row_access_policies

Healthcare API

Expanded metrics and data mapping (15,000+ line changes):

  • New data_mapper_workspaces_iam_policies
  • Enhanced metrics: hl7_v2_store_metrics, dicom_store_metrics, series_metrics, study_metrics
  • New instances_storage_info resource

Cloud Spanner

Backup and security enhancements (14,000+ line changes):

  • New backup_schedules with IAM support
  • New databases_split_points resource
  • New database_roles with IAM policies

Cloud SQL Admin

New integration and management features (12,000+ line changes):

  • New instances_entra_id_certificate for Microsoft Entra ID integration
  • New instances_disk_shrink_config
  • New instances_latest_recovery_time

GKE On-Prem

Enhanced IAM across VMware and Bare Metal clusters (9,000+ line changes):

  • Enhanced VMware cluster resources with IAM policies
  • Enhanced Bare Metal cluster resources with IAM policies
  • New vmware_node_pools and bare_metal_node_pools with IAM

Developer Connect

Git integration improvements (3,500+ line changes):

  • New git_repository_links_git_refs resource
  • New users_self and users_access_token resources
  • New token resources: read_token, read_write_token

Text-to-Speech

Enhanced voices and text resources with new capabilities.

Get Started

Update to the latest Google provider:

stackql registry pull google

Let us know your thoughts! Visit us and give us a star on GitHub.

Window Functions and CTEs Now Available in StackQL

· 2 min read
Technologist and Cloud Consultant

Window functions and Common Table Expressions (CTEs) are now generally available in StackQL. These features work with both the embedded SQLite backend and PostgreSQL backend.

Window Functions

Window functions allow you to perform calculations across sets of rows related to the current row. Supported functions include:

  • Ranking: ROW_NUMBER(), RANK(), DENSE_RANK(), NTILE()
  • Offset: LAG(), LEAD(), FIRST_VALUE(), LAST_VALUE(), NTH_VALUE()
  • Distribution: PERCENT_RANK(), CUME_DIST()
  • Aggregates as window functions: SUM(), COUNT(), AVG(), etc. with OVER clause

Example: Ranking Contributors

SELECT
login,
contributions,
DENSE_RANK() OVER (ORDER BY contributions DESC) as rank
FROM github.repos.contributors
WHERE owner = 'stackql' AND repo = 'stackql';

Example: Running Totals

SELECT
login,
contributions,
SUM(contributions) OVER (ORDER BY contributions DESC) as running_total,
ROUND(100.0 * contributions / SUM(contributions) OVER (), 2) as pct_of_total
FROM github.repos.contributors
WHERE owner = 'stackql' AND repo = 'stackql';

Common Table Expressions (CTEs)

CTEs let you define temporary named result sets using the WITH clause. This simplifies complex queries by breaking them into logical components.

Example: Aggregating Across Multiple Resources

WITH all_contributors AS (
SELECT login, contributions
FROM github.repos.contributors
WHERE owner = 'stackql' AND repo = 'stackql'
UNION ALL
SELECT login, contributions
FROM github.repos.contributors
WHERE owner = 'stackql' AND repo = 'stackql-deploy'
)
SELECT
DENSE_RANK() OVER (ORDER BY SUM(contributions) DESC) as rank,
login,
SUM(contributions) as total_contributions
FROM all_contributors
GROUP BY login
ORDER BY total_contributions DESC;

Documentation

Full documentation is available:

Let us know your thoughts! Visit us and give us a star on GitHub.

StackQL Provider Development Skill for Claude

· 3 min read
Technologist and Cloud Consultant

We've published a Claude Skill for StackQL provider development. It provides Claude with the context needed to help you build providers using the any-sdk library (interface used by StackQL to interact with the cloud providers).

What's in the Skill

The skill covers the full provider development workflow:

  • Provider document structure - The provider.yaml schema and service document layout
  • OpenAPI extensions - All x-stackQL-* extensions including x-stackQL-resources, x-stackQL-config, x-stackQL-objectKey, and others
  • Resource and method definitions - How to define resources, map methods to operations, and configure responses
  • SQL verb mappings - Connecting REST operations to SELECT, INSERT, UPDATE, and DELETE
  • Authentication patterns - API keys, service accounts, OAuth, and custom auth schemes
  • Pagination handling - Request/response token configuration
  • Response processing - JSONPath extraction, transformations, and schema handling
What is a Claude Skill?

A Claude Skill is a markdown file that provides Claude with specialized knowledge for a particular domain or task. When you add a skill to your project, Claude can reference it during conversations to give more accurate, context-aware responses.

To use this skill:

  1. Download the skill file (stackql-provider-development.md)
  2. In Claude, go to your Project settings
  3. Add the skill file to your Project Knowledge

Once added, Claude will automatically apply this knowledge when you're working on StackQL provider development.

Download

Grab the skill from the any-sdk repository:

Download stackql-provider-development.md

Usage

With the skill loaded, you can ask Claude things like:

  • "Create a provider definition for this API"
  • "How do I map this endpoint to a SELECT operation?"
  • "What's the correct objectKey syntax for this nested response?"
  • "Help me configure pagination for this API"

Claude will have the full context of StackQL's OpenAPI extensions and provider patterns to work from.

Example

Here's a sample interaction after loading the skill:

You: I have an API endpoint GET /users that returns {"data": {"users": [...]}}. 
How do I set up the resource?

Claude: For that response structure, you'd configure the method like this:

methods:
list:
operation:
$ref: '#/paths/~1users/get'
response:
mediaType: application/json
openAPIDocKey: '200'
objectKey: $.data.users

The objectKey uses JSONPath to extract the users array from the nested response.

Resources

⭐ Star us on GitHub

Markdown-KV Output Format Available in pystackql

· 3 min read
Technologist and Cloud Consultant

pystackql now includes a markdownkv output format optimized for LLM processing of control plane and data plane data from cloud providers.

Background

Recent research from ImprovingAgents.com tested 11 data formats to determine which ones LLMs parse most accurately. Using 1,000 synthetic employee records and 1,000 randomized queries, they measured how well different formats preserved data integrity through LLM processing.

The results:

FormatAccuracy95% CI
Markdown-KV60.7%57.6%–63.7%
JSON52.3%49.2%–55.4%
Markdown Tables51.9%48.8%–55.0%
JSONL45.0%41.9%–48.1%
CSV44.3%41.2%–47.4%


Markdown-KV showed a 37% improvement over CSV and 16 percentage points over JSON. The tradeoff: it uses approximately 2.7x more tokens than CSV.

What is Markdown-KV?

Markdown-KV uses hierarchical markdown headers with code blocks for key-value pairs:

# Query Results

## Record 1

id: i-1234567890abcdef0
name: prod-web-01
region: us-east-1
instance_type: t3.large
state: running


## Record 2

id: i-0987654321fedcba0
name: staging-web-01
region: us-west-2
instance_type: t3.medium
state: stopped

The format combines clear hierarchy, explicit key-value pairs, and readability for both humans and LLMs.

Usage

from pystackql import StackQL

stackql = StackQL()

# Query with Markdown-KV output
result = stackql.execute(
"""
SELECT instanceId, instanceType, state, availabilityZone
FROM aws.ec2.instances
WHERE region = 'us-east-1'
""",
output='markdownkv'
)

# Use with LLMs
response = llm_client.complete(
f"Identify instances that should be stopped:\n\n{result}"
)

Works in server mode too:

stackql = StackQL(server_mode=True)

result = stackql.execute(
"SELECT name, region, encryption FROM google.storage.buckets WHERE project = 'my-project'",
output='markdownkv'
)

When to Use It

Markdown-KV is useful when:

  • Feeding infrastructure data to LLMs for analysis, security reviews, or recommendations
  • Building RAG pipelines that need to accurately retrieve and reason about infrastructure
  • Accuracy matters more than token efficiency (infrastructure decisions typically do)
  • Query results are focused datasets (most StackQL queries are)

The token cost is a real tradeoff, but infrastructure queries typically return targeted result sets, not massive datasets. When you're asking an LLM to analyze your production environment, accuracy matters.

Getting Started

Update pystackql:

pip install --upgrade pystackql

Add output='markdownkv' to your execute calls or in the StackQL object instantiation:

result = stackql.execute(query, output='markdownkv')

Resources

The Markdown-KV output format is available in pystackql v3.8.2 and later.

⭐ Star us on GitHub and join our community!

StackQL MCP Server Now Available

· 4 min read
Technologist and Cloud Consultant

StackQL now supports the Model Context Protocol (MCP). This integration enables AI agents and assistants to query and manage cloud infrastructure across multiple providers using natural language.

What is the Model Context Protocol?

The Model Context Protocol is an open standard that enables AI applications to securely connect to external data sources and tools. By running StackQL as an MCP server, AI agents like Claude, ChatGPT, and other LLM-based assistants can interact with your cloud infrastructure using StackQL's powerful SQL-based query capabilities.

Why MCP + StackQL?

Combining MCP with StackQL creates a powerful interface for AI-assisted infrastructure management:

  • Natural Language Infrastructure Queries: Ask questions about your cloud resources in plain English and get structured data back
  • Multi-Cloud Support: Access resources across AWS, Google Cloud, Azure, and 100+ other providers through a single interface
  • Secure and Standardized: MCP provides a secure, standardized way for AI agents to interact with your infrastructure
  • SQL-Powered Analytics: Leverage StackQL's full SQL capabilities including joins, aggregations, and complex queries through AI agents

Deployment Options

StackQL's MCP server supports three flexible deployment modes to suit different architectural requirements:

1. Standalone MCP Server

Perfect for development and AI agent integration:

stackql mcp \
--mcp.server.type=http \
--mcp.config '{"server": {"transport": "http", "address": "127.0.0.1:9912"}}'

2. Dual-Protocol Server (In-Memory)

Run both MCP and PostgreSQL wire protocol simultaneously with high-performance in-memory communication:

stackql srv \
--mcp.server.type=http \
--mcp.config '{"server": {"transport": "http", "address": "127.0.0.1:9912"}}' \
--pgsrv.port 5665

This mode is ideal when you need both AI agent access and traditional database client connectivity.

3. Reverse Proxy with TLS

For production environments requiring distributed deployments and encrypted connections:

stackql srv \
--mcp.server.type=reverse_proxy \
--mcp.config '{"server": {"tls_cert_file": "/path/to/cert.pem", "tls_key_file": "/path/to/key.pem", "transport": "http", "address": "127.0.0.1:9004"}, "backend": {"dsn": "postgres://stackql:stackql@127.0.0.1:5446?default_query_exec_mode=simple_protocol"}}' \
--pgsrv.port 5446

Available MCP Tools

When running as an MCP server, StackQL exposes several tools that AI agents can invoke:

ToolDescription
greetTest connectivity with the MCP server
list_providersList all available StackQL providers
list_servicesList services for a specific provider
list_resourcesList resources within a provider service
list_methodsList available methods for a resource
query_v2Execute StackQL queries

Integration with Claude Desktop

To integrate StackQL with Claude Desktop, add this configuration to your MCP settings file (~/Library/Application Support/Claude/claude_desktop_config.json on macOS):

{
"mcpServers": {
"stackql": {
"command": "stackql",
"args": [
"mcp",
"--mcp.server.type=http",
"--mcp.config",
"{\"server\": {\"transport\": \"http\", \"address\": \"127.0.0.1:9912\"}}"
]
}
}
}

Example Use Cases

Once configured, you can ask your AI assistant questions like:

  • "Show me all my EC2 instances across all AWS regions"
  • "List all Google Cloud Storage buckets with public access"
  • "Find all Azure virtual machines that haven't been updated in 30 days"
  • "Compare compute costs across AWS, Azure, and GCP"
  • "Show me IAM policies that grant admin access in my Google Cloud projects"

The AI agent will use StackQL's MCP server to execute the appropriate queries and return structured results.

Example Query Flow

Here's how an AI agent interacts with StackQL via MCP:

# AI agent lists available providers
Tool: list_providers
Response: ["google", "aws", "azure", "github", ...]

# AI agent explores a provider's services
Tool: list_services
Args: {"provider": "google"}
Response: ["compute", "storage", "cloudresourcemanager", ...]

# AI agent executes a query
Tool: query_v2
Args: {"sql": "SELECT name, status FROM google.compute.instances WHERE project = 'my-project' AND zone = 'us-east1-a'"}
Response: [{"name": "instance-1", "status": "RUNNING"}, ...]

Getting Started

  1. Download StackQL version 0.9.250 or later from stackql.io/install

  2. Set up provider authentication:

export GOOGLE_CREDENTIALS=$(cat /path/to/credentials.json)
export AWS_ACCESS_KEY_ID=your-access-key
export AWS_SECRET_ACCESS_KEY=your-secret-key
  1. Start the MCP server:
stackql mcp \
--mcp.server.type=http \
--mcp.config '{"server": {"transport": "http", "address": "127.0.0.1:9912"}}'
  1. Configure your AI assistant to use the StackQL MCP server (see MCP documentation for details)

Documentation

For comprehensive documentation on configuring and using the MCP server, including:

  • Detailed configuration options
  • TLS/mTLS setup
  • Architecture considerations
  • Testing and troubleshooting

Visit the MCP command documentation.

What's Next?

We're actively developing additional MCP capabilities and welcome your feedback. Future enhancements may include:

  • Enhanced resource provisioning and lifecycle management through MCP
  • Built-in prompt templates for common infrastructure queries
  • Extended tool catalog for specialized operations
  • Support for additional MCP transport protocols

Try It Out!

The MCP server feature is available now in StackQL 0.9.250. We'd love to hear about your experiences integrating StackQL with AI agents. Share your use cases, provide feedback, or contribute to the project on GitHub.

⭐ Star us on GitHub and join our community!