Skip to main content
Wherobots Cloud provides multiple ways to develop and run spatial analytics workloads — from interactive notebooks to automated pipelines.

Interactive Development

In Wherobots, you can develop interactively in Jupyter notebooks or SQL sessions.

Notebooks

Start pre-configured JupyterLab environments with scalable runtimes. No setup required.

SQL Sessions

Execute SQL queries on demand or on a schedule via the Spatial SQL API.

AI-Assisted Development

The Wherobots MCP server lets you interact with your spatial data catalogs, generate Spatial queries, and execute those queries through natural language prompts in AI-powered IDEs like VS Code Copilot. Commands executed through the MCP server are categorized as SQL Session workloads.

Configure MCP Server

Set up the Wherobots MCP server for VS Code Copilot integration.

MCP Usage & Best Practices

Explore data and generate spatial SQL through natural language prompts.

MCP Demo

Watch how the MCP server automates data discovery and query generation.

Spatial AI Coding Assistant

The Spatial AI Coding Assistant by Wherobots turns VS Code and any Code OSS-based editors into an agentic geospatial engineering workspace with cloud-scale compute, notebook integration, MCP-powered SQL execution, and AI-assisted workflows. The extension works with VS Code, Cursor, Windsurf, Kiro, Positron, Antigravity, Trae, VS Codium, and other Code OSS editors. It is also available on the Open VSX Registry. While the extension is available to all Organization Editions, use of its job submission, the Wherobots MCP server, and additional compute features requires a Professional or Enterprise Edition Organization.

Extension Setup

Install and configure the Spatial AI Coding Assistant.

Workspaces & Usage

Create, start, stop, and manage workspaces from your editor.

AI-Assisted Notebooks

Create and connect local notebooks to remote Wherobots compute with your editor’s AI assistant.

Job Submission

Submit Python scripts as Wherobots job runs from your editor.

Automated Development

In Wherobots, automated workloads are defined as Job Runs, which can execute Python or JAR scripts on demand or on a schedule, and SQL Sessions, which can execute SQL queries via the SQL Operator. You can submit Job Runs from the UI, API, or Airflow provider.

Job Runs

Execute Python or JAR scripts on demand or on a schedule via the REST API.

SQL Operator

Execute SQL queries against Wherobots catalogs from Airflow.

SQL API & SDKs

The Wherobots Spatial SQL API lets you execute spatial queries programmatically from Python, Java, and REST. Use it to integrate Wherobots into your applications, automate workflows, or run queries from your local machine.

Spatial SQL API

Execute Spatial SQL queries programmatically via Python, Java, and REST.

Storage

Wherobots Cloud offers multiple storage options for your spatial data, including built-in managed storage and integration with your own Amazon S3 buckets. In notebooks, you can seamlessly access data from the file system, managed storage, and S3 in a unified way.

Storage Overview

Understand the storage options available in Wherobots Cloud.

Managed Storage

Built-in S3 storage included with every organization (5 GB free).

S3 Integration

Connect your own Amazon S3 buckets with IAM role authentication.

Notebook & Data Storage

How file system, managed storage, and S3 work together in notebooks.

Unity Catalog Connection

Read Databricks Unity Catalog Delta tables directly in Wherobots without data migration.

RasterFlow

End-to-end inference engine for large-scale raster processing and geospatial ML workflows.

Get Started

Overview of RasterFlow: mosaics, inference, and vectorization.

Datasets

Built-in datasets and how to bring your own imagery.

Models

Built-in models and how to bring your own PyTorch models.

Orchestration

Schedule and automate Wherobots workloads from Apache Airflow DAGs. Use the Airflow provider to orchestrate notebooks, SQL sessions, and job runs alongside your other data workflows.

Airflow Provider

Orchestrate Wherobots workloads from Apache Airflow DAGs.

Job Run Operator

Submit Python or JAR scripts as Job Runs from Airflow.

SQL Operator

Execute SQL queries against Wherobots catalogs from Airflow.

Monitoring

Workload History provides a comprehensive view of all your Wherobots workloads, including notebooks, SQL sessions, job runs, and automated workloads executed via the MCP server. Monitor resource consumption, track performance, and troubleshoot issues in one place.

Workload History

Monitor notebooks, SQL sessions, job runs to track Spatial Unit (SU) consumption.

Runtimes & Compute

Runtimes

Understand runtime types (General Purpose, Memory Optimized), billing in Spatial Units, and how to request compute access.