Interactive Development
In Wherobots, you can develop interactively in Jupyter notebooks or SQL sessions.Notebooks
Start pre-configured JupyterLab environments with scalable runtimes. No setup required.
SQL Sessions
Execute SQL queries on demand or on a schedule via the Spatial SQL API.
Agentic Tools
Use Wherobots agentic components — the VS Code Extension, MCP Server, Agent Skills, and CLI — to bring AI-assisted geospatial development to your editor or terminal.Agentic Tools Overview
See which components are installed automatically and which require manual setup for VS Code, Cursor, and Claude Code.
VS Code Extension
Install the extension for workspace management, notebooks, job submission, and MCP-powered AI workflows.
MCP Server
Configure the MCP server for spatial data exploration and query generation.
CLI
Submit jobs, stream logs, and access the full Wherobots API from your terminal.
Automated Development
In Wherobots, automated workloads are defined as Job Runs, which can execute Python or JAR scripts on demand or on a schedule, and SQL Sessions, which can execute SQL queries via the SQL Operator. You can submit Job Runs from the UI, API, or Airflow provider.Job Runs
Execute Python or JAR scripts on demand or on a schedule via the REST API.
SQL Operator
Execute SQL queries against Wherobots catalogs from Airflow.
SQL API & SDKs
The Wherobots Spatial SQL API lets you execute spatial queries programmatically from Python, Java, and REST. Use it to integrate Wherobots into your applications, automate workflows, or run queries from your local machine.Spatial SQL API
Execute Spatial SQL queries programmatically via Python, Java, and REST.
Storage
Wherobots Cloud offers multiple storage options for your spatial data, including built-in managed storage and integration with your own Amazon S3 buckets. In notebooks, you can seamlessly access data from the file system, managed storage, and S3 in a unified way.Storage Overview
Understand the storage options available in Wherobots Cloud.
Managed Storage
Built-in S3 storage included with every organization (5 GB free).
S3 Integration
Connect your own Amazon S3 buckets with IAM role authentication.
Notebook & Data Storage
How file system, managed storage, and S3 work together in notebooks.
Unity Catalog Connection
Read Databricks Unity Catalog Delta tables directly in Wherobots without data migration.
RasterFlow
End-to-end inference engine for large-scale raster processing and geospatial ML workflows.Get Started
Overview of RasterFlow: mosaics, inference, and vectorization.
Datasets
Built-in datasets and how to bring your own imagery.
Models
Built-in models and how to bring your own PyTorch models.
Orchestration
Schedule and automate Wherobots workloads from Apache Airflow DAGs. Use the Airflow provider to orchestrate notebooks, SQL sessions, and job runs alongside your other data workflows.Airflow Provider
Orchestrate Wherobots workloads from Apache Airflow DAGs.
Job Run Operator
Submit Python or JAR scripts as Job Runs from Airflow.
SQL Operator
Execute SQL queries against Wherobots catalogs from Airflow.
Monitoring
Workload History provides a comprehensive view of all your Wherobots workloads, including notebooks, SQL sessions, job runs, and automated workloads executed via the MCP server. Monitor resource consumption, track performance, and troubleshoot issues in one place.Workload History
Monitor notebooks, SQL sessions, job runs to track Spatial Unit (SU) consumption.
Runtimes & Compute
Runtimes
Understand runtime types (General Purpose, Memory Optimized), billing in Spatial Units, and how to request compute access.

