Interactive Development
In Wherobots, you can develop interactively in Jupyter notebooks or SQL sessions.Notebooks
Start pre-configured JupyterLab environments with scalable runtimes. No setup required.
SQL Sessions
Execute SQL queries on demand or on a schedule via the Spatial SQL API.
AI-Assisted Development
The Wherobots MCP server lets you interact with your spatial data catalogs, generate Spatial queries, and execute those queries through natural language prompts in AI-powered IDEs like VS Code Copilot. Commands executed through the MCP server are categorized as SQL Session workloads.Configure MCP Server
Set up the Wherobots MCP server for VS Code Copilot integration.
MCP Usage & Best Practices
Explore data and generate spatial SQL through natural language prompts.
MCP Demo
Watch how the MCP server automates data discovery and query generation.
Automated Development
In Wherobots, automated workloads are defined as Job Runs, which can execute Python or JAR scripts on demand or on a schedule, and SQL Sessions, which can execute SQL queries via the SQL Operator. You can submit Job Runs from the UI, API, or Airflow provider.Job Runs
Execute Python or JAR scripts on demand or on a schedule via the REST API.
SQL Operator
Execute SQL queries against Wherobots catalogs from Airflow.
SQL API & SDKs
The Wherobots Spatial SQL API lets you execute spatial queries programmatically from Python, Java, and REST. Use it to integrate Wherobots into your applications, automate workflows, or run queries from your local machine.Spatial SQL API
Execute Spatial SQL queries programmatically via Python, Java, and REST.
Storage
Wherobots Cloud offers multiple storage options for your spatial data, including built-in managed storage and integration with your own Amazon S3 buckets. In notebooks, you can seamlessly access data from the file system, managed storage, and S3 in a unified way.Storage Overview
Understand the storage options available in Wherobots Cloud.
Managed Storage
Built-in S3 storage included with every organization (5 GB free).
S3 Integration
Connect your own Amazon S3 buckets with IAM role authentication.
Notebook & Data Storage
How file system, managed storage, and S3 work together in notebooks.
Unity Catalog Connection
Read Databricks Unity Catalog Delta tables directly in Wherobots without data migration.
RasterFlow
End-to-end inference engine for large-scale raster processing and geospatial ML workflows.Get Started
Overview of RasterFlow: mosaics, inference, and vectorization.
Datasets
Built-in datasets and how to bring your own imagery.
Models
Built-in models and how to bring your own PyTorch models.
Orchestration
Schedule and automate Wherobots workloads from Apache Airflow DAGs. Use the Airflow provider to orchestrate notebooks, SQL sessions, and job runs alongside your other data workflows.Airflow Provider
Orchestrate Wherobots workloads from Apache Airflow DAGs.
Job Run Operator
Submit Python or JAR scripts as Job Runs from Airflow.
SQL Operator
Execute SQL queries against Wherobots catalogs from Airflow.
Monitoring
Workload History provides a comprehensive view of all your Wherobots workloads, including notebooks, SQL sessions, job runs, and automated workloads executed via the MCP server. Monitor resource consumption, track performance, and troubleshoot issues in one place.Workload History
Monitor notebooks, SQL sessions, job runs to track Spatial Unit (SU) consumption.
Runtimes & Compute
Runtimes
Understand runtime types (General Purpose, Memory Optimized, GPU Optimized), billing in Spatial Units, and how to request compute access.

