RasterFlow workflows can be submitted as Job Runs using the Wherobots Runs REST API. This lets you run RasterFlow workflows as automated, standalone jobs outside of a notebook, which is useful for production pipelines, scheduled processing, and CI/CD integration.
RasterFlow is currently in Private Preview. Wherobots is rolling out RasterFlow to a select group of Organizations. If you are interested in gaining early access to these new capabilities and helping shape the future of the product, register your interest here.
Create a Python script that contains your RasterFlow workflow. This script will be executed by the Wherobots Job Run environment. It should be self-contained—all imports, configuration, and processing logic must be included in the script.The following example runs the Fields of the World (FTW) model on Haskell County, Kansas using RasterFlow.
#!/usr/bin/env python3"""RasterFlow FTW Job Script - Extract field boundaries using Fields of the World modelThis script runs the Fields of the World (FTW) model on Haskell County, Kansasand vectorizes the results using Wherobots RasterFlow."""import osimport wklsimport geopandas as gpdfrom datetime import datetimefrom rasterflow_remote import RasterflowClientfrom rasterflow_remote.data_models import ModelRecipes, VectorizeMethodEnumdef main(): print("Starting RasterFlow FTW Job...") # Initialize RasterFlow client rf_client = RasterflowClient(cache=False) # Generate AOI for Haskell County, Kansas print("Generating AOI for Haskell County, Kansas...") gdf = gpd.read_file(wkls['us']['ks']['Haskell County'].geojson()) aoi_path = os.getenv("USER_S3_PATH") + "haskell_job.parquet" gdf.to_parquet(aoi_path) print(f"AOI saved to: {aoi_path}") # Run FTW model print("Running Fields of the World model...") model_outputs = rf_client.build_and_predict_mosaic_recipe( aoi=aoi_path, start=datetime(2023, 1, 1), end=datetime(2024, 1, 1), crs_epsg=3857, model_recipe=ModelRecipes.FTW, ) print(f"Model outputs saved to: {model_outputs}") print("RasterFlow FTW Job completed successfully!") print(f"Results:") print(f" Model outputs: {model_outputs}") return { "model_outputs": model_outputs, }if __name__ == "__main__": main()
Show how to customize this workflow
Update the following lines for your use case:
In line 25, replace wkls['us']['ks']['Haskell County'] with your own area of interest.
In line 26, replace "haskell_job.parquet" with your desired output filename.
In lines 33–37, update the start, end, crs_epsg, and model_recipe parameters to match your workflow.
Once the script is uploaded, submit it as a Job Run using the Runs REST API. The following Python script demonstrates how to submit the job:
Never hardcode your API key in scripts or source code. Always load it from an environment variable or a secrets manager. The example below reads the key from the WHEROBOTS_API_KEY environment variable.
The submission script uses the requests library. If it is not already installed in your environment, install it with pip install requests.
submit_rasterflow_job.py
Report incorrect code
Copy
Ask AI
#!/usr/bin/env python3"""Submit a RasterFlow workflow as a Job Run using the Wherobots Runs REST API.This script submits a job that runs the Fields of the World (FTW) modelon Haskell County, Kansas.This is a subset of the code included in the Solution Notebook for the Fields of the World model.See https://cloud.wherobots.com/model-hub/fields-of-the-world for more details."""import osimport requestsfrom datetime import datetimedef submit_rasterflow_job( api_key: str, # Loaded from WHEROBOTS_API_KEY env var in main() script_s3_path: str, region: str = "aws-us-west-2", runtime: str = "micro", job_name: str = "rasterflow-ftw-job", timeout_seconds: int = 3600) -> dict: """ Submit a RasterFlow workflow using the Wherobots Runs REST API. Args: api_key: Wherobots API key script_s3_path: S3 path to the Python script file region: Compute region (default: aws-us-west-2) runtime: Wherobots runtime size (default: micro). RasterFlow manages its own compute, so use micro unless also running WherobotsDB workloads. job_name: Name for the job run timeout_seconds: Job timeout in seconds (default: 3600) Returns: dict: API response containing job run details """ # API endpoint url = f"https://api.cloud.wherobots.com/runs?region={region}" # Headers headers = { "accept": "application/json", "X-API-Key": api_key, "Content-Type": "application/json" } # Job payload payload = { "runtime": runtime, "name": job_name, "runPython": { "uri": script_s3_path }, "timeoutSeconds": timeout_seconds, "environment": { "dependencies": [ ] } } # Make the request print(f"Submitting job to: {url}") print(f"Job name: {job_name}") print(f"Runtime: {runtime}") print(f"Script path: {script_s3_path}") response = requests.post(url, headers=headers, json=payload) # Handle response if response.status_code == 200 or response.status_code == 201: result = response.json() print(f"Job submitted successfully!") print(f"Job ID: {result.get('id')}") print(f"Status: {result.get('status')}") return result else: print(f"Job submission failed!") print(f"Status Code: {response.status_code}") print(f"Response: {response.text}") response.raise_for_status()def main(): """Main function to run the job submission script.""" # Configuration - Update these values API_KEY = os.getenv("WHEROBOTS_API_KEY") USER_S3_PATH = os.getenv("USER_S3_PATH") # Validate required environment variables if API_KEY is None or API_KEY == "": raise ValueError( "WHEROBOTS_API_KEY environment variable is required. " "Get your API key from https://cloud.wherobots.com/settings/api-keys" ) if USER_S3_PATH is None or USER_S3_PATH == "": raise ValueError( "USER_S3_PATH environment variable is required. " "This should be your Wherobots managed storage path." ) SCRIPT_S3_PATH = USER_S3_PATH + "rasterflow_ftw_job.py" print("=== RasterFlow FTW Job Submission ===") print("This script will submit a job to run the Fields of the World model") print("on Haskell County, Kansas using Wherobots RasterFlow.\n") print(f"Make sure you have uploaded rasterflow_ftw_job.py to: {SCRIPT_S3_PATH}") print("The job script should already be available in this directory.\n") # Submit the job result = submit_rasterflow_job( api_key=API_KEY, script_s3_path=SCRIPT_S3_PATH, job_name=f"rasterflow-ftw-{datetime.now().strftime('%Y%m%d-%H%M%S')}" ) print(f"\nJob submission completed!") print(f"Monitor your job at: https://cloud.wherobots.com/job-runs")if __name__ == "__main__": exit(main())
Show how to customize this job submission
Update the highlighted lines for your use case:
In line 20, replace "aws-us-west-2" with your desired compute region.
In line 21, replace "micro" with the runtime size for your workload. For RasterFlow-only jobs, use micro — see the tip below.
In line 22, replace "rasterflow-ftw-job" with a name for your job run.
In line 23, replace 3600 with your desired timeout in seconds.
In line 49, set the WHEROBOTS_API_KEY environment variable with your API key. The script reads it at runtime — never hardcode it.
In line 50, replace "rasterflow_ftw_job.py" with the filename of your uploaded script.
RasterFlow manages its own compute resources for raster processing, so the Wherobots Runtime size does not affect RasterFlow workflow performance. Use the Micro runtime to minimize cost. Only select a larger runtime if your job also performs vector processing with WherobotsDB (e.g., spatial SQL with SedonaContext).
The key parameters in the job submission payload are:
Parameter
Description
runtime
The Wherobots runtime size for the job. RasterFlow manages its own compute resources, so the runtime size does not affect RasterFlow workflow performance. Use micro unless you are also performing vector processing with WherobotsDB in the same job.
name
A unique name for the job run. Must be 8–255 characters matching ^[a-zA-Z0-9_-.]+$.