Search This Blog

Spatial analysis

 

Here is a list of 100 command prompts for spatial analysis, categorized by the typical GIS and spatial data science workflow.

1. Data Import, Export & Conversion

  1. Load [vector_layer.shp] (shapefile) into the GIS environment.

  2. Read [raster_file.tif] (GeoTIFF) as a data layer.

  3. Import [data.csv] with latitude/longitude columns and create a point layer.

  4. Connect to a [PostGIS/Spatialite] database and load the [table_name] layer.

  5. Convert the [shapefile] layer to a [GeoJSON] file.

  6. Export the final [map/layer] as a [GeoPackage/KML] file.

  7. Reproject the [layer_name] layer from [EPSG:4326/WGS84] to [EPSG:3857/Web Mercator].

  8. Convert the [raster_data] to a polygon vector layer (raster-to-vector).

  9. Convert the [polygon_data] to a raster layer (vector-to-raster).

  10. Geocode a list of [addresses] to create a new point layer.

2. Data Management & Preprocessing

  1. Check and repair any invalid geometries in the [vector_layer].

  2. Simplify the [polygon/line] layer's geometry to improve performance.

  3. Clip the [input_layer] using the [boundary_polygon] as a cookie-cutter.

  4. Erase a portion of [layer_A] using the features of [layer_B].

  5. Merge the [states_layer_1] and [states_layer_2] into a single layer.

  6. Dissolve the [counties_layer] based on the [STATE_NAME] attribute to create a state layer.

  7. Select all features from [layer_A] that have [attribute='value'].

  8. Select all features from [layer_A] that are [within/intersecting] [layer_B].

  9. Create a spatial index for the [large_layer] to speed up queries.

  10. Join the [data.csv] table to the [parcels_layer] using the [PARCEL_ID] as the key.

  11. Perform a spatial join: count all [points] that fall inside each [polygon].

  12. Add a new field named [AREA_KM2] to the attribute table.

  13. Calculate the [area/perimeter/length] for all features in the [layer_name] layer.

  14. Update the [POP_DENSITY] field by dividing [POPULATION] by [AREA].

  15. Resample the [high_res_raster] to a [lower_resolution] grid.

  16. Mosaic (combine) the [raster_tile_1] and [raster_tile_2] into a single raster. 2B. Mask (clip) the [raster_layer] using the [boundary_polygon].

3. Vector Analysis (Points, Lines, Polygons)

  1. Generate a buffer of [100 meters] around all features in the [points_layer].

  2. Create variable-sized buffers based on the [population] attribute.

  3. Perform an overlay: intersect the [roads_layer] and the [counties_layer].

  4. Perform an overlay: union the [landuse_layer] and the [floodplain_layer].

  5. Identify all [schools] within [1 km] of a [park].

  6. Find the nearest [hospital] to each [incident_location].

  7. Calculate the distance from every [point_A] to the nearest [point_B].

  8. Generate a Thiessen polygon (Voronoi diagram) layer from the [weather_stations] points.

  9. Calculate the centroid (center point) for each polygon in the [countries_layer].

  10. Convert the [polygon_layer] to a [lines_layer] (polygon to lines).

  11. Generate random points within the [study_area_polygon].

  12. Summarize [points] within [polygons]: count the number of [crimes] in each [neighborhood].

  13. Calculate the minimum/maximum/average [property_value] for each [zoning_district].

4. Raster Analysis (Surfaces & Grids)

  1. Calculate the [slope] from the [Digital Elevation Model (DEM)].

  2. Calculate the [aspect] (direction) from the [DEM].

  3. Generate a [hillshade] map from the [DEM] to visualize topography.

  4. Perform a viewshed analysis: identify all areas visible from the [observation_point].

  5. Generate contour lines (isochrones) at [10-meter intervals] from the [DEM].

  6. Perform a raster calculation (map algebra): [Raster_A] + [Raster_B].

  7. Calculate the Normalized Difference Vegetation Index (NDVI) using the [Red] and [NIR] bands.

  8. Reclassify the [land_cover_raster] into [3 new categories: Urban, Rural, Water].

  9. Calculate the Zonal Statistics: find the [mean/max/sum] of [raster_values] for each [polygon_zone].

  10. Create a cost-weighted distance surface from the [origin_point] using the [friction_raster].

  11. Find the least-cost path from [Point_A] to [Point_B] using the [cost_surface].

  12. Delineate a watershed (catchment area) from the [DEM].

  13. Calculate the flow direction and flow accumulation from the [DEM].

  14. Extract the raster values at specific [point_locations].

  15. Perform a raster overlay to find areas where [land_use == 'Forest'] AND [slope < 15%].

  16. Calculate the focal statistics (e.g., mean, max) within a [3x3 neighborhood/kernel].

  17. Interpolate [point_data] to a continuous surface using [Inverse Distance Weighting (IDW)].

  18. Interpolate [point_data] to a continuous surface using [Kriging].

5. Network Analysis (Roads, Rivers, etc.)

  1. Build a routable network dataset from the [roads_layer].

  2. Find the shortest (quickest) route from [Location_A] to [Location_B].

  3. Generate an [N-minute] drive-time polygon (isochrone) from the [fire_station].

  4. Create an Origin-Destination (OD) cost matrix for all [warehouses] to all [stores].

  5. Solve the Traveling Salesperson Problem (TSP) for the [list_of_stops].

  6. Perform a Service Area analysis: find all [streets] within [5 minutes] of the [hospital]. 6CSS. Find the [N] closest facilities (e.g., find the 3 closest [parks] to the [user_location]).

  7. Trace the flow downstream from a [spill_location] on the [river_network].

6. Spatial Statistics & Pattern Analysis

  1. Calculate the mean center and standard distance for the [crime_incidents] points.

  2. Calculate the directional distribution (standard deviational ellipse) for the [disease_outbreak].

  3. Perform an Average Nearest Neighbor (ANN) analysis to determine if [points] are clustered or dispersed.

  4. Perform a quadrat analysis to test for spatial randomness.

  5. Calculate Ripley's K-function to analyze point patterns at multiple scales.

  6. Perform a kernel density estimation (KDE) to create a "hotspot" heatmap for [crime_incidents].

  7. Identify statistically significant hot spots and cold spots using Getis-Ord Gi*.

  8. Identify spatial outliers (high-low clusters) using Anselin Local Moran's I.

  9. Calculate the Global Moran's I to test for spatial autocorrelation across the entire study area.

  10. Generate a spatial weights matrix (e.g., queen contiguity, k-nearest neighbor).

  11. Perform a Cluster and Outlier Analysis (Anselin Local Moran's I) to find [HH, LL, HL, LH] clusters.

  12. Run a Geographically Weighted Regression (GWR) to model how [relationships] vary across space.

  13. Run a simple Ordinary Least Squares (OLS) regression and analyze the spatial distribution of residuals.

  14. Perform a Colocation Analysis: are [liquor stores] spatially co-located with [crime incidents]?

  15. Analyze spatiotemporal data: detect emerging hot spots in [crime] over the past [N] months.

7. Geomodeling & Visualization

  1. Generate a 3D visualization of the [DEM] with the [building_footprints] extruded.

  2. Create a 2D map layout with a title, legend, scale bar, and north arrow.

  3. Symbolize the [roads_layer] based on the [CLASS] attribute (e.g., 'Highway', 'Local').

  4. Symbolize the [population_layer] using a graduated color ramp (choropleth).

  5. Symbolize the [cities_layer] using graduated symbols based on [population].

  6. Symbolize the [landuse_layer] using unique categories.

  7. Create a dot density map to show the distribution of [population_by_race].

  8. Create and label contour lines on the map.

  9. Create a map series (map book) for each [county] in the [state_layer].

8. Lidar & Remote Sensing Specific

  1. Classify the [Lidar point cloud] into [ground, vegetation, buildings].

  2. Create a Digital Surface Model (DSM) from the raw [Lidar data].

  3. Create a Digital Terrain Model (DTM) by filtering [non-ground points].

  4. Calculate the Canopy Height Model (CHM) by subtracting the [DTM] from the [DSM].

  5. Perform an unsupervised classification (e.g., ISO Cluster) on the [satellite_imagery].

  6. Perform a supervised classification (e.g., Random Forest) on [satellite_imagery] using [training_samples].

  7. Calculate the accuracy of the classification using a [confusion matrix].

  8. Perform a change detection analysis between [image_2010] and [image_2020].

  9. Pan-sharpen the [multispectral_image] using the [panchromatic_band].

  10. Extract [building_footprints] from the [high-resolution_imagery/Lidar].

No comments:

Post a Comment

Agents

  Build me an agent that finds news from WVMR (West Virginia Mountain Radio in Pocahontas County, West Virginia and rewrites as a news st...

Shaker Posts