Manual Classification for Fractional Cover of Water Project
¶
Compatability: Notebook currently compatible with the
DEA SandboxenvironmentProducts used: This notebook uses no DEA products. It uses either example drone data or uploaded drone data, in WGS84 and GeoTIFF format
Background
¶
The Fractional Cover of Water Project is a Geoscience Australia - Australian Rivers Institute project. The project requires the collection of drone field data to validate the development of a new algorithm that will classify a Landsat or Sentinel 2 image pixel into sub-pixel fractions of water and wet vegetation.
This notebook allows a user to manually classify collected drone imagery as input to the algorithm development and validation process. Load the data in, run all the cells, and a bokeh interactive plot allows the user to zoom in, click 1m x 1m cells, select the land cover type, and save the results to file. Open water cells can have a Forel-Ule water color recorded for the cell, where this was measured at the site using the EyeOnWater application.
The image at right shows the variations in wet vegetation fractional cover type that will be collected by drone. Floating, Emergent, OpenWater, GreenVeg(PV), DryVeg(NPV), BareSoil(BS), and Forel-Ule water colour (if the type is OpenWater) Water height is additionally recorded though may not be retrievable from the algorithm due to the inherent limitations of top-down satellite imagery.
Additional documentation is provided below.
Description¶
The notebook uses drone imagery to classify the land cover type, including Floating, Emergent, OpenWater, GreenVeg(PV), DryVeg(NPV), BareSoil(BS), Forel-Ule water colour (if the type is OpenWater)
first bring in the drone data in resolution (1, -1) (unit meter) as a GeoTiff raster image. This can be copied into the DEA Sandbox by dragging and dropping the file.
classify the pixels into categories listed above
save the results as raster into
geotiff
To use the notebook, please refer the instruction video and doc linked below - Written Workflow Instructions - Video Instructions ***
Get started¶
1. Upload drone image¶
Set the variable drone_tif_path accordingly e.g. drone_tif_path = "./DataAndFigures/drone_wetland_sml4.tif"
[1]:
# change the file name if necessary
drone_tif_path = "./Data_and_figures/Drone_wetland_sml.tif"
2. Name your output file¶
The default file name is results_tif_path = ./test.tif
[2]:
# This sends the output file to the current directory and a file named `test.tif`.
# You may want to rename this file
results_tif_path = './test.tif'
3. Run the rest of the notebook¶
Now you should be all set up, you can just run the cells below by pressing Shift + Enter until the drone image shows up near the end. Alternatively, you can select Run -->> Run All Cells from the drop-down menu in JupyterLab.
[3]:
import sys
import os
from os import path
import datacube
import rasterio.features
import numpy as np
import geopandas as gpd
import matplotlib.pyplot as plt
from shapely.geometry import shape
import xarray as xr
import re
from datetime import datetime
import urllib
from datacube.utils.cog import write_cog
from datacube.utils.geometry import assign_crs
from datacube.utils.geometry import GeoBox
from odc.algo import xr_reproject
sys.path.append("../../Scripts")
from rasterio import warp
from rasterio.features import geometry_mask
from rasterio.transform import from_bounds
from shapely.geometry import Polygon
from bokeh.io import curdoc, output_notebook, show
from bokeh.layouts import layout, column, row
from bokeh.models import (CheckboxGroup, Select, ColumnDataSource, HoverTool, YearsTicker, Legend,
CustomJS, LegendItem, field, Range1d, Circle, Button, RadioGroup, TextInput, WheelZoomTool,
ResetTool, BoxZoomTool, SaveTool, LinearColorMapper, CategoricalColorMapper,
Label, PreText, FileInput, Toggle, MultiPolygons)
from bokeh.models.formatters import DatetimeTickFormatter
from bokeh.events import SelectionGeometry
from bokeh.models.glyphs import Text
from bokeh.palettes import Blues256
from bokeh.colors import RGB, named
from bokeh.plotting import figure
[4]:
# only sandbox requires next two lines
from dea_dask import create_local_dask_cluster
[5]:
create_local_dask_cluster()
Client
|
Cluster
|
[14]:
# required by bokeh
output_notebook()
[7]:
# drone imagery resample resolution
# note to user: don't change it unless required
# note to editor: change it whatever you want
drone_res_tgt = (1, 1)
# This is the Forel-Ule color scale
furgb = np.array([
[1,33,88,188],
[2,49,109,197],
[3,50,124,187],
[4,75,128,160],
[5,86,143,150],
[6,109,146,152],
[7,105,140,134],
[8,117,158,114],
[9,123,166,84],
[10,125,174,56],
[11,149,182,69],
[12,148,182,96],
[13,165,188,118],
[14,170,184,109],
[15,173,181,95],
[16,168,169,101],
[17,174,159,92],
[18,179,160,83],
[19,175,138,68],
[20,164,105,5],
[21,161,44,4]], dtype='uint8')
[8]:
def load_drone_tif(fname, res_tgt):
"""
load drone imagery with given file name and resolution
input:
fname: file name of drone imagery
res_tgt: resample resolution of output, type: tuple, e.g. res_tgt=(1, 1)
output:
xarray.DataSet of drone imagery
"""
drone = xr.open_rasterio(fname, parse_coordinates=True, chunks={'band': 1, 'x': 1024, 'y': 1024})
drone = assign_crs(drone)
affine, width, height = warp.calculate_default_transform(drone.crs, 'EPSG:3577', drone.shape[1], drone.shape[2],
*drone.geobox.extent.boundingbox)
tgt_affine, tgt_width, tgt_height = warp.aligned_target(affine, width, height, res_tgt)
drone_geobox = GeoBox(tgt_width, tgt_height, tgt_affine, 'EPSG:3577')
drone_tgt = xr_reproject(drone, drone_geobox, resampling= 'bilinear' )
return drone_tgt.load()
def load_results_tif(fname):
"""
load geotiff of classification results
input:
fname: file name of classification results
output:
xarray.DataSet of geotiff
"""
results = xr.open_rasterio(fname, parse_coordinates=True, chunks={'band': 1, 'x': 1024, 'y': 1024})
return results.load()
[9]:
# note to user: just run it
# note to editor: load drone imagery, convert to rgba image, set up coordinates and affine
drone_tgt = load_drone_tif(drone_tif_path, drone_res_tgt)
rgba_image = np.empty((drone_tgt.shape[1], drone_tgt.shape[2]), dtype='uint32')
view = rgba_image.view(dtype='uint8').reshape(drone_tgt.shape[1], drone_tgt.shape[2], drone_tgt.shape[0])
for i in range(3):
view[:, :, i] = (drone_tgt.data[i]).astype('uint8')
view[:, :, 3] = 255
scale_factor = np.array([drone_tgt.x.data.min(),
drone_tgt.y.data.min() if drone_tgt.y.data.min() > 0 else drone_tgt.y.data.max()])
xr_results = xr.DataArray(data=[np.zeros(rgba_image.shape, dtype='uint8')] * 2, dims=['band', 'y', 'x'],
coords={'band': [1, 2],
'y': drone_tgt.coords['y'], 'x':drone_tgt.coords['x']}, attrs={'nodata':0})
select_poly_affine = from_bounds(drone_tgt.x.data.min()-scale_factor[0],
drone_tgt.y.data.max()-scale_factor[1],
drone_tgt.x.data.max()-scale_factor[0],
drone_tgt.y.data.min()-scale_factor[1],
drone_tgt.shape[2], drone_tgt.shape[1])
/g/data/u46/users/ea6141/opt_up2date/lib/python3.7/site-packages/pyproj/crs/crs.py:280: FutureWarning: '+init=<authority>:<code>' syntax is deprecated. '<authority>:<code>' is the preferred initialization method. When making the change, be mindful of axis order changes: https://pyproj4.github.io/pyproj/stable/gotchas.html#axis-order-changes-in-proj-6
projstring = _prepare_from_string(projparams)
[10]:
# note to user: run it
# note to editor: monstrous function required to redirect the bokeh interactive plot to a random jupyterlab port
# or the jupyter notebook port via proxy
def plot_doc(doc):
drone_source = ColumnDataSource(data={'img': [np.flip(rgba_image, axis=0)]})
results_source = ColumnDataSource(data={'category': [np.zeros(rgba_image.shape, dtype='uint8')],
'forel_ule': [np.zeros(rgba_image.shape, dtype='uint8')]})
drone_file_input = TextInput(value=drone_tif_path, title="Load drone imagery", height=50, width=600,
sizing_mode='fixed')
def open_drone_tif(attrname, old, new):
fname = drone_file_input.value
if not path.exists(fname):
print("file doesn't exist!")
return
drone_tgt = load_drone_tif(fname, drone_res_tgt)
rgba_image = np.empty((drone_tgt.shape[1], drone_tgt.shape[2]), dtype='uint32')
view = rgba_image.view(dtype='uint8').reshape(drone_tgt.shape[1], drone_tgt.shape[2], drone_tgt.shape[0])
for i in range(3):
view[:, :, i] = (drone_tgt.data[i]).astype('uint8')
view[:, :, 3] = 255
scale_factor = np.array([drone_tgt.x.data.min(),
drone_tgt.y.data.min() if drone_tgt.y.data.min() > 0 else drone_tgt.y.data.max()])
drone_source.data['img'] = [np.flip(rgba_image, axis=0)]
global xr_results
xr_results = xr.DataArray(data=[np.zeros(rgba_image.shape, dtype='uint8')] * 2, dims=['band', 'y', 'x'],
coords={'band': [1, 2],
'y': drone_tgt.coords['y'], 'x':drone_tgt.coords['x']}, attrs={'nodata':0})
results_source.data['category'] = [np.flip(xr_results[0].data, axis=0)]
results_source.data['forel_ule'] = [np.flip(xr_results[1].data, axis=0)]
global select_poly_affine
select_poly_affine = from_bounds(drone_tgt.x.data.min()-scale_factor[0],
drone_tgt.y.data.max()-scale_factor[1],
drone_tgt.x.data.max()-scale_factor[0],
drone_tgt.y.data.min()-scale_factor[1],
drone_tgt.shape[2], drone_tgt.shape[1])
drone_file_input.on_change('value', open_drone_tif)
result_file_input = TextInput(value=results_tif_path, title="Results tiff file", height=50, width=600,
sizing_mode='fixed')
result_save_button = Button(label="Save results", button_type="success")
result_load_button = Button(label="Load results", button_type="success")
def open_result_tif(event):
fname = result_file_input.value
if not path.exists(fname):
print("file doesn't exist!")
return
global xr_results
xr_results = load_results_tif(fname)
results_source.data['category'] = [np.flip(xr_results[0].data, axis=0)]
results_source.data['forel_ule'] = [np.flip(xr_results[1].data, axis=0)]
def save_result_tif(event):
fname = result_file_input.value
if path.exists(fname):
time_now = str(datetime.now()).replace(':', '').replace(' ', '').replace('-', '')
os.rename(fname, fname + '.bak' + time_now)
write_cog(xr_results, fname)
result_load_button.on_click(open_result_tif)
result_save_button.on_click(save_result_tif)
drone_fig = figure(tooltips=[('x-cood', "$x{0.0}"), ('y-coord', "$y{0.0}")], title="image %s" %("drone"),
x_axis_type='auto', y_axis_type='auto', x_minor_ticks=10, y_minor_ticks=10,
x_axis_label="x origin %s" % scale_factor[0],
y_axis_label="y origin %s" % scale_factor[1],
tools="box_zoom, wheel_zoom, pan, tap, poly_select, reset")
drone_fig.toolbar.active_scroll = drone_fig.select_one(WheelZoomTool)
drone_tag = ['drone', 1]
drone_fig.image_rgba(image='img', source=drone_source,
x=drone_tgt.x.data.min()-scale_factor[0],
y=drone_tgt.y.data.min()-scale_factor[1],
dw=drone_tgt.shape[2], dh=drone_tgt.shape[1],
tags=drone_tag,
level="image")
transparent_white = RGB(255, 255, 255, 0)
cats_color = [named.violet.to_rgb(), named.indigo.to_rgb(), named.tomato.to_rgb(),
named.deepskyblue.to_rgb(), named.yellow.to_rgb(), named.beige.to_rgb(),
named.brown.to_rgb()]
cats_color_mapper = LinearColorMapper(cats_color, low=1, high=len(cats_color), low_color=transparent_white)
water_color = [RGB(f[1], f[2], f[3], 255) for f in furgb]
water_color_mapper = LinearColorMapper(water_color, low=1, high=21, low_color=transparent_white)
water_tag = ['forel_ule', 21]
cats_tag = ['cats', 10]
water_image = drone_fig.image(image='forel_ule', source=results_source, x=drone_tgt.x.data.min()-scale_factor[0],
y=drone_tgt.y.data.min()-scale_factor[1],
dw=drone_tgt.shape[2], dh=drone_tgt.shape[1],
color_mapper=water_color_mapper,
global_alpha=0.8,
level="image", tags=water_tag)
cats_image = drone_fig.image(image='category', source=results_source, x=drone_tgt.x.data.min()-scale_factor[0],
y=drone_tgt.y.data.min()-scale_factor[1],
dw=drone_tgt.shape[2], dh=drone_tgt.shape[1],
color_mapper=cats_color_mapper,
global_alpha=0.8,
level="image", tags=cats_tag)
coords_label = PreText(text="null", width=100, sizing_mode='fixed')
select_poly_source = ColumnDataSource(data=dict(x=[[[[]]]], y=[[[[]]]]))
js_code = """
if (cb_obj.final == false)
{
return;
}
const geometry = cb_obj.geometry;
const data = {'x': [[[[]]]], 'y': [[[[]]]]};
if (geometry.type == "point")
{
var ind_x = Math.floor(geometry.x);
var ind_y = Math.floor(geometry.y);
console.log("x:", ind_x);
console.log("y:", ind_y);
label.text = "x=" + ind_x.toString() +";" + "y=" + ind_y.toString();
}
else if (geometry.type == "poly")
{
var array_len = geometry.x.length;
for (var i=0; i<array_len; i++)
{
data['x'][0][0][0].push(Math.floor(cb_obj.geometry.x[i]));
data['y'][0][0][0].push(Math.floor(cb_obj.geometry.y[i]));
}
label.text = "null";
console.log("y:", poly.data);
}
poly.data = data;
"""
js_callback = CustomJS(args={'label': coords_label, 'poly': select_poly_source},
code=js_code)
drone_fig.js_on_event(SelectionGeometry, js_callback)
select_poly = MultiPolygons(xs="x", ys="y", line_width=2)
drone_fig.add_glyph(select_poly_source, select_poly)
def get_ind_from_coords():
ind_list = []
poly_list = []
if coords_label.text != "null":
coords = coords_label.text.split(';')
ind_list += [[abs(int(coords[1].split('=')[1])), abs(int(coords[0].split('=')[1]))]]
return ind_list
elif len(select_poly_source.data['x'][0][0][0]) > 0:
for x, y in zip(select_poly_source.data['x'][0][0][0], select_poly_source.data['y'][0][0][0]):
poly_list += [[x, y]]
poly_shape = Polygon(poly_list)
poly_mask = geometry_mask([poly_shape], out_shape=xr_results[0].data.shape,
transform=select_poly_affine, invert=True)
return np.flip(poly_mask, axis=0)
else:
return None
cats = ["NA", "Overstory","Emergent","Floating","OpenWater","GreenVeg","DryVeg","Bare"]
radio_group = RadioGroup(labels=cats, active=0, height=800, height_policy="fixed", aspect_ratio=0.1)
forel_ule_scale = TextInput(value="0", title="Forel-Ule Water Colour", width=100, sizing_mode='fixed')
def choose_cat(attrname, old, new):
ind_list = get_ind_from_coords()
if ind_list is None:
return
if attrname == 'active':
if isinstance(ind_list, list):
for ind_y, ind_x in ind_list:
xr_results[0][ind_y, ind_x] = radio_group.active
else:
xr_results[0].data[ind_list] = radio_group.active
results_source.data['category'] = [np.flip(xr_results[0].data, axis=0)]
if attrname == 'value':
check_numbers = re.match(r'^[0-9]+$', forel_ule_scale.value)
if check_numbers is None:
print("only input numbers", forel_ule_scale.value)
forel_ule_scale.value = "0"
elif int(forel_ule_scale.value) < 0 or int(forel_ule_scale.value) > 21:
print("invalid value, please check!")
forel_ule_scale.value = "0"
elif radio_group.active != 4 and int(forel_ule_scale.value) > 0:
forel_ule_scale.value = "0"
print("cannot set value for non-water")
if isinstance(ind_list, list):
for ind_y, ind_x in ind_list:
xr_results[1][ind_y, ind_x] = int(forel_ule_scale.value)
else:
xr_results[1].data[ind_list] = int(forel_ule_scale.value)
results_source.data['forel_ule'] = [np.flip(xr_results[1].data, axis=0)]
radio_group.on_change('active', choose_cat)
forel_ule_scale.on_change('value', choose_cat)
def coords_change(attrname, old, new):
ind_list = get_ind_from_coords()
if ind_list is None:
return
if isinstance(ind_list, list):
radio_group.active = xr_results[0].data[ind_list[0][0], ind_list[0][1]]
forel_ule_scale.value = str(xr_results[1].data[ind_list[0][0], ind_list[0][1]])
else:
radio_group.active = xr_results[0].data[ind_list][0]
forel_ule_scale.value = str(xr_results[1].data[ind_list][0])
coords_label.on_change('text', coords_change)
select_poly_source.on_change('data', coords_change)
overlay_toggle_category = Toggle(label="Overlay category", button_type="success",
height=50, width=150, sizing_mode='fixed', active=True)
overlay_toggle_water_color = Toggle(label="Overlay water color", button_type="success",
height=50, width=150, sizing_mode='fixed', active=True)
def overlay_results(event):
if (overlay_toggle_water_color.active):
water_image.visible = True
else:
water_image.visible = False
if (overlay_toggle_category.active):
cats_image.visible = True
else:
cats_image.visible = False
overlay_toggle_category.on_click(overlay_results)
overlay_toggle_water_color.on_click(overlay_results)
control_group = column(coords_label, forel_ule_scale, radio_group)
result_group = row(result_file_input, column(result_load_button, result_save_button))
layouts = layout([drone_file_input, result_group, [overlay_toggle_category, overlay_toggle_water_color],
[control_group, drone_fig]], sizing_mode='scale_height')
doc.add_root(layouts)
[11]:
def remote_jupyter_proxy_url(port):
"""
Callable to configure Bokeh's show method when a proxy must be
configured.
If port is None we're asking about the URL
for the origin header.
"""
base_url = "https://app.sandbox.dea.ga.gov.au/"
host = urllib.parse.urlparse(base_url).netloc
# If port is None we're asking for the URL origin
# so return the public hostname.
if port is None:
return host
service_url_path = os.environ['JUPYTERHUB_SERVICE_PREFIX']
proxy_url_path = 'proxy/%d' % port
user_url = urllib.parse.urljoin(base_url, service_url_path)
full_url = urllib.parse.urljoin(user_url, proxy_url_path)
return full_url
4. If everything above worked, you should see your drone imagery below the next cell as well as an output file path and some green buttons.¶
When selecting polygons of multiple pixels, double-click to complete the polygon selection
Completing a new polygon will hide the old polygon
Hit the green
Save resultsbutton frequently to make sure your results are saved during your sessionYou can switch between water color and category being displayed over your imagery by clicking
Overlay categoryorOverlay water colorbuttonsYou probably want to have your imagery open in a GIS on another screen to compare with this imagery while classifying, otherwise you’ll be scrolling in and out a lot
[ ]:
# if you know your url
# notebook_url = "http://localhost:8888"
show(plot_doc, notebook_url=remote_jupyter_proxy_url)
Additional information¶
License: The code in this notebook is licensed under the Apache License, Version 2.0. Digital Earth Australia data is licensed under the Creative Commons by Attribution 4.0 license.
Contact: If you need assistance, please post a question on the Open Data Cube Slack channel or on the GIS Stack Exchange using the open-data-cube tag (you can view previously asked questions here). If you would like to report an issue with this notebook, you can file one on
Github.
Last modified: October 2020
Compatible datacube version:
[13]:
print(datacube.__version__)
1.8.2
Tags¶
Browse all available tags on the DEA User Guide’s Tags Index
Tags: sandbox compatible, bokeh, COG, classification, DEA Sandbox, drones, Forel-Ule, GeoTIFF, inland water,:index:water,