From 448be509d78d8417e736b6049d42e9a535b45dec Mon Sep 17 00:00:00 2001 From: Kurt Schwehr Date: Tue, 23 Jul 2024 08:14:02 -0700 Subject: [PATCH] codespell PiperOrigin-RevId: 655169946 --- experimental/ee_genie.ipynb | 2 +- guides/linked/Earth_Engine_AutoML_Vertex_AI.ipynb | 4 ++-- guides/linked/Earth_Engine_PyTorch_Vertex_AI.ipynb | 2 +- guides/linked/Earth_Engine_TensorFlow_AI_Platform.ipynb | 2 +- .../Earth_Engine_TensorFlow_logistic_regression.ipynb | 2 +- .../Earth_Engine_training_patches_computePixels.ipynb | 2 +- tutorials/imad-tutorial-pt1/index.ipynb | 2 +- tutorials/imad-tutorial-pt2/index.ipynb | 6 +++--- 8 files changed, 11 insertions(+), 11 deletions(-) diff --git a/experimental/ee_genie.ipynb b/experimental/ee_genie.ipynb index c3e199c51..79e2ed3f1 100644 --- a/experimental/ee_genie.ipynb +++ b/experimental/ee_genie.ipynb @@ -635,7 +635,7 @@ "\n", "Make sure you have enough justification to definitively declare the analysis\n", "relevant - it's better to give a false negative than a false positive. However,\n", - "the image analysis identtifies specific matching landmarks (eg, the\n", + "the image analysis identifies specific matching landmarks (eg, the\n", "the outlines of Manhattan island for a request to show NYC), believe it.\n", "\n", "Do not assume too much (eg, that the presence of green doesn't by itself mean the\n", diff --git a/guides/linked/Earth_Engine_AutoML_Vertex_AI.ipynb b/guides/linked/Earth_Engine_AutoML_Vertex_AI.ipynb index a54b12dcb..f1e72bd42 100644 --- a/guides/linked/Earth_Engine_AutoML_Vertex_AI.ipynb +++ b/guides/linked/Earth_Engine_AutoML_Vertex_AI.ipynb @@ -97,7 +97,7 @@ "\n", "REGION = \"us-central1\" # @param {type: \"string\"}\n", "\n", - "# The diplay name of your model (this can be any string).\n", + "# The display name of your model (this can be any string).\n", "MODEL_NAME = \"[model-name]\" # @param {type: \"string\"}" ], "metadata": { @@ -163,7 +163,7 @@ "\n", "Creating data is a long-running operation. This next step can take a while. The `create()` method waits for the operation to complete, outputting statements as the operation progresses. The statements contain the full name of the dataset that you use in the following section.\n", "\n", - "**Note**: You can close the noteboook while you wait for this operation to complete." + "**Note**: You can close the notebook while you wait for this operation to complete." ], "metadata": { "id": "A1ZdO3ueKLsd" diff --git a/guides/linked/Earth_Engine_PyTorch_Vertex_AI.ipynb b/guides/linked/Earth_Engine_PyTorch_Vertex_AI.ipynb index 043d6a688..fdb7b876a 100644 --- a/guides/linked/Earth_Engine_PyTorch_Vertex_AI.ipynb +++ b/guides/linked/Earth_Engine_PyTorch_Vertex_AI.ipynb @@ -468,7 +468,7 @@ { "cell_type": "markdown", "source": [ - "Now we need to specify a handler for our model. We could use a Torchserve default handler or write a custom one. Here, our model returns per-class probabilities, so we'll write a custom handler to call argmax on the probabilites and return the highest-probability class value to Earth Engine." + "Now we need to specify a handler for our model. We could use a Torchserve default handler or write a custom one. Here, our model returns per-class probabilities, so we'll write a custom handler to call argmax on the probabilities and return the highest-probability class value to Earth Engine." ], "metadata": { "id": "STWSevy7gJga" diff --git a/guides/linked/Earth_Engine_TensorFlow_AI_Platform.ipynb b/guides/linked/Earth_Engine_TensorFlow_AI_Platform.ipynb index e6c04cf5e..4e743f453 100644 --- a/guides/linked/Earth_Engine_TensorFlow_AI_Platform.ipynb +++ b/guides/linked/Earth_Engine_TensorFlow_AI_Platform.ipynb @@ -426,7 +426,7 @@ "source": [ "## Create the Keras model\n", "\n", - "Before we create the model, there's still a wee bit of pre-processing to get the data into the right input shape and a format that can be used with cross-entropy loss. Specifically, Keras expects a list of inputs and a one-hot vector for the class. (See [the Keras loss function docs](https://keras.io/losses/), [the TensorFlow categorical identity docs](https://www.tensorflow.org/guide/feature_columns#categorical_identity_column) and [the `tf.one_hot` docs](https://www.tensorflow.org/api_docs/python/tf/one_hot) for details).\n", + "Before we create the model, there's still a small bit of pre-processing to get the data into the right input shape and a format that can be used with cross-entropy loss. Specifically, Keras expects a list of inputs and a one-hot vector for the class. (See [the Keras loss function docs](https://keras.io/losses/), [the TensorFlow categorical identity docs](https://www.tensorflow.org/guide/feature_columns#categorical_identity_column) and [the `tf.one_hot` docs](https://www.tensorflow.org/api_docs/python/tf/one_hot) for details).\n", "\n", "Here we will use a simple neural network model with a 64 node hidden layer. Once the dataset has been prepared, define the model, compile it, fit it to the training data. See [the Keras `Sequential` model guide](https://keras.io/getting-started/sequential-model-guide/) for more details." ] diff --git a/guides/linked/Earth_Engine_TensorFlow_logistic_regression.ipynb b/guides/linked/Earth_Engine_TensorFlow_logistic_regression.ipynb index 25c0eda18..ba9aeec28 100644 --- a/guides/linked/Earth_Engine_TensorFlow_logistic_regression.ipynb +++ b/guides/linked/Earth_Engine_TensorFlow_logistic_regression.ipynb @@ -278,7 +278,7 @@ "source": [ "# Generate training data\n", "\n", - "This is a multi-step process. First, export the image that contains the prediction bands. When that export completes (several hours in this example), it can be reloaded and sampled to generate training and testing datasets. The second step is to export the traning and testing tables to TFRecord files in Cloud Storage (also several hours)." + "This is a multi-step process. First, export the image that contains the prediction bands. When that export completes (several hours in this example), it can be reloaded and sampled to generate training and testing datasets. The second step is to export the training and testing tables to TFRecord files in Cloud Storage (also several hours)." ] }, { diff --git a/guides/linked/Earth_Engine_training_patches_computePixels.ipynb b/guides/linked/Earth_Engine_training_patches_computePixels.ipynb index 8305e73e3..19e768175 100644 --- a/guides/linked/Earth_Engine_training_patches_computePixels.ipynb +++ b/guides/linked/Earth_Engine_training_patches_computePixels.ipynb @@ -222,7 +222,7 @@ "source": [ "## Image retrieval functions\n", "\n", - "This section includes functions to compute a Sentinel-2 median composite and get a pacth of pixels from the composite, centered on the provided coordinates, as either a numpy array or a JPEG thumbnail (for visualization). The functions that request patches are retriable and you can do that automatically by decorating the functions with [Retry](https://googleapis.dev/python/google-api-core/latest/retry.html)." + "This section includes functions to compute a Sentinel-2 median composite and get a patch of pixels from the composite, centered on the provided coordinates, as either a numpy array or a JPEG thumbnail (for visualization). The functions that request patches are retriable and you can do that automatically by decorating the functions with [Retry](https://googleapis.dev/python/google-api-core/latest/retry.html)." ], "metadata": { "id": "vbEM4nlUOmQn" diff --git a/tutorials/imad-tutorial-pt1/index.ipynb b/tutorials/imad-tutorial-pt1/index.ipynb index 4c409daa4..56aae5dec 100644 --- a/tutorials/imad-tutorial-pt1/index.ipynb +++ b/tutorials/imad-tutorial-pt1/index.ipynb @@ -586,7 +586,7 @@ "id": "qksWAxsrIV4g" }, "source": [ - "The next cell codes the MAD transformation itself in the funcion *mad_run()*, taking as input two multiband images and returning the _canonical variates_\n", + "The next cell codes the MAD transformation itself in the function *mad_run()*, taking as input two multiband images and returning the _canonical variates_\n", "\n", "$$\n", "U_i, \\ V_i, \\quad i=1\\dots N,\n", diff --git a/tutorials/imad-tutorial-pt2/index.ipynb b/tutorials/imad-tutorial-pt2/index.ipynb index e8f6cb27b..1fbe773c6 100644 --- a/tutorials/imad-tutorial-pt2/index.ipynb +++ b/tutorials/imad-tutorial-pt2/index.ipynb @@ -575,7 +575,7 @@ " rhos = ee.String.encodeJSON(ee.List(result.get('allrhos')).get(-1))\n", " Z = ee.Image(result.get('Z'))\n", " niter = result.getNumber('niter')\n", - " # Export iMAD and Z as a singe image, including rhos and number of iterations in properties.\n", + " # Export iMAD and Z as a single image, including rhos and number of iterations in properties.\n", " iMAD_export = ee.Image.cat(iMAD, Z).rename(imadnames).set('rhos', rhos, 'niter', niter)\n", " assexport = ee.batch.Export.image.toAsset(iMAD_export,\n", " description='assetExportTask',\n", @@ -718,7 +718,7 @@ }, "source": [ "Gray pixels point to no change, while the wide range of color in the iMAD variates\n", - "indicates a good discrimination of the types of change occuring.\n", + "indicates a good discrimination of the types of change occurring.\n", "\n", "**Aside:** We are of course primarily interested in extracting the changes in the iMAD\n", "image, especially those which mark clear cutting, and we'll come back to them in a moment.\n", @@ -783,7 +783,7 @@ "id": "22554e72" }, "source": [ - "Here we display the four clusters overlayed onto the two Sentinel 2 images:" + "Here we display the four clusters overlaid onto the two Sentinel 2 images:" ] }, {