***NOTE ABOUT THE UNBUFFERED VALIDATION ACCURACY TABLES BEGINNING IN 2016: The training and validation data used to create and accuracy assess the CDL has traditionally been based on ground truth data that is buffered inward 30 meters. This was done 1) because satellite imagery (as well as the polygon reference data) in the past was not georeferenced to the same precision as now (i.e. everything "stacked" less perfectly), 2) to eliminate from training spectrally-mixed pixels at land cover boundaries, and 3) to be spatially conservative during the era when coarser 56 meter AWiFS satellite imagery was incorporated. Ultimately, all of these scenarios created "blurry" edge pixels through the seasonal time series which it was found if ignored from training in the classification helped improve the quality of CDL. However, the accuracy assessment portion of the analysis also used buffered data meaning those same edge pixels were not assessed fully with the rest of the classification. This would be inconsequential if those edge pixels were similar in nature to the rest of the scene but they are not- they tend to be more difficult to classify correctly. Thus, the accuracy assessments as have been presented are inflated somewhat. Beginning with the 2016 CDL season we are creating CDL accuracy assessments using unbuffered validation data. These "unbuffered" accuracy metrics will now reflect the accuracy of field edges which have not been represented previously. Beginning with the 2016 CDLs we published both the traditional "buffered" accuracy metrics and the new "unbuffered" accuracy assessments. The purpose of publishing both versions is to provide a benchmark for users interested in comparing the different validation methods. For the 2019 CDL season we are now only publishing the unbuffered accuracy only publishing the unbuffered accuracy assessments within the official metadata files and offer the full "unbuffered" error matrices for download on the FAQs webpage. Both metadata and FAQs are accessible at <https://www.nass.usda.gov/Research_and_Science/Cropland/SARS1a.php>. We plan to continue producing these unbuffered accuracy assessments for future CDLs. However, there are no plans to create these unbuffered accuracy assessments for past years. It should be noted that accuracy assessment is challenging and the CDL group has always strived to provide robust metrics of usability to the land cover community. This admission of modestly inflated accuracy measures does not render past assessments useless. They were all done consistently so comparison across years and/or states is still valid. Yet, by providing both scenarios for 2016 gives guidance on the bias. If the following table does not display properly, then please visit the following website to view the original metadata file <https://www.nass.usda.gov/Research_and_Science/Cropland/metadata/meta.php>.
USDA, National Agricultural Statistics Service, 2019 California Cropland Data Layer
STATEWIDE AGRICULTURAL ACCURACY REPORT
Crop-specific covers only *Correct Accuracy Error Kappa
------------------------- ------- -------- ------ -----
OVERALL ACCURACY** 407,250 81.1% 18.9% 0.794
Cover Attribute *Correct Producer's Omission User's Commission Cond'l
Type Code Pixels Accuracy Error Kappa Accuracy Error Kappa
---- ---- ------ -------- ----- ----- -------- ----- -----
Corn 1 12,486 76.0% 24.0% 0.756 77.0% 23.0% 0.766
Cotton 2 23,427 92.1% 7.9% 0.919 90.4% 9.6% 0.902
Rice 3 72,657 98.3% 1.7% 0.981 99.1% 0.9% 0.990
Sorghum 4 333 27.8% 72.2% 0.278 68.4% 31.6% 0.683
Sunflower 6 4,066 77.1% 22.9% 0.770 82.0% 18.0% 0.819
Sweet Corn 12 124 40.8% 59.2% 0.408 34.8% 65.2% 0.348
Pop or Orn Corn 13 0 0.0% 100.0% 0.000 n/a n/a n/a
Mint 14 188 92.2% 7.8% 0.922 81.7% 18.3% 0.817
Barley 21 3,984 68.0% 32.0% 0.679 76.5% 23.5% 0.763
Durum Wheat 22 1,611 72.5% 27.5% 0.724 79.0% 21.0% 0.790
Spring Wheat 23 194 25.2% 74.8% 0.252 43.0% 57.0% 0.430
Winter Wheat 24 17,033 71.2% 28.8% 0.704 70.3% 29.7% 0.695
Rye 27 93 12.1% 87.9% 0.120 22.0% 78.0% 0.220
Oats 28 1,744 37.2% 62.8% 0.370 56.2% 43.8% 0.560
Millet 29 0 0.0% 100.0% 0.000 n/a n/a n/a
Canola 31 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Safflower 33 1,873 66.3% 33.7% 0.662 76.3% 23.7% 0.763
Alfalfa 36 57,250 90.2% 9.8% 0.895 85.0% 15.0% 0.840
Other Hay/Non Alfalfa 37 9,013 68.6% 31.4% 0.682 68.4% 31.6% 0.680
Sugarbeets 41 1,954 76.8% 23.2% 0.767 85.4% 14.6% 0.853
Dry Beans 42 844 57.0% 43.0% 0.569 61.1% 38.9% 0.611
Potatoes 43 792 62.2% 37.8% 0.622 84.3% 15.7% 0.843
Other Crops 44 527 56.9% 43.1% 0.568 76.7% 23.3% 0.767
Sugarcane 45 0 n/a n/a n/a 0.0% 100.0% 0.000
Sweet Potatoes 46 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Misc Vegs & Fruits 47 1 12.5% 87.5% 0.125 3.4% 96.6% 0.034
Watermelons 48 133 34.5% 65.5% 0.344 56.8% 43.2% 0.568
Onions 49 1,830 71.1% 28.9% 0.711 79.3% 20.7% 0.793
Cucumbers 50 155 30.8% 69.2% 0.308 51.5% 48.5% 0.515
Chick Peas 51 584 49.4% 50.6% 0.494 73.7% 26.3% 0.737
Peas 53 142 26.5% 73.5% 0.265 38.9% 61.1% 0.389
Tomatoes 54 18,905 87.2% 12.8% 0.869 82.2% 17.8% 0.818
Herbs 57 171 59.2% 40.8% 0.592 80.3% 19.7% 0.803
Clover/Wildflowers 58 3,138 90.3% 9.7% 0.903 88.6% 11.4% 0.886
Sod/Grass Seed 59 740 54.1% 45.9% 0.540 76.9% 23.1% 0.769
Fallow/Idle Cropland 61 24,783 79.8% 20.2% 0.790 71.8% 28.2% 0.708
Cherries 66 1,992 82.4% 17.6% 0.824 89.9% 10.1% 0.899
Peaches 67 150 42.4% 57.6% 0.424 60.7% 39.3% 0.607
Apples 68 34 97.1% 2.9% 0.971 23.9% 76.1% 0.239
Grapes 69 9,057 91.9% 8.1% 0.917 70.4% 29.6% 0.701
Other Tree Crops 71 115 45.5% 54.5% 0.454 21.1% 78.9% 0.211
Citrus 72 73 76.0% 24.0% 0.760 6.3% 93.7% 0.063
Pecans 74 117 31.6% 68.4% 0.316 78.5% 21.5% 0.785
Almonds 75 66,294 79.9% 20.1% 0.783 96.2% 3.8% 0.958
Walnuts 76 19,169 88.4% 11.6% 0.881 81.7% 18.3% 0.813
Pears 77 361 88.3% 11.7% 0.883 94.0% 6.0% 0.940
Pistachios 204 18,231 81.9% 18.1% 0.815 95.9% 4.1% 0.958
Triticale 205 1,389 42.9% 57.1% 0.428 53.6% 46.4% 0.535
Carrots 206 581 51.4% 48.6% 0.514 50.9% 49.1% 0.508
Garlic 208 1,263 71.3% 28.7% 0.712 74.7% 25.3% 0.746
Cantaloupes 209 234 26.9% 73.1% 0.269 64.8% 35.2% 0.648
Olives 211 1,690 84.1% 15.9% 0.840 79.6% 20.4% 0.796
Oranges 212 723 43.7% 56.3% 0.436 52.0% 48.0% 0.519
Honeydew Melons 213 265 51.0% 49.0% 0.509 80.5% 19.5% 0.805
Broccoli 214 194 48.3% 51.7% 0.482 28.2% 71.8% 0.281
Avocados 215 90 64.7% 35.3% 0.647 35.4% 64.6% 0.354
Peppers 216 73 50.0% 50.0% 0.500 49.7% 50.3% 0.497
Pomegranates 217 2,342 97.7% 2.3% 0.977 90.9% 9.1% 0.909
Nectarines 218 17 18.5% 81.5% 0.185 29.3% 70.7% 0.293
Greens 219 384 45.5% 54.5% 0.455 68.8% 31.2% 0.688
Plums 220 882 62.8% 37.2% 0.627 55.0% 45.0% 0.549
Strawberries 221 26 51.0% 49.0% 0.510 54.2% 45.8% 0.542
Squash 222 20 66.7% 33.3% 0.667 64.5% 35.5% 0.645
Vetch 224 50 39.1% 60.9% 0.391 76.9% 23.1% 0.769
Dbl Crop WinWht/Corn 225 12,770 74.7% 25.3% 0.742 68.0% 32.0% 0.674
Dbl Crop Oats/Corn 226 2,529 62.7% 37.3% 0.626 72.5% 27.5% 0.724
Lettuce 227 881 71.8% 28.2% 0.717 39.3% 60.7% 0.392
Dbl Crop Triticale/Corn 228 3,547 48.9% 51.1% 0.486 60.2% 39.8% 0.599
Pumpkins 229 7 13.5% 86.5% 0.135 41.2% 58.8% 0.412
Dbl Crop Lettuce/Cantaloupe 231 0 0.0% 100.0% 0.000 n/a n/a n/a
Dbl Crop WinWht/Sorghum 236 768 52.5% 47.5% 0.524 72.1% 27.9% 0.721
Dbl Crop Barley/Corn 237 90 31.8% 68.2% 0.318 68.2% 31.8% 0.682
Dbl Crop WinWht/Cotton 238 0 0.0% 100.0% 0.000 n/a n/a n/a
Blueberries 242 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Cabbage 243 44 50.6% 49.4% 0.506 86.3% 13.7% 0.863
Cauliflower 244 23 9.2% 90.8% 0.091 16.2% 83.8% 0.162
Celery 245 0 0.0% 100.0% 0.000 n/a n/a n/a
Turnips 247 0 n/a n/a n/a 0.0% 100.0% 0.000
*Correct Pixels represents the total number of independent validation pixels correctly identified in the error matrix.
**The Overall Accuracy represents only the FSA row crops and annual fruit and vegetables (codes 1-61, 66-80, 92 and 200-255).
FSA-sampled grass and pasture. Non-agricultural and NLCD-sampled categories (codes 62-65, 81-91 and 93-199) are not included in the Overall Accuracy.
The accuracy of the non-agricultural land cover classes within the Cropland Data Layer is entirely dependent upon the USGS, National Land Cover Database (NLCD 2016). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover. For more information on the accuracy of the NLCD please reference <https://www.mrlc.gov/>.
Attribute_Accuracy_Value:
Classification accuracy is generally 85% to 95% correct for the major crop-specific land cover categories. See the 'Attribute Accuracy Report' section of this metadata file for the detailed accuracy report.
Attribute_Accuracy_Explanation:
The strength and emphasis of the CDL is crop-specific land cover categories. The accuracy of the CDL non-agricultural land cover classes is entirely dependent upon the USGS, National Land Cover Database (NLCD 2016). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover.
These definitions of accuracy statistics were derived from the following book: Congalton, Russell G. and Kass Green. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices. Boca Raton, Florida: CRC Press, Inc. 1999. The 'Producer's Accuracy' is calculated for each cover type in the ground truth and indicates the probability that a ground truth pixel will be correctly mapped (across all cover types) and measures 'errors of omission'. An 'Omission Error' occurs when a pixel is excluded from the category to which it belongs in the validation dataset. The 'User's Accuracy' indicates the probability that a pixel from the CDL classification actually matches the ground truth data and measures 'errors of commission'. The 'Commission Error' represent when a pixel is included in an incorrect category according to the validation data. It is important to take into consideration errors of omission and commission. For example, if you classify every pixel in a scene to 'wheat', then you have 100% Producer's Accuracy for the wheat category and 0% Omission Error. However, you would also have a very high error of commission as all other crop types would be included in the incorrect category. The 'Kappa' is a measure of agreement based on the difference between the actual agreement in the error matrix (i.e., the agreement between the remotely sensed classification and the reference data as indicated by the major diagonal) and the chance agreement which is indicated by the row and column totals. The 'Conditional Kappa Coefficient' is the agreement for an individual category within the entire error matrix.