***NOTE ABOUT THE UNBUFFERED VALIDATION ACCURACY TABLES BEGINNING IN 2016: The training and validation data used to create and accuracy assess the CDL has traditionally been based on ground truth data that is buffered inward 30 meters. This was done 1) because satellite imagery (as well as the polygon reference data) in the past was not georeferenced to the same precision as now (i.e. everything "stacked" less perfectly), 2) to eliminate from training spectrally-mixed pixels at land cover boundaries, and 3) to be spatially conservative during the era when coarser 56 meter AWiFS satellite imagery was incorporated. Ultimately, all of these scenarios created "blurry" edge pixels through the seasonal time series which it was found if ignored from training in the classification helped improve the quality of CDL. However, the accuracy assessment portion of the analysis also used buffered data meaning those same edge pixels were not assessed fully with the rest of the classification. This would be inconsequential if those edge pixels were similar in nature to the rest of the scene but they are not- they tend to be more difficult to classify correctly. Thus, the accuracy assessments as have been presented are inflated somewhat. Beginning with the 2016 CDL season we are creating CDL accuracy assessments using unbuffered validation data. These "unbuffered" accuracy metrics will now reflect the accuracy of field edges which have not been represented previously. Beginning with the 2016 CDLs we published both the traditional "buffered" accuracy metrics and the new "unbuffered" accuracy assessments. The purpose of publishing both versions is to provide a benchmark for users interested in comparing the different validation methods. For the 2018 CDL season we are now only publishing the unbuffered accuracy only publishing the unbuffered accuracy assessments within the official metadata files and offer the full "unbuffered" error matrices for download on the FAQs webpage. Both metadata and FAQs are accessible at <https://www.nass.usda.gov/Research_and_Science/Cropland/SARS1a.php>. We plan to continue producing these unbuffered accuracy assessments for future CDLs. However, there are no plans to create these unbuffered accuracy assessments for past years. It should be noted that accuracy assessment is challenging and the CDL group has always strived to provide robust metrics of usability to the land cover community. This admission of modestly inflated accuracy measures does not render past assessments useless. They were all done consistently so comparison across years and/or states is still valid. Yet, by providing both scenarios for 2016 gives guidance on the bias. If the following table does not display properly, then please visit this internet site <https://www.nass.usda.gov/Research_and_Science/Cropland/metadata/meta.php> to view the original metadata file.
USDA, National Agricultural Statistics Service, 2018 Oregon Cropland Data Layer
STATEWIDE AGRICULTURAL ACCURACY REPORT
Crop-specific covers only *Correct Accuracy Error Kappa
------------------------- ------- -------- ------ -----
OVERALL ACCURACY** 394,884 87.9% 12.1% 0.849
Cover Attribute *Correct Producer's Omission User's Commission Cond'l
Type Code Pixels Accuracy Error Kappa Accuracy Error Kappa
---- ---- ------ -------- ----- ----- -------- ----- -----
Corn 1 10,518 83.7% 16.3% 0.835 90.8% 9.2% 0.906
Sorghum 4 4 9.3% 90.7% 0.093 44.4% 55.6% 0.444
Soybeans 5 - n/a n/a n/a 0.0% 100.0% 0.000
Sunflower 6 139 33.0% 67.0% 0.330 59.4% 40.6% 0.594
Pop or Orn Corn 13 - 0.0% 100.0% 0.000 n/a n/a n/a
Mint 14 117 92.1% 7.9% 0.921 56.0% 44.0% 0.560
Barley 21 3,431 56.2% 43.8% 0.560 74.4% 25.6% 0.742
Spring Wheat 23 7,802 67.5% 32.5% 0.672 76.5% 23.5% 0.762
Winter Wheat 24 134,964 96.6% 3.4% 0.960 96.7% 3.3% 0.961
Rye 27 30 19.9% 80.1% 0.199 22.1% 77.9% 0.220
Oats 28 401 30.6% 69.4% 0.305 61.5% 38.5% 0.614
Speltz 30 - 0.0% 100.0% 0.000 n/a n/a n/a
Canola 31 348 64.1% 35.9% 0.641 67.8% 32.2% 0.678
Flaxseed 32 4 57.1% 42.9% 0.571 36.4% 63.6% 0.364
Mustard 35 696 80.4% 19.6% 0.804 93.9% 6.1% 0.939
Alfalfa 36 38,289 87.0% 13.0% 0.864 81.6% 18.4% 0.807
Other Hay/Non Alfalfa 37 13,785 62.2% 37.8% 0.615 76.4% 23.6% 0.758
Camelina 38 - n/a n/a n/a 0.0% 100.0% 0.000
Buckwheat 39 - n/a n/a n/a 0.0% 100.0% 0.000
Sugarbeets 41 1,039 73.7% 26.3% 0.737 89.9% 10.1% 0.899
Dry Beans 42 1,473 66.0% 34.0% 0.659 67.9% 32.1% 0.679
Potatoes 43 5,459 83.2% 16.8% 0.831 92.6% 7.4% 0.926
Other Crops 44 400 42.9% 57.1% 0.428 71.3% 28.7% 0.713
Sweet Potatoes 46 34 52.3% 47.7% 0.523 77.3% 22.7% 0.773
Misc Vegs & Fruits 47 10 20.4% 79.6% 0.204 37.0% 63.0% 0.370
Watermelons 48 4 16.0% 84.0% 0.160 23.5% 76.5% 0.235
Onions 49 2,740 78.3% 21.7% 0.783 91.1% 8.9% 0.911
Lentils 52 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Peas 53 2,235 75.5% 24.5% 0.754 80.2% 19.8% 0.802
Caneberries 55 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Hops 56 628 88.3% 11.7% 0.883 90.6% 9.4% 0.906
Herbs 57 1,299 70.1% 29.9% 0.700 81.5% 18.5% 0.815
Clover/Wildflowers 58 3,412 68.1% 31.9% 0.679 77.8% 22.2% 0.776
Sod/Grass Seed 59 35,453 88.4% 11.6% 0.879 90.2% 9.8% 0.897
Fallow/Idle Cropland 61 121,579 95.0% 5.0% 0.942 97.2% 2.8% 0.968
Cherries 66 1,899 80.0% 20.0% 0.799 88.8% 11.2% 0.888
Peaches 67 - 0.0% 100.0% 0.000 n/a n/a n/a
Apples 68 343 53.8% 46.2% 0.537 79.8% 20.2% 0.798
Grapes 69 335 44.0% 56.0% 0.440 65.0% 35.0% 0.650
Christmas Trees 70 294 42.5% 57.5% 0.425 75.6% 24.4% 0.756
Other Tree Crops 71 2,323 73.8% 26.2% 0.737 79.7% 20.3% 0.796
Walnuts 76 6 8.0% 92.0% 0.080 100.0% 0.0% 1.000
Pears 77 1,254 69.4% 30.6% 0.693 82.6% 17.4% 0.826
Triticale 205 789 28.2% 71.8% 0.281 66.3% 33.7% 0.662
Carrots 206 433 80.2% 19.8% 0.802 74.4% 25.6% 0.744
Garlic 208 104 36.6% 63.4% 0.366 63.8% 36.2% 0.638
Cantaloupes 209 1 5.0% 95.0% 0.050 100.0% 0.0% 1.000
Broccoli 214 - 0.0% 100.0% 0.000 n/a n/a n/a
Peppers 216 - n/a n/a n/a 0.0% 100.0% 0.000
Greens 219 35 32.4% 67.6% 0.324 35.7% 64.3% 0.357
Plums 220 11 28.9% 71.1% 0.289 26.8% 73.2% 0.268
Strawberries 221 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Squash 222 69 62.7% 37.3% 0.627 36.7% 63.3% 0.367
Vetch 224 29 27.9% 72.1% 0.279 93.5% 6.5% 0.935
Lettuce 227 - n/a n/a n/a 0.0% 100.0% 0.000
Pumpkins 229 5 4.1% 95.9% 0.041 27.8% 72.2% 0.278
Blueberries 242 363 73.3% 26.7% 0.733 71.7% 28.3% 0.717
Cabbage 243 - 0.0% 100.0% 0.000 n/a n/a n/a
Cauliflower 244 - n/a n/a n/a 0.0% 100.0% 0.000
Radishes 246 246 42.1% 57.9% 0.420 77.8% 22.2% 0.778
Turnips 247 52 39.1% 60.9% 0.391 70.3% 29.7% 0.703
Gourds 249 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
*Correct Pixels represents the total number of independent validation pixels correctly identified in the error matrix.
**The Overall Accuracy represents only the FSA row crops and annual fruit and vegetables (codes 1-61, 66-80, 92 and 200-255).
FSA-sampled grass and pasture. Non-agricultural and NLCD-sampled categories (codes 62-65, 81-91 and 93-199) are not included in the Overall Accuracy.
The accuracy of the non-agricultural land cover classes within the Cropland Data Layer is entirely dependent upon the USGS, National Land Cover Database (NLCD 2011). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover. For more information on the accuracy of the NLCD please reference <https://www.mrlc.gov/>.
Attribute_Accuracy_Value:
Classification accuracy is generally 85% to 95% correct for the major crop-specific land cover categories. See the 'Attribute Accuracy Report' section of this metadata file for the detailed accuracy report.
Attribute_Accuracy_Explanation:
The strength and emphasis of the CDL is crop-specific land cover categories. The accuracy of the CDL non-agricultural land cover classes is entirely dependent upon the USGS, National Land Cover Database (NLCD 2011). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover.
These definitions of accuracy statistics were derived from the following book: Congalton, Russell G. and Kass Green. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices. Boca Raton, Florida: CRC Press, Inc. 1999. The 'Producer's Accuracy' is calculated for each cover type in the ground truth and indicates the probability that a ground truth pixel will be correctly mapped (across all cover types) and measures 'errors of omission'. An 'Omission Error' occurs when a pixel is excluded from the category to which it belongs in the validation dataset. The 'User's Accuracy' indicates the probability that a pixel from the CDL classification actually matches the ground truth data and measures 'errors of commission'. The 'Commission Error' represent when a pixel is included in an incorrect category according to the validation data. It is important to take into consideration errors of omission and commission. For example, if you classify every pixel in a scene to 'wheat', then you have 100% Producer's Accuracy for the wheat category and 0% Omission Error. However, you would also have a very high error of commission as all other crop types would be included in the incorrect category. The 'Kappa' is a measure of agreement based on the difference between the actual agreement in the error matrix (i.e., the agreement between the remotely sensed classification and the reference data as indicated by the major diagonal) and the chance agreement which is indicated by the row and column totals. The 'Conditional Kappa Coefficient' is the agreement for an individual category within the entire error matrix.