CIFAR-100

수학노트
둘러보기로 가기 검색하러 가기

노트

위키데이터

말뭉치

  1. The 100 classes in the CIFAR-100 are roughly grouped into 20 superclasses.[1]
  2. layer { name: "cifar100" type: "Data" top: "data" top: "label" include { phase: TRAIN } transform_param { mean_file: "../mean.binaryproto" } data_param { source: "..[1]
  3. The 100 classes in the CIFAR-100 are grouped into 20 superclasses.[2]
  4. There are the following classes in the CIFAR-100 dataset: S. No Superclass Classes 1.[3]
  5. The 100 classes in the CIFAR-100 are grouped into 20 super-classes.[4]
  6. CIFAR-100 data set is just like the CIFAR-10, except it has 100 classes containing 600 images each.[5]
  7. Our EfficientNets also transfer well and achieve state-of-the-art accuracy on CIFAR-100 (91.7%), Flowers (98.8%), and 3 other transfer learning datasets, with an order of magnitude fewer parameters.[6]
  8. Our method achieves state-of-the-art accuracy on CIFAR-10, CIFAR-100, SVHN, and ImageNet (without additional data).[6]
  9. We evaluate our proposed architecture on four highly competitive object recognition benchmark tasks (CIFAR-10, CIFAR-100, SVHN, and ImageNet).[6]
  10. Our RoR-3-WRN58-4+SD models achieve new state-of-the-art results on CIFAR-10, CIFAR-100 and SVHN, with test errors 3.77%, 19.73% and 1.59%, respectively.[6]
  11. The CIFAR-100 dataset contains 50,000 training and 10,000 test images of 20 object classes, along with 100 object subclasses.[7]
  12. Then, we’ll actually build one – by using the CIFAR-10 and CIFAR-100 datasets.[8]
  13. They come in two ways: the CIFAR-10 datasets, with ten classes, and the CIFAR-100 dataset, with one hundred classes.[8]
  14. Now, let’s load some CIFAR-100 data.[8]
  15. Instead of cifar10 , you’ll import cifar100 .[8]

소스

메타데이터

위키데이터

Spacy 패턴 목록

  • [{'LEMMA': 'CIFAR-100'}]
  • [{'LEMMA': 'CIFAR100'}]