1
0
Fork 0

Can we get GPT2 to play chess... as me?

Trying with DistilGPT2

Unfortunately, across CPU and GPU, it appears un-fine-tuneable.
main
Zach Nation 1 year ago
parent df65965e3d
commit 73ad3dd31c
20 changed files (6.4 GiB → 2.8 GiB)
  1. 190
      07_BackToGPT2.ipynb
  2. 250
      07_OnToDistilGPT2.ipynb
  3. 3
      distilgpt2/64.tflite
  4. 182
      distilgpt2/README.md
  5. 38
      distilgpt2/config.json
  6. 3
      distilgpt2/coreml_model.mlmodel
  7. 3
      distilgpt2/flax_model.msgpack
  8. 6
      distilgpt2/generation_config.json
  9. 8
      distilgpt2/generation_config_for_text_generation.json
  10. 3
      distilgpt2/merges.txt
  11. 3
      distilgpt2/model.safetensors
  12. 3
      distilgpt2/pytorch_model.bin
  13. 3
      distilgpt2/rust_model.ot
  14. 3
      distilgpt2/tf_model.h5
  15. 3
      distilgpt2/tokenizer.json
  16. 3
      distilgpt2/vocab.json
  17. 61
      zach_model/config.json
  18. 5
      zach_model/generation_config.json
  19. 4
      zach_model/pytorch_model.bin
  20. 2
      zach_model/training_args.bin

07_BackToGPT2.ipynb (32 KiB → 0 B)

07_OnToDistilGPT2.ipynb (0 B → 7.6 KiB)

distilgpt2/64.tflite (0 B → 310 MiB)

distilgpt2/README.md (0 B → 11 KiB)

distilgpt2/config.json (0 B → 762 B)

distilgpt2/coreml_model.mlmodel (0 B → 460 MiB)

distilgpt2/flax_model.msgpack (0 B → 312 MiB)

distilgpt2/generation_config.json (0 B → 124 B)

distilgpt2/generation_config_for_text_generation.json (0 B → 165 B)

distilgpt2/merges.txt (0 B → 446 KiB)

distilgpt2/model.safetensors (0 B → 336 MiB)

distilgpt2/pytorch_model.bin (0 B → 336 MiB)

distilgpt2/rust_model.ot (0 B → 484 MiB)

distilgpt2/tf_model.h5 (0 B → 313 MiB)

distilgpt2/tokenizer.json (0 B → 1.3 MiB)

distilgpt2/vocab.json (0 B → 1018 KiB)

zach_model/config.json (805 B → 1.0 KiB)

zach_model/generation_config.json (132 B → 119 B)

zach_model/pytorch_model.bin (6.4 GiB → 318 MiB)

zach_model/training_args.bin

Loading…
Cancel
Save