skyehigh commited on
Commit
195624e
·
verified ·
1 Parent(s): 51eba55

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -3
README.md CHANGED
@@ -1,3 +1,11 @@
1
- ---
2
- license: mit
3
- ---
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ size_categories:
4
+ - 10B<n<100B
5
+ ---
6
+ `sample-10BT` version of the [FineWeb dataset](https://huggingface.co/datasets/HuggingFaceFW/fineweb) tokenized using the gpt2 tokenizer and split into 100M tokens binary shards.
7
+
8
+ A shard is simply a 1D stream of `np.uint16` numbers which are the tokenized samples from the dataset, stored contiguously.
9
+ Each sample from the dataset was prefixed with the `<|endoftext|>` special token before being tokenized.
10
+
11
+ There are 103 training shards (under `train/` dir) and 1 shard for validation (under `val/`).