Skip to content

Commit ebcdcb9

Browse files
committed
Update transformer.egg-info
1 parent f0c4078 commit ebcdcb9

2 files changed

Lines changed: 28 additions & 1 deletion

File tree

transformer.egg-info/PKG-INFO

Lines changed: 27 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
Metadata-Version: 2.4
22
Name: transformer
3-
Version: 0.1.0
3+
Version: 0.2.0
44
Summary: A polished PyTorch implementation of the current State-Of-The-Art(SOTA) Transformer
55
Author-email: Leinier Orama <lof310w@gmail.com>
66
Maintainer-email: Leinier Orama <lof310w@gmail.com>
@@ -23,8 +23,10 @@ Classifier: Topic :: Software Development :: Libraries :: Python Modules
2323
Classifier: Operating System :: OS Independent
2424
Requires-Python: >=3.9
2525
Description-Content-Type: text/markdown
26+
License-File: LICENSE
2627
Requires-Dist: torch>=1.12.0
2728
Requires-Dist: transformers>=4.30.0
29+
Dynamic: license-file
2830

2931
# Transformer
3032

@@ -89,6 +91,30 @@ input_ids = torch.randint(low=0, high=config.vocab_size, size(B, N))
8991
output = model(input_ids, return_states=False)
9092
```
9193

94+
## Default Configuration
95+
The default configuration implements the latest SOTA Transformer design.
96+
97+
```python
98+
from transformer import TransformerConfig
99+
100+
TransformerConfig(
101+
n_layers = 12,
102+
d_model int = 1536,
103+
n_heads = 32,
104+
n_kv_heads = None, # GQA Disabled
105+
vocab_size int = 50000,
106+
d_ff = None, # Choosen Automatically: math.ceil(d_model * 2.666)
107+
attn_type = "MHA",
108+
attn_bias = False,
109+
ffn_bias = True,
110+
attn_qk_norm = True,
111+
lm_head_bias = False,
112+
tied_weights = False,
113+
seq_len = 1024,
114+
max_seq_len = 4096
115+
)
116+
```
117+
92118
## Documentation
93119

94120
Full Documentation available at [This Page](https://lof310.github.io/transformer)

transformer.egg-info/SOURCES.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
LICENSE
12
README.md
23
pyproject.toml
34
transformer/__init__.py

0 commit comments

Comments
 (0)