Skip to content

DIvkov575/PFold

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TODO
- "He" init instead of Xavier
- Increase model capacity
- Improve LR scheduling

TODO perf
- prioritizing functional/structural hotspots when masking
- embedding dim







1/11/26
- 500 epochs, experimented with the LR (4e-4 seems solid), exhausted model capacity... at 4 layers
- 0.1877 accuracy **revisited: I'm not confident that model capacity was necessarily exhausted


1/14/26 
 - 597 epochs, switched file type mid way through, 8 layers - not yet exhausted... 
 - train loss: 2.77, val loss: 2.73, accuracy: 0.1894


1/15/26
- Switched to sin pos encoder -> significant improvemements
- 17 epochs in -> 0.2 accuracy 2.73 train loss, 2.7 val loss -> (re)pause model training in-hopes of LR improvement (decay)
- 24 epochs in -> 0.204 -> increase LR to 8
- 26 epochs end -> 0.205


1/16/26
- limiting to 1 sequences per file - using only good sequences
- (also with increased lR = 8) achieved 0.2035 accuracy in 13 epochs
- achieved 0.2077 accuracy @ ecoch 50 -> 2.6651 training loss, 2.6567 loss

1/21
- increase number of layers to 16 & decrease batch size to 64
- pause/resume @ epoch 17 - 0.2027 accuracy, 2.702 train loss, 2.7044 val loss
- re/pause @ epcoh 30 - 0.2070 accuracy, 2.6668 training loss, 2.6661 validation loss
- re/pause @ epcoh 45 - 0.2096 accuracy, training loss 2.6522, val loss 2.6482 -> LR 4
  observe val loss < trainign loss
- re/pause @ epoch 54 - 0.2105 accuracy, training loss 2.6476, val loss 2.6425 


1/28
  - observe val loss < training loss => set dropout = 0
  - introduced mixed precision
  - repause @ 100 (achieved quickly) - 2.615 accuarcy, 2.6157 val loss, 2.6150 train loss



TODO
- increase number of heads to 12
- Try Nsight gpu profiling
- Tunnel tensorboard -> local reliably

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages