Skip to content

arth-shukla/squad2.0-bert-question-answer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SQuAD 2.0 Question-Answer w/ DistilBERT

Link

WandB: https://wandb.ai/arth-shukla/SQuAD2.0%20with%20Fine-Tuned%20DistilBERT

Resources Used

I used the Stanford SQuAD 2.0 Question-Answer Database: https://rajpurkar.github.io/SQuAD-explorer/

Technologies Used

Algorithms/Concepts: Transformers, Transfer Learning, BERT/DistilBERT, Question-Answer

AI Development: Pytorch (Torch, Datasets, Dataloaders, Cuda), HuggingFace Transformers Library, DistilBERT

Evaluation and Inference

Full Table and other runtime info available on WandB: https://wandb.ai/arth-shukla/SQuAD2.0%20with%20Fine-Tuned%20DistilBERT

Future Experiments

I want to try doing something similar to my BERT Sentiment Analysis project (https://github.com/arth-shukla/sentiment140-bert-transfer-learning). In that project, I retrained my original model on the failure cases by choosing all examples over a certain loss. I think doing that here will improve the model as well.

About Me

Arth Shukla Site | GitHub | LinkedIn

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published