Skip to content

BERT NER adapted to use in Google Colaboratory with TPU

License

Notifications You must be signed in to change notification settings

KoconJan/BERT-NER-CLI

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BERT_NER_CLI Step by Step Guide

a Before started, would like to appreciate for Google Research Team and @Kaiyinzhou's previous work at here.

Environment

  • Python 3.5+
  • Tensorflow 1.11+

Folder structure

Item Desc
NERdata training / evaluating dataset
bert bert code download from here
bert_ner.py training code
ner_predict.py predict code
predict_cli.py simple command line program for testing purpose

Fine-Tune model

alt text

Training with GCP GPU/TPU

I found this pretty detailed instructions of how to deploy code, mount folders and execute .py files with Google Colab and utilizing their FREE TPU/GPU capabilities.

BERT-Base, Uncased or BERT-Large, Uncased need to be unzipped and upload to your Google Drive folder and be mounted.

alt text

I used Colab GPU (K80) fine-tuning the model, took me around 30 mins.

Evaluating

An evaluation script can be found here. A quick evaluation with Uncased 12-layer result in 93.26 f1 score. 24-layer result will be tried and provided here later.

Predicting

A simple command line program was provided here for testing purpose. Simply run

python predict_cli.py

The program will firstly load the model and waiting for inputs.

Some test results:

alt text alt text alt text

About

BERT NER adapted to use in Google Colaboratory with TPU

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%