Fine-tuning BERT for text classification of emotions.
-
Updated
Dec 22, 2022 - Jupyter Notebook
Fine-tuning BERT for text classification of emotions.
use BERT to analyze whether each tweet is neutral, positive, or negative.
• Trained and deployed emotion detector with Word Embeddings, LSTM, BERT using TensorFlow and Transformers. • Fine tunned the Bert-base-cased Encoder Transformer with Tensorflow classification head, provides prediction out of 6 different emotions of the given input tweet with 95% accuracy.
In this implementation , I tried to identify spam and ham messages with Bert and I obtained %93 accuracy
Fake News Headlines Detection using different NLP strategies: BOW, FastText Embedding, Transformers.
BERT QnA API to answer prompts based on Context. (Using CDN)
My implementation of various popular transformer architectures
API to classify sentiment of a review as either positive or negative, developed using bert-base-uncased
Fine-Tune BERT model for spam detection.
News Topic MultiClass Classification with BERT
Flask app for Semantic Similarity of sentences using BERT model.
Korean Cursing expression Detection with fine-tuned klue_BERT (2023-1 KWU TextMining Term Project)
SMS Spam Classification with BERT
This repo provides a guide and code examples to preprocess text for BERT, build TensorFlow input pipelines for text data, and fine-tune BERT for text classification using TensorFlow 2 and TensorFlow Hub.
This repository implements sentiment classifier with Google Jax using Bert transformer as backbone. It also shows model checkpointing and loading for inference.
A collection of tools for BERT Question Answering on SQuAD, Generating and Evaluating predictions, making 1.1 Models compatible with 2.0, and everything you need for Submission to SQuAD Leaderboard.
Implemented BERT from scratch in PyTorch framework using Stanford Sentiment Treebank (SST) & CFIMDB datasets. Inspired by the papers ”The Annotated Transformer” and ”Illustrated BERT”.
Add a description, image, and links to the bert topic page so that developers can more easily learn about it.
To associate your repository with the bert topic, visit your repo's landing page and select "manage topics."