Skip to content

KushGrandhi/Polaroid

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 

Repository files navigation

Adaptive Notes Generator 📑

> Description:

Adaptive Notes Generator is a tool that helps us attend online classes effectively:star_struck:. Due to the Online class culture, taking notes in pen and paper is not a good idea, the only options left are to click screenshots or struggle to note down everything in your notebook:unamused:. Our application will make your life easier, once a meeting video:film_projector: is provided, we will create the notes that will save you time:stopwatch: of research and gathering resources. We will divide your meeting into useful segments and add additional data to make it easy to understand any concept.:bookmark_tabs::bookmark_tabs:

> Problem we are Solving:

During the Pandemic, many meetings were moved to online platforms:computer:, and still, continue using it. The Transition from blackboard:white_square_button: to PowerPoint:desktop_computer: has come with some problems, some of them are as follows:

1. Not being able to keep up the pace:hourglass_flowing_sand::hourglass_flowing_sand: due to concise information on each slide.

2. Not having the ability to write:black_nib::black_nib: effectively what the teacher explains.

We plan to address these issues through our project:innocent::wink:.


Named Entity Recognition system features a sophisticated word embedding strategy using subword features and "Bloom" embeddings, a deep convolutional neural network with residual connections, and a novel transition-based approach to named entity parsing. The system is designed to give a good balance of efficiency, accuracy and adaptability.

Source: https://spacy.io/universe/project/video-spacys-ner-model

BART is a denoising autoencoder for pretraining sequence-to-sequence models. It is trained by corrupting text with an arbitrary noising function, and learning a model to reconstruct the original text. It uses a standard Transformer-based neural machine translation architecture. It uses a standard seq2seq/NMT architecture with a bidirectional encoder (like BERT) and a left-to-right decoder (like GPT). This means the encoder's attention mask is fully visible, like BERT, and the decoder's attention mask is causal, like GPT2.

Source: https://arxiv.org/abs/1910.13461

75300fd2-7bcd-4536-9a05-d2d9c5fc7eec

Source: https://paperswithcode.com/method/bart?hcb=1

1b721bdf-8a62-420e-b8f4-9b3b466a8ff2

Source: https://pytorch.org/hub/snakers4_silero-models_stt/?hcb=1

> Problems faced:

  • Overwriting unwanted branches while testing
  • Finding accurate speech to text model
  • Dealing with cloud space
  • Download and upload from ML backend
  • Didn't have enough computational resource to run bigger models
  • Compressing files upload video
  • Download button in PDF viewer

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published