You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Delve into and analyze NNTrainer modules that handle user inputs (texts).
Decide whether transformation from the knowledge to the input is necessary, and specify transform(or else) logic from the above knowledge to the inputs if needed.
Devise a few sample scenarios for Question Answering (QA) to analyze the effects of knowledge embeddings (or prompting).
References
AAAI'22 - DKPLM: Decomposable Knowledge-Enhanced Pre-trained Language Model for Natural Language Understanding
EMNLP'22 - Knowledge prompting in pre-trained language model for natural language understanding
EMNLP'19 - Incorporating External Knowledge into Machine Reading for Generative Question Answering
Disclaimer
NOTE: We're open to suggestions. Please let us know in the comments if you have any feedback or guidance!
The text was updated successfully, but these errors were encountered:
Preliminary research: Pre-training Knowledge Fusion
References
Disclaimer
The text was updated successfully, but these errors were encountered: