Page History
...
- Diego Collarana (FIT)
- Daniel Baldassare (doctima)
- Michael Wetzel (Coreon)
- Sabine Mahr (word b sign)
- ...
Draft from Daniel Baldassare :
Short definition/description of this topic: please fill in ...Draft from Daniel Baldassare :Verbalizing knowledge graphs for LLM is the task of representing knowledge graphs as text so that they can be written directly in the prompt, the main input source of LLM. Verbalization consists of finding textual representations for nodes, relationships between nodes, and their metadata. Verbalization can take place at different stages of the LLM lifecycle: during training (pre-training, instruction-fine-tuning) or during inference (in-context learning).
- Encoding Graphs in Prompt: Talk like a graph: Encoding graphs for large language models (research.google)
- System prompt vs user prompt
- Train or finetune a Tokenizer with dedicated special tokens for graph data
- Join Graph Embeddings with text embeddings: Joint Embeddings for Graph Instruction Tuning (arxiv.org)
...