Picture for Traian Rebedea

Traian Rebedea

Towards Inference-time Category-wise Safety Steering for Large Language Models

Add code
Oct 02, 2024
Figure 1 for Towards Inference-time Category-wise Safety Steering for Large Language Models
Figure 2 for Towards Inference-time Category-wise Safety Steering for Large Language Models
Figure 3 for Towards Inference-time Category-wise Safety Steering for Large Language Models
Figure 4 for Towards Inference-time Category-wise Safety Steering for Large Language Models
Viaarxiv icon

"Vorbeşti Româneşte?" A Recipe to Train Powerful Romanian LLMs with English Instructions

Add code
Jun 26, 2024
Viaarxiv icon

Unsupervised Extraction of Dialogue Policies from Conversations

Add code
Jun 21, 2024
Viaarxiv icon

OpenLLM-Ro -- Technical Report on Open-source Romanian LLMs

Add code
May 17, 2024
Figure 1 for OpenLLM-Ro -- Technical Report on Open-source Romanian LLMs
Figure 2 for OpenLLM-Ro -- Technical Report on Open-source Romanian LLMs
Figure 3 for OpenLLM-Ro -- Technical Report on Open-source Romanian LLMs
Figure 4 for OpenLLM-Ro -- Technical Report on Open-source Romanian LLMs
Viaarxiv icon

CantTalkAboutThis: Aligning Language Models to Stay on Topic in Dialogues

Add code
Apr 04, 2024
Viaarxiv icon

Improving Legal Judgement Prediction in Romanian with Long Text Encoders

Add code
Mar 04, 2024
Viaarxiv icon

NeMo Guardrails: A Toolkit for Controllable and Safe LLM Applications with Programmable Rails

Add code
Oct 16, 2023
Viaarxiv icon

UPB at IberLEF-2023 AuTexTification: Detection of Machine-Generated Text using Transformer Ensembles

Add code
Aug 02, 2023
Viaarxiv icon

GEST: the Graph of Events in Space and Time as a Common Representation between Vision and Language

Add code
May 22, 2023
Viaarxiv icon

Distilling the Knowledge of Romanian BERTs Using Multiple Teachers

Add code
Jan 11, 2022
Figure 1 for Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Figure 2 for Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Figure 3 for Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Figure 4 for Distilling the Knowledge of Romanian BERTs Using Multiple Teachers
Viaarxiv icon