RenlyH/CodeV-RL
Image-Text-to-Text • 8B • Updated • 175 • 2
data_source stringclasses 8
values | prompt listlengths 2 2 | reward_model dict | extra_info dict | images unknown | env_name stringclasses 1
value |
|---|---|---|---|---|---|
FigureQA | [
{
"content": "You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on provided image and question. \n - You can optionally generate and execute Python code to help analyze or process the image before answering.\n\n**Python execution rules**:\n - The Python code will be execute... | {
"ground_truth": "No",
"style": "rule"
} | {
"answer": "",
"image_file_name": "22169.jpg",
"image_file_path": "/home/xinhaih/hits/exp/renly/CodeV_images/22169.jpg",
"image_size": [
588,
392
],
"index": "0",
"question": "<image>Does Midnight Blue have the maximum area under the curve?",
"split": "train",
"style": "rule"
} | [
137,
80,
78,
71,
13,
10,
26,
10,
0,
0,
0,
13,
73,
72,
68,
82,
0,
0,
2,
76,
0,
0,
1,
136,
8,
6,
0,
0,
0,
179,
245,
191,
252,
0,
0,
239,
141,
73,
68,
65,
84,
120,
218,
236,
189,
119,
124,
85,
215,
153,
239,
253,
22... | code |
FigureQA | [
{
"content": "You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on provided image and question. \n - You can optionally generate and execute Python code to help analyze or process the image before answering.\n\n**Python execution rules**:\n - The Python code will be execute... | {
"ground_truth": "Yes",
"style": "rule"
} | {
"answer": "",
"image_file_name": "25799.jpg",
"image_file_path": "/home/xinhaih/hits/exp/renly/CodeV_images/25799.jpg",
"image_size": [
392,
392
],
"index": "0",
"question": "<image>Is Midnight Blue greater than Medium Blue?",
"split": "train",
"style": "rule"
} | [
137,
80,
78,
71,
13,
10,
26,
10,
0,
0,
0,
13,
73,
72,
68,
82,
0,
0,
1,
136,
0,
0,
1,
136,
8,
6,
0,
0,
0,
91,
67,
119,
38,
0,
0,
90,
229,
73,
68,
65,
84,
120,
218,
237,
221,
121,
116,
92,
103,
157,
231,
255,
247,
... | code |
FigureQA | [{"content":"You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on pr(...TRUNCATED) | {
"ground_truth": "Yes",
"style": "rule"
} | {"answer":"","image_file_name":"17055.jpg","image_file_path":"/home/xinhaih/hits/exp/renly/CodeV_ima(...TRUNCATED) | "iVBORw0KGgoAAAANSUhEUgAAAtgAAAGICAYAAAB7tZdvAAEAAElEQVR42uz9Z5Rc133ne39PrJy7q3NEo7uRcyZAghGUDIpi0Mi(...TRUNCATED) | code |
FigureQA | [{"content":"You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on pr(...TRUNCATED) | {
"ground_truth": "No",
"style": "rule"
} | {"answer":"","image_file_name":"29591.jpg","image_file_path":"/home/xinhaih/hits/exp/renly/CodeV_ima(...TRUNCATED) | "iVBORw0KGgoAAAANSUhEUgAAAhQAAAGICAYAAAAZNnlAAABUSUlEQVR42u3deXBcZ4Eu/Of0vi/qllr7vli2LK/yEm8xsZ3FhDi(...TRUNCATED) | code |
FigureQA | [{"content":"You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on pr(...TRUNCATED) | {
"ground_truth": "No",
"style": "rule"
} | {"answer":"","image_file_name":"24680.jpg","image_file_path":"/home/xinhaih/hits/exp/renly/CodeV_ima(...TRUNCATED) | "iVBORw0KGgoAAAANSUhEUgAAAfgAAAGICAYAAACtCvK+AAA9C0lEQVR42u3deZBU533v//fpvXt6umff92EZGEDAsEgIkGRtlmN(...TRUNCATED) | code |
FigureQA | [{"content":"You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on pr(...TRUNCATED) | {
"ground_truth": "No",
"style": "rule"
} | {"answer":"","image_file_name":"34812.jpg","image_file_path":"/home/xinhaih/hits/exp/renly/CodeV_ima(...TRUNCATED) | "iVBORw0KGgoAAAANSUhEUgAAAcAAAAGICAYAAADWLjByAACZJElEQVR42uz9d3xc133n/7/u3OkFmEHvlSAJgiTYKTaJkihZPZa(...TRUNCATED) | code |
FigureQA | [{"content":"You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on pr(...TRUNCATED) | {
"ground_truth": "No",
"style": "rule"
} | {"answer":"","image_file_name":"39373.jpg","image_file_path":"/home/xinhaih/hits/exp/renly/CodeV_ima(...TRUNCATED) | "iVBORw0KGgoAAAANSUhEUgAAAqAAAAGICAYAAACeK1IDAABkBElEQVR42u39eXBd52HfcX/PufsGXAAX+0qQIMFd3Ekt1GbFsiQ(...TRUNCATED) | code |
FigureQA | [{"content":"You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on pr(...TRUNCATED) | {
"ground_truth": "Yes",
"style": "rule"
} | {"answer":"","image_file_name":"232.jpg","image_file_path":"/home/xinhaih/hits/exp/renly/CodeV_image(...TRUNCATED) | "iVBORw0KGgoAAAANSUhEUgAAAkwAAAGICAYAAACz9b/8AABTsElEQVR42u3deXDc933f/+feF3ZxLO6bAEESBEgAPESRkEjKkqh(...TRUNCATED) | code |
FigureQA | [{"content":"You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on pr(...TRUNCATED) | {
"ground_truth": "No",
"style": "rule"
} | {"answer":"","image_file_name":"38489.jpg","image_file_path":"/home/xinhaih/hits/exp/renly/CodeV_ima(...TRUNCATED) | "iVBORw0KGgoAAAANSUhEUgAAAdwAAAGICAYAAADrvFEUAACsU0lEQVR42uz9d3xc15Xni37PqZwRCzkngmDOpEhJFBUoiUq2nNV(...TRUNCATED) | code |
FigureQA | [{"content":"You are a helpful assistant.\n\n**Goal**:\n - Answer the user's question based on pr(...TRUNCATED) | {
"ground_truth": "No",
"style": "rule"
} | {"answer":"","image_file_name":"31719.jpg","image_file_path":"/home/xinhaih/hits/exp/renly/CodeV_ima(...TRUNCATED) | "iVBORw0KGgoAAAANSUhEUgAAAoQAAAGICAYAAADYnfGpAABKsUlEQVR42u3dd3Bd953f/fe5vaI3ogNEIYTCIlaxiFShqm2u9Ei(...TRUNCATED) | code |
This dataset is used to train CodeV-RL, as described in the referenced paper.