GETMARKED JSON Quiz Format

Overview
Once you have uploaded a document for conversion and have received the GETMARKED JSON quiz data, you will need to process the data into a suitable form for storing and rendering. This section will help you understand the structure of our JSON quiz format.

Note: Please ensure you have read our getting started guide before continuing!
Structure of JSON Quiz
Below are the abridged JSONs of the most common quiz types we support. The "..." in the JSON blocks are fields we are ignoring for now as they are not relevant for understanding the structure of the format.
Open ended question

{
    "title": "...",
    "category": "OEQ",
    "stimulus": "...",
    "prompt": "<div>What is the color of the sky?</div>",
    "answers": [["Blue"]]
}

Multiple choice question

{
    "title": "...",
    "category": "MCQ",
    "stimulus": "...",
    "prompt": "<div>What is the color of the sky?</div>",
    "choices": [
        {"id": "a", "content": "<div>Red</div>"},
        {"id": "b", "content": "<div>Blue</div>"}
    ],
    "answers": [["b"]]
}

Cloze question (fill-in-the-blanks)

{
    "title": "...",
    "category": "CLOZE",
    "stimulus": "...",
    "prompt": "<div>Roses are %%%0_TEXT_ENTRY_AgJ82I%%%. Violets are %%%1_TEXT_ENTRY_GgowI3%%%.</div>",
    "interactions": [
        {
            "placeholder": "%%%0_TEXT_ENTRY_AgJ82I%%%",
            "category": "text-entry",
            "answers": ["red", "Red"]
        },
        {
            "placeholder": "%%%1_TEXT_ENTRY_GgowI3%%%",
            "category": "text-entry",
            "answers": ["blue", "Blue"]
        }
    ],
    "answers": [
        ["red", "Red"],
        ["blue", "Blue"]
    ]
}

Matching question

{
    "title": "...",
    "category": "MATCH",
    "stimulus": "...",
    "prompt": "<div>Match the following countries to their capital.</div>",
    "matching_interaction": {
        "left": [
            {"id": "1", "content": "<div>USA</div>"},
            {"id": "2", "content": "<div>France</div>"}
        ],
        "right": [
            {"id": "a", "content": "<div>Paris</div>"},
            {"id": "b", "content": "<div>London</div>"},
            {"id": "c", "content": "<div>Washington D.C.</div>"}
        ],
        "answers": [
            ["1", "c"],
            ["2", "a"]
        ]
    },
    "answers": [
        ["1 c", "2 a"]
    ]
}

Numerical response question

{
    "title": "...",
    "category": "NRQ",
    "stimulus": "...",
    "prompt": "<div>Provide an estimate of π using the Gregory-Leibniz series.</div>",
    // refer to answer section below for more details about the forms Numeric Answer can take.
    "answers": [["3.141±0.0005", "3.10⟷3.14", "3.141p4"]]
}

JSON Quiz Schema
Below is a full sample of the JSON response that you might receive from our Quiz Import API.

{
    "status": "success",
    "message": "Your file has been successfully converted.",
    "data": {
        "filename": "getmarked-practice-0.docx",
        "num_of_pages": 4.0,
        "questions": [
          {
            "id": "2568481",
            "title": "Q001 Use the following table to ans...Which is the most expensive car on the list?",
            "category": "MCQ",
            "stimulus": "<div> <div>Use the following table to answer the next 2 questions:</div>
                         <table style=\"border: 1px solid #eeeeee;\"><tr><td style=\"padding: 5px; border:
                         1px solid #eeeeee;\"><div>Brand of Car</div></td><td style=\"padding: 5px; border: 1px solid
                         #eeeeee;\"><div>Make of Car</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>Cost ($)</div></td></tr><tr><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>Toyota</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>RAV4</div></td>
                         <td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>35,000</div></td></tr><tr>
                         <td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>Ford</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>F-Series</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>38,000</div></td></tr><tr>
                         <td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>Tesla</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>Model 3</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>80,000</div></td></tr>
                         <tr><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>Honda</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>Civic</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>27,600</div></td></tr></table> <div>   </div> </div>",
            "prompt": "<div> <div> 1. Which is the most expensive car on the list? </div> </div>",
            "answers": [
              [
                "c"
              ]
            ],
            "choices": [
              {
                "id": "a",
                "content": "<div> Toyota RAV4 </div>"
              },
              {
                "id": "b",
                "content": "<div> Ford F-Series </div>"
              },
              {
                "id": "c",
                "content": "<div> Tesla Model 3 </div>"
              },
              {
                "id": "d",
                "content": "<div> Honda Civic </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568482",
            "title": "Q002 Use the following table to ans...Which are the two cheapest cars on the list?",
            "category": "MRQ",
            "stimulus": "<div> <div>Use the following table to answer the next 2 questions:</div>
                         <table style=\"border: 1px solid #eeeeee;\"><tr><td style=\"padding: 5px; border:
                         1px solid #eeeeee;\"><div>Brand of Car</div></td><td style=\"padding: 5px; border: 1px solid
                         #eeeeee;\"><div>Make of Car</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>Cost ($)</div></td></tr><tr><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>Toyota</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>RAV4</div></td>
                         <td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>35,000</div></td></tr><tr>
                         <td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>Ford</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>F-Series</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>38,000</div></td></tr><tr>
                         <td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>Tesla</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>Model 3</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>80,000</div></td></tr>
                         <tr><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>Honda</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\">
                         <div>Civic</div></td><td style=\"padding: 5px; border: 1px solid #eeeeee;\"><div>27,600</div></td></tr></table> <div>   </div> </div>",
            "prompt": "<div> <div> 2. Which are the two cheapest cars on the list?</div> </div>",
            "answers": [
              [
                "a",
                "d"
              ]
            ],
            "choices": [
              {
                "id": "a",
                "content": "<div> Toyota RAV4 </div>"
              },
              {
                "id": "b",
                "content": "<div> Ford F-Series </div>"
              },
              {
                "id": "c",
                "content": "<div> Tesla Model 3 </div>"
              },
              {
                "id": "d",
                "content": "<div> Honda Civic </div>"
              }
            ],
            "metadata": {
              "Type": "Multiple Choice",
              "Difficulty": "Hard",
              "category": "Apply"
            },
            "points": 1.0
          },
          {
            "id": "2568483",
            "title": "Q003 The diagram below shows a child's swing. The swing is releas",
            "category": "MCQ",
            "prompt": "<div> <div> The diagram below shows a child's swing.</div> <div> <img src=\"https://digitaliserstorage.blob.core.windows.net/media/user_upload_files/user2__tHM1uEIE/23FV0TB7_KnCXfcEGBwUKgaQUjkLuaqtFkw4IqxUj.jpg\" /> </div> <div>The swing is released from point X.</div> <div>Which movement takes one period of oscillation?</div> </div>",
            "answers": [
              [
                "c"
              ]
            ],
            "choices": [
              {
                "id": "a",
                "content": "<div> From X to Y </div>"
              },
              {
                "id": "b",
                "content": "<div> From X to Z </div>"
              },
              {
                "id": "c",
                "content": "<div> From X to Z and back to X </div>"
              },
              {
                "id": "d",
                "content": "<div> From X to Z and back to Y </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568484",
            "title": "Q004 Who was captain of the Titanic when it sank on April 15, 191",
            "category": "MCQ",
            "prompt": "<div> <div> Who was captain of the Titanic when it sank on April 15, 1912?</div> </div>",
            "answers": [
              [
                "a"
              ]
            ],
            "choices": [
              {
                "id": "a",
                "content": "<div> <div> Edward Smith </div> </div>"
              },
              {
                "id": "b",
                "content": "<div> <div> Christopher Columbus </div> </div>"
              },
              {
                "id": "c",
                "content": "<div> <div> Jack Sparrow </div> </div>"
              },
              {
                "id": "d",
                "content": "<div> <div> David Greene </div> </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568485",
            "title": "Q005 ∫x+   sin⁡x1+   cos⁡x   dx is equal to",
            "category": "MCQ",
            "prompt": "<div> <div> <math><mrow><mo stretchy=\"false\">∫</mo><mrow><mfrac><mrow><mi>x</mi><mo>+</mo><mi>   </mi><mrow><mrow><mi mathvariant=\"normal\">sin</mi></mrow><mo>⁡</mo><mrow><mi>x</mi></mrow></mrow></mrow><mrow><mn>1</mn><mo>+</mo><mi>   </mi><mrow><mrow><mi mathvariant=\"normal\">cos</mi></mrow><mo>⁡</mo><mrow><mi>x</mi></mrow></mrow></mrow></mfrac><mi>   </mi><mi>d</mi><mi>x</mi></mrow></mrow></math> is equal to</div> </div>",
            "answers": [
              [
                "b"
              ]
            ],
            "choices": [
              {
                "id": "a",
                "content": "<div> <math><mrow><mrow><mi mathvariant=\"normal\">log</mi></mrow><mo>⁡</mo><mrow><mfenced close=\"|\" open=\"|\" separators=\"|\"><mrow><mn>1</mn><mo>+</mo><mi>   </mi><mrow><mrow><mi mathvariant=\"normal\">cos</mi></mrow><mo>⁡</mo><mrow><mi>x</mi></mrow></mrow></mrow></mfenced></mrow></mrow><mo>+</mo><mi>c</mi></math> </div>"
              },
              {
                "id": "b",
                "content": "<div> <math><mrow><mrow><mi mathvariant=\"normal\">log</mi></mrow><mo>⁡</mo><mrow><mfenced close=\"|\" open=\"|\" separators=\"|\"><mrow><mn>1</mn><mo>+</mo><mi>   </mi><mrow><mrow><mi mathvariant=\"normal\">sin</mi></mrow><mo>⁡</mo><mrow><mi>x</mi></mrow></mrow></mrow></mfenced></mrow></mrow><mo>+</mo><mi>c</mi></math> </div>"
              },
              {
                "id": "c",
                "content": "<div> <math><mi>x</mi><mo>-</mo><mi>   </mi><mrow><mrow><mi mathvariant=\"normal\">tan</mi></mrow><mo>⁡</mo><mrow><mi>x</mi></mrow></mrow><mo>+</mo><mi>c</mi></math> </div>"
              },
              {
                "id": "d",
                "content": "<div> <math><mi>x</mi><mi>   </mi><mrow><mrow><mi mathvariant=\"normal\">tan</mi></mrow><mo>⁡</mo><mrow><mfrac><mrow><mi>x</mi></mrow><mrow><mn>2</mn></mrow></mfrac></mrow></mrow></math> </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568486",
            "title": "Q006 Select the word whose underlin...",
            "category": "MCQ",
            "stimulus": "<div> <div>Select the word whose underlined part differs from the other three in pronunciation in the next 2 questions.</div> <div>   </div> </div>",
            "prompt": "<div> <div> 6.</div> </div>",
            "answers": [
              [
                "B"
              ]
            ],
            "choices": [
              {
                "id": "A",
                "content": "<div> am<u>a</u>zing </div>"
              },
              {
                "id": "B",
                "content": "<div> ch<u>a</u>rge </div>"
              },
              {
                "id": "C",
                "content": "<div> fem<u>a</u>le </div>"
              },
              {
                "id": "D",
                "content": "<div> t<u>a</u>ste </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568487",
            "title": "Q007 Select the word whose underlin...",
            "category": "MCQ",
            "stimulus": "<div> <div>Select the word whose underlined part differs from the other three in pronunciation in the next 2 questions.</div> <div>   </div> </div>",
            "prompt": "<div> <div> 7.</div> </div>",
            "answers": [
              [
                "A"
              ]
            ],
            "choices": [
              {
                "id": "A",
                "content": "<div> br<u>ea</u>k </div>"
              },
              {
                "id": "B",
                "content": "<div> br<u>ea</u>th </div>"
              },
              {
                "id": "C",
                "content": "<div> thr<u>ea</u>d </div>"
              },
              {
                "id": "D",
                "content": "<div> tr<u>ea</u>d </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568488",
            "title": "Q008(a) 8(a) A student sets up a circu...Explain why in this circuit the current lamp Q is larger th",
            "category": "OEQ",
            "stimulus": "<div> 8(a) <div> A student sets up a circuit using a battery made of four cells, a resistor R, two identical lamps P and Q, and a switch. The circuit is shown in Fig. 6.1.</div> <div> <img src=\"https://digitaliserstorage.blob.core.windows.net/media/user_upload_files/user2__tHM1uEIE/CDAZNTFW_aeZWRNEHLWp7TIdUbcCIuAqlqnqnGRBn.jpg\" /> </div> <div>   </div> </div>",
            "prompt": "<div> <div> Explain why in this circuit the current lamp Q is larger than the current in lamp P.</div> </div>",
            "answers": [
              [
                "Lamp P is connected in parallel with resistor R hence it has lower current compare to Q."
              ]
            ],
            "points": 1.0
          },
          {
            "id": "2568489",
            "title": "Q008(b) 8(b) A student sets up a circu...Explain why in this circuit lamp Q has different resistance",
            "category": "OEQ",
            "stimulus": "<div> 8(b) <div> A student sets up a circuit using a battery made of four cells, a resistor R, two identical lamps P and Q, and a switch. The circuit is shown in Fig. 6.1.</div> <div> <img src=\"https://digitaliserstorage.blob.core.windows.net/media/user_upload_files/user2__tHM1uEIE/RVCJFZAF_onTGN0UG4ATMhPhVeFx5s67obKFQLWfe.jpg\" /> </div> <div>   </div> </div>",
            "prompt": "<div> <div> Explain why in this circuit lamp Q has different resistance from lamp P even though they are identical lamps.</div> </div>",
            "answers": [
              [
                "As V=IR, given that P has less effective resistance due to being connect in parallel with R."
              ]
            ],
            "points": 1.0
          },
          {
            "id": "2568490",
            "title": "Q009 Suddenly she came upon a little three-legged table, all made",
            "category": "CLOZE",
            "prompt": "<div> <div> <p>Suddenly she came upon a little three-legged table, all made of solid glass;
                        there was (9) %%%0_TEXT_ENTRY_yJpav7%%% on it except a tiny golden key, and Alice's first (10) %%%1_TEXT_ENTRY_YAqpHj%%% was
                        that it might belong to one of the doors of the hall; but, alas! either the locks were too large, or the key was too small,
                        but at any rate it would not open any of them. (11) %%%2_TEXT_ENTRY_Q3Wejt%%% , on the second time round, she came upon a low
                        curtain she had not noticed before, and behind it was a little door about fifteen inches high: she tried the little golden key in the lock,
                        and to her great delight it fitted! </p> <p>Alice (12) %%%3_TEXT_ENTRY_OzxF6g%%% the door and found that it led into a small passage,
                        not much larger than a rat-hole: she knelt down and looked along the passage into the loveliest garden you ever saw. How she longed to get
                        out of that dark hall, (13) %%%4_TEXT_ENTRY_Dz74Au%%% wander about among those beds of bright flowers and those cool fountains, but
                        she (14) %%%5_TEXT_ENTRY_IkN5Qv%%% not even get her head through the doorway; \"and even if my head would go through,\"
                        thought poor Alice, \"it would be of very little use without my shoulders. Oh, how I wish I could shut up like a telescope!
                        I think I could, if I only (15) %%%6_TEXT_ENTRY_aNQ2m8%%% how to begin.\" For, you see, so many out-of-the-way things had happened lately,
                        that Alice had begun to think that very few things indeed were really impossible. </p> <p>(16) %%%7_TEXT_ENTRY_zmoOEb%%% seemed
                        to be no use in waiting by the little door, so she went back to the table, half hoping she might find another key on it, or at any rate a
                        book of rules for shutting people up like telescopes: this time she found a little bottle on it, (\"which certainly was not here before,\"
                        said Alice,) and round the neck of the bottle was a paper label, (17) %%%8_TEXT_ENTRY_eeyetY%%% the words \"DRINK ME,\" beautifully printed
                        on it in large letters. </p> </div> </div>",
            "interactions": [
              {
                "category": "text-entry",
                "placeholder": "%%%0_TEXT_ENTRY_yJpav7%%%",
                "answers": [
                  "nothing"
                ]
              },
              {
                "category": "text-entry",
                "placeholder": "%%%1_TEXT_ENTRY_YAqpHj%%%",
                "answers": [
                  "thought"
                ]
              },
              {
                "category": "text-entry",
                "placeholder": "%%%2_TEXT_ENTRY_Q3Wejt%%%",
                "answers": [
                  "thankfully",
                  " luckily"
                ]
              },
              {
                "category": "text-entry",
                "placeholder": "%%%3_TEXT_ENTRY_OzxF6g%%%",
                "answers": [
                  "opened"
                ]
              },
              {
                "category": "text-entry",
                "placeholder": "%%%4_TEXT_ENTRY_Dz74Au%%%",
                "answers": [
                  "and"
                ]
              },
              {
                "category": "text-entry",
                "placeholder": "%%%5_TEXT_ENTRY_IkN5Qv%%%",
                "answers": [
                  "could"
                ]
              },
              {
                "category": "text-entry",
                "placeholder": "%%%6_TEXT_ENTRY_aNQ2m8%%%",
                "answers": [
                  "knew"
                ]
              },
              {
                "category": "text-entry",
                "placeholder": "%%%7_TEXT_ENTRY_zmoOEb%%%",
                "answers": [
                  "there"
                ]
              },
              {
                "category": "text-entry",
                "placeholder": "%%%8_TEXT_ENTRY_eeyetY%%%",
                "answers": [
                  "with"
                ]
              }
            ],
            "answers": [
              [
                "nothing"
              ],
              [
                "thought"
              ],
              [
                "thankfully",
                " luckily"
              ],
              [
                "opened"
              ],
              [
                "and"
              ],
              [
                "could"
              ],
              [
                "knew"
              ],
              [
                "there"
              ],
              [
                "with"
              ]
            ],
            "points": 1.0
          },
          {
            "id": "2568491",
            "title": "Q015 The Grand Canyon is a steep-sided canyon carved by the Color",
            "category": "CLOZE",
            "prompt": "<div> <p>The Grand Canyon is a steep-sided canyon carved by the Colorado River in %%%0_INLINE_CHOICE_LIzVgb%%%. The canyon is a result of %%%1_TEXT_ENTRY_rbLMO1%%% which exposes one of the most complete geologic columns on the planet.</p> </div>",
            "interactions": [
              {
                "category": "inline-choice",
                "placeholder": "%%%0_INLINE_CHOICE_LIzVgb%%%",
                "choices": [
                  {
                    "id": "a",
                    "content": "Arizona"
                  },
                  {
                    "id": "b",
                    "content": "Utah"
                  },
                  {
                    "id": "c",
                    "content": "California"
                  },
                  {
                    "id": "d",
                    "content": "Nevada"
                  }
                ],
                "answers": [
                  "a"
                ]
              },
              {
                "category": "text-entry",
                "placeholder": "%%%1_TEXT_ENTRY_rbLMO1%%%",
                "answers": [
                  "erosion",
                  "weathering"
                ]
              }
            ],
            "answers": [
              [
                "a"
              ],
              [
                "erosion",
                "weathering"
              ]
            ],
            "points": 1.0
          },
          {
            "id": "2568492",
            "title": "Q016 The United States has 52 states.",
            "category": "MCQ",
            "prompt": "<div> <div>The United States has 52 states.</div> </div>",
            "answers": [
              [
                "b"
              ]
            ],
            "choices": [
              {
                "id": "a",
                "content": "<div> true </div>"
              },
              {
                "id": "b",
                "content": "<div> false </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568493",
            "title": "Q017 Neil Armstrong was the first person to walk on the moon.",
            "category": "MCQ",
            "prompt": "<div> <div>Neil Armstrong was the first person to walk on the moon.</div> </div>",
            "answers": [
              [
                "a"
              ]
            ],
            "choices": [
              {
                "id": "a",
                "content": "<div> true </div>"
              },
              {
                "id": "b",
                "content": "<div> false </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568494",
            "title": "Q018 United States of America entered WWI in 1917.",
            "category": "MCQ",
            "prompt": "<div> <div> United States of America entered WWI in 1917. </div> </div>",
            "answers": [
              [
                "a"
              ]
            ],
            "choices": [
              {
                "id": "a",
                "content": "<div> true </div>"
              },
              {
                "id": "b",
                "content": "<div> false </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568495",
            "title": "Q019 Paris is the capital of Italy.",
            "category": "MCQ",
            "prompt": "<div> <div>Paris is the capital of Italy.</div> </div>",
            "answers": [
              [
                "b"
              ]
            ],
            "choices": [
              {
                "id": "a",
                "content": "<div> true </div>"
              },
              {
                "id": "b",
                "content": "<div> false </div>"
              }
            ],
            "points": 1.0
          },
          {
            "id": "2568496",
            "title": "Q020 Match the following landmarks with the states they are in:",
            "category": "MATCH",
            "prompt": "<div> <div>Match the following landmarks with the states they are in:</div> </div>",
            "matching_interaction": {
              "left": [
                {
                  "id": "L0",
                  "content": "<div><div> Statue of Liberty </div></div>"
                },
                {
                  "id": "L1",
                  "content": "<div><div> Grand Canyon </div></div>"
                },
                {
                  "id": "L2",
                  "content": "<div><div> Mount Rushmore </div></div>"
                },
                {
                  "id": "L3",
                  "content": "<div><div> Golden Gate Bridge </div></div>"
                },
                {
                  "id": "L4",
                  "content": "<div><div> Empire State Building </div></div>"
                }
              ],
              "right": [
                {
                  "id": "R0",
                  "content": "<div><div> New York </div></div>"
                },
                {
                  "id": "R1",
                  "content": "<div><div> Nevada </div></div>"
                },
                {
                  "id": "R2",
                  "content": "<div><div> South Dakota </div></div>"
                },
                {
                  "id": "R3",
                  "content": "<div><div> California </div></div>"
                }
              ],
              "answers": [
                [
                  "L0",
                  "R0"
                ],
                [
                  "L1",
                  "R1"
                ],
                [
                  "L2",
                  "R2"
                ],
                [
                  "L3",
                  "R3"
                ],
                [
                  "L4",
                  "R0"
                ]
              ]
            },
            "answers": [
              [
                "L0 R0",
                "L1 R1",
                "L2 R2",
                "L3 R3",
                "L4 R0"
              ]
            ],
            "points": 1.0
          }
        ]
  }
}

The questions field contains a list of the extracted questions, each question has the following schema:

Question Item Schema

Field Description
id
(string)
An id to uniquely identify the question. Must be treated as a string text and not integer.
title
(string)
Question title is generally not presented to the student. It is used as reference for the teacher.
category
(enum)
The type of question. The different categories are:
Enum Description
MCQ
Multiple choice question. Only one of the choices are correct. Includes True-False questions as well.
MRQ
Multiple response question. One or more of the choices may be correct.
OEQ
Open ended question. Other names include essay question or short answer question.
NRQ
Numerical response question. Answer can only be numeric value and/or a numeric range.
CLOZE
Cloze questions are questions with inline interactions that are part of the question text. Those interactions are fill-in-the-blank or select-drop-down. Sometimes called embedded answers question.
MATCH
Matching question. Consider of two sets of items, a left set and a right set. Learners are required to associate the right items to the left items.

The left items can be considered as sub-questions and learners must respond to all of them. Right items can be considered as choices and may be used once, multiple times or not at all.

Note: there's huge variance between learning platforms over how matching questions are rendered. You should pick whatever makes sense for you.
CONTENT
Content only item. Does not contain any question and requires no respond from the learner. Only stimulus and prompt would be in use.
FILE
File upload question where learner are suppose to response to item by uploading a file. Only stimulus and prompt would be in use.
stimulus
(string)
Shared XHTML content between different questions. Stimulus may be absent in some questions.

Content can be dumped directly into browser to be rendered (Usually concatenated with prompt).
prompt
(string)
XHTML content of question task. Prompt may be absent in some questions.

Content can be dumped directly into browser to be rendered (Usually concatenated with stimulus).
choices
(List of Objects)
List of choices where every choice has an id and XHTML content.

The XHTML which may be rendered on browser as is.
Refer to choices section for further help in processing choices.
interactions
(List of Objects)
List of cloze interactions. Every cloze interaction contains a placeholder text that can be found inside the question's body. The placeholder text inside the question body marks the position where the cloze interaction should be inserted.

You should not depend on the placeholder text to tell you the cloze interaction type. It simply serves as a location marker with in the question body.

The cloze interaction will always contain the category attribute, which will tell you the type of this cloze field, either text-entry or inline-choice. It may contain the answers attribute if answers are present. It may also contain a choices attribute if the interaction is an inline-choice interaction (i.e. select drop down).

If the category of the interaction is inline-choice. Then it will contain a choices field. Here's an example:

"interactions": [
    {
        "placeholder": "%%%0_INLINE_CHOICE_ABCDEF%%%",
        "category": "inline-choice",
        "choices": [
            {"id": "a", "content": "California"},
            {"id": "b", "content": "Delaware"},
            {"id": "c", "content": "Texas"}
        ],
        "answers": ["a"]
    },
    ...
],



The interaction field will not be present if question has no cloze interactions within its body.

Refer to interactions section for further help in processing interactions.
matching_interaction
(JSON Object)
matching_interaction contains three attributes left, right, and answers. Any of these attributes may not be present if there's no data or items in them.

Both left and right attributes contains a JSON object that holds the left and right set of items respectively for the matching question. They are structure like so:

"matching_interaction": {
    "left": [
        {"id": "1", "content": "<div>USA</div>"},
        {"id": "2", "content": "<div>France</div>"}
    ],
    "right": {
        {"id": "a", "content": "<div>Paris</div>"},
        {"id": "b", "content": "<div>London</div>"},
        {"id": "c", "content": "<div>Washington D.C.</div>"}
    },
    ...
}

The left and right items consist of key-value pairs where the key is the identifier and value is the XHTML content of the the item.
(Note: it is possible for images to appear in either left or right items)

The left set of items are the sub-questions and the right set of items are the possible choices to be associated with the left items.

Learners must respond to all of the left items. Right items can be considered as choices and may be used once, multiple times or not at all.

For the answers attribute, it contains a list of list of identifiers that maps the left item identifier to the correct right item identifier.

"matching_interaction": {
    ...
    "answers": [
        ["1", "c"],
        ["2", "a"]
    ]
}


matching_interaction will only be present in matching questions.
answers
(array)
An array of arrays of answers (given in string).

The inner array represents the list of possible correct answers for a interaction. The outer array represents the answers for each (cloze) interaction present in the question (each question may have multiple interaction).

For NRQ, answers will always either contain
  • a numeric tolerance/error ± (U+00B1) in X±Y, or
  • a numeric range (inclusive) ⟷ (U+27F7) in X⟷Y, or
  • a precision marker p (U+0070) in XpY.
You can always expect X and Y to be present in all the above forms. In the case of XpY, Y will always be a positive integer and represents the precision required (i.e. number of digits) for the answer to be graded correct when compared to X.

Finally, teachers will overwhelmingly use X±Y over other numeric answer forms. So, for simplicity, you may consider to support only X±Y and ignore the rest.

May not be present if no answers are detected in the file.
metadata
(JSON object)
The keys are metadata tags that is found within the question text. Below is an non-exhaustive list of metadata we accept.
Difficulty Category Objective Topic
Feedback Explanation Bloom aacsb standard
Subject Page Ref sta complexity
The values associated with each key are plaintext string, not XHTML.
(e.g. {"Feedback": "Choice A is incorrect due to XYZ. Choice B is correct."})
Note: fields will be absent if there are no corresponding data.

Assessments and Sections Schema

Where available, our Import API will also return information on assessments and sections inside the uploaded file. I will look something like so.


{
    "status": "success",
    "message": "Your file has been successfully converted.",
    "data": {
        "filename": "canvas-export-module-1-quizzes.zip",
        "num_of_pages": 25.0,
        "questions": [...],
        "assessments": [
            {
            "id": "12",
            "title": "Semester 2 mid term test",
            "section_ids": ["330", "331", "332"]
            },
            {
            "id": "13",
            "title": "Semester 2 Exam",
            "section_ids": ["333", "334", "335"]
            }
        ],
        "sections": [
            {
            "id": "330",
            "title": "Semester 2 mid term test",
            "category": "ORD",
            "question_ids": ["330", "331", "332"]
            },
            {
            "id": "331",
            "title": "Semester 2 Exam",
            "category": "RAND",
            "select": 1,
            "question_ids": ["333", "334", "335"]
            },
            ...
        ]
    }
}

Assessment Schema

Field Description
id
(string)
An id to uniquely identify the assessment. Must be treated as a string text and not integer.
title
(string)
The name of the assessment, usually visible to both learners and instructors.
section_ids
(array)
Array of strings containing the id of the sections that the assessment is composed of. A section is a group of questions.

Section Schema

A section is a group of questions. Questions may be selected from it for an assessment depending on the category of the section.
Field Description
id
(string)
An id to uniquely identify the section. Must be treated as a string text and not integer.
title
(string)
The name of the section, usually NOT visible to the learners, only visible to the instructors.
NOTE: this may not always be present
category
(enum)
The type of section. The different categories are:
Enum Description
ORD
Ordered section. All questions are to be presented in order.
RAND
Random section. N number of questions are randomly selected from this section.
select
(intger)
The number of questions to be randomly selected from this section.

This field is only present in a RAND section.
Processing Response Fields
In this section we provide some examples to illustrate how different fields may be processed before they can be rendered on the browser.


Stimulus and Prompt
Stimulus and prompt form the body of the question. As they are provided as XHTML, you can simply concaternate stimulus and prompt in that order and render the result directly on the browser.
Using the test endpoint's question 5a's response, the result should look like this:

<div>
    <div>
    5. A student sets up a circuit using a battery made of four cells, a resistor R, two identical lamps P and Q,
        and a switch. The circuit is shown in Fig. 6.1.
    </div>
    <div>
        <img src="https://digitaliserstorage.blob.core.windows.net/media/image/yhaojin/S8DK3WWI_uahPtvwErqHg0IQNwReTK5buPHFK9NmR78yPgprQ13l99AaR0XskUZgIJaNvRD0F.jpg"/>
    </div>
</div>
<div>
    Explain why in this circuit the current lamp Q is larger than the current in lamp P.
</div>



Choices
Choices are a series of XHTML. You may want to render them either as radio button inputs (MCQ) or checkbox inputs (MRQs). Using the test endpoint's question 2's response, an example of how choices may be rendered in browser may look like this:

<input type="radio" name="choices-selection" id="a">
<label for="a">
    <div>Toyota RAV4</div>
</label>
<input type="radio" name="choices-selection" id="b">
<label for="b">
    <div>Ford F-Series</div>
</label>
<input type="radio" name="choices-selection" id="c">
<label for="c">
    <div>Tesla Model 3</div>
</label>
<input type="radio" name="choices-selection" id="d">
<label for="d">
    <div>Honda Civic</div>
</label>



Cloze Interactions
When questions have cloze interactions, the placeholder text inside the question prompt needs to be rendered before dumping into the browser.

Cloze interactions that are text-entry are fill-in-the-blank response field and interactions that are inline-choice are select-drop-down fields. Below is an example of how the placeholder text can be replace with the relevant HTML tags to collect learner's response:

<div>
    <span>
        The Grand Canyon is a steep-sided canyon carved by the Colorado River in
        <select name="1_INTERACTION_INLINE_CHOICE">
            <option value="a">Arizona</option>
            <option value="b">Utah</option>
            <option value="c">California</option>
            <option value="d">Nevada</option>
        </select>.
    </span>
    <span>
        The canyon is a result of <input name="0_INTERACTION_TEXT_ENTRY"> which
        exposes one of the most complete geologic columns on the planet.
    </span>
</div>

Check out our API endpoints
Now that you know how to process the converted job’s data, why not try converting some files using our API endpoints? Head on over to our API endpoints page and start using them!
Need some help?
Coding is hard. If you have problems rendering the converted data or with our endpoints in general, please do not hesitate to reach out and contact us.