GETMARKED JSON Quiz Format

Overview
Once you have uploaded a document for conversion and have received the GETMARKED JSON quiz data, you will need to process the data into a suitable form for storing and rendering. This section will help you understand the structure of our JSON quiz format.

Note: Please ensure you have read our getting started guide before continuing!
Structure of JSON Quiz
Below are the abridged JSONs of the most common quiz types we support. The "..." in the JSON blocks are fields we are ignoring for now as they are not relevant for understanding the structure of the format.
Open ended question

{
    "title": "...",
    "category": "OEQ",
    "stimulus": "...",
    "prompt": "<div>What is the color of the sky?</div>",
    "answers": [["Blue"]]
}

Multiple choice question

{
    "title": "...",
    "category": "MCQ",
    "stimulus": "...",
    "prompt": "<div>What is the color of the sky?</div>",
    "choices": {
        "a": "<div>Red</div>"
        "b": "<div>Blue</div>"
        },
    "answers": [["b"]]
}

Cloze question (fill-in-the-blanks)

{
    "title": "...",
    "category": "CLOZE",
    "stimulus": "...",
    "prompt": "<div>Roses are %%%0_INTERACTION_TEXT_ENTRY%%%. Violets are %%%1_INTERACTION_TEXT_ENTRY%%%.</div>",
    "interactions": {
        "%%%0_INTERACTION_TEXT_ENTRY%%%": {
            "category": "text-entry",
            "answers": ["red", "Red"]
        },
        "%%%1_INTERACTION_TEXT_ENTRY%%%": {
            "category": "text-entry",
            "answers": ["blue", "Blue"]
        }
    },
    "answers": [
        ["red", "Red"],
        ["blue", "Blue"]
    ]
}

Matching question

{
    "title": "...",
    "category": "MATCH",
    "stimulus": "...",
    "prompt": "<div>Match the following countries to their capital.</div>",
    "matching_interaction": {
        "left": {
            "1": "<div>USA</div>",
            "2": "<div>France</div>"
        },
        "right": {
            "a": "<div>Paris</div>",
            "b": "<div>London</div>",
            "c": "<div>Washington D.C.</div>"
        },
        "answers": {
            "1": "c",
            "2": "a"
        }
    },
    "answers": [
        ["1 c", "2 a"]
    ]
}

JSON Quiz Schema
Below is a full sample of the JSON response that you might receive from our Quiz Import API.

{
    "status": "success",
    "message": "Your file has been successfully converted.",
    "data": {
        "filename": "getmarked-practice-0.docx",
        "num_of_pages": 3.0,
        "questions": [
            {
                "title": "Q6 Suddenly she came upon a little three-legged table, all made",
                "category": "CLOZE",
                "interactions": {
                    "%%%0_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                    },
                    "%%%1_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                    },
                    "%%%2_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                    },
                    "%%%3_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                    },
                    "%%%4_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                    },
                    "%%%5_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                    },
                    "%%%6_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                    },
                    "%%%7_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                    },
                    "%%%8_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                    }
                },
                "prompt": "<blockquote><div><div> <div> Suddenly she came upon a little three-legged table, 
                    all made of solid glass; there was (6) %%%0_INTERACTION_TEXT_ENTRY%%% on it except a tiny golden key, 
                    and Alice’s first (7) %%%1_INTERACTION_TEXT_ENTRY%%% was that it might belong to one of the doors of 
                    the hall; but, alas! either the locks were too large, or the key was too small, but at any rate it would 
                    not open any of them. (8) %%%2_INTERACTION_TEXT_ENTRY%%% , on the second time round, she came upon a low 
                    curtain she had not noticed before, and behind it was a little door about fifteen inches high: she tried 
                    the little golden key in the lock, and to her great delight it fitted! </div> <div> Alice 
                    (9) %%%3_INTERACTION_TEXT_ENTRY%%% the door and found that it led into a small passage, not much larger than 
                    a rat-hole: she knelt down and looked along the passage into the loveliest garden you ever saw. How she longed 
                    to get out of that dark hall, (10) %%%4_INTERACTION_TEXT_ENTRY%%% wander about among those beds of bright flowers 
                    and those cool fountains, but she (11) %%%5_INTERACTION_TEXT_ENTRY%%% not even get her head through the doorway; 
                    “and even if my head would go through,” thought poor Alice, “it would be of very little use without my shoulders. 
                    Oh, how I wish I could shut up like a telescope! I think I could, if I only (12) %%%6_INTERACTION_TEXT_ENTRY%%% 
                    how to begin.” For, you see, so many out-of-the-way things had happened lately, that Alice had begun to think that 
                    very few things indeed were really impossible. </div> <div> (13) %%%7_INTERACTION_TEXT_ENTRY%%% seemed to be no 
                    use in waiting by the little door, so she went back to the table, half hoping she might find another key on it, or 
                    at any rate a book of rules for shutting people up like telescopes: this time she found a little bottle on it, 
                    (“which certainly was not here before,” said Alice,) and round the neck of the bottle was a paper label, 
                    (14) %%%8_INTERACTION_TEXT_ENTRY%%% the words “DRINK ME,” beautifully printed on it in large letters. </div> 
                    </div></div></blockquote>"
            },
            {
                "title": "Q1 Use the following table to ans...Which is the most expensive car in the list?",
                "category": "MCQ",
                "answers": [
                    [
                        "c"
                    ]
                ],
                "stimulus": "<div> <div> Use the following table to answer the next 2 questions: </div> <table 
                    style=\"border: 1px solid #eeeeee;\"> <tr> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> 
                    Brand of Car </div> </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Make of Car </div> 
                    </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Cost ($) </div> </td> </tr> 
                    <tr> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Toyota </div> </td> <td style=\"padding: 5px; 
                    border: 1px solid #eeeeee;\"> <div> RAV4 </div> </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> 
                    35,000 </div> </td> </tr> <tr> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Ford </div> </td> 
                    <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> F-Series </div> </td> <td style=\"padding: 5px; 
                    border: 1px solid #eeeeee;\"> <div> 38,000 </div> </td> </tr> <tr> <td style=\"padding: 5px; border: 1px solid 
                    #eeeeee;\"> <div> Tesla </div> </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Model 3 </div> 
                    </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> 80,000 </div> </td> </tr> <tr> <td 
                    style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Honda </div> </td> <td style=\"padding: 5px; border: 1px solid 
                    #eeeeee;\"> <div> Civic </div> </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> 27,600 </div> 
                    </td> </tr> </table> <div>   </div> </div>",
                "prompt": "<div> <div> 1. Which is the most expensive car in the list? </div> </div>" ,
                "choices": {
                    "a": "<div> Toyota RAV4 </div>",
                    "b": "<div> Ford F-Series </div>",
                    "c": "<div> Tesla Model 3 </div>",
                    "d": "<div> Honda Civic </div>"
                }
            },
            {
                "title": "Q2 Use the following table to ans...Which is the cheapest car in the list?",
                "category": "MCQ",
                "answers": [
                    [
                        "d"
                    ]
                ],
                "stimulus": ""<div> <div> Use the following table to answer the next 2 questions: </div>
                    <table style=\"border: 1px solid #eeeeee;\"> <tr> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> 
                    Brand of Car </div> </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Make of Car </div> </td> 
                    <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Cost ($) </div> </td> </tr> <tr> <td style=\"padding: 
                    5px; border: 1px solid #eeeeee;\"> <div> Toyota </div> </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> 
                    <div> RAV4 </div> </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> 35,000 </div> </td> </tr> 
                    <tr> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Ford </div> </td> <td style=\"padding: 5px; 
                    border: 1px solid #eeeeee;\"> <div> F-Series </div> </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> 
                    <div> 38,000 </div> </td> </tr> <tr> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Tesla </div> 
                    </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Model 3 </div> </td> <td style=\"padding: 5px; 
                    border: 1px solid #eeeeee;\"> <div> 80,000 </div> </td> </tr> <tr> <td style=\"padding: 5px; border: 1px solid 
                    #eeeeee;\"> <div> Honda </div> </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> Civic </div> 
                    </td> <td style=\"padding: 5px; border: 1px solid #eeeeee;\"> <div> 27,600 </div> </td> </tr> </table> <div> 
                    </div> </div>",
                "prompt": "<div> <div> 2. Which is the cheapest car in the list? </div> </div>" ,
                "choices": {
                    "a": "<div> Toyota RAV4 </div>",
                    "b": "<div> Ford F-Series </div>",
                    "c": "<div> Tesla Model 3 </div>",
                    "d": "<div> Honda Civic </div>"
                }
            },
            {
                "title": "Q3 The diagram below shows a child’s swing. The swing is releas",
                "category": "MCQ",
                "answers": [
                    [
                        "c"
                    ]
                ],
                "prompt": "<div> <div> The diagram below shows a child’s swing. </div> <div> 
                    <img src=\"https://digitaliserstorage.blob.core.windows.net/media/image/yhaojin/U9DG7RP2_3nQubtm7ijXr1GxrYS07gVuhuL
                    kjHOM5vB5Efv7x9ssRYXuAH80zhKDdYg3AqryA.jpg\" /></div> <div> The swing is released from point X. </div> <div> 
                    Which movement takes one period of oscillation? </div></div>",
                "choices": {
                    "a": "<div> From X to Y </div>",
                    "b": "<div> From X to Z </div>",
                    "c": "<div> From X to Z and back to X </div>",
                    "d": "<div> From X to Z and back to Y </div>"
                }
            },
            {
                "title": "Q4 ∫ x + sin ⁡ x 1 + cos ⁡ x d x is equal to",
                "category": "MCQ",
                "answers": [
                    [
                        "b"
                    ]
                ],
                "prompt": "<div> <div> <math> <mrow> <mo stretchy=\"false\">∫ </mo> <mrow> <mfrac> 
                    <mrow> <mi> x </mi> <mo> + </mo> <mi> </mi> <mrow> <mrow> <mi mathvariant=\"normal\">sin </mi> </mrow>
                    <mo> ⁡ </mo> <mrow> <mi> x </mi> </mrow> </mrow> </mrow> <mrow> <mn> 1 </mn> <mo> + </mo> <mi> 
                    </mi> <mrow> <mrow> <mi mathvariant=\"normal\">cos </mi> </mrow> <mo> ⁡ </mo> <mrow> <mi> x </mi> </mrow>
                    </mrow> </mrow> </mfrac> <mi> </mi> <mi> d </mi> <mi> x </mi> </mrow> </mrow> </math> is equal to 
                    </div> </div>",
                "choices": {
                    "a": "<div> <math> <mrow> <mrow> <mi mathvariant=\"normal\">log </mi> </mrow> <mo> ⁡ </mo> 
                        <mrow> <mfenced close=\"|\" open=\"|\" separators=\"|\"> <mrow> <mn> 1 </mn> <mo> + </mo> <mi> </mi> 
                        <mrow> <mrow> <mi mathvariant=\"normal\">cos </mi> </mrow> <mo> ⁡ </mo> <mrow> <mi> x </mi> 
                        </mrow> </mrow> </mrow> </mfenced> </mrow> </mrow> <mo> + </mo> <mi> c </mi> </math> </div>",
                    "b": "<div> <math> <mrow> <mrow> <mi mathvariant=\"normal\">log </mi> </mrow> <mo> ⁡ </mo> 
                        <mrow> <mfenced close=\"|\" open=\"|\" separators=\"|\"> <mrow> <mn> 1 </mn> <mo> + </mo> <mi> </mi> 
                        <mrow> <mrow> <mi mathvariant=\"normal\">sin </mi> </mrow> <mo> ⁡ </mo> <mrow> <mi> x </mi> 
                        </mrow> </mrow> </mrow> </mfenced> </mrow> </mrow> <mo> + </mo> <mi> c </mi> </math> </div>",
                    "c": "<div> <math> <mi> x </mi> <mo> - </mo> <mi> </mi> <mrow> <mrow> 
                        <mi mathvariant=\"normal\">tan </mi> </mrow> <mo> ⁡ </mo> <mrow> <mi> x </mi> </mrow> </mrow> <mo> + 
                        </mo> <mi> c </mi> </math> </div>",
                    "d": "<div> <math> <mi> x </mi> <mi> </mi> <mrow> <mrow> 
                        <mi mathvariant=\"normal\">tan </mi> </mrow> <mo> ⁡ </mo> <mrow> <mfrac> <mrow> <mi> x </mi> </mrow> 
                        <mrow> <mn> 2 </mn> </mrow> </mfrac> </mrow> </mrow> </math> </div>"
                }
            },
            {
                "title": "Q5(a) student sets up a circuit usin...Explain why in this circuit the current lamp Q is larger tha",
                "category": "OEQ",
                "answers": [
                    [
                        "Lamp P is connected in parallel with resistor R hence it has lower current compare to Q."
                    ]
                ],
                "stimulus": "<div> <div> 5. A student sets up a circuit using a battery made of four cells, a 
                    resistor R, two identical lamps P and Q, and a switch. The circuit is shown in Fig. 6.1. </div> <div> 
                    <img src=\"https://digitaliserstorage.blob.core.windows.net/media/image/yhaojin/S8DK3WWI_uahPtvwErqHg0IQN 
                    wReTK5buPHFK9NmR78yPgprQ13l99AaR0XskUZgIJaNvRD0F.jpg\" /></div> <div>   </div> </div>,
                "prompt": "<div> Explain why in this circuit the current lamp Q is larger than the current in lamp P. </div>"
            },
            {
                "title": "Q5(b) student sets up a circuit usin...Explain why in this circuit lamp Q has different resistance ",
                "category": "OEQ",
                "answers": [
                    [
                        "As V=IR, given that P has less effective resistance due to being connect in parallel with R."
                    ]
                ],
                "stimulus": "<div> <div> 5. A student sets up a circuit using a battery made of four cells, a 
                    resistor R, two identical lamps P and Q, and a switch. The circuit is shown in Fig. 6.1. </div> <div> 
                    <img src=\"https://digitaliserstorage.blob.core.windows.net/media/image/yhaojin/W40PXKB6_ad2DdYorsTR4wkLm
                    JhQ0cH1tszqkLYtFj8MnGT65YzFOmF9I1ta7gJV7jBjEhUgx.jpg\" /></div> <div>   </div> </div>,
                "prompt": "<div> Explain why in this circuit lamp Q has different resistance from lamp P even though 
                    they are identical lamps. </div>"
            },
            {
                "title": "Q15 The Grand Canyon is a steep-sided canyon carved by the Color",
                "category": "CLOZE",
                "answers": [
                    [
                        "a"
                    ],
                    [
                        "erosion", "weathering"
                    ]
                ],
                "interactions": {
                    "%%%0_INTERACTION_INLINE_CHOICE%%%": {
                        "category": "inline-choice",
                        "answers": [
                            "a"
                        ],
                        "choices": {
                            "a": "Arizona",
                            "b": "Utah",
                            "c": "California",
                            "d": "Nevada"
                        }
                    },
                    "%%%1_INTERACTION_TEXT_ENTRY%%%": {
                        "category": "text-entry",
                        "answers": [
                            "erosion", "weathering"
                        ]
                    }
                },
                "prompt": "<blockquote><div><div> <div> The Grand Canyon is a steep-sided canyon carved
                    by the Colorado River in %%%0_INTERACTION_INLINE_CHOICE%%%. The canyon is a result of %%%1_INTERACTION_TEXT_ENTRY%%% which exposes
                    one of the most complete geologic columns on the planet. </div> </div></div></blockquote>"
            },
            {
                "title": "Q16 The United States has 52 states.",
                "category": "MCQ",
                "answers": [
                    [
                        "b"
                    ]
                ],
                "prompt": "<div> <div> The United States has 52 states. </div> </div>" ,
                "choices": {
                    "a": "<div> true </div>",
                    "b": "<div> false </div>"
                }
            },
            {
                "title": "Q17 Neil Armstrong was the first person to walk on the moon.",
                "category": "MCQ",
                "answers": [
                    [
                        "a"
                    ]
                ],
                "prompt": "<div> <div> Neil Armstrong was the first person to walk on the moon. </div> </div>" ,
                "choices": {
                    "a": "<div> true </div>",
                    "b": "<div> false </div>"
                }
            },
            {
                "title": "Q18 Match the following landmarks with the states they are in: S",
                "category": "CLOZE",
                "answers": [
                    [
                        "a"
                    ],
                    [
                        "b"
                    ],
                    [
                        "c"
                    ],
                    [
                        "d"
                    ],
                    [
                        "a"
                    ]
                ],
                "interactions": {
                    "%%%0_INTERACTION_INLINE_CHOICE%%%": {
                        "category": "inline-choice",
                        "answers": [
                            "a"
                        ],
                        "choices": {
                            "a": "New York",
                            "b": "Nevada",
                            "c": "South Dakota",
                            "d": "California"
                        }
                    },
                    "%%%1_INTERACTION_INLINE_CHOICE%%%": {
                        "category": "inline-choice",
                        "answers": [
                            "b"
                        ],
                        "choices": {
                            "a": "New York",
                            "b": "Nevada",
                            "c": "South Dakota",
                            "d": "California"
                        }
                    },
                    "%%%2_INTERACTION_INLINE_CHOICE%%%": {
                        "category": "inline-choice",
                        "answers": [
                            "c"
                        ],
                        "choices": {
                            "a": "New York",
                            "b": "Nevada",
                            "c": "South Dakota",
                            "d": "California"
                        }
                    },
                    "%%%3_INTERACTION_INLINE_CHOICE%%%": {
                        "category": "inline-choice",
                        "answers": [
                            "d"
                        ],
                        "choices": {
                            "a": "New York",
                            "b": "Nevada",
                            "c": "South Dakota",
                            "d": "California"
                        }
                    },
                    "%%%4_INTERACTION_INLINE_CHOICE%%%": {
                        "category": "inline-choice",
                        "answers": [
                            "a"
                        ],
                        "choices": {
                            "a": "New York",
                            "b": "Nevada",
                            "c": "South Dakota",
                            "d": "California"
                        }
                    }
                },
                "prompt": "<blockquote><div><div> <div> Match the following landmarks with the states 
                    they are in: </div> <div> Statue of Liberty %%%0_INTERACTION_INLINE_CHOICE%%% </div> <div> Grand Canyon 
                    %%%1_INTERACTION_INLINE_CHOICE%%% </div> <div> Mount Rushmore %%%2_INTERACTION_INLINE_CHOICE%%% </div> <div>
                    Golden Gate Bridge %%%3_INTERACTION_INLINE_CHOICE%%% </div> <div> Empire State Building %%%4_INTERACTION_INLINE_CHOICE%%% 
                    </div> </div></div></blockquote>"
            }
        ]
    }
}

The questions field contains a list of the extracted questions, each question has the following schema:

Question Item Schema

Field Description
id
(string)
An id to uniquely identify the question. Must be treated as a string text and not integer.
title
(string)
Question title is generally not presented to the student. It is used as reference for the teacher.
category
(enum)
The type of question. The different categories are:
Enum Description
MCQ
Multiple choice question. Only one of the choices are correct. Includes True-False questions as well.
MRQ
Multiple response question. One or more of the choices may be correct.
OEQ
Open ended question. Other names include essay question or short answer question.
CLOZE
Cloze questions are questions with inline interactions that are part of the question text. Those interactions are fill-in-the-blank or select-drop-down. Sometimes called embedded answers question.
MATCH
Matching question. Consider of two sets of items, a left set and a right set. Learners are required to associate the right items to the left items.

The left items can be considered as sub-questions and learners must respond to all of them. Right items can be considered as choices and may be used once, multiple times or not at all.

Note: there's huge variance between learning platforms over how matching questions are rendered. You should pick whatever makes sense for you.
CONTENT
Content only item. Does not contain any question and requires no respond from the learner. Only stimulus and prompt would be in use.
FILE
File upload question where learner are suppose to response to item by uploading a file. Only stimulus and prompt would be in use.
stimulus
(string)
Shared XHTML content between different questions. Stimulus may be absent in some questions.

Content can be dumped directly into browser to be rendered (Usually concatenated with prompt).
prompt
(string)
XHTML content of question task. Prompt may be absent in some questions.

Content can be dumped directly into browser to be rendered (Usually concatenated with stimulus).
choices
(JSON object)
The key is the choice identifier and the value is the corresponding XHTML content of the choice.

The XHTML which may be rendered on browser as is.
Refer to choices section for further help in processing choices.
interactions
(JSON object)
They keys are placeholder texts that can be found inside the question's body. The placeholder texts inside the question body marks the position where the cloze interaction should be inserted.

You should not depend on the placeholder text to tell you the cloze interaction type. It simply serves as a location marker with in the question body.

The values are JSON objects that provide more information about the cloze interaction. It will always contain the category attribute, which will tell you the type of this cloze field, either text-entry or inline-choice. It may contain the answers attribute if answers are present. It may also contain a choices attribute if the interaction is an inline-choice interaction (i.e. select drop down).

If the category of the interaction is inline-choice. Then it will contain a choices field. Here's an example:

"interactions": {
    "%%%0_INTERACTION_INLINE_CHOICE%%%": {
        "category": "inline-choice",
        "choices": {
            "a": "California",
            "b": "Delaware",
            "c": "Texas",
        },
        "answers": ["a"]
    },
    ...
},



The interaction field will not be present if question has no cloze interactions within its body.

Refer to interactions section for further help in processing interactions.
matching_interaction
(JSON Object)
matching_interaction contains three attributes left, right, and answers. Any of these attributes may not be present if there's no data or items in them.

Both left and right attributes contains a JSON object that holds the left and right set of items respectively for the matching question. They are structure like so:

"matching_interaction": {
    "left": {
        "1": "<div>USA</div>",
        "2": "<div>France</div>"
    },
    "right": {
        "a": "<div>Paris</div>",
        "b": "<div>London</div>",
        "c": "<div>Washington D.C.</div>"
    },
    ...
}

The left and right items consist of key-value pairs where the key is the identifier and value is the XHTML content of the the item.
(Note: it is possible for images to appear in either left or right items)

The left set of items are the sub-questions and the right set of items are the possible choices to be associated with the left items.

Learners must respond to all of the left items. Right items can be considered as choices and may be used once, multiple times or not at all.

For the answers attribute, it contains a JSON object that maps the left item identifier to the correct right item identifier.

"matching_interaction": {
    ...
    "answers": {
        "1": "c",
        "2": "a"
    }
}


matching_interaction will only be present in matching questions.
answers
(array)
An array of arrays of answers (given in string).

The inner array represents the list of possible correct answers for a interaction. The outer array represents the answers for each (cloze) interaction present in the question (each question may have multiple interaction).

Will not be present if no answers are detected in the file.
metadata
(JSON object)
The keys are metadata tags that is found within the question text. Below is an non-exhaustive list of metadata we accept.
Difficulty Category Objective Topic
Feedback Explanation Bloom aacsb standard
Subject Page Ref sta complexity
The values associated with each key are plaintext string, not XHTML.
(e.g. {"Feedback": "Choice A is incorrect due to XYZ. Choice B is correct.})
Note: fields will be absent if there are no corresponding data.

Assessments and Sections Schema

Where available, our Import API will also return information on assessments and sections inside the uploaded file. I will look something like so.


{
    "status": "success",
    "message": "Your file has been successfully converted.",
    "data": {
        "filename": "canvas-export-module-1-quizzes.zip",
        "num_of_pages": 25.0,
        "questions": [...],
        "assessments": [
            {
            "id": "12",
            "title": "Semester 2 mid term test",
            "section_ids": ["330", "331", "332"]
            },
            {
            "id": "13",
            "title": "Semester 2 Exam",
            "section_ids": ["333", "334", "335"]
            }
        ],
        "sections": [
            {
            "id": "330",
            "title": "Semester 2 mid term test",
            "category": "ORD",
            "question_ids": ["330", "331", "332"]
            },
            {
            "id": "331",
            "title": "Semester 2 Exam",
            "category": "RAND",
            "select": 1,
            "question_ids": ["333", "334", "335"]
            },
            ...
        ]
    }
}

Assessment Schema

Field Description
id
(string)
An id to uniquely identify the assessment. Must be treated as a string text and not integer.
title
(string)
The name of the assessment, usually visible to both learners and instructors.
section_ids
(array)
Array of strings containing the id of the sections that the assessment is composed of. A section is a group of questions.

Section Schema

A section is a group of questions. Questions may be selected from it for an assessment depending on the category of the section.
Field Description
id
(string)
An id to uniquely identify the section. Must be treated as a string text and not integer.
title
(string)
The name of the section, usually NOT visible to the learners, only visible to the instructors.
NOTE: this may not always be present
category
(enum)
The type of section. The different categories are:
Enum Description
ORD
Ordered section. All questions are to be presented in order.
RAND
Random section. N number of questions are randomly selected from this section.
select
(intger)
The number of questions to be randomly selected from this section.

This field is only present in a RAND section.
Processing Response Fields
In this section we provide some examples to illustrate how different fields may be processed before they can be rendered on the browser.


Stimulus and Prompt
Stimulus and prompt form the body of the question. As they are provided as XHTML, you can simply concaternate stimulus and prompt in that order and render the result directly on the browser.
Using the test endpoint's question 5a's response, the result should look like this:

<div>
    <div>
    5. A student sets up a circuit using a battery made of four cells, a resistor R, two identical lamps P and Q,
        and a switch. The circuit is shown in Fig. 6.1.
    </div>
    <div>
        <img src="https://digitaliserstorage.blob.core.windows.net/media/image/yhaojin/S8DK3WWI_uahPtvwErqHg0IQNwReTK5buPHFK9NmR78yPgprQ13l99AaR0XskUZgIJaNvRD0F.jpg"/>
    </div>
</div>
<div>
    Explain why in this circuit the current lamp Q is larger than the current in lamp P.
</div>



Choices
Choices are a series of XHTML. You may want to render them either as radio button inputs (MCQ) or checkbox inputs (MRQs). Using the test endpoint's question 2's response, an example of how choices may be rendered in browser may look like this:

<input type="radio" name="choices-selection" id="a">
<label for="a">
    <div>Toyota RAV4</div>
</label>
<input type="radio" name="choices-selection" id="b">
<label for="b">
    <div>Ford F-Series</div>
</label>
<input type="radio" name="choices-selection" id="c">
<label for="c">
    <div>Tesla Model 3</div>
</label>
<input type="radio" name="choices-selection" id="d">
<label for="d">
    <div>Honda Civic</div>
</label>



Cloze Interactions
When questions have cloze interactions, the placeholder text inside the question prompt needs to be rendered before dumping into the browser.

Interactions that contain the substring INTERACTION_TEXT_ENTRY are fill-in-the-blank response field and interactions with the substring INTERACTION_INLINE_CHOICE are select-drop-down fields. Using question 15 from the test endpoint, you can replace each INTERACTION_TEXT_ENTRY with an <input> tag and INTERACTION_INLINE_CHOICE with <select> tag to create those fields like so:

<div>
    <span>
        The Grand Canyon is a steep-sided canyon carved by the Colorado River in
        <select name="1_INTERACTION_INLINE_CHOICE">
            <option value="a">Arizona</option>
            <option value="b">Utah</option>
            <option value="c">California</option>
            <option value="d">Nevada</option>
        </select>.
    </span>
    <span>
        The canyon is a result of <input name="0_INTERACTION_TEXT_ENTRY"> which
        exposes one of the most complete geologic columns on the planet.
    </span>
</div>

Check out our API endpoints
Now that you know how to process the converted job’s data, why not try converting some files using our API endpoints? Head on over to our API endpoints page and start using them!
Need some help?
Coding is hard. If you have problems rendering the converted data or with our endpoints in general, please do not hesitate to reach out and contact us.