Retrieving submissions from the FreeTextResponse XBlock

The FreeTextResponse XBlock is a useful tool for quick, automated grading on free-text, as well as self-reflection. But it's not immediately clear how you retrieve what your learners have submitted in this XBlock. In this article, we'll show you exactly how to do so.

Note: In order to do this, you need to have first enabled and inserted the FreeTextResponse XBlock in your course. Tahoe customers can find information on how to do so in our article listing the XBlocks available on Tahoe. Enterprise customers will need to contact support to ensure the XBlock has been enabled on their site before it can be added to a course in this way.

Step 1: Find the location ID of your component

First, navigate to the location of the XBlock within your course in the LMS and select Staff Debug Info underneath where it appears.

In the popup that appears, you should now copy the text that appears after "location=". For example in this case, the location ID is:


Step 2: Download the submission data

Now that we have the Location ID (the unique identifier that identifies exactly where this component is within the course), we can use the instructor dashboard to generate a grade report for this question.

First, navigate to the Data Download tab of the Instructor dashboard.

Next, scroll down to Reports. In this section is the line "To generate a CSV file that lists all student answers to a given problem, enter the location of the problem (from its Staff Debug Info)."

Paste the Location ID into the box below this line, and click Download a CSV of problem responses, the button below the box.

Step 3: Download and analyze the report

Scroll down to the bottom of this page. After a short delay while the data is assembled, you should see a report appear with the title containing the words "student state from block-v1". This is your new report, so click the link to download it.

This is a CSV file, so your data will be in the following format:

student_username,"{""student_answer"": ""Here is my answer text"", ""score"": 1.0, ""count_attempts"": 1}"

Still need help? Contact Us Contact Us