Skip to content

Text Answer Question

The text answer question resource creates open-ended questions where users type free-form text responses. These questions allow for flexible answers and can test deeper understanding beyond multiple choice options.

As a lab author, you use text answer questions to assess deeper understanding:

  • Conceptual Understanding: Test ability to explain concepts, principles, or reasoning in the user’s own words
  • Command Recall: Test knowledge of specific commands, syntax, or procedures that must be typed exactly
  • Troubleshooting Skills: Assess ability to diagnose problems and articulate solution approaches

Text answer questions provide rich insight into user understanding beyond what multiple choice questions can reveal.

resource "text_answer_question" "name" {
question = "Explain the main benefit of using containers in software development."
answer = "portability"
}
resource "text_answer_question" "name" {
question = "What is the primary purpose of a Dockerfile?"
answer = "define container image instructions"
hints = [
"Think about what Docker uses to build images",
"Consider the relationship between Dockerfile and image creation"
]
exact = false
tags = ["docker", "containers", "dockerfile"]
}

text_answer_question

FieldRequiredTypeDescription
questionstringThe question text presented to users
answerstringThe primary acceptable answer
exactboolWhether answers must match exactly (no partial matching). Defaults to false.
hintslist(string)Hints shown to users when hints are enabled. Defaults to empty list.
tagslist(string)Tags for categorizing and organizing questions. Defaults to empty list.

The following defaults are applied automatically:

resource "text_answer_question" "name" {
exact = false
hints = []
tags = []
}
resource "text_answer_question" "define_pod" {
question = "What is a Pod in Kubernetes?"
answer = "smallest deployable unit"
tags = ["kubernetes", "concepts", "pods"]
}
resource "text_answer_question" "docker_command" {
question = "What Docker command lists all running containers?"
answer = "docker ps"
exact = true
tags = ["docker", "commands"]
}

Text answer questions are referenced in quiz resources:

resource "quiz" "conceptual_understanding" {
questions = [
resource.text_answer_question.define_pod,
resource.text_answer_question.docker_command
]
show_hints = true
attempts = 3
}
  1. Clear Instructions: Indicate expected answer length or format in the question
  2. Appropriate Matching: Use exact matching for commands, flexible for concepts
  3. Educational Focus: Use text questions to test understanding, not memorization
  4. Reasonable Expectations: Don’t expect users to guess exact wording