Text Answer Question
The text answer question resource creates open-ended questions where users type free-form text responses. These questions allow for flexible answers and can test deeper understanding beyond multiple choice options.
Use Cases
Section titled “Use Cases”As a lab author, you use text answer questions to assess deeper understanding:
- Conceptual Understanding: Test ability to explain concepts, principles, or reasoning in the user’s own words
- Command Recall: Test knowledge of specific commands, syntax, or procedures that must be typed exactly
- Troubleshooting Skills: Assess ability to diagnose problems and articulate solution approaches
Text answer questions provide rich insight into user understanding beyond what multiple choice questions can reveal.
HCL Syntax
Section titled “HCL Syntax”Basic Syntax
Section titled “Basic Syntax”resource "text_answer_question" "name" { question = "Explain the main benefit of using containers in software development." answer = "portability"}
Full Syntax
Section titled “Full Syntax”resource "text_answer_question" "name" { question = "What is the primary purpose of a Dockerfile?" answer = "define container image instructions"
hints = [ "Think about what Docker uses to build images", "Consider the relationship between Dockerfile and image creation" ]
exact = false
tags = ["docker", "containers", "dockerfile"]}
Fields
Section titled “Fields”Core Configuration
Section titled “Core Configuration”Field | Required | Type | Description |
---|---|---|---|
question | ✓ | string | The question text presented to users |
answer | ✓ | string | The primary acceptable answer |
exact | bool | Whether answers must match exactly (no partial matching). Defaults to false. | |
hints | list(string) | Hints shown to users when hints are enabled. Defaults to empty list. | |
tags | list(string) | Tags for categorizing and organizing questions. Defaults to empty list. |
Default Values
Section titled “Default Values”The following defaults are applied automatically:
resource "text_answer_question" "name" { exact = false hints = [] tags = []}
Examples
Section titled “Examples”Simple Definition Question
Section titled “Simple Definition Question”resource "text_answer_question" "define_pod" { question = "What is a Pod in Kubernetes?" answer = "smallest deployable unit"
tags = ["kubernetes", "concepts", "pods"]}
Command Usage Question
Section titled “Command Usage Question”resource "text_answer_question" "docker_command" { question = "What Docker command lists all running containers?" answer = "docker ps"
exact = true
tags = ["docker", "commands"]}
Usage in Quizzes
Section titled “Usage in Quizzes”Text answer questions are referenced in quiz resources:
resource "quiz" "conceptual_understanding" { questions = [ resource.text_answer_question.define_pod, resource.text_answer_question.docker_command ]
show_hints = true attempts = 3}
Best Practices
Section titled “Best Practices”- Clear Instructions: Indicate expected answer length or format in the question
- Appropriate Matching: Use exact matching for commands, flexible for concepts
- Educational Focus: Use text questions to test understanding, not memorization
- Reasonable Expectations: Don’t expect users to guess exact wording