Add per-question LLM/Standard mode toggle for async peer instruction (discussion)#1173
Add per-question LLM/Standard mode toggle for async peer instruction (discussion)#1173sethbern wants to merge 18 commits intoRunestoneInteractive:mainfrom
Conversation
|
I'm torn, I know @barbarer wants this, but very few people use async PI and to add more compexity to this page is worrisome. I understand it is extra clicks to enable, the async, but we cannot have everything on the front page. I'm thinking about it.... |
|
I think it might make more sense and reduce interface clutter to integrate the async option into the Visibility status dropdown rather than having a separate toggle. Instead of a standalone async toggle, what if the Visibility dropdown could include an option like "Visible + Async PI (after X time)" that enables async peer instruction when the visibility conditions are met? |
|
What about adding a switch to the peer instruction dashboard, so that an instructor could make the asynchronous questions available at the end of the peer instruction session? |
components/rsptx/db/models.py
Outdated
| activities_required = Column( | ||
| Integer | ||
| ) # only reading assignments will have this populated | ||
| use_llm = Column(Web2PyBoolean, default=False) |
There was a problem hiding this comment.
Why would this be on a per question basis? Shouldn't this be on an assignment ?
There was a problem hiding this comment.
The per-question setting is specifically for AB testing in research studies, where we need to assign different conditions to different questions within the same assignment. For regular classroom use, instructors would just set it once, they have the ability to select the same option for all questions, similar to how selecting how items will be graded works.
|
Hey @bnmnetp, I removed the async toggle from the assignment builder and moved it to the PI instructor dashboard so it now shows up as a "Release After-Class PI" button on the last question instead of Next Question. I also fixed a question counter bug I found where clicking Next Question on the last question would save an out-of-bounds index to the database, causing it to show "Question 3 of 2". Let me know what you think
|
|
Thanks for the updates. I am still struggling to understand how each question would need an async toggle. We have select questions which are made for AB testing, but maybe I don't understand what you are trying to accomplish, or if they don't work in this context. What I don't want to do is needlessly complicate the interface for 99% of the instructors (which will cause great confusion to most of them) to support AB testing by the 1%. So maybe we should have a zoom where we can talk about all of this so that I can fully understand. Tagging @barbarer as I think she should be part of the conversation. |
|
|
The unit test that is failing now is related to your changes to the model. |
There was a problem hiding this comment.
Pull request overview
Adds course- and question-level controls for async Peer Instruction discussion mode, enabling instructors to (a) toggle “after-class” release and (b) choose Standard vs LLM discussion per question (with API-key gating and student-facing messaging/fallbacks).
Changes:
- Adds a
use_llmflag to assignment questions and wires it through validation/types. - Introduces a course setting (
enable_async_llm_modes) and an instructor API endpoint to report API-key + setting status. - Updates instructor and student UI flows to support async release toggling and LLM/Standard discussion behavior with fallback.
Reviewed changes
Copilot reviewed 15 out of 15 changed files in this pull request and generated 5 comments.
Show a summary per file
| File | Description |
|---|---|
| migrations/versions/c1d2e3f4a5b6_add_use_llm_to_assignment_questions.py | Adds DB column for per-question LLM enablement. |
| components/rsptx/db/models.py | Adds use_llm field to SQLAlchemy AssignmentQuestion. |
| components/rsptx/validation/schemas.py | Allows use_llm in assignment question update payloads. |
| components/rsptx/templates/admin/instructor/course_settings.html | Adds course setting toggle for enabling async LLM modes. |
| bases/rsptx/admin_server_api/routers/instructor.py | Exposes enable_async_llm_modes in course settings page context. |
| bases/rsptx/web2py_server/applications/runestone/models/questions.py | Adds DAL field use_llm on assignment_questions. |
| bases/rsptx/web2py_server/applications/runestone/controllers/peer.py | Adds instructor endpoint to toggle async release; gates LLM mode per-question when enabled. |
| bases/rsptx/web2py_server/applications/runestone/views/peer/dashboard.html | Adds “Release After-Class PI” / “Undo” UI on last question. |
| bases/rsptx/web2py_server/applications/runestone/views/peer/peer_async.html | Updates student async PI step text, disclaimer, and fallback-to-standard behavior. |
| bases/rsptx/assignment_server_api/routers/instructor.py | Adds /has_api_key endpoint for UI gating. |
| bases/rsptx/assignment_server_api/assignment_builder/src/store/assignmentExercise/assignmentExercise.logic.api.ts | Adds RTK query hook useHasApiKeyQuery. |
| bases/rsptx/assignment_server_api/assignment_builder/src/types/exercises.ts | Adds use_llm to Exercise type. |
| bases/rsptx/assignment_server_api/assignment_builder/src/components/routes/AssignmentBuilder/components/exercises/AssignmentExercisesList/AssignmentExercisesTable.tsx | Adds per-row + bulk “Async Mode” dropdown UI when peer async + setting enabled. |
| bases/rsptx/assignment_server_api/assignment_builder/src/components/routes/AssignmentBuilder/AssignmentBuilder.module.css | Adjusts layout styling to accommodate UI changes. |
| bases/rsptx/book_server_api/routers/assessment.py | Minor robustness tweak when parsing poll results vote values. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
migrations/versions/c1d2e3f4a5b6_add_use_llm_to_assignment_questions.py
Outdated
Show resolved
Hide resolved
| def toggle_async(): | ||
| response.headers["content-type"] = "application/json" | ||
| assignment_id = request.vars.assignment_id | ||
| assignment = db(db.assignments.id == assignment_id).select().first() | ||
| new_value = not (assignment.peer_async_visible or False) | ||
| db(db.assignments.id == assignment_id).update(peer_async_visible=new_value) | ||
| db.commit() | ||
| return json.dumps({"peer_async_visible": new_value}) |
There was a problem hiding this comment.
toggle_async updates an assignment solely by assignment_id without verifying the assignment belongs to the instructor’s current course (or even that the assignment exists). As written, an instructor could toggle peer_async_visible for any assignment ID they can guess, and a missing/invalid ID will raise an exception when accessing assignment.peer_async_visible. Recommend validating assignment_id, returning a JSON error for missing/unknown assignments, and checking assignment.course == auth.user.course_id before updating.
There was a problem hiding this comment.
The @auth.requires decorator handles the security function to make sure that the instructor is an instructor, but you are correct it does not check that the assignment belongs to the instructors current course.
bases/rsptx/web2py_server/applications/runestone/views/peer/dashboard.html
Outdated
Show resolved
Hide resolved
| <Dropdown | ||
| className="editable-table-dropdown" | ||
| value={data.use_llm && hasApiKey ? "LLM" : "Standard"} | ||
| onChange={(e) => updateAssignmentQuestions([{ ...data, use_llm: e.value === "LLM" }])} | ||
| options={[ |
There was a problem hiding this comment.
The per-row Async Mode dropdown calls updateAssignmentQuestions([{ ...data, use_llm: ... }]) but does not JSON-stringify question_json. The batch update endpoint expects question_json as a Pydantic Json (string), and other update paths in this codebase stringify question_json before sending. As-is, this update is likely to fail validation when question_json is an object. Recommend normalizing the payload (e.g., ensure question_json is a JSON string) and handling/displaying mutation errors.
…stions.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
…shboard.html Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
|
Hey @bnmnetp, I went through the copilot suggestions and applied the ones that made sense and manually did some. I also didn't know copilot could review PRs and give direct suggestions, that's pretty cool! Let me know what you think |


This PR gives instructors the ability to toggle async peer instruction on and off and choose which PI questions use an LLM. This is what I have so far I would love any feedback, especially with the assignment dashboard since there have been recent changes. @bnmnetp @barbarer
Instructors can toggle async peer mode on/off from the assignment list
Each peer question can be set to LLM or Standard mode individually (defaults to Standard)
LLM option only shows up if there's a valid API key configured
If the API key is broken students get a fallback to standard discussion instead of an error
Deleting an API key flips everything back to Standard and re-adding restores previous settings
Students see a small disclaimer when chatting with the AI peer