diff --git a/source/img/guides/Grading-new-feature1.png b/source/img/guides/Grading-new-feature1.png index a971ebb3..546255d8 100644 Binary files a/source/img/guides/Grading-new-feature1.png and b/source/img/guides/Grading-new-feature1.png differ diff --git a/source/img/guides/assessment_sct_check.png b/source/img/guides/assessment_sct_check.png index 6a36c062..5d180e95 100644 Binary files a/source/img/guides/assessment_sct_check.png and b/source/img/guides/assessment_sct_check.png differ diff --git a/source/img/guides/assessment_sct_execution.png b/source/img/guides/assessment_sct_execution.png index 1a84241d..f892833e 100644 Binary files a/source/img/guides/assessment_sct_execution.png and b/source/img/guides/assessment_sct_execution.png differ diff --git a/source/img/guides/customizeSubmitbutton.png b/source/img/guides/customizeSubmitbutton.png new file mode 100644 index 00000000..94d27731 Binary files /dev/null and b/source/img/guides/customizeSubmitbutton.png differ diff --git a/source/img/guides/random-sync.png b/source/img/guides/random-sync.png index 279900dc..d222d305 100644 Binary files a/source/img/guides/random-sync.png and b/source/img/guides/random-sync.png differ diff --git a/source/img/guides/random-update.png b/source/img/guides/random-update.png index b12d74f9..1fac7971 100644 Binary files a/source/img/guides/random-update.png and b/source/img/guides/random-update.png differ diff --git a/source/img/guides/sql-helpers.png b/source/img/guides/sql-helpers.png new file mode 100644 index 00000000..ccf1b9a0 Binary files /dev/null and b/source/img/guides/sql-helpers.png differ diff --git a/source/img/guides/std-assessment-args.png b/source/img/guides/std-assessment-args.png index 71bebcc5..72044511 100644 Binary files a/source/img/guides/std-assessment-args.png and b/source/img/guides/std-assessment-args.png differ diff --git a/source/img/guides/std-assessment-error.png b/source/img/guides/std-assessment-error.png index 6894f61a..b3af63e5 100644 Binary files a/source/img/guides/std-assessment-error.png and b/source/img/guides/std-assessment-error.png differ diff --git a/source/img/guides/std-assessment-stdin-ignore.png b/source/img/guides/std-assessment-stdin-ignore.png index 1131c366..1d406456 100644 Binary files a/source/img/guides/std-assessment-stdin-ignore.png and b/source/img/guides/std-assessment-stdin-ignore.png differ diff --git a/source/img/guides/std-assessment-stdin.png b/source/img/guides/std-assessment-stdin.png index ca5864fa..2192450e 100644 Binary files a/source/img/guides/std-assessment-stdin.png and b/source/img/guides/std-assessment-stdin.png differ diff --git a/source/img/sql-helpers.png b/source/img/sql-helpers.png deleted file mode 100644 index 9c4f8fdf..00000000 Binary files a/source/img/sql-helpers.png and /dev/null differ diff --git a/source/instructors/authoring/assessments/ai-assessment-generation.rst b/source/instructors/authoring/assessments/ai-assessment-generation.rst index 82046fdf..c8bc078d 100644 --- a/source/instructors/authoring/assessments/ai-assessment-generation.rst +++ b/source/instructors/authoring/assessments/ai-assessment-generation.rst @@ -42,7 +42,8 @@ Assessments can be auto-generated using the text on the current guides page as c :alt: Assessment Generation Prompt -4. When you are satisfied with the result, press **Apply** and then **Create**. +4. When you are satisfied with the result, press **Apply** and then **Create**. If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, ``create assessment based on the first paragraph with 2 correct answers``. + .. important:: The generate assessment feature does not configure the page :ref:`layout; ` you should specify the layout depending on how you want to present the information to the students. diff --git a/source/instructors/authoring/assessments/assessments.rst b/source/instructors/authoring/assessments/assessments.rst index dff80c48..a0f1bbbb 100644 --- a/source/instructors/authoring/assessments/assessments.rst +++ b/source/instructors/authoring/assessments/assessments.rst @@ -7,7 +7,9 @@ Assessments =========== -Assessments can be used to determine how well students understand course material, and can be automatically or manually graded. Codio offers a wide range of assessment types, including auto-graded code tests, multiple choice tests, fill in the blanks, drop-down selection, free text responses, and assignment grading. Assessments can be interspersed throughout tutorial materials or stand alone using an :ref:`assignment level script `. +Codio offers assessments that determine how well students understand course material and provide automatic or manual grading options. Codio offers a wide range of assessment types, including auto-graded code tests, multiple choice tests, fill in the blanks, drop-down selection, free text responses, and assignment grading. Assessments can be interspersed throughout tutorial materials or stand alone using an :ref:`assignment level script `. + +Codio provides a Starter Pack project that contains examples for all assessment types and a guides authoring cheat sheet. Go to **Starter Packs**, search for **Demo Guides and Assessments**, click **Use Pack**, then click **Create** to install it to your Codio account. You can view the results of assessments in a course from the teacher dashboard. diff --git a/source/instructors/authoring/assessments/random.rst b/source/instructors/authoring/assessments/random.rst index 6c5dea41..a7468365 100644 --- a/source/instructors/authoring/assessments/random.rst +++ b/source/instructors/authoring/assessments/random.rst @@ -6,55 +6,159 @@ Random Assessment ================= -The Random assessment type allows you to set up a group of assessments to then randomly assign one to each individual student assignment. Multiple Random assessments can be added on the same page but all of those random assessments must be of Simple layout type (1 Panel without tree). Random assessments with Complex layout (any layout other than 1 Panel without tree) can not be added on the same page or mixed with any other assessments. If you do mix Complex layout Random assessments with any other assessment, it may not function as intended and you will also get a warning when you Publish the assignment. +The Random assessment type allows you to define a pool of assessments, with each student receiving a randomly selected assessment from that pool. Note that specific layout requirements apply. -There is assignment level duplication prevention such that regardless of the query, Codio checks whether the library assessment IDs are unique. This prevents students from seeing the same assessment question multiple times in an assignment, as long as every question in the library is unique, and all randomized assessments are drawn from the same library. -If duplicate assessments are generated, it indicates that the assessment library does not have enough unique assessments for the set of random assessment queries in the assignment. -‌ +**Layout Requirements** + +* **Simple Layout (1-panel)**: Multiple Random assessments can be added on the same page +* **Complex Layout (multi-panel)**: + + * Cannot be added on the same page + * Cannot be mixed with any other assessments + * Warning will display when Publishing if mixed with other assessments + * May not function as intended if layout requirements are violated + +**Duplication Prevention** + +Codio automatically prevents duplicate assessments at the assignment level by checking library assessment IDs for uniqueness. This ensures students won't see the same assessment question multiple times within an assignment. + +**Requirements for duplication prevention:** + +* All questions in the assessment library must be unique +* All randomized assessments must be drawn from the same library + +.. note:: + If duplicate assessments are generated, the assessment library does not contain enough unique assessments to satisfy all random assessment queries in the assignment. + +Creating a Random Assessment +~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 1. On the **General** page, enter the name of your assessment that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. 2. On the **Grading** page, enter the amount of points to assign to the assessment. Enter the score for correctly answering the question they are assigned. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). -- **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. +- **Use maximum score** - Enables selection of the highest score from all attempts as the final assessment score. 3. On the **Execution** page, browse to an assessment library where you can set up filters define the range of assessments to randomly assign. You can work from any assessment library you have access to. +.. list-table:: Filter Categories and Inputs + :widths: 30 70 + :header-rows: 1 + + * - Category + - Available Inputs & Description + * - Bloom's Taxonomy Level + - .. tab-set:: + + .. tab-item:: Levels I-III + + * Level I - Remembering + * Level II - Understanding + * Level III - Applying + + .. tab-item:: Levels IV-VI + + * Level IV - Analyzing + * Level V - Evaluating + * Level VI - Creating + * - Assessment Type (auto-detected) + - .. tab-set:: + + .. tab-item:: Code-Based + + * Standard Code Test + * Advanced Code Test + * Parsons Puzzle + + .. tab-item:: Text-Based + + * Multiple Choice + * Fill in the Blanks + * Free Text Autograde + * - Programming Language + - Select the programming language for code-based assessments (e.g., Python, Java, C++, JavaScript) + * - Category (topic-level) + - Broad subject area or topic category for filtering assessments (e.g., variables, functions, loops). + * - Content (sub-topic level) + - Specific subtopic or concept within the category (e.g., modifying variables, creating functions, nesting for loops). + * - Learning Objective (SWBAT form) + - Define what "Students Will Be Able To..." accomplish after completing the assessment + + Example: "Students will be able to implement binary search algorithms efficiently" + + :ref:`Click here ` for more information on how to use Assessment Libraries. -Updating Random assessments ---------------------------- -If you wish to update, change or review the assessments assigned to the random assessment, select the **Update Search** button on the **Execution** tab and this will open the assessment library field with the saved search parameters. + +Updating Random Assessments +~~~~~~~~~~~~~~~~~~~~~~~~~~~ + + +Modifying Assessment Parameters +------------------------------- + +To update, change, or review the assessments assigned to a random assessment: + +1. Navigate to the **Execution** tab +2. Select the **Update Search** button +3. The assessment library field will open with your saved search parameters .. image:: /img/guides/random-update.png :alt: Update Random assessment + :width: 450px + + +Publishing Changes +------------------ + +After reviewing assessments, follow the appropriate publishing method based on your situation: + +**If ONLY random assessment changes were made:** + +* Students have not started: Publish normally +* Students have already started: Use the **Sync** button on the **Edit** tab (see Synchronizing Changes below) + +**If other assignment changes were also made:** + +* Publish normally first +* Then navigate to the **Edit** tab to sync if students have started + +Synchronizing Changes from the Course +------------------------------------- -You can then review the assesments and publish the assignment if you wish in the usual manner, but if the only changes made are in relation to the random assignment and there are students who may have already started the assignment you should do go to the **Edit** tab and use the **Sync** button. If you have made other changes to the assignment though, publish in the usual manner as well and then go to the **Edit** tab. If students have already started the assignment, the **Sync** button will show +When random assessment changes are made (either by you or another organization member), synchronize them from the **Edit** tab in the course. -Publishing/Synchronising changes from the **Course** ----------------------------------------------------- +A **Sync** button appears on the **Edit** tab when changes are available to synchronize. -If the only changes to a previously published assignment are for the random assessment(s), or if someone else in the organization has updated the assessments being used in the assignment, the changes made can be updated/synchronised from the **Edit** tab in the course. +.. image:: /img/guides/random-sync.png + :alt: Synchronize Random assessment + :width: 500px -A **Sync** button will be shown on the **Edit** tab for the assignment if there are changes that can be updated/synchronised. +.. warning:: + Students who have already started the assignment will not receive updates unless their assignment is reset. Resetting will cause them to start "as new" and **all previous work will be lost**. - .. image:: /img/guides/random-sync.png - :alt: Synchronise Random assessment +Sync Options +------------ -If there are students that have already started the assignment they will not get the updates/changes unless their assignments is also reset so they will start again 'as new' and any previous work will be lost. +When you press the **Sync** button, Codio will check if students have started and present appropriate options: -Pressing the **Sync** button will identify if there are students who have already started and then give you the option to reset and publish or just publish so then only students who have not started the assignment will get the update/changes +**No students have started** -**Synchronising where no students started assignment** +The assignment will sync and publish to all students. - .. image:: /img/guides/random-sync-nostudents.png - :alt: Synchronise Random assessment no students started +.. image:: /img/guides/random-sync-nostudents.png + :alt: Synchronize Random assessment - no students started + :width: 300px -**Synchronising where students have started assignment** +**Students have started** - .. image:: /img/guides/random-sync-studentsstarted.png - :alt: Synchronise Random assessment students started +You can choose to: +* **Reset and publish**: All students restart with new changes (previous work lost) +* **Publish only**: Only students who haven't started receive the changes +.. image:: /img/guides/random-sync-studentsstarted.png + :alt: Synchronize Random assessment - students started + :width: 450px \ No newline at end of file diff --git a/source/instructors/authoring/assessments/splice.rst b/source/instructors/authoring/assessments/splice.rst index af2ba5e8..ebcef644 100644 --- a/source/instructors/authoring/assessments/splice.rst +++ b/source/instructors/authoring/assessments/splice.rst @@ -8,10 +8,13 @@ SPLICE Assessments SPLICE (Standards, Protocols, and Learning Infrastructure for Computing Education) provides a protocol for information exchange between the assessment and the application in which it is embedded. To use a SPLICE assessment, provide a link to the assessment, and Codio will display it in an iframe. More information about SPLICE assessments may be found here: https://cssplice.org/. -**The assessment portion of a SPLICE assessment is created outside of Codio.** +.. note:: The assessment portion of a SPLICE assessment is created outside of Codio. + +How to Use a SPLICE Assessment in Codio +--------------------------------------- + +For more information on **Metadata** (Optional) and **Files** (Optional) see :ref:`Assessments `. -How to use a SPLICE Assesment in Codio -************************************** 1. On the **General** page, enter the name of your assessment that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. @@ -21,18 +24,4 @@ How to use a SPLICE Assesment in Codio - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. -4. (Optional) Click **Metadata** in the left navigation pane and complete the following fields: - - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata - - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** The objectives are the specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then the Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - The **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. - -5. (Optional) Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment when adding it to an assessment library. The files are then included in the **Additional content** list. - - .. image:: /img/guides/assessment_files.png - :alt: Files - -6. Click **Create** to complete the process. \ No newline at end of file +4. Click **Create** to complete the process. \ No newline at end of file diff --git a/source/instructors/authoring/assessments/standard-code-test.rst b/source/instructors/authoring/assessments/standard-code-test.rst index 145af1b4..d0f4b368 100644 --- a/source/instructors/authoring/assessments/standard-code-test.rst +++ b/source/instructors/authoring/assessments/standard-code-test.rst @@ -7,10 +7,9 @@ Standard Code Test ================== Standard code tests are dialog driven, where you specify input data and the expected output. Codio then executes the student code, supplies the specified input data, and compares the expected output to the student code's actual output. -.. Note:: **The output (including white space) of all the test cases in your Standard Code test cannot exceed 20,000 characters**. +.. Note:: The output (including white space) of all the test cases in your Standard Code test cannot exceed 20,000 characters. If your output will exceed this limit, or you need finer control of the tests, you can create custom code tests. See :ref:`Advanced Code Tests ` for more information. -Codio provides a Starter Pack project that contains examples for all assessment types and a guides authoring cheat sheet. Go to **Starter Packs** and search for **Demo Guides and Assessments** if not already loaded in your **My Projects** area. Click **Use Pack** and then **Create** to install it to your Codio account. For more information about adding a Standard Code Test, view this video @@ -18,193 +17,196 @@ For more information about adding a Standard Code Test, view this video
-Assessment Auto-Generation -++++++++++++++++++++++++++ - -Assessments can be auto-generated using the text on the current guides page as context. Follow the steps below to auto-generate a Standard Code Test assessment: - -1. Select **Standard Code Test** from the Assessments list. - -2. Press the **Generate** button at bottom right corner. - - .. image:: /img/guides/generate-assessment-button.png - :alt: Generate assessment button -3. The Generation Prompt will open, press **Generate Using AI** to preview the generated assessment. - - .. image:: /img/guides/assessment-generation-prompt.png - :alt: Assessment Generation Prompt - - - - The Standard Code Test assessment generation provides instructions for the student, the test cases, the solution, and an explanation of the solution. The test cases may use standard input or command line arguments, and the expected output is specified. - - - Codio also creates the code file for the student, with the solution bracketed within :ref:`solution templates `. - - - The generate assessment feature does not configure the page :ref:`layout; ` you should specify the layout depending on how you want to present the information to the students. - - If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, *create assessment based on the first paragraph with 2 correct answers.* +Assessment Auto-Generation +-------------------------- +Assessments can be auto-generated using the text on the current guide page as context. For more information, see the :ref:`AI assessment generation ` page. +The Standard Code Test assessment generation creates: -4. When you are satisfied with the result, press **Apply** and then **Create**. +- Instructions for the student +- Test cases using standard input or command line arguments +- Expected output specifications +- A solution with explanation +- A code file for the student with the solution bracketed within :ref:`solution templates ` -More information about generating assessments may be found on the :ref:`AI assessment generation ` page. Assessment Manual Creation -++++++++++++++++++++++++++ +-------------------------- -Follow these steps to set up a standard code test: +Follow these steps to set up a standard code test. For more information on **General**, **Metadata** (Optional) and **Files** (Optional) see :ref:`Assessments `. -1. On the **General** page, enter the following information: - .. image:: /img/guides/assessment_general.png - :alt: General +1. Complete **General**. - - **Name** - Enter a short name that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. - - Toggle the **Show Name** setting to hide the name in the challenge text the student sees. - - - **Instructions** - Enter text that is shown to the student using optional Markdown formatting. 2. Click **Execution** in the navigation pane and complete the following information: .. image:: /img/guides/assessment_sct_execution.png :alt: Execution + :width: 350px + + - **Command** - Executes the student code (usually run). + - **Pre-exec command** - Executes before each test runs (usually compile). If this fails, the main Command will not run. + - **Timeout** - Amend the timeout setting for code execution (up to 300 seconds via arrows, or manually enter a longer period). - .. Note:: If you store the assessment scripts in the **.guides/secure** folder, they run securely and students cannot see the script or the files in the folder. - The files can be dragged and dropped from the File Tree into the field to automatically populate the necessary execution and run code. + **Language-Specific Commands** - - **Timeout** - Where you can amend the timeout setting for the code to execute. Arrows will allow you to set max 300 (sec), if you require longer, you can manually enter the timeout period. - - - **Command** - Enter the command that executes the student code. This is usually a run command. If the **Pre-exec command** fails, the **Command** will not run. + Select your programming language for command examples: + + .. tabs:: - - **Pre-exec command** - Enter the command to execute before each test is run. This is usually a compile command. - - - **Java** - - Compile: javac -cp path/to/file filename.java - - Run: java -cp path/to/file filename + .. tab:: Java - - **Python** - - Run: python path/to/file/filename.py + - Compile: ``javac -cp path/to/file filename.java`` + - Run: ``java -cp path/to/file filename`` - - **C** + .. tab:: Python - Compile: gcc filename.c -o filename -lm + - Run: ``python path/to/file/filename.py`` - Run: ./filename + .. tab:: C - - **C++** + - Compile: ``gcc filename.c -o filename -lm`` + - Run: ``./filename`` - Compile: g++ -o filename filename.cpp + .. tab:: C++ - Run: ./filename + - Compile: ``g++ -o filename filename.cpp`` + - Run: ``./filename`` - - **Ruby** + .. tab:: Ruby - Run: ruby filename.rb + - Run: ``ruby filename.rb`` - - **Bash** + .. tab:: Bash - Run: bash full_path.sh + - Run: ``bash full_path.sh`` - - **SQL** - - Codio provides three helper scripts to facilitate running queries on the database your students are modifying. These queries use ODBC to connect to and query the database and it is more strict about spacing than sqlcmd. - - Run: (depending on your version of SQL) - - python .guides/scripts/helper_mssql.py --db_host localhost --db_user SA --db_pass YourPassword --db_name DBNAME - - python .guides/scripts/helper_mysql.py --db_host localhost --db_user root --db_pass YourPassword --db_name DBNAME - - python .guides/scripts/helper_postgres.py --db_host localhost --db_port 5432 --db_user postgres --db_pass YourPassword --db_name DBNAME - - .. Note:: First you must use **Tools > Install Software** to install the appropriate helper script for your database (MSSQL,MySql,PostgreSQL). For example, if you are using MSSQL, you would download the Helper MSSql. - - .. image:: /img/sql-helpers.png - :alt: Install SQL Helper Script - + .. tab:: SQL + - Run: ``mysql EPDriver < /home/codio/workspace/codetest.sql --table`` + + Codio provides helper scripts to run queries via ODBC (stricter spacing than sqlcmd). + + .. Note:: First install the appropriate helper script via **Tools > Install Software** (MSSQL, MySQL, or PostgreSQL). + + .. image:: /img/guides/sql-helpers.png + :alt: Install SQL Helper Script + :width: 500px + + Run the appropriate command: + + - MSSQL: ``python .guides/scripts/helper_mssql.py --db_host localhost --db_user SA --db_pass YourPassword --db_name DBNAME`` + - MySQL: ``python .guides/scripts/helper_mysql.py --db_host localhost --db_user root --db_pass YourPassword --db_name DBNAME`` + - PostgreSQL: ``python .guides/scripts/helper_postgres.py --db_host localhost --db_port 5432 --db_user postgres --db_pass YourPassword --db_name DBNAME`` + + 3. Click **Grading** in the left navigation pane and complete the following fields: - .. image:: /img/guides/Grading-new-feature1.png - :alt: Grading +.. image:: /img/guides/Grading-new-feature1.png + :alt: Grading + :width: 500px - - **Points** - The score given to the student if the code test passes. You can enter any positive numeric value. If this assessment should not be graded, enter 0 points. - - **Allow Partial Points** - Toggle to enable partial points, the grade is then based on the percentage of test cases the code passes. See :ref:`Allow Partial Points ` for more information. - - **Case Insensitive** - Toggle to enable a case insensitive output comparison. By default, the comparison is case sensitive. - - **Ignore White Space** - Toggle to enable stripping out any white space characters (carriage return, line feed, tabs, etc.) from both the expected output and the student output. - - **Substring Match** - Toggle to enable substring match when comparing the expected output to the student output. The entire expected output needs to be contiguous in the student output. - - **Add Item to Check** - Click to create another set of input/output fields. - - **Search** - Search the test cases by the number/index assigned to it. - - **Expand All/Collapse All** - Click to expand/collapse all test cases. - - **Input - Arguments** - Enter the command line arguments that are read by the student code. - - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. - - **Delete** - Click to delete the test case. +.. list-table:: Grading Configuration Options + :widths: 25 75 + :header-rows: 1 - .. image:: /img/guides/std-assessment-args.png - :alt: Input Arguments + * - Setting + - Description + * - **Points** + - The score given to the student if the code test passes. You can enter any positive numeric value. If this assessment should not be graded, enter 0 points. + * - **Allow Partial Points** + - Toggle to enable partial points, the grade is then based on the percentage of test cases the code passes. See :ref:`Allow Partial Points ` for more information. + * - **Use Maximum Score** + - Toggle to enable assessment final score to be the highest score attained of all runs. + * - **Case Insensitive** + - Toggle to enable a case insensitive output comparison. By default, the comparison is case sensitive. + * - **Ignore White Space** + - Toggle to enable stripping out any white space characters (carriage return, line feed, tabs, etc.) from both the expected output and the student output. + * - **Substring Match** + - Toggle to enable substring match when comparing the expected output to the student output. The entire expected output needs to be contiguous in the student output. + * - **Search** + - Search the test cases by the number/index assigned to it. + * - **Expand All/Collapse All** + - Click to expand/collapse all test cases. + * - **Add Item to Check** + - Click to create another set of input/output fields. + * - **Delete** + - Click to delete the test case. + * - **Check All** + - Press to check all test cases at once so you can see how many of them are passed or failed. - - **Input - Stdin** - Enter data that would be entered manually in the console. For example, Enter your Name:. If using this input method: - - The input data should have a new line if it would be expected in the actual program execution. - - In the **Output** field, the prompt text that is displayed to the user appears in ``stdout`` and should be reflected in your output field but without the data entered by the user. You do **not** need a new line in the output field between each input prompt as the new line character is part of the user input. - - **Ignore white space** and **Substring match** are recommended options as they make the test more tolerant. The following image shows how to format input and output fields if you are **not** ignoring white space or doing a **Substring match**. Note how the input field only supplied the values to be input, not the prompt itself (which is actually a part of `stdout`). It is important to accurately account for all spaces and carriage returns. +.. image:: /img/guides/std-assessment-args.png + :alt: Input Arguments + :width: 450px - .. image:: /img/guides/std-assessment-stdin.png - :alt: Input and Output Example +- **Input - Arguments and Input - Stdin** - The following image shows the more tolerant approach with the **Ignore whitespace** option set. In this case everything on its own line for readability. The whitespace characters will be stripped out of both the expected output and the student output at runtime. +.. tabs:: - .. image:: /img/guides/std-assessment-stdin-ignore.png - :alt: Ignore Whitespace + .. tab:: Input - Arguments + Enter the command line arguments that are read by the student code. - - **Generate** - Click this button to generate the expected output based on the input you provided in the left half of the box. You need to have the solution code in the code file in order for the output to be generated. If there is already some content in the output box then you will get a pop up asking you if you want to overwrite the existing output. + .. tab:: Input - Stdin - - **Check** - Use this to test whether running the solution code, using the optional input, will result in the value in the expected output box. If the test fails an output box will appear below showing the difference between the current output and the expected output. + Enter data that would be entered manually in the console. For example, Enter your Name:. If using this input method: + - The input data should have a new line if it would be expected in the actual program execution. + - In the **Output** field, the prompt text that is displayed to the user appears in ``stdout`` and should be reflected in your output field but without the data entered by the user. You do **not** need a new line in the output field between each input prompt as the new line character is part of the user input. + - **Ignore white space** and **Substring match** are recommended options as they make the test more tolerant. The following image shows how to format input and output fields if you are **not** ignoring white space or doing a **Substring match**. Note how the input field only supplied the values to be input, not the prompt itself (which is actually a part of ``stdout``). It is important to accurately account for all spaces and carriage returns. - .. image:: /img/guides/assessment_sct_check.png - :alt: Check Test Case - + .. image:: /img/guides/std-assessment-stdin.png + :alt: Input and Output Example + + The following image shows the more tolerant approach with the **Ignore whitespace** option set. In this case everything on its own line for readability. The whitespace characters will be stripped out of both the expected output and the student output at runtime. + + .. image:: /img/guides/std-assessment-stdin-ignore.png + :alt: Ignore Whitespace + :width: 400px - - **Check All** - Press to check all test cases at once so you can see how many of them are passed or failed. - - **Show Error Feedback** - Toggle to enable feedback to students about errors related to the specific test case. +.. list-table:: Assessment Settings + :widths: 25 75 + :header-rows: 1 - .. image:: /img/guides/std-assessment-error.png - :alt: Show Error Feedback + * - Setting + - Description + * - **Run Test** + - Click this button to generate the expected output based on the input you provided in the left half of the box. You need to have the solution code in the code file in order for the output to be generated. If there is already some content in the output box then you will get a pop up asking you if you want to overwrite the existing output. + * - **Check Test** + - Use this to test whether running the solution code, using the optional input, will result in the value in the expected output box. If the test fails an output box will appear below showing the difference between the current output and the expected output. - - **Show Expected Answer** - Toggle to show the students the expected output when they have submitted an answer for the question. To suppress expected output, disable the setting. - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - **Show Rationale to Students** - Toggle to display the answer, and the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. + .. image:: /img/guides/assessment_sct_check.png + :alt: Check Test Case -4. Click on the **Parameters** tab if you wish to edit/change **Parameterized Assessments** (deprecated) using a script. See :ref:`Parameterized Assessments ` for more information. New parameterized assessments can no longer be set up. + * - **Show Error Feedback** + - Toggle to enable feedback to students about errors related to the specific test case. -5. Click **Metadata** in the left navigation pane and complete the following fields: + .. image:: /img/guides/std-assessment-error.png + :alt: Show Error Feedback + :width: 400px - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata + * - **Show Expected Answer** + - Toggle to show the students the expected output when they have submitted an answer for the question. To suppress expected output, disable the setting. + * - **Define Number of Attempts** + - Enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. + * - **Show Rationale to Students** + - Toggle to display the answer, and the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** + * - **Rationale** + - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** The objectives are the specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then the Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - The **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. -6. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment when adding it to an assessment library. The files are then included in the **Additional content** list. +4. Click **Create** to complete the process. - .. image:: /img/guides/assessment_files.png - :alt: Files -7. Click **Create** to complete the process. -Output written to a file +Output Written to a File ------------------------ If the output of a program is written to a file rather than Standard Output (stdout) you can use the **Command** portion of the execution tab to run a bash command to list out the contents of the file. diff --git a/source/instructors/authoring/assessments/student-submission.rst b/source/instructors/authoring/assessments/student-submission.rst index 9fd0ba20..6f84a40e 100644 --- a/source/instructors/authoring/assessments/student-submission.rst +++ b/source/instructors/authoring/assessments/student-submission.rst @@ -3,51 +3,76 @@ .. _student-submission: -Student submission options +Student Submission Options ========================== -There are two important settings that control +You can control the following: -- the way a student submits individual questions -- the way a student notifies the course instructors that an assignment is completed. +- How students submit individual questions. +- How students notify course instructors that an assignment is completed. -The submit button ------------------ -By default each assessment has a submit button beneath the assessment. Once pressed, the answer is autograded. +The Submit Button +-------------------- +By default each assessment has a Submit button beneath the assessment. Once pressed, the answer is autograded. + +.. note:: You can customize the Submit button label to any text you prefer. In the assessment markdown, update the text to the left of "|assessment". + + .. image:: /img/guides/customizeSubmitbutton.png + :alt: Customize Submit Button + :width: 400px -Each assessment type allows you to define the number of attempts students can make. On the last allowed attempt, the student will be warned that they will not be able to resubmit after that attempt. +Each assessment type allows you to define the number of attempts students can make. Students can see the number of attempts they have left to the right of the Submit button. -You can suppress the use of submit buttons for Advanced Code Test, Standard Code Test, MCQ, Fill in the Blank, Free Text and Free Text Autograde assessments. You cannot suppress the submit buttons for Parsons Puzzle assessments. +Suppressing the Submit Button +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -This feature is handy for exams, students do not need to worry about the effect of pressing the button. They can provide a response and move on to other assessments or pages in the guide. +You can suppress the Submit button for the following assessment types: Advanced Code Test, Standard Code Test, MCQ, Fill in the Blank, Parsons Puzzle, LLM Rubric Auto Grade, LLM Rubric, Free Text, and Free Text Autograde. -To suppress the use of the **Submit** button, go to the **settings** button in the guide and disable **Use submit buttons**. +This feature is useful for exams where students shouldn't need to worry about pressing a submit button. They can simply provide their response and move on to other assessments or pages. + +To suppress the use of the **Submit** button, go to the **Settings** button in the guide and disable **Use submit buttons**. .. image:: /img/guides/globalsettings.png :alt: Global Settings -Once the project is marked as complete (see below) all assessment responses are submitted automatically. All students work must be marked as complete either manually or using the automated **Mark as Complete** option on the final deadline. +Once the project is marked as complete (see below) all assessment responses are submitted automatically. All student work must be marked as complete either manually or using the automated **Mark as Complete** option on the final deadline. + + Mark as Complete ---------------- -To suppress the student **Mark as complete** action, go to the guide **settings** (see above screenshot) and disable **Use mark as complete**. -A student can proactively mark an assignment as complete. This can trigger an :ref:`assignment level autograde script ` and it is also displayed in the teacher dashboard for that student. +A student can proactively mark an assignment as complete. If there is an :ref:`assignment-level autograde script ` it will be run at this time and the completion status will be displayed in the teacher dashboard. + +**Advantages:** +Instructors can grade students' work as soon as they mark it as complete. + +**Drawbacks:** +Once students mark an assignment as complete, they can no longer make changes to the assignment, including answering assessments. + +Viewing Completed Assignments +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -The drawback of using the student driven mark as complete option is that once students mark an assignment as complete, they are no longer able to make changes to the assignment, including answering assessments. The advantage is that instructors are able to grade those students' work ahead of a deadline. +If a project has been marked as completed, students can: -If the project has been marked as completed, students can click on the 'graded' button to access grade feedback. If they wish to view the project, they can click on the name of the project on the left hand side. As the assignment is completed they will not be able to edit anything but can still view the content. +- Click on their grade to access grade feedback +- Click on the project name on the left side to view the content (read-only) -You can disable your students' ability to **Mark as Complete** entirely and eliminate instances of prematurely marking as complete or forgetting to do so. This means students won't need to request that their assignment be re-enabled if they submitted by mistake or decide they want to change something after marking as complete. +Disabling Student Mark as Complete +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -If you choose to disable your students' ability to mark as complete, you can use one of the following methods to mark assignments as complete: +You can disable students' ability to **Mark as Complete** entirely. This eliminates instances of prematurely marking as complete or forgetting to do so, and prevents students from needing to request that their assignment be re-enabled. -- Once the assignment deadline has been reached and you want to start grading student work, :ref:`mark all students' work as complete ` from the assignment actions area. +To disable this feature, go to the guide **Settings** and disable **Use mark as complete**. -- Set an :ref:`end of assignment date ` and specify that once the date is reached, the students' work should be **Marked as Complete** automatically. +Alternative Methods for Marking Complete +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ +If you disable student mark as complete, you can use one of the following methods: +- Once the assignment deadline has been reached, :ref:`mark all students' work as complete ` from the assignment actions area +- Set an :ref:`end of assignment date ` and specify that student work should be **Marked as Complete** automatically when the date is reached -Penalty deadlines +Penalty Deadlines ----------------- -Another related feature is **Penalty deadlines** which allow you to specify deadlines, before the final grading deadline, where a percentage deduction of the final grade is made. :ref:`Click here ` for more information on managing penalty deadlines. \ No newline at end of file +Another related feature is **Penalty Deadlines** which allow you to specify deadlines, before the final grading deadline, where a percentage deduction of the final grade is made. :ref:`Click here ` for more information on managing penalty deadlines. \ No newline at end of file diff --git a/source/instructors/authoring/assessments/ungraded-assessments.rst b/source/instructors/authoring/assessments/ungraded-assessments.rst index aea76099..76f92735 100644 --- a/source/instructors/authoring/assessments/ungraded-assessments.rst +++ b/source/instructors/authoring/assessments/ungraded-assessments.rst @@ -5,5 +5,7 @@ Ungraded Assessments ==================== -As an instructor, you can set up assessments that are not graded. They can be used as a consequence free way to check a student's understanding of the content. The assessment is not graded if the correct/incorrect points are set to zero (0). No points are added or deducted from the student's overall grade for the assignment. +You can create assessments that are not graded to provide a consequence-free way for students to check their understanding of the content. + +To make an assessment ungraded, set both the correct and incorrect points to zero (0). No points will be added to or deducted from the student's overall grade for the assignment. \ No newline at end of file diff --git a/source/instructors/setupcourses/library/assessmentslibrary.rst b/source/instructors/setupcourses/library/assessmentslibrary.rst index 62be8d4d..2c3e5289 100644 --- a/source/instructors/setupcourses/library/assessmentslibrary.rst +++ b/source/instructors/setupcourses/library/assessmentslibrary.rst @@ -28,6 +28,3 @@ You can organize your library by course number, programming language, department - - -