diff --git a/source/img/guides/freetext-grading.png b/source/img/guides/freetext-grading.png index 82ce662a..c024b7d2 100644 Binary files a/source/img/guides/freetext-grading.png and b/source/img/guides/freetext-grading.png differ diff --git a/source/img/guides/freetext_navigate.png b/source/img/guides/freetext_navigate.png index 84db11f4..3806342e 100644 Binary files a/source/img/guides/freetext_navigate.png and b/source/img/guides/freetext_navigate.png differ diff --git a/source/img/guides/freetextanswer.png b/source/img/guides/freetextanswer.png index 93709531..d800f767 100644 Binary files a/source/img/guides/freetextanswer.png and b/source/img/guides/freetextanswer.png differ diff --git a/source/img/guides/freetexticon.png b/source/img/guides/freetexticon.png index ba2e9a7a..2695f2e0 100644 Binary files a/source/img/guides/freetexticon.png and b/source/img/guides/freetexticon.png differ diff --git a/source/img/guides/notpartial.png b/source/img/guides/notpartial.png index 278c675c..5e9fcf0d 100644 Binary files a/source/img/guides/notpartial.png and b/source/img/guides/notpartial.png differ diff --git a/source/img/guides/partial.png b/source/img/guides/partial.png index 229143d2..d89027ac 100644 Binary files a/source/img/guides/partial.png and b/source/img/guides/partial.png differ diff --git a/source/instructors/authoring/assessments/advanced-code-test.rst b/source/instructors/authoring/assessments/advanced-code-test.rst index d2b6ed36..82cbcc5e 100644 --- a/source/instructors/authoring/assessments/advanced-code-test.rst +++ b/source/instructors/authoring/assessments/advanced-code-test.rst @@ -46,11 +46,11 @@ Complete each section to set up your advanced code test. For more information on - **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. -5. **(Optional)** Complete **Metadata**. +4. **(Optional)** Complete **Metadata**. -6. **(Optional)** Complete **Files**. +5. **(Optional)** Complete **Files**. -7. Click **Create** to complete the process. +6. Click **Create** to complete the process. See a Working Example diff --git a/source/instructors/authoring/assessments/auto-grade-scripts.rst b/source/instructors/authoring/assessments/auto-grade-scripts.rst index c21067db..ab02ae65 100644 --- a/source/instructors/authoring/assessments/auto-grade-scripts.rst +++ b/source/instructors/authoring/assessments/auto-grade-scripts.rst @@ -5,18 +5,19 @@ Assignment Level Scripts ======================== + You can use assignment level scripts to evaluate student code, normalize points, and mark for participation grading. Assignment level scripts are added in the **Script Grading** field on the :ref:`Script Grading ` settings page. These scripts can then transfer the grading value into the grading field. Assignment level scripts are run when an assignment is **Marked as Complete**. .. Note:: The script must execute within 3 minutes or a timeout error occurs. There is a maximum size for the feedback that can be returned of 1Mb. If this limit is exceeded, the message **Payload content length greater than maximum allowed: 1048576** will be returned. If you are using an LMS platform with Codio, be sure to enter a percentage value in the **Grade Weight** field to maintain compatibility with LMS gradebooks. This value is then transferred into your LMS gradebook once you :ref:`release the grades `. -Secure scripts +Secure Scripts -------------- If you store grading scripts in the **.guides/secure** folder, they run securely and students cannot see the script or the files in the folder. Only instructors can access this folder. -You can find more information about assessment security :ref:`here `. +You can find more information about assessment security in the :ref:`Assessment Security ` page. -Access assessment results +Access Assessment Results ------------------------- You can access student scores for the auto-graded assessments in an assignment. All this information is in JSON format and can be accessed in the ``CODIO_AUTOGRADE_ENV`` environment variable. The following tabs show the format of this data. The first tab shows just the assessment portion of the data and the second depicts all the available values. @@ -258,24 +259,26 @@ The student grade is calculated based on whether they answered the question, not - In the course assignment settings :ref:`Grade Weights ` section, enable **Script Grading** set **Set custom script path** to that file and disable **Assessments Grading**. -Regrade an individual student's assignment ------------------------------------------- -If students have clicked **Mark as Complete** and the custom script is triggered, you can regrade their work by resetting the `complete` switch, and then set it to *complete* again, which triggers the custom script to run again. +Regrade an Individual Student's Assignment +------------------------------------------- +If students have clicked **Mark as Complete** and the custom script has been triggered, you can regrade their work by clicking the three vertical dots next to the student's name to access additional actions. Select **Mark as Incomplete**, then click the three vertical dots again and select **Mark as Complete** to retrigger the custom script. -Regrade all student's assignments +Regrade All Students' Assignments --------------------------------- -You can regrade all student's assignments that have already been auto-graded from the **Actions** button on the assignment page. +You can regrade all students' assignments that have already been auto-graded by clicking the **Regrade Completed** button at the bottom of the assignment page. + +1. Navigate to the assignment and click it. +2. Then click **Regrade Completed** at the bottom of the page. This is useful if you have found a bug in your assignment level grading script. **Regrade Completed** does not run individual code test assessments. -1. Navigate to the assignment and open it. -2. Click the **Actions** button and then click **Regrade Completed**. This is useful if you have found a bug in your assignment level grading script. **Regrade Completed** does not run individual code test assessments. +Test and Debug Your Grading Scripts +------------------------------------ -Test and debug your grading scripts ------------------------------------ -.. Note:: Codio provides the ability to test your auto-grading scripts when creating your project, this should be done before publishing your project to a course. Once an assignment has been published to the course, any changes made to files in the student workspace (/home/codio/workspace) are not reflected in the published assignment. Grading scripts should be stored in the **.guides/secure** folder. Files in the .guides and guides/secure folders can be published even if students have already started. +Codio provides the ability to test your auto-grading scripts when creating your project, this should be done before publishing your project to a course. Once an assignment has been published to the course, any changes made to files in the student workspace (/home/codio/workspace) are not reflected in the published assignment. Grading scripts should be stored in the **.guides/secure** folder. Files in the **.guides** and **.guides/secure** folders can be published even if students have already started. -Test your script in the IDE -........................... +Test Your Script in the IDE +---------------------------- + You can test your auto-grading script in the Codio IDE from the **Education > Test Autograde Script** on the menu bar. This option allows you to specify the location of your auto-grading script and run it against the current project content. It also allows you simulate scores attained by any auto-graded assessments located in the Codio Guide and select which autograded assessments to test. .. image:: /img/autograde-test.png @@ -283,17 +286,18 @@ You can test your auto-grading script in the Codio IDE from the **Education > Te Be sure to take the following into account when using this feature: -- When you click **Test Script**: +1. When you click **Test Script**: + +- All output to ``stdout`` and ``stderr`` are displayed in the dialog. +- The grade returned by your test script is at the bottom of the output section. - - All output to ``stdout`` and ``stderr`` are displayed in the dialog. - - The grade returned by your test script is at the bottom of the output section. +2. ``stdout`` and ``stderr`` output is not available when running the actual auto-grading script (not in test mode) because it runs invisibly when the assignment is marked as complete. Because of this, you should only generate output for testing and debugging. +3. If you want your script to provide feedback to the student, you should output it to a file that can be accessed by the student when opening the project at a later date. In this case, you should allow read-only access to the project from the assignment settings after being marked as complete. -- ``stdout`` and ``stderr`` output is not available when running the actual auto-grading script (not in test mode) because it runs invisibly when the assignment is marked as complete. Because of this, you should only generate output for testing and debugging. -- If you want your script to provide feedback to the student, you should output it to a file that can be accessed by the student when opening the project at a later date. In this case, you should allow read-only access to the project from the assignment settings after being marked as complete. +Test Your Script Using Bootstrap Launcher +------------------------------------------ -Test your script using bootstrap launcher -......................................... -You can also use a simple bootstrap launcher that loads and executes the script from a remote location so that you can edit and debug independently of the Codio box. The following example bash script shows a Python script that is located as a Gist on GitHub. This script might be called **.guides/secure/launcher.sh**. +You can also use a simple bootstrap launcher that loads and executes the script from a remote location so that you can edit and debug independently of the Codio box. The following example is a bash launcher script that downloads and runs a Python script from a GitHub Gist. This script would be saved as **.guides/secure/launcher.sh**. .. code:: bash @@ -312,7 +316,7 @@ Sending Points to Codio Codio provides a Python library to facilitate reporting points from your custom scripts. There are four functions in this library: `send_grade`, `send_grade_v2`, `send_partial` and `send_partial_v2`. - .. Note:: Partial points are not used in assignment level scripts, see :ref:`Allow Partial Points ` for more information about setting up partial points. +.. Note:: Partial points are not used in assignment level scripts. See :ref:`Allow Partial Points ` for more information about setting up partial points. In order to use this library you need to add the following code to the top of your grading script: @@ -336,23 +340,31 @@ The calls to use these functions are as follows: send_grade(grade) -`grade` - Should be the percent correct for the assessment. +or: .. code:: python send_grade_v2(grade, feedback, format=FORMAT_V2_TXT, extra_credit=None) -`grade` - Should be the percent correct for the assessment. - -`feedback` - The buffer containing the feedback for your student - maximum size is 1 Mb. -`format` - The format can be Markdown, HTML or text and the default is text. +.. list-table:: + :widths: 20 80 + :header-rows: 1 -`extra_credit` - Extra points beyond the value for doing this correctly. These do not get passed to an LMS system automatically, just the percentage correct. + * - Field + - Description + * - ``grade`` + - Should be the percent correct for the assessment. + * - ``feedback`` + - The buffer containing the feedback for your student - maximum size is 1 Mb. + * - ``format`` + - The format can be Markdown, HTML or text and the default is text. + * - ``extra_credit`` + - Extra points beyond the value for doing this correctly. These do not get passed to an LMS system automatically, just the percentage correct. .. _autograde-enhance: -Auto-grading enhancements +Auto-Grading Enhancements ------------------------- The V2 versions of the grading functions allow you to: @@ -362,14 +374,25 @@ The V2 versions of the grading functions allow you to: - Notify (instructors and students) and reopen assignments for a student on grade script failure. -If you don't use the send_grade_v2 functions, this URL (passed as an environment variable) can be used:```CODIO_AUTOGRADE_V2_URL``` +If you don't use the send_grade_v2 functions, this URL (passed as an environment variable) can be used:``CODIO_AUTOGRADE_V2_URL`` These variables allow POST and GET requests with the following parameters: -- **Grade** (```CODIO_AUTOGRADE_V2_URL```) - return 0-100 percent. This is the percent correct out of total possible points. -- **Feedback** - text -- **Format** - html, md, txt - txt is default -- **Penalty** - Penalty is number between 0-100, +.. list-table:: + :widths: 20 80 + :header-rows: 1 + + * - Field + - Description + * - **Grade** (``CODIO_AUTOGRADE_V2_URL``) + - Return 0-100 percent. This is the percent correct out of total possible points. + * - **Feedback** + - Text + * - **Format** + - html, md, txt - txt is default + * - **Penalty** + - Penalty is number between 0-100 + If you want to calculate penalties in the grading script you can use the **completedDate** (in UTC format) in ``CODIO_AUTOGRADE_ENV`` to calculate penalties. See Python example below. @@ -430,7 +453,7 @@ These Python and Bash files that can be loaded by a bootstrap script or as expla main() -Example grading scripts +Example Grading Scripts ----------------------- This section provides example assignment level scripts using the older methods to send grades. diff --git a/source/instructors/authoring/assessments/edit-assessment-points.rst b/source/instructors/authoring/assessments/edit-assessment-points.rst index 56eef98e..52752e97 100644 --- a/source/instructors/authoring/assessments/edit-assessment-points.rst +++ b/source/instructors/authoring/assessments/edit-assessment-points.rst @@ -7,9 +7,11 @@ Edit Assessment Points ====================== To edit assessment points, follow these steps: -1. In the Guide Editor, click the **Assessments** button to view the list of all assessments. +1. In the Guide Editor, click the **Assessments** button. -2. Click the assessment to open it and modify the points. +2. Then click **"View Existing Assessments"** in the bottom right corner to view the list of all assessments. + +3. Modify the point value in the box to the left of the assessment you want to update. Once done, click the **Close** button on the bottom left. .. image:: /img/assessmentpoints.png :alt: Edit Assessment Points \ No newline at end of file diff --git a/source/instructors/authoring/assessments/edit-assessment.rst b/source/instructors/authoring/assessments/edit-assessment.rst index 2568e9c5..384368dd 100644 --- a/source/instructors/authoring/assessments/edit-assessment.rst +++ b/source/instructors/authoring/assessments/edit-assessment.rst @@ -8,12 +8,16 @@ Edit an Assessment To edit an assessment, either: -1. In the Guide Editor, click the **Edit: ** button to the right of the assessment. +- In the Guide Editor, click the **Edit: ** button to the right of the assessment. .. image:: /img/guides/editassessmentbutton.png :alt: Edit Assessment -2. Click the **Assessment** button to view the list of all assessments and click the assessment to open it and make your changes. +Or, in the Guide Editor, use the following steps: + 1. Click the **Assessment** button. + 2. Click **"View Existing Assessments"** in the bottom right corner. + 3. Click the assessment to open it and make your changes. + .. image:: /img/guides/editassessmentlist.png :alt: Edit Assessment List \ No newline at end of file diff --git a/source/instructors/authoring/assessments/fill-in-blanks.rst b/source/instructors/authoring/assessments/fill-in-blanks.rst index b916c5a4..2b522de4 100644 --- a/source/instructors/authoring/assessments/fill-in-blanks.rst +++ b/source/instructors/authoring/assessments/fill-in-blanks.rst @@ -7,125 +7,96 @@ Fill in the Blanks ================== A **Fill in the blank question** can use either free text or offer options to be chosen from a drop-down list: - - Free Text Answers - Shows a question where the student must complete the missing words (fill in the blank) by entering the answers. - .. image:: /img/guides/assessments-fitb1.png - :alt: Free Text +.. list-table:: + :widths: 30 70 + :header-rows: 1 - - Drop-Down Answers - The possible answers are available in a drop-down list where the student chooses the correct answer. + * - Answer Type + - Description + * - Free Text Answers + - Shows a question where the student must complete the missing words (fill in the blank) by entering the answers. - .. image:: /img/guides/assessments-fitb2.png - :alt: Drop-Down + .. image:: /img/guides/assessments-fitb1.png + :alt: Free Text + * - Drop-Down Answers + - The possible answers are available in a drop-down list where the student chooses the correct answer. + + .. image:: /img/guides/assessments-fitb2.png + :alt: Drop-Down Assessment Auto-Generation ++++++++++++++++++++++++++ -Assessments can be auto-generated using the text on the current guides page as context. Follow the steps below to auto-generate a Fill in the Blanks assessment: - -1. Select **Fill in the Blanks** assessment from Assessments list. - -2. Press the **Generate** button at bottom right corner. - - .. image:: /img/guides/generate-assessment-button.png - :alt: Generate assessment button - -3. The Generation Prompt will open, press **Generate Using AI** to preview the generated assessment. +Assessments can be auto-generated using the text on the current guide page as context. For more information, see the :ref:`AI assessment generation ` page. - .. image:: /img/guides/assessment-generation-prompt.png - :alt: Assessment Generation Prompt - - If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, *create assessment based on the first paragraph with 2 correct answers.* - -4. When you are satisfied with the result, press **Apply** and then **Create**. - -More information about generating assessments may be found on the :ref:`AI assessment generation ` page. Assessment Manual Creation ++++++++++++++++++++++++++ -Follow these steps to set up fill in the blank assessments manually: - -1. On the **General** page, enter the following information: - - .. image:: /img/guides/assessment_general.png - :alt: General +Follow these steps to set up fill in the blank assessments manually. For more information on **General**, **Metadata** (optional) and **Files** (optional) see :ref:`Assessments `. - - **Name** - Enter a short name that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. - - Toggle the **Show Name** setting to hide the name in the challenge text the student sees. - - - **Instruction** - Enter the instructions to be shown to the students. +1. Complete **General**. 2. Click **Execution** in the navigation pane and complete the following information: .. image:: /img/guides/assessment_fitb_exec.png :alt: Execution - - **Text** - Enter the question in markdown, including the correct answer specification. For example: +- **Text** - Enter the question in markdown, including the correct answer specification. For example: -``A prime number (or a prime) is a <<>> number greater than <<<1>>> that has no positive divisors other than <<<1>>> and <<>>.`` - - - - **Show Possible Values** - Toggle to display possible options for the correct answer: +.. code-block:: text + + A prime number (or a prime) is a <<>> number greater than <<<1>>> that has no positive divisors other than <<<1>>> and <<>>. + +- **Show Possible Values** - Toggle to display possible options for the correct answer: - - For text-free questions, blank fields are available for the student to enter the correct answer. - - For drop-down questions, all the correct values (anything within the `<<< >>>` chevrons) are provided in each of the answer positions in a randomized order. You can also add incorrect answers (one per line). + - For free text questions, blank fields are available for the student to enter the correct answer. + - For drop-down questions, all the correct values (anything within the `<<< >>>` chevrons) are provided in each of the answer positions in a randomized order. You can also add incorrect answers (one per line). - .. image:: /img/guides/distractors.png - :alt: Distractors +.. image:: /img/guides/distractors.png + :alt: Distractors - **Regular Expression Support** +**Regular Expression Support** - Codio regular expression matching follows the Java language standards. +Codio regular expression matching follows the Java language standards. - Examples of regular expressions supported for blank fields: - - - Answer allows any characters - ``<<>>`` - - Answer starts with word "begin" - ``<<>>`` - - Answer ends with word "end" - ``<<>>`` - - Answer can contain one or more spaces in "this is" - ``<<>>`` - - Answer contains 1 or 2 or 3 - ``<<>>`` - - Answer allows color or colour - ``<<>>`` - - Answer allows yes or "yes" - ``<<<"yes", ""yes"">>>`` - - Answer allows hat or cat - ``<<>>`` - - Answer allows i==0 or i == 0 - ``<<>>`` - - Answer must only contain digits - ``<<>>`` - - Answer must only contain non-digits - ``<<>>`` - - Answer requires word character - ``<<>>`` - - Answer requires non-word character - ``<<>>`` - - Answer allows several answers (Place1 or Place2) - ``<<<"Place1", "Place2">>>`` - - Answer requires case sensitivity - ``<<>>`` or ``<<>>`` +Examples of regular expressions supported for blank fields: + +- Answer allows any characters - ``<<>>`` +- Answer starts with word "begin" - ``<<>>`` +- Answer ends with word "end" - ``<<>>`` +- Answer can contain one or more spaces in "this is" - ``<<>>`` +- Answer contains 1 or 2 or 3 - ``<<>>`` +- Answer allows color or colour - ``<<>>`` +- Answer allows yes or "yes" - ``<<<"yes", ""yes"">>>`` +- Answer allows hat or cat - ``<<>>`` +- Answer allows i==0 or i == 0 - ``<<>>`` +- Answer must only contain digits - ``<<>>`` +- Answer must only contain non-digits - ``<<>>`` +- Answer requires word character - ``<<>>`` +- Answer requires non-word character - ``<<>>`` +- Answer allows several answers (Place1 or Place2) - ``<<<"Place1", "Place2">>>`` +- Answer requires case sensitivity - ``<<>>`` or ``<<>>`` 3. Click **Grading** in the navigation pane and complete the following fields: .. image:: /img/guides/assessment_fitb_grading.png :alt: Grading - - **Points** - Enter the score for correctly answering the question. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). +- **Points** - Enter the score for correctly answering the question. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - **Show Expected Answer** - Toggle to show the students the expected output when they have submitted an answer for the question. To suppress expected output, disable the setting. +- **Show Expected Answer** - Toggle to show students the expected output when they have submitted an answer for the question. To suppress expected output, disable the setting. - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - - **Show Rationale to Students** - Toggle to display the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - - **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. - - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. - -4. Click on the **Parameters** tab if you wish to edit/change **Parameterized Assessments** (deprecated) using a script. See :ref:`Parameterized Assessments ` for more information. New parameterized assessments can no longer be set up. - -5. Click **Metadata** in the left navigation pane and complete the following fields: +- **Define Number of Attempts** - enable to set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata +- **Show Rationale to Students** - Toggle this option to display answer explanations to students. The rationale will appear after they submit their answer and whenever they view the assignment after marking it as complete. Control when students see the rationale by selecting: **Never**, **After x attempts**, **If score is greater than or equal to a % of the total**, or **Always**. - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** The objectives are the specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then the Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - The **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. +- **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. +- **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. -6. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment when adding it to an assessment library. The files are then included in the **Additional content** list. +4. **(Optional)** Complete **Metadata**. - .. image:: /img/guides/assessment_files.png - :alt: Files +5. **(Optional)** Complete **Files**. -7. Click **Create** to complete the process. \ No newline at end of file +6. Click **Create** to complete the process. diff --git a/source/instructors/authoring/assessments/free-text.rst b/source/instructors/authoring/assessments/free-text.rst index 5f8db790..d23d9795 100644 --- a/source/instructors/authoring/assessments/free-text.rst +++ b/source/instructors/authoring/assessments/free-text.rst @@ -5,91 +5,66 @@ Free Text ========= -Free text assessments allow students to answer questions in their own words. Because Free Text assessments allow for LaTeX formatting, this type of assessment is recommended for math assessments. Teachers are then able to review and manually grade their answers. +Free text assessments allow students to answer questions in their own words. This type of assessment is recommended for math assessments since it allows for LaTeX formatting. Teachers are then able to review and manually grade their answers. -Assessment Auto-Generation -++++++++++++++++++++++++++ - -Assessments can be auto-generated using the text on the current guides page as context. Follow the steps below to auto-generate a Free Text assessment: - -1. Select **Free Text** assessment from Assessments list. - -2. Press the **Generate** button at bottom right corner. - - .. image:: /img/guides/generate-assessment-button.png - :alt: Generate assessment button -3. The Generation Prompt will open, press **Generate Using AI** to preview the generated assessment. - .. image:: /img/guides/assessment-generation-prompt.png - :alt: Assessment Generation Prompt - -If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, *create assessment based on the first paragraph with 2 correct answers.* - -4. When you are satisfied with the result, press **Apply** and then **Create**. +Assessment Auto-Generation +++++++++++++++++++++++++++ -More information about generating assessments may be found on the :ref:`AI assessment generation ` page. +Assessments can be auto-generated using the text on the current guides page as context. For more information about generating assessments, see the :ref:`AI assessment generation ` page. Assessment Manual Creation ++++++++++++++++++++++++++ -Follow these steps to set up a free text assessment manually: - -1. On the **General** page, enter the following information: - - .. image:: /img/guides/assessment_free_general.png - :alt: General +Follow these steps to set up a free text assessment manually. For more information on **General**, **Metadata** (optional), and **Files** (optional), see :ref:`Assessments `. - - **Name** - Enter a short name that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. - - If you want to hide the name in the challenge text the student sees, toggle the **Show Name** setting to disable it. - - - **Instruction** - Enter the instructions in markdown to be shown to the students. +1. Complete **General**. 2. Click **Grading** in the navigation pane and complete the following fields: .. image:: /img/guides/assessment_free_grading.png :alt: Grading - - **Points** - Enter the score for correctly answering the question. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - - **Allow Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage of answers they correctly answer. Once you toggle this on, you will be able to add rubric items. Rubric items are negative points, they will be subtracted from the total score of the assessment. +- **Points** - Enter the score for correctly answering the question. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - **Preview Type** - Choose the input (plaintext or markdown) to be provided by the student. LaTex is also supported and is useful when students need to enter mathematical formulas in their answers. The following options are available: +- **Partial Points** - Toggle to enable a percentage of total points to be awarded based on the percentage of correct answers. Once you enable this, you can add rubric items. Rubric items are negative points that will be subtracted from the total score. - - **Plaintext** - Students enter ordinary text with no markdown formatting; there is no preview window. - - **Plaintext + LaTeX** - Students enter plaintext with no markdown formatting but offers support for LaTeX commands. A preview window is available where students can see the rendered LaTeX output. - - **Markdown + LaTeX** - Students enter markdown that also offers support for LaTex commands. A preview window is available where students can see the rendered markdown with LaTeX output. +- **Preview Type** - Choose the format for student input (plaintext, markdown, or LaTeX). LaTeX is useful when students need to enter mathematical formulas in their answers. The following options are available: - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - - **Show Rationale to Students** - Toggle to display the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - - **Rationale** - Enter guidance for the assessment. This is visible to the teacher when the project is opened in the course or when opening the student's project. This guidance information can also be shown to students after they have submitted their answer and when they reload the assignment after marking it as completed. +.. list-table:: + :widths: 20 80 + :header-rows: 1 -3. Click on the **Parameters** tab if you wish to edit/change **Parameterized Assessments** (deprecated) using a script. See :ref:`Parameterized Assessments ` for more information. New parameterized assessments can no longer be set up. + * - Format Type + - Description + * - **Plaintext** + - Students enter ordinary text with no markdown formatting; there is no preview window. + * - **Plaintext + LaTeX** + - Students enter plaintext with no markdown formatting but offers support for LaTeX commands. A preview window is available where students can see the rendered LaTeX output. + * - **Markdown + LaTeX** + - Students enter markdown that also offers support for LaTeX commands. A preview window is available where students can see the rendered markdown with LaTeX output. -4. Click **Metadata** in the left navigation pane and complete the following fields: +- **Define Number of Attempts** - Enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. + +- **Show Rationale to Students** - Toggle to display the rationale to students. This guidance information will be shown after they have submitted their answer and whenever they view the assignment after marking it as completed. You can control the display by selecting **Never**, **After x attempts**, **If score is ≥ x% of total**, or **Always**. - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata +- **Rationale** - Enter guidance for the assessment. This is visible to the teacher when the project is opened in the course or when viewing the student's project. It can also be shown to students after submission or when they revisit the assignment after marking it as completed. - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then its Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - By default, **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. +3. **(Optional)** Complete **Metadata**. -5. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment. The files are then included in the **Additional content** list. +4. **(Optional)** Complete **Files**. - .. image:: /img/guides/assessment_files.png - :alt: Files +5. Click **Create** to complete the process. -6. Click **Create** to complete the process. -Grading free text assessments +Grading Free Text Assessments ----------------------------- To review and grade answers given by students in a free text assessment, follow these steps: -1. Select the assignment to view the list of all assessments in the assignment for the student. +1. Select the assignment. + +2. Find the student you want to grade and click the number under **Points** to view all assessments in the assignment. .. image:: /img/guides/freetext-grading.png :alt: Free Text Grading @@ -99,36 +74,41 @@ To review and grade answers given by students in a free text assessment, follow .. image:: /img/guides/freetexticon.png :alt: Free Text Assessments Icon -2. Click any line to view the question and the answer submitted by the student. +3. Click any line to view the question and the answer submitted by the student. -3. In the **Points** for answer field, perform one of the following depending on whether **Allow Partial Points** was enabled or disabled for the question: +4. In the **Points** for answer field, perform one of the following depending on whether **Allow Partial Points** was enabled or disabled for the question: - If **Allow Partial Points** was disabled, click **Correct** or **Incorrect**: .. image:: /img/guides/notpartial.png :alt: Allow Partial Points Disabled + :width: 450px - - If **Allow Partial Points** was enabled, select the points to give for the answer up to the maximum points: - + - If **Allow Partial Points** is enabled, assign points for the answer up to the maximum allowed. You can also add rubric items and specify the point value for each: + .. image:: /img/guides/partial.png :alt: Allow Partial Points Enabled + :width: 450px + + +5. In the **Comments** field, enter any information about the grade, which can be viewed by the student when the grade is released, and then click **Submit Comment**. + -4. In the **Comments** field, enter any information (in markdown + LaTeX) about the grade, which can be viewed by the student when the grade is released, and then click **Submit Comment**. +Navigate Student Assessments +---------------------------- -Navigate student assessments -............................. -You can navigate through student assessments using the left (**<**) and right (**>**) arrow buttons at the top of the **Assessments grading** dialog. +You can navigate through student assessments using the left (**<**) and right (**>**) arrow buttons at the bottom of the **Assessments grading** dialog. .. image:: /img/guides/freetext_navigate.png :alt: Navigating Assessments -View graded free text assessments -................................. +View Graded Free Text Assessments +---------------------------------- You can view the points given and the Correct column checked for all free text assessments that have been graded. .. image:: /img/guides/freetextanswer.png :alt: View Graded Assessment -Free text assessments that are automatically graded as correct -.............................................................. -You can do this with :ref:`Free Text Autograde `. \ No newline at end of file +Free Text Assessments That Are Automatically Graded as Correct +-------------------------------------------------------------- +Free text assessments can be automatically graded as correct. To learn more, see :ref:`Free Text Autograde `.