diff --git a/source/img/guides/assessment_gb_rubric.png b/source/img/guides/assessment_gb_rubric.png index c767878e..e77b722b 100644 Binary files a/source/img/guides/assessment_gb_rubric.png and b/source/img/guides/assessment_gb_rubric.png differ diff --git a/source/img/guides/assessment_mc_exec.png b/source/img/guides/assessment_mc_exec.png index 859db428..94f9a203 100644 Binary files a/source/img/guides/assessment_mc_exec.png and b/source/img/guides/assessment_mc_exec.png differ diff --git a/source/img/guides/assessment_mc_grading.png b/source/img/guides/assessment_mc_grading.png index 683d158e..87d8a597 100644 Binary files a/source/img/guides/assessment_mc_grading.png and b/source/img/guides/assessment_mc_grading.png differ diff --git a/source/img/guides/parsonspuzzlegrading.png b/source/img/guides/parsonspuzzlegrading.png new file mode 100644 index 00000000..939e5aa6 Binary files /dev/null and b/source/img/guides/parsonspuzzlegrading.png differ diff --git a/source/instructors/authoring/assessments/advanced-code-test.rst b/source/instructors/authoring/assessments/advanced-code-test.rst index 82cbcc5e..0542c95e 100644 --- a/source/instructors/authoring/assessments/advanced-code-test.rst +++ b/source/instructors/authoring/assessments/advanced-code-test.rst @@ -10,10 +10,10 @@ The advanced code test assessment type allows you to easily implement unit tests To ensure that your test scripts run securely and to prevent student access to your testing files or executables, place them in the **.guides/secure** folder. Create a folder if one does not already exist. This folder is not available to students in their assignments and they cannot access it from the command line. Only teachers with editing privileges have access to the **.guides/secure** folder. -.. Note:: If your assignment will contain multiple assessments, Code files and Test Cases for individual assessments should be placed in separate folders to avoid compiling all files. +.. note:: If your assignment will contain multiple assessments, Code files and Test Cases for individual assessments should be placed in separate folders to avoid compiling all files. -Complete each section to set up your advanced code test. For more information on **General**, **Metadata** and **Files** see :ref:`Assessments `. +Complete each section to set up your advanced code test. For more information on **General**, **Metadata** (Optional) and **Files** (Optional) see :ref:`Assessments `. 1. Complete **General**. @@ -27,7 +27,7 @@ Complete each section to set up your advanced code test. For more information on - **Python**: `pycodestyle`_ or `UnitTest`_ - **JavaScript**: `JSHint and JSLint`_ -.. Note:: For more information, see the :ref:`test-types` section or click any test name above to navigate directly to that section. +.. note:: For more information, see the :ref:`test-types` section or click any test name above to navigate directly to that section. - **Language Assessment Subtype** - Click the drop-down and choose a subtype for the selected language type, if applicable. @@ -35,22 +35,18 @@ Complete each section to set up your advanced code test. For more information on 3. Click **Grading** in the top navigation pane and complete the following fields: - .. image:: /img/guides/ACTGradingScreen.png - :alt: Grading - :width: 500px - - - **Points** - The score given to the student if the code test passes. You can enter any positive numeric value. If this assessment should not be graded, enter 0 points. - - **Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage correctly answered. Note that it's not enough to turn partial points on; the advanced code test has to be written to handle partial points. See :ref:`Partial Points ` for more information. - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - **Show Rationale to Students** - Toggle to display the rationale for the answer to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. - - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. - -4. **(Optional)** Complete **Metadata**. +.. image:: /img/guides/ACTGradingScreen.png + :alt: Grading + :width: 500px -5. **(Optional)** Complete **Files**. +- **Points** - The score given to the student if the code test passes. You can enter any positive numeric value. If this assessment should not be graded, enter 0 points. +- **Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage correctly answered. Note that it's not enough to turn partial points on; the advanced code test has to be written to handle partial points. See :ref:`Partial Points ` for more information. +- **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. +- **Show Rationale to Students** - Toggle to display the rationale for the answer to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** +- **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. +- **Use Maximum Score** - Toggle to enable assessment final score to be the highest score attained of all runs. -6. Click **Create** to complete the process. +4. Click **Create** to complete the process. See a Working Example diff --git a/source/instructors/authoring/assessments/fill-in-blanks.rst b/source/instructors/authoring/assessments/fill-in-blanks.rst index 2b522de4..6602aaff 100644 --- a/source/instructors/authoring/assessments/fill-in-blanks.rst +++ b/source/instructors/authoring/assessments/fill-in-blanks.rst @@ -95,8 +95,4 @@ Examples of regular expressions supported for blank fields: - **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. -4. **(Optional)** Complete **Metadata**. - -5. **(Optional)** Complete **Files**. - -6. Click **Create** to complete the process. +4. Click **Create** to complete the process. diff --git a/source/instructors/authoring/assessments/free-text.rst b/source/instructors/authoring/assessments/free-text.rst index d23d9795..abd6516b 100644 --- a/source/instructors/authoring/assessments/free-text.rst +++ b/source/instructors/authoring/assessments/free-text.rst @@ -51,11 +51,7 @@ Follow these steps to set up a free text assessment manually. For more informati - **Rationale** - Enter guidance for the assessment. This is visible to the teacher when the project is opened in the course or when viewing the student's project. It can also be shown to students after submission or when they revisit the assignment after marking it as completed. -3. **(Optional)** Complete **Metadata**. - -4. **(Optional)** Complete **Files**. - -5. Click **Create** to complete the process. +3. Click **Create** to complete the process. Grading Free Text Assessments diff --git a/source/instructors/authoring/assessments/grade-book.rst b/source/instructors/authoring/assessments/grade-book.rst index d3eec04d..6015c4e0 100644 --- a/source/instructors/authoring/assessments/grade-book.rst +++ b/source/instructors/authoring/assessments/grade-book.rst @@ -5,51 +5,39 @@ Grade Book ========== -A Grade Book assessment may be used to manually grade assignments, it will not appear in the student guides. When the instructor opens a student assignment, the Grade Book is available for rubric based grading. Comments, points, and rubric items are visible to the student when the assessment is graded and the grades are released. Grade Book assessments do not show in the total assessment count on student/teacher dashboard. +A Grade Book assessment may be used to manually grade assignments, it will not appear in the student guides. When the instructor opens a student assignment, the Grade Book is available for rubric based grading. Comments, points, and rubric items are visible to the student when the assessment is graded and the grades are released. Grade Book assessments do not show in the total assessment count on student/teacher dashboard. For more information on **General**, **Metadata** (optional) and **Files** (optional) see :ref:`Assessments `. -1. On the **General** page, enter a short **Name** that describes the test. This name is displayed in the teacher dashboard and when the teacher is viewing the student work. The name should reflect the challenge and thereby be clear when reviewing. - Leaving the name visible will make it easier to locate when grading and we do not recommend toggling the **Show Name** setting to disable it. - - .. image:: /img/guides/assessment_gradebook_general.png - :alt: General +1. Complete **General**. Keep the **Show Name** setting enabled to make submissions easier to locate when grading. 2. Click **Grading** in the navigation pane and complete the following fields: - .. image:: /img/guides/assessment_gradebook_grading.png - :alt: Grading +.. image:: /img/guides/assessment_gradebook_grading.png + :alt: Grading - - **Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). +- **Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - **Allow Partial Points** - Partial points must be toggled on for this type of assessment. Once you toggle this on, you will be able to add rubric items. Rubric items are negative points, they will be subtracted from the total score of the assessment. +- **Allow Partial Points** - Partial points must be toggled on for this type of assessment. Once you toggle this on, you will be able to add rubric items. Rubric items are negative points, they will be subtracted from the total score of the assessment. .. image:: /img/guides/assessment_gb_rubric.png :alt: Rubric + :width: 350px -3. Click **Metadata** in the left navigation pane and complete the following fields: - - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata - - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then its Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - By default, **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. - -4. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment. The files are then included in the **Additional content** list. +3. Click **Create** to complete the process. - .. image:: /img/guides/assessment_files.png - :alt: Files -5. Click **Save** to complete the process. +Grading a Grade Book Assessment +------------------------------- -The instructor must :ref:`opening a student assignment ` to complete the Grade Book grading. Locate the Grade Book assessment in the student file and click **Grade**. +The instructor must :ref:`open a student assignment ` to complete the Grade Book grading. Locate the Grade Book assessment in the student file and click **Grade**. - .. image:: /img/guides/assessment_gb_opengb.png - :alt: Grade book assessment +.. image:: /img/guides/assessment_gb_opengb.png + :alt: Grade book assessment Click on the 0 or 1 to allocate initial points for overall correctness and then use the other rubric items to subtract points for missing items. There is also a **Points adjust** field if you wish to adjust total points upwards. - .. image:: /img/guides/assessment_gb_grade.png - :alt: Grade book assessment \ No newline at end of file +.. image:: /img/guides/assessment_gb_grade.png + :alt: Grade book assessment + :width: 700px \ No newline at end of file diff --git a/source/instructors/authoring/assessments/math-assessments.rst b/source/instructors/authoring/assessments/math-assessments.rst index fdb1b6a0..c654bac0 100644 --- a/source/instructors/authoring/assessments/math-assessments.rst +++ b/source/instructors/authoring/assessments/math-assessments.rst @@ -10,16 +10,16 @@ Math Assessments Codio allows you to set and grade math questions for any type and level of mathematics using the **Free Text** assessment. We only offer manual grading of mathematical expressions or proofs. -Manually graded assessments using free text -******************************************* +Manually Graded Assessments Using Free Text +------------------------------------------- -To create a manually graded math question, you can use the **Free text** assessment type. This allows the students to enter expressions or even full proofs and worked answers using Latex. For more information about Latex, please :ref:`click here `. +To create a manually graded math question, you can use the **Free text** assessment type. This allows the students to enter expressions or even full proofs and worked answers using LaTeX. For more information about LaTeX, see :ref:`LaTeX `. -You can enter Latex in the **Question** and **Rationale** fields. +You can enter LaTex in the **Question** and **Rationale** fields. -You should also set the **Preview type** drop down to either **Plaintext + Latex** or **Markdown + Latex**. Both of these ensure that the student sees a preview pane beneath their answer entry fully rendered in markdown and/or Latex. Please - :ref:`click here ` to review the free text assessment. +You should also set the **Preview type** dropdown to either **Plaintext + LaTeX** or **Markdown + LaTeX**. Both of these ensure that the student sees a preview pane beneath their answer entry fully rendered in markdown and/or LaTeX. See :ref:`Free text assessments ` for more information. -Multiple choice -*************** +Multiple Choice +--------------- -You can also use the multiple choice assessment type to create answers containing properly rendered Latex expressions. +You can also use the multiple choice assessment type to create answers containing properly rendered LaTeX expressions. See :ref:`Multiple Choice ` for more information. diff --git a/source/instructors/authoring/assessments/multiple-choice.rst b/source/instructors/authoring/assessments/multiple-choice.rst index 47df8422..4b4f3dd2 100644 --- a/source/instructors/authoring/assessments/multiple-choice.rst +++ b/source/instructors/authoring/assessments/multiple-choice.rst @@ -5,94 +5,59 @@ Multiple Choice =============== -Multiple choice type assessments provide a question and then single or multiple response options. +Multiple choice assessments allow you to present a question with single or multiple response options. These questions can be created manually or generated using AI. Assessment Auto-Generation -++++++++++++++++++++++++++ +-------------------------- -Assessments can be auto-generated using the text on the current guides page as context. Follow the steps below to auto-generate a Multiple Choice assessment: - -1. Select **Multiple Choice** from the Assessments list. - -2. Press the **Generate** button at bottom right corner. - - .. image:: /img/guides/generate-assessment-button.png - :alt: Generate assessment button - -3. The Generation Prompt will open, press **Generate Using AI** to preview the generated assessment. - - .. image:: /img/guides/assessment-generation-prompt.png - :alt: Assessment Generation Prompt - - - If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, *create assessment based on the first paragraph with 2 correct answers.* - -4. When you are satisfied with the result, press **Apply** and then **Create**. - - -More information about generating assessments may be found on the :ref:`AI assessment generation ` page. +Assessments can be auto-generated using the text on the current guide page as context. For more information, see the :ref:`AI assessment generation ` page. Assessment Manual Creation -++++++++++++++++++++++++++ +-------------------------- -Follow these steps to set up multiple choice assessments manually: +Follow these steps to set up multiple choice assessments manually. For more information on **Metadata** (optional) and **Files** (optional), see :ref:`Assessments `. 1. On the **General** page, enter the following information: - .. image:: /img/guides/assessment_mc_general.png - :alt: General - - - **Name** - Enter a short name that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. +.. image:: /img/guides/assessment_mc_general.png + :alt: General - If you want to hide the name in the challenge text the student sees, toggle the **Show Name** setting to disable it. +- **Name** - Enter a short name that describes the test. This name appears in the teacher dashboard for easy reference. To hide the name from the student's view of the challenge, toggle **Show Name** to disable it. - - **Question** - Enter the question instruction that is shown to the student. +- **Question** - Enter the question instruction that is shown to the student. 2. Click **Execution** in the navigation pane and complete the following information: - .. image:: /img/guides/assessment_mc_exec.png - :alt: Execution +.. image:: /img/guides/assessment_mc_exec.png + :alt: Execution + :width: 450px - - **Shuffle Answers** - Toggle to randomize the order of the answers so each student sees the answers in a different order. - - **Multiple Response** - Toggle to enable a user to select more than one answer. - - **Answers** - Mark the correct answer(s) to the question. You can add as many options as needed. For each answer, toggle to enable as correct answer (for multiple responses), or click the radio button for the correct single response. - - **Ordering** - Use the **Up** and **Down** arrows to change the order in which the answers are presented. +- **Shuffle Answers** - Toggle to randomize the order of the answers so each student sees the answers in a different order. +- **Multiple Response** - Toggle to enable a user to select more than one answer. +- **Answers** - Mark the correct answer(s) to the question. You can add as many options as needed. For each answer, toggle to enable as correct answer (for multiple responses), or click the radio button for the correct single response. +- **Ordering** - Use the **Up** and **Down** arrows to change the order in which the answers are presented. 3. Click **Grading** in the navigation pane and complete the following fields: - .. image:: /img/guides/assessment_mc_grading.png - :alt: Grading - - - **Correct Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - - **Incorrect Points** is the score to be deducted if the student makes an incorrect selection. Typically, this value will be 0 but you can assign any positive numeric value if you wish to penalize guessing. If this assessment is to be ungraded, set '0' points - - - **Allow Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage of answers they correctly answer. +.. image:: /img/guides/assessment_mc_grading.png + :alt: Grading + :width: 450px - - **Show Expected Answer** - Toggle to show the students the expected output when they have submitted an answer for the question. To suppress expected output, disable the setting. - - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - - **Show Rationale to Students** - Toggle to display the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** +- **Correct Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. - - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. +- **Incorrect Points** - The score to be deducted if the student makes an incorrect selection. Typically, this value will be 0, but you can assign any positive numeric value if you wish to penalize guessing. If this assessment is to be ungraded, enter zero (0). -4. Click on the **Parameters** tab if you wish to edit/change **Parameterized Assessments** (deprecated) using a script. See :ref:`Parameterized Assessments ` for more information. New parameterized assessments can no longer be set up. +- **Allow Partial Points** - Toggle to enable a percentage of total points to be given based on the percentage of answers they correctly answer. -5. Click **Metadata** in the left navigation pane and complete the following fields: +- **Use Maximum Score** - Toggle to enable assessment final score to be the highest score attained of all runs. - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata +- **Show Expected Answer** - Choose when students can view the expected output: **never**, **when grades are released**, or **always**. - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** The objectives are the specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then the Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - The **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. +- **Define Number of Attempts** - Toggle to enable attempt limits and specify the maximum number of attempts. If disabled, students can make unlimited attempts. -6. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment when adding it to an assessment library. The files are then included in the **Additional content** list. +- **Show Rationale to Students** - Toggle to display the rationale for the answer. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this by selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total**, **When grades are released**, or **Always**. - .. image:: /img/guides/assessment_files.png - :alt: Files +- **Rationale** - Enter guidance for the assessment. This is always visible to the teacher when the project is opened in the course or when opening the student's project. -7. Click **Create** to complete the process. +4. Click **Create** to complete the process. diff --git a/source/instructors/authoring/assessments/parsons-puzzle.rst b/source/instructors/authoring/assessments/parsons-puzzle.rst index 805e2909..e5b124f3 100644 --- a/source/instructors/authoring/assessments/parsons-puzzle.rst +++ b/source/instructors/authoring/assessments/parsons-puzzle.rst @@ -8,112 +8,90 @@ Parsons Puzzle Parson’s problems are available in Codio as Parsons Puzzles. Parson’s Puzzles are formative assessments that ask students to arrange blocks of scrambled code, allowing them to focus on the purpose and flow of the code (often including a new pattern or feature) instead of syntax. Codio uses `js-parsons `_ for Parson's Puzzles. Assessment Auto-Generation -++++++++++++++++++++++++++ +-------------------------- -Assessments can be auto-generated using the text on the current guides page as context. Follow the steps below to auto-generate a Parsons Puzzle assessment: +Assessments can be auto-generated using the text on the current guide page as context. For more information, see the :ref:`AI assessment generation ` page. -1. Select **Parson's Puzzle** assessment from Assessments list. - -2. Press the **Generate** button at bottom right corner. - - .. image:: /img/guides/generate-assessment-button.png - :alt: Generate assessment button - -3. The Generation Prompt will open, press **Generate Using AI** to preview the generated assessment. - - .. image:: /img/guides/assessment-generation-prompt.png - :alt: Assessment Generation Prompt - -If you are not satisfied with the result, select **Regenerate** to create a new version of the assessment. You can provide additional guidance in the **Generation Prompt** field. For example, *create assessment based on the first paragraph with 2 correct answers.* - -4. When you are satisfied with the result, press **Apply** and then **Create**. - -More information about generating assessments may be found on the :ref:`AI assessment generation ` page. Assessment Manual Creation -++++++++++++++++++++++++++ - - -Complete the following steps to set up a **Line Based Grader** Parsons Puzzle assessment. The **Line Based Grader** assessment treats student answers as correct if and only if they match the order and indentation found in **Initial Values**. For incorrect answers, it highlights the lines that were not ordered or indented properly. +-------------------------- -1. On the **General** page, enter the following information: - .. image:: /img/guides/assessment_general.png - :alt: General +Complete the following steps to set up a **Line Based Grader** Parsons Puzzle assessment. The **Line Based Grader** assessment treats student answers as correct if and only if they match the order and indentation found in **Initial Values**. For incorrect answers, it highlights the lines that were not ordered or indented properly. For more information on **General**, **Metadata** (optional) and **Files** (optional) see :ref:`Assessments `. - - **Name** - Enter a short name that describes the test. This name is displayed in the teacher dashboard so the name should reflect the challenge and thereby be clear when reviewing. - - If you want to hide the name in the challenge text the student sees, toggle the **Show Name** setting to disable it. - - - **Instruction** - Enter the instructions in markdown to be shown to the students. +1. Complete **General**. 2. Click **Execution** in the navigation pane and complete the following information: - .. image:: /img/guides/assessment_parsons_exec.png - :alt: Execution - - - **Code to Become Blocks** - Enter code blocks that make up the initial state of the puzzle for the students. - - **Code to Become Distractor Blocks** - Enter code blocks that serve as distractions. - - **Max Distractors** - Enter the maximum number of distractors allowed. - - **Grader** - Choose the appropriate grader for the puzzle from the drop-down list. - - **Show Feedback** - Select to show feedback to student and highlight error in the puzzle. Deselect to hide feedback and not show highlight error in the puzzle. - - **Require Dragging** - If you enter **Code to Become Distractor Blocks**, **Require Dragging** will automatically turn on. Without distractor blocks, you can decide whether or not you want students to drag blocks to a separate area to compose their solution. - - **Disable Indentation** - If you do not want to require indention, check the **Disable Indentation** box. - - **Indent Size** - Each indention defaults to 50 pixels. +.. image:: /img/guides/assessment_parsons_exec.png + :alt: Execution + +.. list-table:: + :widths: 30 70 + :header-rows: 1 + + * - Option + - Description + * - **Code to Become Blocks** + - Enter the code solution in the correct order using proper indentation. + * - **Code to Become Distractor Blocks** + - Enter lines of code that serve as distractions. + * - **Max Distractors** + - Enter the maximum number of distractors allowed. + * - **Grader** + - Choose the appropriate grader for the puzzle from the drop-down list. For more information see :ref:`Grader Options `. + * - **Show Feedback** + - Select to show feedback to students and highlight errors in the puzzle. Deselect to hide feedback and not highlight errors in the puzzle. + * - **Require Dragging** + - If you enter **Code to Become Distractor Blocks**, **Require Dragging** will automatically turn on. Without distractor blocks, you can decide whether you want students to drag blocks to a separate area to compose their solution. + * - **Disable Indentation** + - If you do not want to require indentation, check the **Disable Indentation** box. + * - **Indent Size** + - Each indentation defaults to 50 pixels. 3. Click **Grading** in the navigation pane and complete the following fields: - .. image:: /img/guides/Grading-new-feature1.png - :alt: Grading - - - **Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - - - **Define Number of Attempts** - enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. - - **Show Rationale to Students** - Toggle to display the answer, and the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - - **Rationale** - Enter guidance for the assessment. This is visible to the teacher when the project is opened in the course or when opening the student's project. This guidance information can also be shown to students after they have submitted their answer and when they reload the assignment after marking it as completed. - - **Use maximum score** - Toggle to enable assessment final score to be the highest score attained of all runs. - -4. Click on the **Parameters** tab if you wish to edit/change **Parameterized Assessments** (deprecated) using a script. See :ref:`Parameterized Assessments ` for more information. New parameterized assessments can no longer be set up. +.. image:: /img/guides/parsonspuzzlegrading.png + :alt: Grading + :width: 450px -5. Click **Metadata** in the left navigation pane and complete the following fields: +- **Points** - Enter the score if the student selects the correct answer. You can choose any positive numeric value. If this is an ungraded assessment, enter zero (0). - .. image:: /img/guides/assessment_metadata.png - :alt: Metadata +- **Define Number of Attempts** - Enable to allow and set the number of attempts students can make for this assessment. If disabled, the student can make unlimited attempts. +- **Show Rationale to Students** - Toggle to display the answer, and the rationale for the answer, to the student. This guidance information will be shown to students after they have submitted their answer and any time they view the assignment after marking it as completed. You can set when to show this selecting from **Never**, **After x attempts**, **If score is greater than or equal to a % of the total** or **Always** - - **Bloom's Level** - Click the drop-down and choose the level of Bloom's Taxonomy: https://cft.vanderbilt.edu/guides-sub-pages/blooms-taxonomy/ for the current assessment. - - **Learning Objectives** specific educational goal of the current assessment. Typically, objectives begin with Students Will Be Able To (SWBAT). For example, if an assessment asks the student to predict the output of a recursive code segment, then its Learning Objectives could be *SWBAT follow the flow of recursive execution*. - - **Tags** - By default, **Content** and **Programming Language** tags are provided and required. To add another tag, click **Add Tag** and enter the name and values. +- **Rationale** - Enter guidance for the assessment. This is visible to the teacher when the project is opened in the course or when opening the student's project. This guidance information can also be shown to students after they have submitted their answer and when they reload the assignment after marking it as completed. +- **Use Maximum Score** - Toggle to enable assessment final score to be the highest score attained of all runs. -6. Click **Files** in the left navigation pane and check the check boxes for additional external files to be included with the assessment when adding it to an assessment library. The files are then included in the **Additional content** list. - .. image:: /img/guides/assessment_files.png - :alt: Files +4. Click **Create** to complete the process. -7. Click **Create** to complete the process. +.. _grader-options: Grader Options -------------- -**VariableCheckGrader** - Executes the code in the order submitted by the student and checks variable values afterwards. +VariableCheckGrader +~~~~~~~~~~~~~~~~~~~ +Executes the code in the order submitted by the student and checks variable values afterwards. .. raw:: html
-Expected and supported options: +**Expected and supported options**: - ``vartests`` (required) array of variable test objects - Each variable test object can/must have the following properties: +Each variable test object can/must have the following properties: - - ``initcode`` - code that will be prepended before the learner solution code - - ``code`` - code that will be appended after the learner solution code - - ``message`` (required) - a textual description of the test, shown to learner +- ``initcode`` - code that will be prepended before the learner solution code +- ``code`` - code that will be appended after the learner solution code +- ``message`` (required) - a textual description of the test, shown to learner -Properties specifying what is tested: +**Properties specifying what is tested**: - ``variables`` - an object with properties for each variable name to be tested; the value of the property is the expected value @@ -122,33 +100,46 @@ Properties specifying what is tested: - ``variable`` - a variable name to be tested - ``expected`` - expected value of the variable after code execution -**TurtleGrader** - for exercises that draw turtle graphics in Python. Grading is based on comparing the commands executed by the model and student turtle. If the ``executable_code`` option is also specified, the code on each line of that option will be executed instead of the code in the student constructed lines. - .. Note:: Student code should use the variable ``myTurtle`` for commands to control the turtle in order for the grading to work. +UnitTestGrader +~~~~~~~~~~~~~~ + +Executes student code and Skulpt unit tests. This grader is for Python problems where students create functions. Similar to traditional unit tests on code, this grader leverages a unit test framework where you set asserts - meaning this grader checks the functionality of student code. .. raw:: html -
+
- Required options: +LanguageTranslationGrader +~~~~~~~~~~~~~~~~~~~~~~~~~ -- ``turtleModelCode`` - The code constructing the model drawing. The turtle is initialized to modelTurtle variable, so your code should use that variable. The following options are available: +Code translating grader where Java or psuedocode blocks map to Python in the background. Selecting the language allows the Parson's problem to check for correct indentation and syntax. - - ``turtlePenDown`` - A boolean specifying whether or not the pen should be put down initially for the student constructed code - - ``turtleModelCanvas`` - ID of the canvas DOM element where the model solution will be drawn. Defaults to `modelCanvas`. - - ``turtleStudentCanvas`` - ID of the canvas DOM element where student turtle will draw. Defaults to `studentCanvas`. +.. raw:: html + +
-**UnitTestGrader** - Executes student code and Skulpt unit tests. This grader is for Python problems where students create functions. Similar to traditional unit tests on code, this grader leverages a unit test framework where you set asserts - meaning this grader checks the functionality of student code. -.. raw:: html -
+TurtleGrader +~~~~~~~~~~~~ -**LanguageTranslationGrader** - Code translating grader where Java or psuedocode blocks map to Python in the background. Selecting the language allows the Parson's problem to check for correct indentation and syntax. +This is for exercises that draw turtle graphics in Python. Grading is based on comparing the commands executed by the model and student turtle. If the ``executable_code`` option is also specified, the code on each line of that option will be executed instead of the code in the student constructed lines. + +.. note:: Student code should use the variable ``myTurtle`` for commands to control the turtle in order for the grading to work. .. raw:: html -
+
+ + Required options: + +- ``turtleModelCode`` - The code constructing the model drawing. The turtle is initialized to modelTurtle variable, so your code should use that variable. The following options are available: + + - ``turtlePenDown`` - A boolean specifying whether or not the pen should be put down initially for the student constructed code + - ``turtleModelCanvas`` - ID of the canvas DOM element where the model solution will be drawn. Defaults to `modelCanvas`. + - ``turtleStudentCanvas`` - ID of the canvas DOM element where student turtle will draw. Defaults to `studentCanvas`. + Sample Starter Pack ------------------- diff --git a/source/instructors/authoring/assessments/partial-points.rst b/source/instructors/authoring/assessments/partial-points.rst index 1bd16281..31bb8629 100644 --- a/source/instructors/authoring/assessments/partial-points.rst +++ b/source/instructors/authoring/assessments/partial-points.rst @@ -9,12 +9,12 @@ You can award partial points for student assessments in your testing scripts. Us -Autograding enhancements for partial points +Autograding Enhancements for Partial Points ------------------------------------------- -To provide you with more robust auto-grade scripts, you can send back feedback in different formats HTML, Markdown, or plaintext and a URL is passed as an environment variable ```CODIO_PARTIAL_POINTS_V2_URL```. These variables allow POST and GET requests with the following parameters: +To provide you with more robust auto-grade scripts, you can send back feedback in different formats HTML, Markdown, or plaintext and a URL is passed as an environment variable ``CODIO_PARTIAL_POINTS_V2_URL``. These variables allow POST and GET requests with the following parameters: -- **Score** (```CODIO_PARTIAL_POINTS_V2_URL```) - 0-100 percent for assessment, should be a percentage of total points possible. +- **Score** (``CODIO_PARTIAL_POINTS_V2_URL``) - a percentage of total points possible between 0-100. - **Feedback** - text - this is limited to 1 Mb - **Format** - html, md, or txt (default) @@ -22,8 +22,8 @@ If you use Python, you can also use the function ``send_partial_v2``. Also, thro As a general rule, your script should always exit with ``0``; otherwise, the grade will be 0. If the student receives partial points, the results will display an orange percent sign rather than a green check mark or red x. -Example auto-grading scripts with partial points -................................................ +Example Auto-Grading Scripts With Partial Points +------------------------------------------------ The following examples in Python and JavaScript show how you can write your scripts in any language. @@ -113,7 +113,7 @@ The Python script parses the student's file and then assigns points based on spe exit(0 if res else 1) -Example grading script for partial points +Example Grading Script for Partial Points ----------------------------------------- These are examples of the older method of partial points reporting. @@ -145,4 +145,5 @@ These are examples of the older method of partial points reporting. main() -The score you award should be any value between 0 and the maximum score you specified when defining the assessment in the Codio authoring editor. \ No newline at end of file +.. important:: + The score you award should be any value between 0 and the maximum score you specified when defining the assessment in the Codio authoring editor. \ No newline at end of file