General Judge Training
Area Judge Training
Referee / Field Reset Training
Pit Admin Training
Queuing Training


Reviewer Training
Pit Admin Training

This part of the training will prepare you to judge  FIRST  LEGO League team robots and their design processes. Robot Design Judges need to be familiar with the Robot Game rules, missions, and updates that are regularly posted throughout the season. Teams and Judges can find the Challenge on the Central Valley Robotics website. Also, the full Robot Game Challenge document is also included in the Robot Design Judge Prep Pack. 

Layout of the Session 

Note: Similarly to last year, there will not be robot game tables in the Robot Design Judging rooms. Please adjust your approach to the judging session and questions asked appropriately. The Robot Performance Award already exists to reward the Robot (the Product), and the new ranking formula in Judging Lite now equally incorporates the team's robot performance scores. This is meant to emphasize that the Robot Design Awards and Judging are about the Robot Design Process and the team's ability to articulate the Process.

The Robot Design Judging session is a design review!  The teams have spent the season designing their robot to accomplish this year’s challenge.  The judging session will last 10 minutes, and during the judging sessions, teams will demonstrate their design process, programming, strategies, and technical knowledge.

Paraphrased Advice from the Global Judge Advisor, Skip Gridley:

Purpose: The Robot Design Judging Session is not meant to judge quality of robot alone (robot game / robot performance award) it is meant to discover from the students their understanding of how they designed their robot for the challenge, the processes they went through, the challenges they had to tackle, how they used the engineering design process.

The Format: Another way to think of this new judging session format is an engineering design review where FIRST LEGO League / the Judges have released the request for proposal for a robot that performs the robot challenge and the team is to tell the "interested parties" the design and how they designed their robot to meet the challenge. Then, you judge as the “interested party” on what you hear reported from the students. You should be judging the same principles just in a different format.

Judge Perspective: "If I cannot see it perform how will I know it met the goals?" Just review the team based on rubric criteria not on performance. As a judge you might see it run once on a table that might not be up to robot game specifications anyway. As the judge/interested party you would ask in “real life” what did you do to verify that design? This is the question that we want you to ask the students.

Example questions that follow this new perspective: What analysis or data collecting did you go through to verify the quality ion your design? Did you use trial and error? How many times did it work out of ten? What was your strategy for selecting your design process? What is your average score on the table? 

More Question Topics: Ask for facts about the robot. You could also ask about interesting, fun, or challenging parts of the robot.

As always, from the team's code we want to see good programming practice in general, well documented code, and that the students really understand whats going on. Ask them how repeatable is the programming? Always discover whether a team uses sensors, mechanical hard stops, or just hand/eye alignment.

Out of your judging pair, one person often leads the discussion with the team by asking the team members questions, while the other judge will sit back and write, fill out, and circle scores on the team’s rubric. Ideally the questions the first judge is asking are  helping the second judge to fill out the team’s rubric . Keep in mind this  whole session is question and answer,  so new teams and shy members might need to be encouraged a bit more in order for you to get the answers you need to fill out your rubric. Also, some teams might have a presentation prepared beforehand for you, so let them perform it before asking further questions. This said,  do not criticize teams for not having prepared a formal presentation . Therefore, if they are not prepared to start a presentation, go ahead and start asking questions to learn about what their robot does.  Is a team's strategy to accomplish a mission eyeballing their targets or utilizing sensors? Why did the students design their robot a certain way? The ability of the students to describe their mechanical design, strategy, and programming are important takeaways. 

If you have time at the end of your judging session, feel free to ask more general questions to get a bigger picture of the team you are judging. Questions like “Did everyone on your team have a specific role,  and if so can you tell us about them?” or “What does everyone think the coolest part of the robot is?” can give you insight on what each member is proud of on the robot. Usually by the end of the session the judge who was sitting down writing on the rubric has time to stand up as well and ask any final questions, and this could be a chance to get any points from the rubric covered that were not earlier in the judging session. 

Evaluating Rubrics 

The Robot Design Rubric is new this year! Even experienced judges need to take a look at the new rubric and check out the new layout. The rubrics have changed to further accommodate the removal of Robot Games tables. Also, rubric content has changed to focus on design instead of observation, and to reflect new programming software. There is now only one comment section at the end, so please make sure that what you write will help a team improve for the next level of competition. 

As a judge, you’ll evaluate team performance in each rubric criteria.  From beginning to exemplary, each rubric area specifies the team behavior you should see at that level.  You can mark “ND” for “Not Demonstrated” if the team doesn’t provide any information to help you assess what they did. Please circle on the rubric where you evaluate a team. 

Robot Design Rubric

When writing feedback for teams, recognize that teams work hard and treat them with respect. Complement the children’s achievements with vocabulary appropriate for the subject matter.  Make sure you positively communicate opportunities to improve.  Keep all your comments constructive. When taking notes, discussing teams, and completing rubrics, be specific and share examples or evidence that supports why the team achieved a particular evaluation.  Specific comments are more helpful to teams than general impressions. When you first meet in your judging pair, determine a system to keep detailed notes, complete rubrics, and make comments in between teams so that you’ll stay on time while giving quality feedback.

CVR will provide all of the rubrics you will need on an event day, so please do not print your own! You must use the rubrics provided by CVR on an event day.

Final Notes

Now that you have completed this training, be sure to review the Robot Design Judging Prep Pack, which includes the Robot Design Judging Primer.  

The Prep Pack contains a more in depth discussion of each rubric criteria and provides additional tips for judges. Thank you for taking the time to perform Central Valley Robotics’ Judge Training and volunteering to become a Judge!

Please complete your certification to be a Judge by logging into MyCVR and taking the certification test.